Strengthening Privacy Protections in COVID-19 Mobile Phone–Enhanced Surveillance Programs
Research SummaryPublished Jul 30, 2020
Research SummaryPublished Jul 30, 2020
Public health officials worldwide are struggling to manage the lethal coronavirus disease 2019 (COVID-19) pandemic. As part of the response, governments, technology companies, and research organizations are leveraging emerging data-collection and data-analysis capabilities to understand the disease and model and track its spread through communities. Facilitated by a trove of technology-based data sources—in particular, the data generated from the widespread use of mobile phones—these public health surveillance programs could prove especially valuable for preventing successive waves of infections as quarantine orders are relaxed and economies reopen.
Dozens of countries, including the United States, have been using mobile phone tools and data sources for COVID-19 surveillance activities, such as tracking infections and community spread, identifying populated areas at risk, and enforcing quarantine orders. These tools can augment traditional epidemiological interventions, such as contact tracing with technology-based data collection (e.g., automated signaling and record-keeping on mobile phone apps). As the response progresses, other beneficial technologies could include tools that authenticate those with low risk of contagion or that build community trust as stay-at-home orders are lifted.
However, the potential benefits that COVID-19 mobile phone–enhanced public health ("mobile") surveillance program tools could provide are also accompanied by potential for harm. There are significant risks to citizens from the collection of sensitive data, including personal health, location, and contact data. People whose personal information is being collected might worry about who will receive the data, how those recipients might use the data, how the data might be shared with other entities, and what measures will be taken to safeguard the data from theft or abuse.
The risk of privacy violations can also impact government accountability and public trust. The possibility that one's privacy will be violated by government officials or technology companies might dissuade citizens from getting tested for COVID-19, downloading public health–oriented mobile phone apps, or sharing symptom or location data. More broadly, real or perceived privacy violations might discourage citizens from believing government messaging or complying with government orders regarding COVID-19.
As U.S. public health agencies consider COVID-19-related mobile surveillance programs, they will need to address privacy concerns to encourage broad uptake and protect against privacy harms. Otherwise, COVID-19 mobile surveillance programs likely will be ineffective and the data collected unrepresentative of the situation on the ground.
To help public health officials understand and evaluate the privacy implications of mobile surveillance programs, RAND Corporation researchers developed a concise, standardized, and transparent privacy scorecard. Conciseness is important because privacy policies for data collection and use are often lengthy and written in complex legal jargon that prevents a typical user from reading and understanding these policies. The RAND team wanted a standardized approach because there are many types of mobile surveillance programs that can be used to monitor COVID-19. Public health agencies will need to be able to compare not only the efficacy and usability of such programs, and but also the privacy protections included in different programs to make good decisions regarding intervention selection. Finally, transparency is critical to building trust with potential users of mobile surveillance program tools.
The research team analyzed documents from diverse sources— including advocacy groups, technology companies, government officials and members of Congress, and key laws (such as the European Union's General Data Protection Regulation, the Health Insurance Portability and Accountability Act, and the California Consumer Protection Act)—to develop a set of criteria that apply to the mobile surveillance programs being used in COVID-19 response. "Scores" are assigned based on the extent to which the program satisfies the criteria (as determined by an objective, fact-based evaluative question). The scoring options include (1) fully satisfied, (2) partly satisfied, (3) not satisfied, (4) unclear, or (5) not applicable (N/A).
To demonstrate how the scorecard can be used to assess and compare mobile surveillance program tools, the research team scored 40 such programs in 20 countries (including the United States). They found that there is considerable variance across programs, even when those programs are focused on a similar activity (e.g., symptom tracking, contact tracing). For example, Australia's COVIDSafe contact tracing program fully met 16 of the 20 scorecard criteria and partially met two other criteria. By contrast, South Korea's contact tracing program fully or partially met only six criteria and did not meet nine; the remaining five criteria were either unclear or not applicable. (Completed scorecards for all assessed programs are available in Appendix B of the full report.) The research team did not evaluate the level of penetration or the efficacy of the programs they scored.
Does the program meet criteria? | Australia's COVIDSafe | South Korea's Location-Based Text Alerts |
---|---|---|
Fully | 16 | 4 |
Partially | 2 | 2 |
No | 1 | 9 |
Unclear | 1 | 2 |
Not available | 0 | 3 |
Transparency | Policies | Does the program provide answers to all the privacy questions that were identified? |
---|---|---|
Public audit | Are the data collected by the program auditable by the public or an independent third party? | |
Open source | Is the program software code open source? | |
Disclosure of data collected | Are users explicitly told what type(s) of data (e.g., GPS, Bluetooth) are collected? | |
User-specific data visibility | Can users view and correct the data that pertain to them? | |
Purpose | Narrow scope | Does the program relate exclusively to the COVID-19 public health response? |
Secondary use prohibition | Does the program prohibit secondary uses (e.g., making data available for sale or provided to other entities/companies)? | |
Law enforcement firewall | Are the data only available to public health officials and not law enforcement? | |
Data minimization | Does the app collect only the minimum amount of information necessary to achieve the stated purpose? (For instance, does it collect information about users who have not opted in, or specific details, such as timestamps, if only general date is necessary?) | |
Anonymity | Real identities obscured | Does the program anonymize the real identities of the users? |
Reidentification prohibition | Does the program prohibit efforts to reidentify anonymous information (for instance, within the terms of use)? | |
Informed Consent | Voluntary | Can users opt out of the program without punitive consequences without being denied access to certain services or goods? |
Specificity | Do users give consent for the data to be used for the program's specific purpose? | |
Revocable | Can consent be withdrawn (for example, by deleting the app)? | |
Data deletion | Does the user have the right to delete data that are collected? | |
Temporal Limitations | Sunset clause | Is there a predetermined date when the program will end? |
Data time limits | Are there limits to how long specific data are collected, processed, and stored? | |
Data Management | Encryption | Are the data that are collected encrypted? |
Local storage | Will data be stored and processed entirely on the user's mobile device? | |
Policies | Are there clear policies about data management and cybersecurity practices? |
As U.S. public health agencies continue to develop, assess, and promote mobile surveillance programs, there are several actions that federal, state, and local officials can take to strengthen privacy protections for mobile surveillance program users.
There is an opportunity for the federal government to promote a national culture of consumer data privacy to both build trust with the public and prevent the abuse of mobile data.
Google and Apple jointly developed an interoperable protocol (or API) that enables Bluetooth low-energy beaconing to notify a user of a potential exposure deriving from prolonged proximity to another user who is infected. The protocol is designed to protect anonymity by enabling public health authorities to develop applications that use the Bluetooth chips to send and receive randomly assigned and changing identifiers. If a user is determined to be infected, the user can choose to have the device identifiers included on list of individuals who have tested positive for COVID-19. Periodically, user devices check this identifier list and, if one is found to have been received by the user's device, the user is notified of a potential exposure. The protocol is now available for app development and will be incorporated directly into mobile operating systems.
Category | Subcategory | Fulfillment | Description |
---|---|---|---|
Transparency | Policies | Partially | A significant amount of detail about the protocol has been provided, but the status of some of the criteria will depend on the particular applications that are built. |
Public audit | Unclear | This depends on specifics of how apps using the protocol are developed and implemented. There is no clear approach to public audits at this stage. To get access to the API, apps have to "meet specific criteria around privacy, security, and data control;"a however, there is no specific mention of "audit." | |
Open source | Fully | The code is open source and available. | |
Disclosure of data collected | Fully | Although the data collected by each application will to some degree be determined by each public health authority, the API itself only uses Bluetooth beaconing.b | |
User-specific data visibility | Unclear | It is not clear whether users can access data that are collected or transmitted. | |
Purpose | Narrow scope | Fully | The protocol is exclusively used for contact tracing for COVID-19.c |
Secondary use prohibition | Fully | Apple and Google state that the protocol will not be available for marketing.d | |
Law enforcement firewall | Fully | Only official government public health authorities will have access to the API, and it cannot be used for any purpose other than COVID-19 response.e | |
Data minimization | Fully | Bluetooth beaconing data are only collected for users that have been confirmed to be infected with COVID-19. User location cannot be tracked, and apps that track location will not have access to the API. | |
Anonymity | Real identities obscured | Fully | The protocol is designed to maintain anonymity by using randomly assigned, changing Bluetooth beacons that can only be resolved to a user's phone with a user-specific key. |
Reidentification prohibition | Fully | The design of the protocol, including decentralization and use of rotating anonymous identifiers, makes reidentification extremely challenging. | |
Informed Consent | Voluntary | Fully | Google and Apple provide specific goals of the approach, and this is an opt-in system.f |
Specificity | Fully | Google and Apple emphasize that this will be an opt-in system for contact tracing.f | |
Revocable | Fully | Google and Apple emphasize that this will be an opt-in system for contact tracing.f | |
Data deletion | Fully | Users have the option to delete all tracing keys collected by the API. | |
Temporal Limitations | Sunset clause | No | Apple and Google can choose to shut down the API unilaterally on a regional basis. Once it is incorporated into operating system updates, the API may stay on phones indefinitely. |
Data time limits | No | Time limits for data retention will depend on how applications using the protocol are developed and implemented. | |
Data Management | Encryption | Fully | Bluetooth beaconing data are encrypted on a user's phone. |
Local storage | Partially | The list of Bluetooth beacons stays on the phone; however, a central server is used to collect infected persons' keys and broadcast them to other users. | |
Policies | Partially | Only some cybersecurity details have been released. Specifics will depend on how the app is developed and implemented. |
aApple and Google, Exposure Notification: Frequently Asked Questions, version 1.0, April 2020.
bApple and Google, Contact Tracing: Bluetooth Specification, v1.1, April 2020.
cApple and Google, "Privacy-Safe Contact Tracing Using Bluetooth Low Energy," webpage, undated.
dCasey Newton, "Apple and Google Have a Clever Way of Encouraging People to Install Contact-Tracing Apps for COVID-19," The Verge, April 14, 2020.
eDarrell Etherington, "Apple and Google Release Sample Code, UI and Detailed Policies for COVID19 Exposure-Notification Apps," TechCrunch, May 4, 2020.
fZack Whittaker and Darrell Etherington, "Q&A: Apple and Google Discuss Their Coronavirus Tracing Efforts," TechCrunch, April 13, 2020.
With the scorecard, end users of mobile surveillance program tools can see which privacy protections specific programs offer and explanations for how the program did or did not meet the identified criteria. In cases where there are privacy trade-offs—for example, if the collection of real identities is necessary or data need to be managed centrally rather than on users' devices—public health officials can explain the reasons for not meeting specific criteria. Transparency about such trade-offs, including a justification based on public health needs, can help build user trust.
This publication is part of the RAND research brief series. Research briefs present policy-oriented summaries of individual published, peer-reviewed documents or of a body of published work.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.