The Care19 mobile app, developed by the State of North Dakota to assist in contact tracing during the global outbreak of the COVID-19, April 24, 2020, photo by Paresh Dave/Reuters

commentary

(Inside Sources)

Should Communities Be Concerned About Digital Technologies to Fight COVID-19?

The Care19 mobile app, developed by the State of North Dakota to assist in contact tracing during the global outbreak of the COVID-19, April 24, 2020

Photo by Paresh Dave/Reuters

Recently, Apple and Google caused a stir among cybersecurity experts and privacy advocates in particular with their announcement that they are jointly working on an application to enable contact tracing through their respective devices' Bluetooth systems.

But these digital applications could pose privacy and security concerns of which communities should be aware.

Personal, internet-connected electronic devices, such as smartphones and activity trackers, are offering an unprecedented opportunity to identify, track, map, and communicate about COVID-19 within communities.

For example, increasing use of telemedicine—delivery of health provider services via telecommunications technologies like phones, computers, and tablets—is one important advance that has been used to help those with mild COVID-19 symptoms.

This Apple and Google app would monitor phone location data, instruct users to report if they had tested positive for COVID-19, and alert users if their phone was in close physical proximity to the phone of an infected individual.

It makes sense to leverage increasingly digitally connected communities to gather data and disseminate information. The technology is already out there, collecting and capable of reporting location, signs of disease, and more. However, there are some important trade-offs to consider.

First, an immediate concern is the limited ability to conduct oversight on how these types of applications are used and how any results they provide are interpreted. Effective employment of contact tracing depends on several assumptions.

It assumes a high rate of accurate testing, accompanied by clear guidance about what to do if a person has come into contact with an infected person.

Contact-tracing technology is also limited by ability to discern whether people were physically separated by floors or walls or whether face masks were worn. Furthermore, if the number of users of an app are too few, or the users are not representative of the population (i.e., in demographics or in the prevalence of COVID-19 infection), then the resulting data may have limited utility and could even cause harm.

For example, individuals who don't receive any alerts about proximity to an infected person may assume they are “safe” and relax their social-distancing behavior.

Many users may be willing to suspend privacy protections in the face of a crisis. But there are open questions about whether and how data and digital capabilities could be used in the future.

Share on Twitter

Second, privacy protection is somewhat limited and variable across apps. Although many users may be willing to suspend privacy protections in the face of a crisis, there are open questions about whether and how data and digital capabilities could continue to be leveraged in the future.

In particular, contact-tracing apps may depend on users to report that they have tested positive, but individuals with COVID-19 may be especially reluctant to report this information to an app, even if the app promises anonymity, for fear of being identified and harmed.

Consumers are right to have concerns about what technology companies and others may have to gain in the long run from enabling increasingly widespread and invasive health-related data collection.

For example, in the United States, the privacy and security parts of the Health Insurance Portability and Accountability Act and the Health Information Technology for Economic and Clinical Health Act are only applicable to information shared by covered health care providers.

Thus, health information that individuals/patients share voluntarily through many apps is not covered. The exception is those apps that directly link a person to their medical care team.

Several apps that have arisen since the coronavirus crisis began do acknowledge privacy issues. Approaches generally involve either telling users that their personal information may be reported or guaranteeing some form of data anonymization that can naturally be suspended if a person may be infected with COVID-19.

An additional concern is that of cybersecurity. A significant increase in cyber attacks on hospitals, already overburdened with caring for COVID-19 patients, has already been reported.

The same apps that can help treat and track COVID-19 could also introduce vulnerabilities that bad actors could use to conduct attacks and surveillance.

Share on Twitter

The same digital applications that can help treat and track COVID-19 could also introduce vulnerabilities of which bad actors could take advantage—now or in the future—to conduct a range of attacks and surveillance.

For example, a malicious person or group might employ new applications to spread false information and create panic that overloads health or other services, request a ransom payment after denying services, surveil the movements and contacts of people of interest, and gain access to other valuable cyber targets, such as bank or retirement accounts.

There is tremendous need for information about symptoms, movements, and contacts at the personal level to track COVID-19 infections, identify and isolate close contacts at risk, anticipate coming needs, and support health policy decisions, such as those covering movement restrictions (including those within borders) or distribution of medical supplies.

Taking reasonable precautions that support privacy and security could help ensure that digital tools and information will help keep our communities safe and well during this, and future, emergencies.


Abbie Tingstad is a senior physical scientist and associate director of the Engineering and Applied Sciences Department, Shira Fischer is a physician policy researcher, Erika Bloom is a behavioral/social scientist, and Mary Lee is a mathematician and inaugural fellow for the Center for Global Risk and Security at the nonprofit, nonpartisan RAND Corporation.

This commentary originally appeared on Inside Sources on May 13, 2020. Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.