Now Could Be the Time to Form Policy for Emerging Brain- and Body-Enhancement Technologies


Jan 12, 2021

Digital image of heads with padlocks, photo by maxkabakov/Getty Images

Photo by maxkabakov/Getty Images

This commentary originally appeared on RealClearDefense on January 12, 2021.

It has recently been reported that U.S. diplomats in China and Cuba were likely the victims of directed microwave radiation, causing physical effects such as headaches, visual problems, nausea, and cognitive difficulties. In some cases, the effects were so debilitating that the diplomats were no longer able to work. While causation has not been established, news outlets have reported on theories as to the intent of the radiation—perhaps to cause harm to the diplomats or to steal secrets from their cell phones.

Assaults such as these against the front lines of the U.S. diplomatic community could represent a stark warning on the implications of emerging technologies such as brain-computer interfaces (BCI) and other Internet of Bodies (IoB) devices. What might a nefarious actor be willing to do if wearable or implanted devices could reveal valuable government information, or if that information could be intercepted and manipulated? How should emerging technology be governed in order to yield the maximum benefit? The balancing of benefits and risks is complex and thus deserves considerable attention.

There are numerous potential and beneficial uses for BCI and IoB technologies. Amputees could use BCIs to physically control prosthetic limbs. Head-worn attention-monitoring devices could be used to analyze whether a student is paying attention or whether a driver is getting drowsy. Some have predicted that BCIs will someday become a requirement for those performing jobs such as surgery or flying planes. Internet-connected glasses or wearable cameras could record whatever the user sees. For some BCI developers, the ultimate goal is to be able to read and write to the brain, and recent research advancements have shown that neural signals can be translated into text, which could allow a person to type using only his or her thoughts or allow those unable to speak to articulate their feelings or health symptoms. Yet, with this expanse of benefits, which may require a sense of urgency, there are also risks that could be discussed sooner rather than later. In fact, unmanaged fear of the risks could stifle the ability to leverage the benefits. The events in China and Cuba are reminders that policy regarding emerging technology might best be addressed proactively rather than reactively.

Unmanaged fear of the risks of brain-computer interfaces could stifle the ability to leverage the benefits.

Share on Twitter

The U.S. Department of Defense is investing in the development of BCIs and IoB devices for military use to, for example, monitor cognitive workload and physical symptoms, control aircraft, and enhance decisionmaking. The implications of such technology being hacked by adversaries, though unlikely until far into the future, could be disastrous. An adversary might be willing to deploy weapons that could allow them not only to access and manipulate information read from or written to people's brains but to increase pain or exploit emotions such as fear.

These devices may also present a number of ethical and privacy risks. Data privacy is already a fraught issue for smartphone and other IoB devices and software. There is a patchwork of data privacy laws that vary widely state by state, with no coordinated federal regulations on data brokers (entities that have no direct relationship with a user but can buy and sell that person's information). BCIs and implanted IoB devices may potentially record more intimate data than ever before, which may exacerbate unresolved data privacy concerns. Ethical issues of informed consent and rules of engagement in a warfighting setting have yet to be resolved for either group of technology.

Together with international partners, policymakers might consider developing appropriate policy frameworks for such emerging technologies to ensure that innovations harnessed for societal, economic, or military benefits do not create new vulnerabilities and that governments adequately defend and manage against potential attacks. The technology is quickly moving forward. Policy may need to play catch-up.

Mary Lee is a mathematician at the nonprofit, nonpartisan RAND Corporation, Tim Marler is a senior research engineer at RAND and a professor at the Pardee RAND Graduate School and Anika Binnendijk is a political scientist at RAND.