The Confluence of Brain and Machine May Require Multi-Sectoral Regulation to Maximise Its Gains

commentary

Jan 30, 2023

Illustration in yellow, blue, and purple of silhouettes of heads with circuitry and computer chips, photo by barsrsind/Getty Images

Photo by barsrsind/Getty Images

This commentary originally appeared on Inside Sources on January 30, 2023.

It has been more than two years since the Parliamentary Assembly of the Council of Europe spoke up about the risks to individual privacy from brain-computer interfaces (BCIs) and related technology. Yet many policymakers are still not alive to the challenge of comprehensive regulation in this accelerating industry which could bring significant gains in many areas of technology and medical development. Moreover the conversation on neuro-rights—the right to control one's own identity and have ultimate control over decisionmaking—remains a slow burn compared to the technological advancement of BCIs.

Since the 1970s, international scientists have been researching and developing BCIs, devices that connect the brain to a computer and decode brain activity in real-time. The technology has evolved significantly over the years, attracting attention in various fields, with numerous applications potentially revolutionizing fields such as medicine, military, and entertainment. As the Parliamentary Assembly (PDF) of the Council of Europe stated, the development and deployment of this technology “risks profound violation of individual privacy and dignity, with the potential to subvert free will and breach the ultimate refuge of human freedom—the mind.” As the commercialization of this technology expands, it would be prudent for international policymakers to consider how this technology should be regulated, in order to reap its benefits and minimize its risks.

Investment in BCI technology is expanding and the field continues to advance at a rapid pace with around 59,000 academic publications and 10,000 patents filed in 2021 alone. The global BCI market was $1.52 billion in 2021 and is expected to reach $5.3 billion by 2030. In the United States, NIH's Brain Initiative was launched in 2013, followed by the China Brain Project and other countries, including Japan, South Korea, Canada, and Australia, following suit with similar funding programs.

It would be prudent for international policymakers to consider how brain-computer interface technology should be regulated, in order to reap its benefits and minimize its risks.

Share on Twitter

To date, the majority of BCI research and development has been in the medical sector in assistive technology. Some medical applications aim to replace or restore brain functioning lost due to illness or accident, while others can aid diagnosis or rehabilitation. BCI technology has also shown promise in motor rehabilitation following stroke, as well as in treating Parkinson's disease and psychiatric disorders. BCI-based interventions have also helped children with ADHD with inattentive symptoms. Other medical applications showing promise include enabling patients to control assistive equipment (PDF) like mechanized limbs or wheelchairs, as well as speech translators for patients who lost the ability to speak.

Outside the medical field, BCI development has spanned across sectors such as entertainment, industrial production, transport, marketing, and defense. A limited number of companies are already marketing BCI headsets, allowing the user to play video games with their mind (PDF). Meta is developing BCI technology for hands-free augmented-reality navigation. BCI applications are also emerging for neuromarketing, including the collection of brain data to measure consumer reaction to campaigns.

In the military sphere, the U.S. Defense Advanced Research Projects Agency has funded BCI research since the 1970s with the aims of augmenting the cognitive capabilities of armed forces and mitigating the human consequences of armed conflict. Projects have included the development of functional prosthetics, memory enhancement and recovery, and telepathic communication. Similarly, the UK Ministry of Defence (PDF) is researching BCI technology to enhance cognitive abilities such as decisionmaking and sensory processing. Other potential military applications of BCIs include control of unmanned aerial vehicles or cyberdefence systems.

Despite their promise, BCIs come with both technical and social challenges. Since invasive BCIs are implanted directly in the brain, they pose a greater risk to tissue damage and infection. Noninvasive BCIs are not exempt from risks considering the malleable nature of neural activity of the brain and there are calls to better assess and define the risk associated. Noninvasive BCIs, such as gaming headsets, are already on the market but there has been limited research on the long-term effects of these devices.

The use of BCI technology raises additional questions regarding agency and liability. As the consumer market for BCI technology expands, companies may have access to increasing volumes of sensitive data on brain activity, which they may seek to exploit for profit. Additionally, companies may be the target of cyber attacks (PDF) aimed at stealing data or obtaining control of BCIs. The possibility of data being used inappropriately or without relevant governance has led academics and policymakers to call for the introduction of mental rights or neuro-rights to the current human rights catalogue. These scenarios may seem far into the future but given the pace of development and investment in BCI, the timeline is not quite as stretched as initially anticipated by policymakers.

As the consumer market for BCI technology expands, companies may have access to increasing volumes of sensitive data on brain activity, which they may seek to exploit for profit.

Share on Twitter

In addition to technical, safety, and social issues, there can be significant ethical issues, especially involving use in military contexts. As BCI technology becomes more common in battlefields, there is a risk that the use of such technology may contribute to a diffusion of responsibility for use-of-force decisions, making it more difficult to determine legal culpability. Attribution may be particularly challenging considering that BCIs can potentially stimulate regions of the brain that reduce the inhibitions of soldiers about morally problematic behaviors. If a soldier commits a war crime in these circumstances, would they still be legally responsible for the act?

While there are some policy developments striving to address these issues, including recent executive orders bringing a new focus on biotechnology in the United States, BCIs could be lost in the mix. In the European Union, a medical directive looks to classify most BCI technology as a medical device despite nonmedical applications, which is seen by some as a conservative stance.

The Organization for Economic Co-operation and Development (OECD) Council has developed a recommendation on responsible innovation in neurotechnology, aiming to guide governments and innovators to anticipate and address the ethical, legal, and social challenges raised by novel neurotechnology while still encouraging innovation in the field. This is a particularly useful tool for policymakers. With applications in entertainment, medicine, virtual reality, and human behavior and thoughts, it may be necessary to develop a cross-sector approach on BCI technology assessment and avoid siloed input into the regulatory frameworks. The United Kingdom, in particular, faces an important regulatory choice: maintain the EU's stance of classifying all BCI technology under the Medical Devices Regulation or regulate the technology in a manner that is more nuanced and could support more rapid innovation albeit at risk. Relatedly, policymakers might consider whether BCI regulations should be horizontal, reaching across various sectors. A proactive approach now seems imperative.


Sana Zakaria is a research leader working in science and emerging technology at RAND Europe. Joana Beirao is a stagiaire at RAND Europe. Tim Marler is a senior research engineer at the nonprofit, nonpartisan RAND and a professor at the Pardee RAND Graduate School.

More About This Commentary

Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.