Someone, somewhere is likely collecting data about you right now. In the decade since the introduction of the iPhone, it has become common for smartphones and their built-in sensors to gather and store extensive data about their users. Smart watches place similar sensors on our wrists, making it possible to record our location and details about our health status in real time. With the rapid growth of the so-called Internet of Things (IoT), sensors are increasingly embedded in devices in our homes and cars. These devices are not always transparent about the data they collect and how they protect it. Phone applications can surreptitiously turn on microphones or cameras and IoT devices send streams of data out of our homes over the Internet with varying degrees of security.
Even our bodies are being invaded by technologies that collect personal data. Implantable medical devices, including advanced pacemakers that treat heart problems and insulin pumps to manage diabetes, also collect a stream of data. Many such devices transmit it to health care providers, helping to improve care and reduce costs. In the future, implanted technology will not just be for treating disease but for augmenting human senses and capabilities—to enhance memory, expand human perception and provide connectivity and communications. We are moving from a world where people often think of their phone as an extension of themselves to one where it is quite literally part of who they are.
The “someone” collecting all this data is often you.
The “someone” collecting all this data is therefore often you, since knowing where and how far you walked, having a record of your communications with your friends, or monitoring the status of your home from afar was the reason you bought the devices. But even if you willingly signed on to this, the reality is that the data is also useful to others—and you may not always like the idea of companies using analysis of your email to target advertising or the government having ready access to your location or health data for a criminal investigation.
Using data from personal devices to solve crimes and help protect the public is both an area of great debate and rapid advances. Major debate has been going on regarding whether citizens should be allowed to protect their smartphones with encryption and passwords that make it difficult if not impossible for police to access the data contained. Police in Georgia have sought records of data that might have been recorded by an IoT device—an Amazon Echo—investigating a suspected murder, seeking a source of data that would never exist but for the integration of sensors into home devices. In Ohio, police executed a warrant that practically reached inside a suspect's body for evidence, using information from an implanted pacemaker in an investigation of arson and insurance fraud.
Even if such data can help solve crimes, should it be made available for that purpose? Though public safety is an important goal, how much of a modern citizen's “digital footprint” should be available for criminal or other investigations? Should it matter whether citizens are aware their phone or car is monitoring them? What if they don't have the freedom to opt out and stop the data collection, which would certainly be difficult if the technology was embedded in their bodies? When does access to that data begin to look less like police searching through someone's belongings and more like forcing them to testify against themselves, something the Constitution provides specific protection against?
The answers to these futuristic-sounding questions could have long-reaching effects—potentially creating disincentives for some capabilities to be built into new technologies or pushing criminals to use tools with built-in legal protection from police access. The answers won't just affect the balance between individual rights and public safety. They will echo in the decisions people make about what technologies to use and influence inventive companies' decisions about what new devices and features to create.
When public safety and criminal justice are the concern, the questions are often taken up in the courts; precedent set in individual cases shapes the legal landscape for the nation overall. Is the court system prepared for such complex technological questions? A recent panel of legal and technical experts convened as part of RAND research for the National Institute of Justice raised concerns that the court system is unprepared, identifying more than 30 actions the panel thought were needed. The recommendations include doing fundamental research to assess what analogies to existing technologies are and aren't appropriate, defining policies on collecting real-time physiological data during legal proceedings, and developing better training for judges and lawyers to prepare them to take on these 21st-century conundrums.
The decision to allow data from implanted medical devices to be used in criminal proceedings may affect whether patients are willing to even use such devices.
Making sure the court system is prepared to handle these questions is important. But given everything at stake, society shouldn't rely only on lawyers, prosecutors and judges to untangle the issues. For example, the decision to allow data from implanted medical devices to be used in criminal proceedings may affect whether patients are willing to even use such devices, which is a question that goes well beyond the value of data to a specific criminal case. Going one direction might sacrifice a capability that could help solve many crimes, while another might limit the use of a technology that could save many lives. Deciding which trade-offs should be made will require thinking beyond the walls of the police station, courthouse or prison.
Doing that requires involvement of society as a whole, including legislators, technology companies that have a financial interest in getting it right, and the citizens whose rights are at stake. The controversy regarding government surveillance in recent years demonstrated that, on issues of technologies and individual rights, citizens-as-customers can influence companies. And those companies can respond technologically—as many did by adding end-to-end encryption to their communications apps—and can drive policy debate as well.
Citizens and civil society should therefore look for opportunities to query the government about how data from new technologies, applications and the Internet of Things is being used. And based on the answers, everyone should look for opportunities to question companies innovating in these areas. Doing so could push them to consider how these issues may affect future customers and their bottom line. It also will keep influential companies in the policy debate and help the nation's legal structures and ideas keep up with technology.
If the trade-offs were easy, it might be all right to allow them to be sorted out on a case-by-case basis, whether in the courts or by legal or policy experts. But they aren't. The answers will shape not just individuals' rights in court processes and criminal investigations, but the pace of technology and innovation in the economy as well.
Brian A. Jackson is a senior physical scientist at the nonprofit, nonpartisan RAND Corporation and a professor at the Pardee RAND Graduate School.