Right now, your web browsing habits are likely being recorded, analyzed, and used to try and influence you by hundreds of companies engaged in “Online Behavior Advertising” (OBA). The history of websites you have visited is used to show you advertisements designed to entice you to click on them, which translates into revenue for advertisers. By default, browsers generally are set to automatically allow this kind of online tracking, but there may be a better way to make sure consumers decide whether or not they want to allow advertisers to sniff at their digital footprints.
The OBA firms seek to reassure consumers by saying there is no “personally identifiable information” in their datasets (by which they mean your name or address isn't used to identify you in their records), and that their attention to your browsing history will make the advertisements you see more useful and interesting. They also argue that restricting such advertising will prevent sites from offering the free content and services many consumers want. However, privacy rights advocates say the information is exceedingly easy to de-anonymize and presents a real threat to an individual's freedom to explore the internet without fear of having their behavior used against them by their bosses, their peers, or the government.
Currently, there is heated debate over which way browsers should be set by default; should tracking be allowed with an option to 'opt-out,' (the current system) or should tracking be blocked with an option to 'opt-in.'? So far, there has been no compromise between privacy advocates and OBAs on the question, partly because very little data exists on which default is better, or more justified, than the other. A default option carries an enormous amount of weight with consumers, who often assume the default is what most people prefer. Many studies have shown that labeling something as a default greatly increases the number of people choosing it.
In order to get past the debate and work toward giving individuals real choice, I argue for a third alternative. Rather than a default with an option to either opt-in or opt-out, browsers should simply require consumers to choose one or the other without their decision being influenced by the establishment of a default. This type of system is generally called “forced choice” and works in areas like organ donation where neutrality is critical to avoid giving the appearance that one or the other choice is preferable. Online advertisers would then know their advertising, promotions, and free content are only being sent to consumers who want it. And privacy advocates would know that consumers were affirmatively deciding whether or not they want to be tracked by advertisers. An added advantage of forced choice is that it can be implemented unilaterally by any browser at near zero cost.
User choices from such a system could easily be aggregated and used to gauge the public's opinion about such targeted advertising. If data breaches or scandals erupted within the OBA industry, it is likely consumers would begin to refuse tracking and, hence, targeted ads. This is another advantage of forced choice — it better aligns the incentives of online advertisers with consumers' concerns about privacy. Rather than resisting privacy restrictions aimed at the more sinister forms of OBA, online advertisers may come to promote them as a way of reassuring consumers that their privacy and trust are valued and that they should feel comfortable allowing OBA.
Forced choice's strength is also its weakness: users have to choose. And arguably, an uninformed choice is no better than an unexamined default. While OBA is a difficult subject, that alone isn't a reason to ignore it. With forced choice, advertisers would be free to try and promote the benefits of tracking and behavioral advertising to consumers, and privacy advocates would be free to do the opposite. In the end, though, the consumer would have the last word.
Steven Isley is a doctoral fellow at the Pardee RAND Graduate School.
This commentary originally appeared in The Orange County Register on October 8, 2013.