The Christchurch Massacre Was Another Internet-Enabled Atrocity


Mar 20, 2019

Imam Ibrahim Abdul Halim of the Linwood Mosque is embraced by Father Felimoun El-Baramoussy from the Coptic Church, in Christchurch, New Zealand March 18, 2019, photo by Edgar Su/Reuters

Imam Ibrahim Abdul Halim of the Linwood Mosque is embraced by Father Felimoun El-Baramoussy from the Coptic Church, in Christchurch, New Zealand, March 18, 2019

Photo by Edgar Su/Reuters

This commentary originally appeared on The Hill on March 20, 2019.

We are reminded every time we stream a movie, search Google or receive an email of the immense benefits that flow from the internet.

The latest mass killing in New Zealand is a reminder that these benefits are not unalloyed. Trolling, bullying, fake news, election manipulation, hate speech, international and domestic terrorism have all become internet-enabled abuses, incited by, propagated by, sometimes organized by and concealed by online activity. Calls to rein in such malign behavior have grown, so far to limited effect.

The ubiquity and the anonymity associated with the online world make fixing responsibility for abuses and designing remedies exceptionally difficult.

Who should be held accountable for abusive content online, the author or the publisher? That is, should it be the creator of the content or the hosting site? And what role should government play, if any, in regulating such activity?

The individual certainly bears responsibility for his or her work. Even in the United States not all expression is free. Speech involving fraud, conspiracy to commit crimes and, under some limited circumstances, even the simple encouragement to do so is illegal, whether conducted online or in person.

It is forbidden, for instance, to post, transmit or even download child pornography. The originator, the transmitting site and even the recipient can each be prosecuted. But the anonymous nature of much social media authorship, particularly that of a transgressive nature, makes individual accountability difficult to identify and even harder to penalize.

There are also civil penalties under U.S. law for slander, libel and defamation. In addition to the authors, the print publishers and broadcasters can also be sued for carrying such material. Social media sites, however, cannot be sued, having been granted an explicit exemption from such liability by the Communications Decency Act of 1996.

Congress could choose to remove that exemption. The government could also become more active in enforcement against various criminal forms of online behavior, as it has already with child pornography.

Taken very far, however, such enforcement could become expensive, intrusive and quite unpopular without becoming very effective.

There are other ways to encourage social media sites to exercise greater discretion regarding their content. Freedom of the press means freedom to carry or not carry whatever content the publisher chooses. And society can evaluate for-profit enterprises by the choices it makes.

Facebook is no more obliged to accept a posting than the New York Times is to print a submitted article. The social media companies themselves bear the social responsibility for their content.

Share on Twitter

Freedom of speech does not carry with it a freedom to be published. Facebook is no more obliged to accept a posting than the New York Times is to print a submitted article. And so it is the social media companies themselves that bear the social responsibility for their content.

Increasing the legal liability of internet sites to lawsuits for defamation might inspire some greater care on their part, and the threat of broader government regulation is already doing so. But users and advertisers can also have a major impact when they act in a concerted fashion, as they sometimes do.

For the market to provide a potent form of internet discipline, the public would need to hold social media companies morally and commercially responsible for the content they disseminate.

This means moving away from the conception of social media sites as passive transmitters of individual expressions, like the phone company, to see them as active moderators whose algorithms sort, organize, cull and display content in a calculated fashion for which they should be held responsible.

Exercising such editorial control presents a major challenge given the immense volume of material that flows through and remains on these sites, but it is ultimately a task only the social media companies themselves have the capacity to accomplish.

Although social media companies could rise to this steep challenge of their own volition, it seems likely that it would be individuals, through their choices as both voters and consumers, and businesses, through their choices as advertisers, who might be the most likely to nudge them in that direction.

James Dobbins is the former U.S. ambassador to the European Union. He holds the Distinguished Chair in Security and Diplomacy with the RAND Corporation.

More About This Commentary

Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.