Cover: Study on Media Literacy and Online Empowerment Issues Raised by Algorithm-Driven Media Services

Study on Media Literacy and Online Empowerment Issues Raised by Algorithm-Driven Media Services

Published in: European Commission (2019). doi: 10.2759/27617

Posted on Oct 30, 2019

by Axelle Devaux, Alexandra Theben, Advait Deshpande, Frans Folkvord, Arya Sofia Meranto, Federica Porcu, Tor Richardson-Golinski, Nuria Febrer, Jirka Taylor, Katherine Stewart, et al.

Citizens' rights to access unbiased and verifiable information online are being challenged. Countering the spread of disinformation is proving to be a Herculean task for societies around the world. The responsibility for stemming disinformation lies with many stakeholders. Social media platforms and the algorithms they use to shape the news that people read, as well as people's behaviour online, all play a role in the rapid spread of disinformation. People's choices and reaction to content can lead to the viral dissemination and amplification of harmful content. Policy initiatives, changes to social media algorithms and media literacy programmes for online users are all methods that have been employed so far to address these challenges but evidence of what works is still scarce. Research has shown that people tend to be unaware of their own cognitive biases and underestimate the extent to which algorithms influence their behaviour on social media platforms. Improving the media literacy of consumers and reducing their vulnerability to disinformation is a necessary part of the solution. However, our research also suggests that an approach based on behavioural science, which triggers people to be aware of their own online behaviour, is worth exploring. Rational and analytical thinking that is necessary for media literacy awareness can fall by the wayside as people shift into a more reactive and less rational way of thinking while engaging with news content online. This study proposes three concrete behavioural experiments to be conducted that would test whether social media platforms could counter cognitive biases and trigger a more analytical type of thinking by online users. This low-key approach would be employed at the point of media consumption—i.e. just before a user is about to click on an article link—and prompt them to pause for a moment and consider what it is they are about to share or read.

Research conducted by

This report is part of the RAND external publication series. Many RAND studies are published in peer-reviewed scholarly journals, as chapters in commercial books, or as documents published by other organizations.

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.