The practice of disinformation, the deliberate spreading of false information to deceive, has been rife since before the outbreak of COVID-19. However, as a researcher who has examined this phenomenon recently in depth, disinformation about the virus should not 'infect' me—and yet, it has, at least for a few seconds.
The headline of an article Bonne nouvelle: le Champagne immunise contre le Coronavirus (Good news: Champagne immunises against Coronavirus) caught my attention and I could barely resist the urge to share it with friends. Then I wondered why. All the warning signs for disinformation were there: a picture of a magazine or journal article which was not possible to verify (neither the author nor the magazine it was published in), and the message was forwarded to me using WhatsApp, without the context that one can have on social platforms, such as Facebook. But it came from a doctor, who received it from another doctor. It mentioned 'authority organisations.'
Did I really believe that it was true? To be honest I did not even ask that question. I found it amusing. And it was exactly what I wanted to hear right after more bad news about the virus. I knew it would distract my friends for a couple of seconds from their concern and worry.
In uncertain times, emotions can take a greater role than they should in our decision-making process. While emotions are a necessary (and desirable) component of how we make decisions, it is important to be aware of these emotions and understand how they affect our choices. For example, uncertainty makes us more alert to information that could help us make sense of what is going on in the world.
Fear and anxiety can inhibit our natural ability to detect what is true and what is false, and shift our attention away from the possible effects of our actions.Share on Twitter
However, feelings of fear and anxiety can also inhibit our natural ability to detect what is true and what is false and shift our attention away from the possible effects of our actions, like spreading disinformation. Also, people naturally tend to lean towards news and information that confirms their existing worldview. This 'confirmation' bias is part of the many cognitive biases that affect our decision-making process.
While one could assume a post claiming that drinking Champagne could protect against the coronavirus is harmless, and would have a limited effect on health, disinformation can have disastrous consequences. One extreme example reported by BBC News reveals that 16 Iranians died from poisoning after false rumours spread that drinking alcohol would help prevent people getting the COVID-19 virus. Disinformation can also give a false sense of protection and lead people to be less careful about what really protects them, like social distancing.
In the COVID-19 context, our vulnerability to disinformation can be a risk to our health. More generally, it is a threat to the good functioning of democracy and an opportunity for those who can politically or financially gain from disinformation. Both information technologies (e.g. the use of algorithms to personalise and amplify content) and social media make it easier for false news to reach more people, faster and at a low cost. Such online disinformation campaigns orchestrated in times of elections were observed, for instance, in the 2016 UK vote to exit the European Union, the 2016 U.S. presidential elections (PDF) and the 2017 French presidential elections.
The strongest weapon against disinformation is our common sense. We need to learn when and how to use it.Share on Twitter
The strongest weapon against disinformation is our common sense. We need to learn when and how to use it. To support this, all European education systems provide some sort of critical thinking learning objectives as part of the citizenship education curriculum in schools. Media literacy is one aspect of critical thinking. There is evidence that media literacy initiatives at school tend to lessen children's vulnerability to disinformation. However, RAND Europe's recent research on media literacy shows that there is less evidence available about whether interventions outside the school setting work. In particular, serious efforts to evaluate such interventions have only recently started. It is too early to offer a comprehensive overview of which measures work in fighting disinformation targeting voters during elections, or vulnerable populations in times of crisis, such as now.
Like COVID-19, disinformation spreads only if we help it. While we have all been asked to stay at home as responsible citizens to contain the virus, we should also feel responsible for making it harder for disinformation to spread. Experts say that something as simple as washing your hands can make a difference in combatting the virus, and it takes only 40-60 seconds. This should become automatic, something we do without even thinking. Thinking twice before sharing a post should equally become automatic and does not take longer than washing your hands. While confined at home, why not try pausing for 40-60 seconds before you share a post?
A research leader at RAND Europe, Axelle Devaux is currently researching Truth Decay in Europe, as part of the RAND Truth Decay research agenda.
This commentary originally appeared on Encompass on April 9, 2020. Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.