The Russian "Firehose of Falsehood" Propaganda Model

Why It Might Work and Options to Counter It

by Christopher Paul, Miriam Matthews

Download Free Electronic Document

Full Document

FormatFile SizeNotes
PDF file 0.2 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

Российская модель пропаганды «Пожарный шланг с потоками лжи»: Почему это работает и каковы способы ей противостоять

Russian language version

FormatFile SizeNotes
PDF file 0.3 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

"نموذج الدعاية الروسيّ: "خرطوم الأباطيل

Arabic language version

FormatFile SizeNotes
PDF file 0.3 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

Russian President Vladimir Putin speaks with journalists after a live broadcast nationwide call-in, Moscow, April 14, 2016

Russian President Vladimir Putin speaks with journalists after a live broadcast nationwide call-in, Moscow, April 14, 2016

Photo by Maxim Shemetov/Reuters

Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia's approach to propaganda. This new approach was on full display during the country's 2014 annexation of the Crimean peninsula. It continues to be demonstrated in support of ongoing conflicts in Ukraine and Syria and in pursuit of nefarious and long-term goals in Russia's “near abroad” and against NATO allies.

In some ways, the current Russian approach to propaganda builds on Soviet Cold War–era techniques, with an emphasis on obfuscation and on getting targets to act in the interests of the propagandist without realizing that they have done so.1 In other ways, it is completely new and driven by the characteristics of the contemporary information environment. Russia has taken advantage of technology and available media in ways that would have been inconceivable during the Cold War. Its tools and channels now include the Internet, social media, and the evolving landscape of professional and amateur journalism and media outlets.

We characterize the contemporary Russian model for propaganda as “the firehose of falsehood” because of two of its distinctive features: high numbers of channels and messages and a shameless willingness to disseminate partial truths or outright fictions. In the words of one observer, “[N]ew Russian propaganda entertains, confuses and overwhelms the audience.”2

Contemporary Russian propaganda has at least two other distinctive features. It is also rapid, continuous, and repetitive, and it lacks commitment to consistency.

Interestingly, several of these features run directly counter to the conventional wisdom on effective influence and communication from government or defense sources, which traditionally emphasize the importance of truth, credibility, and the avoidance of contradiction.3 Despite ignoring these traditional principles, Russia seems to have enjoyed some success under its contemporary propaganda model, either through more direct persuasion and influence or by engaging in obfuscation, confusion, and the disruption or diminution of truthful reporting and messaging.

We offer several possible explanations for the effectiveness of Russia's firehose of falsehood. Our observations draw from a concise, but not exhaustive, review of the literature on influence and persuasion, as well as experimental research from the field of psychology. We explore the four identified features of the Russian propaganda model and show how and under what circumstances each might contribute to effectiveness. Many successful aspects of Russian propaganda have surprising foundations in the psychology literature, so we conclude with a brief discussion of possible approaches from the same field for responding to or competing with such an approach.

Russian Propaganda Is High-Volume and Multichannel

Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. This propaganda includes text, video, audio, and still imagery propagated via the Internet, social media, satellite television, and traditional radio and television broadcasting. The producers and disseminators include a substantial force of paid Internet “trolls” who also often attack or undermine views or information that runs counter to Russian themes, doing so through online chat rooms, discussion forums, and comments sections on news and other websites.4 Radio Free Europe/Radio Liberty reports that “there are thousands of fake accounts on Twitter, Facebook, LiveJournal, and vKontakte” maintained by Russian propagandists. According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.5

All other things being equal, messages received in greater volume and from more sources will be more persuasive.

Share on Twitter

RT (formerly Russia Today) is one of Russia's primary multimedia news providers. With a budget of more than $300 million per year, it broadcasts in English, French, German, Spanish, Russian, and several Eastern European languages. The channel is particularly popular online, where it claims more than a billion page views. If true, that would make it the most-watched news source on the Internet.6 In addition to acknowledged Russian sources like RT, there are dozens of proxy news sites presenting Russian propaganda, but with their affiliation with Russia disguised or downplayed.7

Experimental research shows that, to achieve success in disseminating propaganda, the variety of sources matters:

  • Multiple sources are more persuasive than a single source, especially if those sources contain different arguments that point to the same conclusion.
  • Receiving the same or similar message from multiple sources is more persuasive.
  • People assume that information from multiple sources is likely to be based on different perspectives and is thus worth greater consideration.8

The number and volume of sources also matter:

  • Endorsement by a large number of users boosts consumer trust, reliance, and confidence in the information, often with little attention paid to the credibility of those making the endorsements.
  • When consumer interest is low, the persuasiveness of a message can depend more on the number of arguments supporting it than on the quality of those arguments.9

Finally, the views of others matter, especially if the message comes from a source that shares characteristics with the recipient:

  • Communications from groups to which the recipient belongs are more likely to be perceived as credible. The same applies when the source is perceived as similar to the recipient. If a propaganda channel is (or purports to be) from a group the recipient identifies with, it is more likely to be persuasive.
  • Credibility can be social; that is, people are more likely to perceive a source as credible if others perceive the source as credible. This effect is even stronger when there is not enough information available to assess the trustworthiness of the source.
  • When information volume is low, recipients tend to favor experts, but when information volume is high, recipients tend to favor information from other users.
  • In online forums, comments attacking a proponent's expertise or trustworthiness diminish credibility and decrease the likelihood that readers will take action based on what they have read.10

The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have a quality all its own. High volume can deliver other benefits that are relevant in the Russian propaganda context. First, high volume can consume the attention and other available bandwidth of potential audiences, drowning out competing messages. Second, high volume can overwhelm competing messages in a flood of disagreement. Third, multiple channels increase the chances that target audiences are exposed to the message. Fourth, receiving a message via multiple modes and from multiple sources increases the message's perceived credibility, especially if a disseminating source is one with which an audience member identifies.

Russian Propaganda Is Rapid, Continuous, and Repetitive

Contemporary Russian propaganda is continuous and very responsive to events. Due to their lack of commitment to objective reality (discussed later), Russian propagandists do not need to wait to check facts or verify claims; they just disseminate an interpretation of emergent events that appears to best favor their themes and objectives. This allows them to be remarkably responsive and nimble, often broadcasting the first “news” of events (and, with similar frequency, the first news of nonevents, or things that have not actually happened). They will also repeat and recycle disinformation. The January 14, 2016, edition of Weekly Disinformation Review reported the reemergence of several previously debunked Russian propaganda stories, including that Polish President Andrzej Duda was insisting that Ukraine return former Polish territory, that Islamic State fighters were joining pro-Ukrainian forces, and that there was a Western-backed coup in Kiev, Ukraine’s capital.11

Sometimes, Russian propaganda is picked up and rebroadcast by legitimate news outlets; more frequently, social media repeats the themes, messages, or falsehoods introduced by one of Russia’s many dissemination channels. For example, German news sources rebroadcast Russian disinformation about atrocities in Ukraine in early 2014, and Russian disinformation about EU plans to deny visas to young Ukrainian men was repeated with such frequency in Ukrainian media that the Ukrainian general staff felt compelled to post a rebuttal.12

The experimental psychology literature tells us that first impressions are very resilient: An individual is more likely to accept the first information received on a topic and then favor this information when faced with conflicting messages.13 Furthermore, repetition leads to familiarity, and familiarity leads to acceptance:

  • Repeated exposure to a statement has been shown to increase its acceptance as true.
  • The “illusory truth effect” is well documented, whereby people rate statements as more truthful, valid, and believable when they have encountered those statements previously than when they are new statements.
  • When people are less interested in a topic, they are more likely to accept familiarity brought about by repetition as an indicator that the information (repeated to the point of familiarity) is correct.
  • When processing information, consumers may save time and energy by using a frequency heuristic, that is, favoring information they have heard more frequently.
  • Even with preposterous stories and urban legends, those who have heard them multiple times are more likely to believe that they are true.
  • If an individual is already familiar with an argument or claim (has seen it before, for example), they process it less carefully, often failing to discriminate weak arguments from strong arguments.14

Russian propaganda has the agility to be first, which affords propagandists the opportunity to create the first impression. Then, the combination of high-volume, multichannel, and continuous messaging makes Russian themes more likely to be familiar to their audiences, which gives them a boost in terms of perceived credibility, expertise, and trustworthiness.

Russian Propaganda Makes No Commitment to Objective Reality

It may come as little surprise that the psychology literature supports the persuasive potential of high-volume, diverse channels and sources, along with rapidity and repetition. These aspects of Russian propaganda make intuitive sense. One would expect any influence effort to enjoy greater success if it is backed by a willingness to invest in additional volume and channels and if its architects find ways to increase the frequency and responsiveness of messages. This next characteristic, however, flies in the face of intuition and conventional wisdom, which can be paraphrased as “The truth always wins.”

Contemporary Russian propaganda makes little or no commitment to the truth. This is not to say that all of it is false. Quite the contrary: It often contains a significant fraction of the truth. Sometimes, however, events reported in Russian propaganda are wholly manufactured, like the 2014 social media campaign to create panic about an explosion and chemical plume in St. Mary's Parish, Louisiana, that never happened.15 Russian propaganda has relied on manufactured evidence—often photographic. Some of these images are easily exposed as fake due to poor photo editing, such as discrepancies of scale, or the availability of the original (pre-altered) image.16 Russian propagandists have been caught hiring actors to portray victims of manufactured atrocities or crimes for news reports (as was the case when Viktoria Schmidt pretended to have been attacked by Syrian refugees in Germany for Russian's Zvezda TV network), or faking on-scene news reporting (as shown in a leaked video in which “reporter” Maria Katasonova is revealed to be in a darkened room with explosion sounds playing in the background rather than on a battlefield in Donetsk when a light is switched on during the recording).17

Contemporary Russian propaganda makes little or no commitment to the truth. This flies in the face of the conventional wisdom that the truth always wins.

Share on Twitter

In addition to manufacturing information, Russian propagandists often manufacture sources. Russian news channels, such as RT and Sputnik News, are more like a blend of infotainment and disinformation than fact-checked journalism, though their formats intentionally take the appearance of proper news programs. Russian news channels and other forms of media also misquote credible sources or cite a more credible source as the origin of a selected falsehood. For example, RT stated that blogger Brown Moses (a staunch critic of Syria's Assad regime whose real name is Eliot Higgins) had provided analysis of footage suggesting that chemical weapon attacks on August 21, 2013, had been perpetrated by Syrian rebels. In fact, Higgins's analysis concluded that the Syrian government was responsible for the attacks and that the footage had been faked to shift the blame.18 Similarly, several scholars and journalists, including Edward Lucas, Luke Harding, and Don Jensen, have reported that books that they did not write—and containing views clearly contrary to their own—had been published in Russian under their names. “The Kremlin's spin machine wants to portray Russia as a besieged fortress surrounded by malevolent outsiders,” said Lucas of his misattributed volume, How the West Lost to Putin.19

Why might this disinformation be effective? First, people are often cognitively lazy. Due to information overload (especially on the Internet), they use a number of different heuristics and shortcuts to determine whether new information is trustworthy.20 Second, people are often poor at discriminating true information from false information—or remembering that they have done so previously. The following are a few examples from the literature:

  • In a phenomenon known as the “sleeper effect,” low-credibility sources manifest greater persuasive impact with the passage of time. While people make initial assessments of the credibility of a source, in remembering, information is often dissociated from its source. Thus, information from a questionable source may be remembered as true, with the source forgotten.
  • Information that is initially assumed valid but is later retracted or proven false can continue to shape people's memory and influence their reasoning.
  • Even when people are aware that some sources (such as political campaign rhetoric) have the potential to contain misinformation, they still show a poor ability to discriminate between information that is false and information that is correct.21

Familiar themes or messages can be appealing even if these themes and messages are false. Information that connects with group identities or familiar narratives—or that arouses emotion—can be particularly persuasive. The literature describes the effects of this approach:

  • Someone is more likely to accept information when it is consistent with other messages that the person believes to be true.
  • People suffer from “confirmation bias”: They view news and opinions that confirm existing beliefs as more credible than other news and opinions, regardless of the quality of the arguments.
  • Someone who is already misinformed (that is, believes something that is not true) is less likely to accept evidence that goes against those misinformed beliefs.
  • People whose peer group is affected by an event are much more likely to accept conspiracy theories about that event.
  • Stories or accounts that create emotional arousal in the recipient (e.g., disgust, fear, happiness) are much more likely to be passed on, whether they are true or not.
  • Angry messages are more persuasive to angry audiences.22

False statements are more likely to be accepted if backed by evidence, even if that evidence is false:

  • The presence of evidence can override the effects of source credibility on perceived veracity of statements.
  • In courtroom simulations, witnesses who provide more details—even trivial details—are judged to be more credible.23

Finally, source credibility is often assessed based on “peripheral cues,” which may or may not conform to the reality of the situation.24 A broadcast that looks like a news broadcast, even if it is actually a propaganda broadcast, may be accorded the same degree of credibility as an actual news broadcast.25 Findings from the field of psychology show how peripheral cues can increase the credibility of propaganda:

  • Peripheral cues, such as the appearance of expertise or the format of information, lead people to accept—with little reflection—that the information comes from a credible source.
  • Expertise and trustworthiness are the two primary dimensions of credibility, and these qualities may be evaluated based on visual cues, such as format, appearance, or simple claims of expertise.
  • Online news sites are perceived as more credible than other online formats, regardless of the veracity of the content.26

The Russian firehose of falsehood takes advantage of all five of these factors. A certain proportion of falsehood in Russian propaganda may just be accepted by audiences because they do not recognize it as false or because various cues lead them to assign it greater credibility than they should. This proportion actually increases over time, with people forgetting that they have rejected certain offered “facts.” The proportion of falsehoods accepted increases even more when the disinformation is consistent with narratives or preconceptions held by various audiences. Where evidence is presented or seemingly credible sources disseminate the falsehoods, the messages are even more likely to be accepted. This is why Russian faux-news propaganda channels, such as RT and Sputnik, are so insidious. Visually, they look like news programs, and the persons appearing on them are represented as journalists and experts, making audience members much more likely to ascribe credibility to the misinformation these sources are disseminating.

The logo of state-controlled broadcaster Russia Today (RT) is seen in front of the State Historical Museum at Red Square in central Moscow, March 18, 2018, photo by Gleb Garanich/Reuters

The logo of state-controlled broadcaster Russia Today (RT) is seen in front of the State Historical Museum at Red Square in central Moscow, March 18, 2018.

Photo by Gleb Garanich/Reuters

Russian Propaganda Is Not Committed to Consistency

The final distinctive characteristic of Russian propaganda is that it is not committed to consistency. First, different propaganda media do not necessarily broadcast the exact same themes or messages. Second, different channels do not necessarily broadcast the same account of contested events. Third, different channels or representatives show no fear of “changing their tune.” If one falsehood or misrepresentation is exposed or is not well received, the propagandists will discard it and move on to a new (though not necessarily more plausible) explanation. One example of such behavior is the string of accounts offered for the downing of Malaysia Airlines Flight 17. Russian sources have offered numerous theories about how the aircraft came to be shot down and by whom, very few of which are plausible.27 Lack of commitment to consistency is also apparent in statements from Russian President Vladimir Putin. For example, he first denied that the “little green men” in Crimea were Russian soldiers but later admitted that they were. Similarly, he at first denied any desire to see Crimea join Russia, but then he admitted that that had been his plan all along.28

Again, this flies in the face of the conventional wisdom on influence and persuasion. If sources are not consistent, how can they be credible? If they are not credible, how can they be influential? Research suggests that inconsistency can have deleterious effects on persuasion—for example, when recipients make an effort to scrutinize inconsistent messages from the same source.29 However, the literature in experimental psychology also shows that audiences can overlook contradictions under certain circumstances:

  • Contradictions can prompt a desire to understand why a shift in opinion or messages occurred. When a seemingly strong argument for a shift is provided or assumed (e.g., more thought is given or more information is obtained), the new message can have a greater persuasive impact.
  • When a source appears to have considered different perspectives, consumer attitudinal confidence is greater. A source who changes his or her opinion or message may be perceived as having given greater consideration to the topic, thereby influencing recipient confidence in the newest message.30

Potential losses in credibility due to inconsistency are potentially offset by synergies with other characteristics of contemporary propaganda. As noted earlier in the discussion of multiple channels, the presentation of multiple arguments by multiple sources is more persuasive than either the presentation of multiple arguments by one source or the presentation of one argument by multiple sources.31 These losses can also be offset by peripheral cues that enforce perceptions of credibility, trustworthiness, or legitimacy.32 Even if a channel or individual propagandist changes accounts of events from one day to the next, viewers are likely to evaluate the credibility of the new account without giving too much weight to the prior, “mistaken” account, provided that there are peripheral cues suggesting the source is credible.

While the psychology literature suggests that the Russian propaganda enterprise suffers little when channels are inconsistent with each other, or when a single channel is internally inconsistent, it is unclear how inconsistency accumulates for a single prominent figure. While inconsistent accounts by different propagandist on RT, for example, might be excused as the views of different journalists or changes due to updated information, the fabrications of Vladimir Putin have been unambiguously attributed to him, which cannot be good for his personal credibility. Of course, perhaps many people have a low baseline expectation of the veracity of statements by politicians and world leaders.33 To the extent that this is the case, Putin's fabrications, though more egregious than the routine, might be perceived as just more of what is expected from politicians in general and might not constrain his future influence potential.

What Can Be Done to Counter the Firehose of Falsehood?

Experimental research in psychology suggests that the features of the contemporary Russian propaganda model have the potential to be highly effective. Even those features that run counter to conventional wisdom on effective influence (e.g., the importance of veracity and consistency) receive some support in the literature.

If the Russian approach to propaganda is effective, then what can be done about it? We conclude with a few thoughts about how NATO, the United States, or other opponents of the firehose of falsehood might better compete. The first step is to recognize that this is a nontrivial challenge. Indeed, the very factors that make the firehose of falsehood effective also make it quite difficult to counter: For example, the high volume and multitude of channels for Russian propaganda offer proportionately limited yield if one channel is taken off the air (or offline) or if a single misleading voice is discredited. The persuasive benefits that Russian propagandists gain from presenting the first version of events (which then must be dislodged by true accounts at much greater effort) could be removed if the true accounts were instead presented first. But while credible and professional journalists are still checking their facts, the Russian firehose of falsehood is already flowing: It takes less time to make up facts than it does to verify them.

We are not optimistic about the effectiveness of traditional counterpropaganda efforts. Certainly, some effort must be made to point out falsehoods and inconsistencies, but the same psychological evidence that shows how falsehood and inconsistency gain traction also tells us that retractions and refutations are seldom effective. Especially after a significant amount of time has passed, people will have trouble recalling which information they have received is the disinformation and which is the truth. Put simply, our first suggestion is don't expect to counter the firehose of falsehood with the squirt gun of truth.

To the extent that efforts to directly counter or refute Russian propaganda are necessary, there are some best practices available—also drawn from the field of psychology—that can and should be employed. Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false “facts” are removed.34

Forewarning is perhaps more effective than retractions or refutation of propaganda that has already been received. The research suggests two possible avenues:

  • Propagandists gain advantage by offering the first impression, which is hard to overcome. If, however, potential audiences have already been primed with correct information, the disinformation finds itself in the same role as a retraction or refutation: disadvantaged relative to what is already known.35
  • When people resist persuasion or influence, that act reinforces their preexisting beliefs.36 It may be more productive to highlight the ways in which Russian propagandists attempt to manipulate audiences, rather than fighting the specific manipulations.

In practice, getting in front of misinformation and raising awareness of misinformation might involve more robust and more widely publicized efforts to “out” Russian propaganda sources and the nature of their efforts. Alternatively, it could take the form of sanctions, fines, or other barriers against the practice of propaganda under the guise of journalism. The UK communications regulator, Ofcom, has sanctioned RT for biased or misleading programs, but more is needed.37 Our second suggestion is to find ways to help put raincoats on those at whom the firehose of falsehood is being directed.

Don't expect to counter Russia's firehose of falsehood with the squirt gun of truth. Instead, put raincoats on those at whom the firehose is aimed.

Share on Twitter

Another possibility is to focus on countering the effects of Russian propaganda, rather than the propaganda itself. The propagandists are working to accomplish something. The goal may be a change in attitudes, behaviors, or both. Identify those desired effects and then work to counter the effects that run contrary to your goals. For example, suppose the goal of a set of Russian propaganda products is to undermine the willingness of citizens in NATO countries to respond to Russian aggression. Rather than trying to block, refute, or undermine the propaganda, focus instead on countering its objective. This could be accomplished through efforts to, for example, boost support for a response to Russian aggression, promote solidarity and identity with threatened NATO partners, or reaffirm international commitments.

Thinking about the problem in this way leads to several positive developments. It encourages prioritization: Do not worry so much about countering propaganda that contributes to effects that are not of concern. This view also opens up the aperture. Rather than just trying to counter disinformation with other information, it might be possible to thwart desired effects with other capabilities—or to simply apply information efforts to redirecting behaviors or attitudes without ever directly engaging with the propaganda. That leads to our third suggestion: Don't direct your flow of information directly back at the firehose of falsehood; instead, point your stream at whatever the firehose is aimed at, and try to push that audience in more productive directions.

That metaphor and mindset leads us to our fourth suggestion for responding to Russian propaganda: Compete! If Russian propaganda aims to achieve certain effects, it can be countered by preventing or diminishing those effects. Yet, the tools of the Russian propagandists may not be available due to resource constraints or policy, legal, or ethical barriers. Although it may be difficult or impossible to directly refute Russian propaganda, both NATO and the United States have a range of capabilities to inform, influence, and persuade selected target audiences. Increase the flow of persuasive information and start to compete, seeking to generate effects that support U.S. and NATO objectives.

Our fifth and final suggestion for addressing the challenge of Russian propaganda is to use various technical means to turn off (or turn down) the flow. If the firehose of falsehood is being employed as part of active hostilities, or if counterpropaganda efforts escalate to include the use of a wider range of information warfare capabilities, then jamming, corrupting, degrading, destroying, usurping, or otherwise interfering with the ability of the propagandists to broadcast and disseminate their messages could diminish the impact of their efforts. Anything from aggressive enforcement of terms of service agreements with Internet providers and social media services to electronic warfare or cyberspace operations could lower the volume—and the impact—of Russian propaganda.

Notes

  • [1] Olga Oliker, “Russia's New Military Doctrine: Same as the Old Doctrine, Mostly,” Washington Post, January 15, 2015.
  • [2] Giorgio Bertolin, “Conceptualizing Russian Information Operations: Info-War and Infiltration in the Context of Hybrid Warfare,” IO Sphere, Summer 2015, p. 10.
  • [3] See, for example, U.S. Department of Defense, Defense Science Board, Report of the Defense Science Board Task Force on Strategic Communication, Washington, D.C., January 2008; Christopher Paul, Strategic Communication: Origins, Concepts, and Current Debates, Santa Barbara, Calif.: Praeger Security International, 2011; Arturo Muñoz, U.S. Military Information Operations in Afghanistan: Effectiveness of Psychological Operations 2001–2010, Santa Monica, Calif.: RAND Corporation, MG-1060, 2012.
  • [4] See Adrian Chen, “The Agency,” New York Times Magazine, June 2, 2015, and Peter Pomerantsev and Michael Weiss, The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money, New York: Institute of Modern Russia and The Interpreter, 2014.
  • [5] Dmitry Volchek and Daisy Sindelar, “One Professional Russian Troll Tells All,” Radio Free Europe/Radio Liberty, March 25, 2015.
  • [6] Pomerantsev and Weiss, 2014.
  • [7] Joel Harding, “Russian News and Russian Proxy News Sites,” To Inform Is to Influence, November 15, 2015.
  • [8] The first two points on sources are from Stephen G. Harkins and Richard E. Petty, “The Multiple Source Effect in Persuasion: The Effects of Distraction,” Personality and Social Psychology Bulletin, Vol. 7, No. 4, December 1981; the third is from Harkins and Petty, “Information Utility and the Multiple Source Effect,” Journal of Personality and Social Psychology, Vol. 52, No. 2, 1987.
  • [9] The first point on the number and volume of sources is from Andrew J. Flanagin and Miriam J. Metzger, “Trusting Expert- Versus User-Generated Ratings Online: The Role of Information Volume, Valence, and Consumer Characteristics,” Computers in Human Behavior, Vol. 29, No. 4, July 2013; the second is from Joseph W. Alba and Howard Marmorstein, “The Effects of Frequency Knowledge on Consumer Decision Making,” Journal of Consumer Research, Vol. 14, No. 1, June 1987.
  • [10] The points on the views of others are, respectively, from Chanthika Pornpitakpan, “The Persuasiveness of Source Credibility: A Critical Review of Five Decades' Evidence,” Journal of Applied Social Psychology, Vol. 34, No. 2, February 2004; Michael G. Hughes, Jennifer A. Griffith, Thomas A. Zeni, Matthew L. Arsenault, Olivia D. Copper, Genevieve Johnson, Jay H. Hardy, Shane Connelly, and Michael D. Mumford, “Discrediting in a Message Board Forum: The Effects of Social Support and Attacks on Expertise and Trustworthiness,” Journal of Computer-Mediated Communication, Vol. 19, No. 3, April 2014; Flanagin and Metzger, 2013; and Hughes et al., 2014.
  • [11] Disinformation, “Weekly Disinformation Review,” Disinfo, January 14, 2016.
  • [12] Examples of the propagation of Russian disinformation from, respectively, Milan Lelich, “Victims of Russian Propaganda,” New Eastern Europe, July 25, 2014, and Paul A. Goble, “Top 10 Fakes of Russian Propaganda About Ukraine in 2015,” Euromaidan Press, December 26, 2015.
  • [13] Richard E. Petty, John T. Caccioppo, Alan J., Strathman, and Joseph R. Priester, “To Think or Not To Think: Exploring Two Routes to Persuasion,” in Timothy C. Brock and Melanie C. Green, eds., Persuasion: Psychological Insights and Perspectives, 2nd ed., Thousand Oaks, Calif.: Sage Publications, 2005.
  • [14] Points on repetition and familiarity from, respectively, Stephan Lewandowsky, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook, “Misinformation and Its Correction: Continued Influence and Successful Debiasing,” Psychological Science in the Public Interest, Vol. 13, No. 3, December 2012; Linda A. Henkel and Mark E. Mattson, “Reading Is Believing: The Truth Effect and Source Credibility,” Consciousness and Cognition, Vol. 20, No. 4, December 2011; Heather M. Claypool, Diane M. Mackie, Teresa Garcia-Marques, Ashley McIntosh, and Ashton Udall, “The Effects of Personal Relevance and Repetition on Persuasive Processing,” Social Cognition, Vol. 22, No. 3, June 2004; Alba, and Marmorstein, “The Effects of Frequency Knowledge on Consumer Decision Making,” Journal of Consumer Research, Vol. 14, No. 1, June 1987; Jean E. Fox Tree and Mary Susan Eldon, “Retelling Urban Legends,” American Journal of Psychology, Vol. 120, No. 3, Fall 2007; and Teresa Garcia-Marques and Diane M. Mackie, “The Feeling of Familiarity as a Regulator of Persuasive Processing,” Social Cognition, Vol. 19, No. 1, 2001.
  • [15] Chen, 2015.
  • [16] Julia Davis, “Russia's Top 100 Lies About Ukraine,” The Examiner, August 11, 2014.
  • [17] Examples of Russian propagandists using actors to spoof actual news events from, respectively, Balmforth, 2016, and Oli Smith, “Watch: Russia's Fake Ukraine War Report Exposed in Putin PR Disaster,” Express, August 24, 2015.
  • [18] James Miller, “Russian Media: Conspiracy Theories and Reading Comprehension Issues,” The Interpreter, September 18, 2013.
  • [19] Edward Lucas, “Russia Has Published Books I Didn't Write!” Daily Beast, August 20, 2015.
  • [20] Miriam J. Metzger and Andrew J. Flanagin, “Credibility and Trust of Information in Online Environments: The Use of Cognitive Heuristics,” Journal of Pragmatics, Vol. 59, Part B, December 2013.
  • [21] The point on the sleeper effect and credibility is from Pornpitakpan, 2004, and Henkel and Mattson, 2011. See also Lewandowsky et al., 2012, and Ullrich K. H. Ecker, Stephan Lewandowsky, Olivia Fenton, and Kelsey Martin, “Do People Keep Believing Because They Want to? Preexisting Attitudes and Continued Influence of Misinformation,” Memory and Cognition, Vol. 42, No. 2, 2014. The point on information that is later retracted or proven false is from Ecker et al., 2014. See also Lewandowsky et al., 2012. The point on awareness of potential misinformation is from Lewandowsky et al., 2012.
  • [22] These points on messages, familiarity, and emotions are from, respectively, Lewandowsky et al., 2012; Pornpitakpan, 2004; Ecker et al., 2014; Jan-Willem Van Prooijen and Eric van Dijk, “When Consequence Size Predicts Belief in Conspiracy Theories: The Moderating Role of Perspective Taking,” Journal of Experimental Social Psychology, Vol. 44, November 2014; Lewandowsky et al., 2012; and David Destono, Richard E. Petty, Derek D., Rucker, Duane T. Wegener, and Julia Braverman, “Discrete Emotions and Persuasion: The Role of Emotion-Induced Expectancies,” Journal of Personality and Social Psychology, Vol. 86, No. 1, January 2004.
  • [23] These points on evidence and credibility are from, respectively, Pornpitakpan, 2004, and Brad E. Bell and Elizabeth F. Loftus, “Trivial Persuasion in the Courtroom: The Power of (a Few) Minor Details,” Journal of Personality and Social Psychology, Vol. 56, No. 5, May 1989.
  • [24] Petty et al., 2005.
  • [25] Metzger and Flanagin, 2013.
  • [26] These points on peripheral cues and trustworthiness are from, respectively, Petty et al., 2005; James C. McCroskey and Thomas J. Young, “Ethos and Credibility: The Construct and Its Measurement After Three Decades,” Central States Speech Journal, Vol. 32, No. 1, 1981, and Pornpitakpan, 2004; and Andrew J. Flanagin and Miriam J. Metzger, “The Role of Site Features, User Attributes, and Information Verification Behaviors on the Perceived Credibility of Web-Based Information,” New Media and Society, Vol. 9, No. 2, April 2007.
  • [27] Michael B. Kelley and Brett LoGiurato, “Russia's Military Tells a Very Different Story About What Happened to MH17,” Business Insider, July 21, 2014.
  • [28] Steven Pifer, “Putin, Lies and His 'Little Green Men,'” CNN, March 20, 2015.
  • [29] René Ziegler, Michael Diehl, Raffael Zigon, and Torsten Fett, “Source Consistency, Distinctiveness, and Consensus: The Three Dimensions of the Kelley ANOVA Model of Persuasion,” Personality and Social Psychology Bulletin, Vol. 30, No. 3, March 2004.
  • [30] The point on contradiction prompting a desire to understand the reason for a shift in opinion is from Taly Reich and Zakary L. Tormala, “When Contradictions Foster Persuasion: An Attributional Perspective,” Journal of Experimental Social Psychology, Vol. 49, No. 3, May 2013. The point about confidence in a source who has changed perspectives is from Derek D. Rucker, Richard E. Petty, and Pablo Briñol, “What's in a Frame Anyway? A Meta-Cognitive Analysis of the Impact of One Versus Two Sided Message Framing on Attitude Certainty,” Journal of Consumer Psychology, Vol. 18, No. 2, April 2008.
  • [31] Stephen G. Harkins and Richard E. Petty, “Information Utility and the Multiple Source Effect,” Journal of Personality and Social Psychology, Vol. 52, No. 2, February 1987.
  • [32] Petty et al., 2005.
  • [33] Richard R. Lau, “Negativity in Political Perception,” Political Behavior, Vol. 4, No. 4, December 1982.
  • [34] Lewandowsky et al., 2012.
  • [35] Ecker et al., 2014.
  • [36] Zakary L. Tormala and Richard E. Petty, “Source Credibility and Attitude Certainty: A Metacognitive Analysis of Resistance to Persuasion,” Journal of Consumer Psychology, Vol. 14, No. 4, 2004.
  • [37] Jasper Jackson, “RT Sanctioned by Ofcom over Series of Misleading and Biased Articles,” The Guardian, September 21, 2015.

This research was conducted in the International Security and Defense Policy Center of the RAND National Defense Research Institute.

This report is part of the RAND Corporation perspective series. RAND perspectives present informed perspective on a timely topic that address the challenges facing the public and private sectors. All RAND perspectives undergo rigorous peer review to ensure high standards for research quality and objectivity.

Permission is given to duplicate this electronic document for personal use only, as long as it is unaltered and complete. Copies may not be duplicated for commercial purposes. Unauthorized posting of RAND PDFs to a non-RAND Web site is prohibited. RAND PDFs are protected under copyright law. For information on reprint and linking permissions, please visit the RAND Permissions page.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.