Cover: #KeepingItReal


Improving Social Media Users' Resistance to False Information during Elections

Published Nov 9, 2021

by Hilary Reininger

Download eBook for Free

FormatFile SizeNotes
PDF file 4.7 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

This dissertation addresses there being too much false information shared on social media by users during U.S. election seasons. I characterize false information on social media in stages of Production, Dissemination, and Consumption. Social media platforms have responded by developing Remove, Reduce, and, recently, Inform policies that broadly respond to each stage, respectively. This dissertation investigated potential Inform policies and how social media platforms can apply them. The most applicable policy options are summarized for Facebook, Twitter, and YouTube in short, industry white papers in the appendices. After clarifying terms used in discussing disinformation (Chapter 2), I make the case for social media platforms enacting or expanding their Inform policies to help users resist false information (Chapter 3). I then conduct a literature review of other fields with similar Inform prerogatives (such as public health, advertising, and social psychology) and apply principles social media policymakers could use in constructing their Inform policies (Chapter 4). I then offer two proofs-of-concept to apply Inform policies. I network 7 million tweets using the Louvain community detection algorithm and conduct a text analysis to demonstrate principles for how a platform could find users most exposed to false information. I recommend a framework for how platforms could decide how to locate users most exposed to disinformation in a least-biased way to avoid accusations of political bias (Chapter 5). I then apply the most promising finding from the literature review, inoculation or "pre-bunks," in a survey experiment to help respondents (n = 634) resist behavioral reactions to disinformation memes from the 2016 U.S. presidential campaign. I test how inoculations interact with emotions and how that relates to resistance to persuasion from disinformation. I find that inoculations never hurt and sometimes improved resistance to disinformation memes (Chapter 6). Finally, I suggest platforms like Facebook, Twitter, and YouTube, apply or expand Inform policies through building trust through transparency and communication with users, focusing on evidence-based, direct and indirect approaches for helping users resist false information, and applying principles of influence with what they know of their users to make attractive products to help their users resist false information.

Research conducted by

This document was submitted as a dissertation in August 2021 in partial fulfillment of the requirements of the doctoral degree in public policy analysis at the Pardee RAND Graduate School. The faculty committee that supervised and approved the dissertation consisted of Christopher Paul (Chair), Marek Posard, Bill Marcellino, and Yoel Roth.

This publication is part of the RAND dissertation series. Pardee RAND dissertations are produced by graduate fellows of the Pardee RAND Graduate School, the world's leading producer of Ph.D.'s in policy analysis. The dissertations are supervised, reviewed, and approved by a Pardee RAND faculty committee overseeing each dissertation.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.