Extremist groups have been trolling the internet for decades, and they have learned to temper their words and disguise their intentions. A new scorecard can help users—or parents, or advertisers, or the social media companies themselves—understand when they might be interacting with extremist content.
Findings from a survey of U.S. teachers reveal how limited home internet access has been a barrier to providing instruction amid pandemic-related school closures. The problem is particularly acute among high-poverty schools.
China's quest to become the global center for artificial intelligence starts with mastering big data analytics. Its aggressive strategy encompasses economic, military, police, and intelligence functions. Beijing is already using big data to survey the country's domestic population and enhance its military.
Today's self-selecting solo terrorists answer only to their god, whether seeking to destroy all government, pursuing racial separation or genocidal goals, expressing sexual dissatisfaction, or simply wanting to leave their mark. Military operations are irrelevant. This is a deeper societal problem.
Disinformation has become a central feature of the COVID-19 crisis. This type of malign information and high-tech “deepfake” imagery poses a risk to democratic societies worldwide by increasing public mistrust in governments and public authorities. New research highlights new ways to detect and dispel disinformation online.
As social media is increasingly being used as a primary source for news, there is a rising threat from the spread of malign and false information. A new machine learning model identified differences between authentic political supporters and Russian trolls shaping online debates about the 2016 U.S. election. How could the model be applied in the future?
Digital platforms that let users interact virtually and often anonymously have given rise to harassment and other criminal behaviors. Tech-facilitated abuse—such as nonconsensual pornography, doxing, and swatting—compromises privacy and safety. How can law enforcement respond?
RAND researchers asked a nationally representative sample of adults about their news-consumption habits. The answers reveal clues about what it might take to address Truth Decay—the decline of facts in U.S. public life.
Digital materials for lesson planning and instruction are becoming an increasingly important resource for teachers. A survey of English language arts, mathematics, and science teachers across the United States provides insights on which materials they use and what they consider barriers to use.
Quantum computers are expected to be powerful enough to break the current cryptography that protects all digital communications. But this scenario is preventable if policymakers take actions now to minimize the harm that quantum computers may cause.
Like COVID-19, disinformation spreads only if we help it spread. While we have all been asked to stay at home as responsible citizens to contain the virus, we should also feel responsible for making it harder for disinformation to spread.
Quantum computers that are exponentially faster than any of our current classical computers and are capable of code-breaking applications could be available in 12 to 15 years, posing major risks to the security of current communications systems.
Quantum computers are expected to revolutionize computing. But hackers may be able to use them to crack the encryption system that protects all digital communications. How soon could this scenario become a reality? And what can be done to prevent it?
Jennifer Kavanagh, who wrote the RAND book Truth Decay about the diminishing role that facts play in making important public policy decisions, calls the unfolding situation with the novel coronavirus and COVID-19 a worst-case scenario.