1
|
Kozyreva A, Lorenz-Spreen P, Herzog SM, Ecker UKH, Lewandowsky S, Hertwig R, Ali A, Bak-Coleman J, Barzilai S, Basol M, Berinsky AJ, Betsch C, Cook J, Fazio LK, Geers M, Guess AM, Huang H, Larreguy H, Maertens R, Panizza F, Pennycook G, Rand DG, Rathje S, Reifler J, Schmid P, Smith M, Swire-Thompson B, Szewach P, van der Linden S, Wineburg S. Toolbox of individual-level interventions against online misinformation. Nat Hum Behav 2024:10.1038/s41562-024-01881-0. [PMID: 38740990 DOI: 10.1038/s41562-024-01881-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Accepted: 04/05/2024] [Indexed: 05/16/2024]
Abstract
The spread of misinformation through media and social networks threatens many aspects of society, including public health and the state of democracies. One approach to mitigating the effect of misinformation focuses on individual-level interventions, equipping policymakers and the public with essential tools to curb the spread and influence of falsehoods. Here we introduce a toolbox of individual-level interventions for reducing harm from online misinformation. Comprising an up-to-date account of interventions featured in 81 scientific papers from across the globe, the toolbox provides both a conceptual overview of nine main types of interventions, including their target, scope and examples, and a summary of the empirical evidence supporting the interventions, including the methods and experimental paradigms used to test them. The nine types of interventions covered are accuracy prompts, debunking and rebuttals, friction, inoculation, lateral reading and verification strategies, media-literacy tips, social norms, source-credibility labels, and warning and fact-checking labels.
Collapse
Affiliation(s)
- Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany.
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Stefan M Herzog
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Ullrich K H Ecker
- School of Psychological Science & Public Policy Institute, University of Western Australia, Perth, Western Australia, Australia
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol, Bristol, UK
- Department of Psychology, University of Potsdam, Potsdam, Germany
| | - Ralph Hertwig
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Ayesha Ali
- Department of Economics, Lahore University of Management Sciences, Lahore, Pakistan
| | - Joe Bak-Coleman
- Craig Newmark Center, School of Journalism, Columbia University, New York, NY, USA
| | - Sarit Barzilai
- Department of Learning and Instructional Sciences, University of Haifa, Haifa, Israel
| | - Melisa Basol
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Adam J Berinsky
- Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Cornelia Betsch
- Institute for Planetary Health Behaviour, University of Erfurt, Erfurt, Germany
- Bernhard Nocht Institute for Tropical Medicine, Hamburg, Germany
| | - John Cook
- Melbourne Centre for Behaviour Change, University of Melbourne, Melbourne, Victoria, Australia
| | - Lisa K Fazio
- Department of Psychology and Human Development, Vanderbilt University, Nashville, TN, USA
| | - Michael Geers
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
- Department of Psychology, Humboldt University of Berlin, Berlin, Germany
| | - Andrew M Guess
- Department of Politics and School of Public and International Affairs, Princeton University, Princeton, NJ, USA
| | - Haifeng Huang
- Department of Political Science, Ohio State University, Columbus, OH, USA
| | - Horacio Larreguy
- Departments of Economics and Political Science, Instituto Tecnológico Autónomo de México, Mexico City, Mexico
| | - Rakoen Maertens
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | | | - Gordon Pennycook
- Department of Psychology, Cornell University, Ithaca, NY, USA
- Department of Psychology, University of Regina, Regina, Saskatchewan, Canada
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Steve Rathje
- Department of Psychology, New York University, New York, NY, USA
| | - Jason Reifler
- Department of Politics, University of Exeter, Exeter, UK
| | - Philipp Schmid
- Institute for Planetary Health Behaviour, University of Erfurt, Erfurt, Germany
- Bernhard Nocht Institute for Tropical Medicine, Hamburg, Germany
- Centre for Language Studies, Radboud University Nijmegen, Nijmegen, the Netherlands
| | - Mark Smith
- Graduate School of Education, Stanford University, Stanford, CA, USA
| | | | - Paula Szewach
- Department of Politics, University of Exeter, Exeter, UK
- Barcelona Supercomputing Center, Barcelona, Spain
| | | | - Sam Wineburg
- Graduate School of Education, Stanford University, Stanford, CA, USA
| |
Collapse
|
2
|
Kozyreva A, Smillie L, Lewandowsky S. Incorporating Psychological Science Into Policy Making: The Case of Misinformation. EUROPEAN PSYCHOLOGIST 2023; 28:a000493. [PMID: 37994309 PMCID: PMC7615323 DOI: 10.1027/1016-9040/a000493] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2023]
Abstract
The spread of false and misleading information in online social networks is a global problem in need of urgent solutions. It is also a policy problem because misinformation can harm both the public and democracies. To address the spread of misinformation, policymakers require a successful interface between science and policy, as well as a range of evidence-based solutions that respect fundamental rights while efficiently mitigating the harms of misinformation online. In this article, we discuss how regulatory and nonregulatory instruments can be informed by scientific research and used to reach EU policy objectives. First, we consider what it means to approach misinformation as a policy problem. We then outline four building blocks for cooperation between scientists and policymakers who wish to address the problem of misinformation: understanding the misinformation problem, understanding the psychological drivers and public perceptions of misinformation, finding evidence-based solutions, and co-developing appropriate policy measures. Finally, through the lens of psychological science, we examine policy instruments that have been proposed in the EU, focusing on the strengthened Code of Practice on Disinformation 2022.
Collapse
Affiliation(s)
- Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Laura Smillie
- Joint Research Center, European Commission, Brussels, Belgium
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol, UK
- School of Psychological Sciences, University of Western Australia, Australia
- Department of Psychology, University of Potsdam, Germany
| |
Collapse
|
3
|
Moore RC, Dahlke R, Hancock JT. Exposure to untrustworthy websites in the 2020 US election. Nat Hum Behav 2023:10.1038/s41562-023-01564-2. [PMID: 37055575 DOI: 10.1038/s41562-023-01564-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2022] [Accepted: 02/20/2023] [Indexed: 04/15/2023]
Abstract
Research using large-scale data on individuals' internet use has provided vital information about the scope and nature of exposure to misinformation online. However, most prior work relies on data collected during the 2016 US election. Here we examine exposure to untrustworthy websites during the 2020 US election, using over 7.5 million website visits from 1,151 American adults. We find that 26.2% (95% confidence interval 22.5% to 29.8%) of Americans were exposed to untrustworthy websites in 2020, down from 44.3% (95% confidence interval 40.8% to 47.7%) in 2016. Older adults and conservatives continued to be the most exposed in 2020 as in 2016, albeit at lower rates. The role of online platforms in exposing people to untrustworthy websites changed, with Facebook playing a smaller role in 2020 than in 2016. Our findings do not minimize misinformation as a key social problem, but instead highlight important changes in its consumption, suggesting directions for future research and practice.
Collapse
Affiliation(s)
- Ryan C Moore
- Department of Communication, Stanford University, Stanford, CA, USA.
| | - Ross Dahlke
- Department of Communication, Stanford University, Stanford, CA, USA
| | | |
Collapse
|
4
|
Kozyreva A, Wineburg S, Lewandowsky S, Hertwig R. Critical Ignoring as a Core Competence for Digital Citizens. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2023; 32:81-88. [PMID: 37994317 PMCID: PMC7615324 DOI: 10.1177/09637214221121570] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2023]
Abstract
Low-quality and misleading information online can hijack people's attention, often by evoking curiosity, outrage, or anger. Resisting certain types of information and actors online requires people to adopt new mental habits that help them avoid being tempted by attention-grabbing and potentially harmful content. We argue that digital information literacy must include the competence of critical ignoring-choosing what to ignore and where to invest one's limited attentional capacities. We review three types of cognitive strategies for implementing critical ignoring: self-nudging, in which one ignores temptations by removing them from one's digital environments; lateral reading, in which one vets information by leaving the source and verifying its credibility elsewhere online; and the do-not-feed-the-trolls heuristic, which advises one to not reward malicious actors with attention. We argue that these strategies implementing critical ignoring should be part of school curricula on digital information literacy. Teaching the competence of critical ignoring requires a paradigm shift in educators' thinking, from a sole focus on the power and promise of paying close attention to an additional emphasis on the power of ignoring. Encouraging students and other online users to embrace critical ignoring can empower them to shield themselves from the excesses, traps, and information disorders of today's attention economy.
Collapse
Affiliation(s)
- Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Sam Wineburg
- Graduate School of Education, Stanford University
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol
- School of Psychological Science, University of Western Australia
- Department of Psychology, University of Potsdam
| | - Ralph Hertwig
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| |
Collapse
|