251
|
Quandt T, Klapproth J, Frischlich L. Dark social media participation and well-being. Curr Opin Psychol 2021; 45:101284. [PMID: 35016088 DOI: 10.1016/j.copsyc.2021.11.004] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Revised: 11/15/2021] [Accepted: 11/17/2021] [Indexed: 12/26/2022]
Abstract
In recent years, there have been increasing global concerns about the abuse of digital technologies for malicious 'dark participation', the spreading of digital offenses, hate speech, fake news, and conspiracy theories. Clearly, dark participation can have severe effects on the victims and on society at large. However, less is known about the impact of dark participation on the perpetrators' well-being. Preliminary research on the perpetrators indicates positive emotions and specific gratifications resulting from their behavior, in particular when it is fully consistent with their ideologies. Uncovering these gratifications-and the positive effects dark participation may have on perpetrators' well-being-could be the key to a better understanding of the dark side of social media.
Collapse
Affiliation(s)
- Thorsten Quandt
- Westfälische Wilhelms-Universität Münster, Münster, Germany.
| | | | | |
Collapse
|
252
|
Mosleh M, Pennycook G, Rand DG. Field Experiments on Social Media. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2021. [DOI: 10.1177/09637214211054761] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Online behavioral data, such as digital traces from social media, have the potential to allow researchers an unprecedented new window into human behavior in ecologically valid everyday contexts. However, research using such data is often purely observational, which limits its usefulness for identifying causal relationships. Here we review recent innovations in experimental approaches to studying online behavior, with a particular focus on research related to misinformation and political psychology. In hybrid lab-field studies, exposure to social-media content can be randomized, and the impact on attitudes and beliefs can be measured using surveys, or exposure to treatments can be randomized within survey experiments, and their impact on subsequent online behavior can be observed. In field experiments conducted on social media, randomized treatments can be administered directly to users in the online environment (e.g., via social-tie invitations, private messages, or public posts) without revealing that they are part of an experiment, and the effects on subsequent online behavior can then be observed. The strengths and weaknesses of each approach are discussed, along with practical advice and central ethical constraints on such studies.
Collapse
Affiliation(s)
- Mohsen Mosleh
- Science, Innovation, Technology, and Entrepreneurship Department, University of Exeter Business School
- Sloan School of Management, Massachusetts Institute of Technology
| | - Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina
- Department of Psychology, University of Regina
| | - David G. Rand
- Sloan School of Management, Massachusetts Institute of Technology
- Institute for Data, Systems, and Society, Massachusetts Institute of Technology
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology
| |
Collapse
|
253
|
Escolà-Gascón Á, Dagnall N, Gallifa J. Critical thinking predicts reductions in Spanish physicians' stress levels and promotes fake news detection. THINKING SKILLS AND CREATIVITY 2021; 42:100934. [PMID: 35154504 PMCID: PMC8818444 DOI: 10.1016/j.tsc.2021.100934] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Revised: 08/20/2021] [Accepted: 08/21/2021] [Indexed: 06/14/2023]
Abstract
The prevalence of pseudoscientific beliefs and fake news increased during the coronavirus crisis. Misinformation streams such as these potentially pose risks to people's health. Thus, knowing how these pseudoscientific beliefs and fake news impact the community of internists may be useful for improving primary care services. In this research, analyses of stress levels, effectiveness in detecting fake news, use of critical thinking (CP), and attitudes toward pseudosciences in internists during the COVID-19 crisis were performed. A total of 1129 internists participated. Several multiple regression models were applied using the forward stepwise method to determine the weight of CP and physicians' attitudes toward pseudosciences in predicting reductions in stress levels and facilitating the detection of fake news. The use of critical thinking predicted 46.9% of the reduction in stress levels. Similarly, skeptical attitudes and critical thinking predicted 56.1% of the hits on fake news detection tests. The stress levels of physicians during the coronavirus pandemic were clinically significant. The efficacy of fake news detection increases by 30.7% if the individual was a physician. Study outcomes indicate that the use of critical thinking and skeptical attitudes reduce stress levels and allow better detection of fake news. The importance of how to promote critical and skeptical attitudes in the field of medicine is discussed.
Collapse
Affiliation(s)
- Álex Escolà-Gascón
- Ramon Llull University, School of Psychology, Education and Sport Sciences, Blanquerna, 34 Císter St, Barcelona, 08022, Spain
| | - Neil Dagnall
- Faculty of Health, Psychology and Social Care, Manchester Metropolitan University, Manchester, United Kingdom
| | - Josep Gallifa
- Ramon Llull University, School of Psychology, Education and Sport Sciences, Blanquerna, 34 Císter St, Barcelona, 08022, Spain
| |
Collapse
|
254
|
Köbis NC, Doležalová B, Soraperra I. Fooled twice: People cannot detect deepfakes but think they can. iScience 2021; 24:103364. [PMID: 34820608 PMCID: PMC8602050 DOI: 10.1016/j.isci.2021.103364] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 07/28/2021] [Accepted: 10/25/2021] [Indexed: 11/17/2022] Open
Abstract
Hyper-realistic manipulations of audio-visual content, i.e., deepfakes, present new challenges for establishing the veracity of online content. Research on the human impact of deepfakes remains sparse. In a pre-registered behavioral experiment (N = 210), we show that (1) people cannot reliably detect deepfakes and (2) neither raising awareness nor introducing financial incentives improves their detection accuracy. Zeroing in on the underlying cognitive processes, we find that (3) people are biased toward mistaking deepfakes as authentic videos (rather than vice versa) and (4) they overestimate their own detection abilities. Together, these results suggest that people adopt a “seeing-is-believing” heuristic for deepfake detection while being overconfident in their (low) detection abilities. The combination renders people particularly susceptible to be influenced by deepfake content. People cannot reliably detect deepfakes Raising awareness and financial incentives do not improve people's detection accuracy People tend to mistake deepfakes as authentic videos (rather than vice versa) People overestimate their own detection deepfake abilities
Collapse
Affiliation(s)
- Nils C Köbis
- Center for Humans and Machines, Max Planck Institute for Human Development, 14195 Berlin, Germany
| | - Barbora Doležalová
- Amsterdam School of Economics, University of Amsterdam, 1001 NJ Amsterdam, The Netherlands
| | - Ivan Soraperra
- Amsterdam School of Economics, University of Amsterdam, 1001 NJ Amsterdam, The Netherlands
| |
Collapse
|
255
|
Conspiracy theory beliefs, scientific reasoning and the analytical thinking paradox. APPLIED COGNITIVE PSYCHOLOGY 2021. [DOI: 10.1002/acp.3885] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
|
256
|
Agley J, Xiao Y, Thompson EE, Chen X, Golzarri-Arroyo L. Intervening on Trust in Science to Reduce Belief in COVID-19 Misinformation and Increase COVID-19 Preventive Behavioral Intentions: Randomized Controlled Trial. J Med Internet Res 2021; 23:e32425. [PMID: 34581678 PMCID: PMC8519341 DOI: 10.2196/32425] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2021] [Revised: 09/07/2021] [Accepted: 09/26/2021] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND Trust in science meaningfully contributes to our understanding of people's belief in misinformation and their intentions to take actions to prevent COVID-19. However, no experimental research has sought to intervene on this variable to develop a scalable response to the COVID-19 infodemic. OBJECTIVE Our study examined whether brief exposure to an infographic about the scientific process might increase trust in science and thereby affect belief in misinformation and intention to take preventive actions for COVID-19. METHODS This two-arm, parallel-group, randomized controlled trial aimed to recruit a US representative sample of 1000 adults by age, race/ethnicity, and gender using the Prolific platform. Participants were randomly assigned to view either an intervention infographic about the scientific process or a control infographic. The intervention infographic was designed through a separate pilot study. Primary outcomes were trust in science, COVID-19 narrative belief profile, and COVID-19 preventive behavioral intentions. We also collected 12 covariates and incorporated them into all analyses. All outcomes were collected using web-based assessment. RESULTS From January 22, 2021 to January 24, 2021, 1017 participants completed the study. The intervention slightly improved trust in science (difference-in-difference 0.03, SE 0.01, t1000=2.16, P=.031). No direct intervention effect was observed on belief profile membership, but there was some evidence of an indirect intervention effect mediated by trust in science (adjusted odds ratio 1.06, SE 0.03, 95% CI 1.00-1.12, z=2.01, P=.045) on membership in the "scientific" profile compared with the others. No direct nor indirect effects on preventive behaviors were observed. CONCLUSIONS Briefly viewing an infographic about science appeared to cause a small aggregate increase in trust in science, which may have, in turn, reduced the believability of COVID-19 misinformation. The effect sizes were small but commensurate with our 60-second, highly scalable intervention approach. Researchers should study the potential for truthful messaging about how science works to serve as misinformation inoculation and test how best to do so. TRIAL REGISTRATION NCT04557241; https://clinicaltrials.gov/ct2/show/NCT04557241. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID) RR2-10.2196/24383.
Collapse
Affiliation(s)
- Jon Agley
- Prevention Insights, Department of Applied Health Science, School of Public Health Bloomington, Indiana University Bloomington, Bloomington, IN, United States
| | - Yunyu Xiao
- Department of Population Health Sciences, Weill Cornell Medicine, New York, NY, United States
| | - Esi E Thompson
- Indiana University Media School, Indiana University Bloomington, Bloomington, IN, United States
| | - Xiwei Chen
- Biostatistics Consulting Center, School of Public Health Bloomington, Indiana University Bloomington, Bloomington, IN, United States
| | - Lilian Golzarri-Arroyo
- Biostatistics Consulting Center, School of Public Health Bloomington, Indiana University Bloomington, Bloomington, IN, United States
| |
Collapse
|
257
|
Mermelstein S, German TC. Counterintuitive Pseudoscience Propagates by Exploiting the Mind's Communication Evaluation Mechanisms. Front Psychol 2021; 12:739070. [PMID: 34675845 PMCID: PMC8523830 DOI: 10.3389/fpsyg.2021.739070] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 09/16/2021] [Indexed: 12/02/2022] Open
Abstract
Epidemiological models of culture posit that the prevalence of a belief depends in part on the fit between that belief and intuitions generated by the mind's reliably developing architecture. Application of such models to pseudoscience suggests that one route via which these beliefs gain widespread appeal stems from their compatibility with these intuitions. For example, anti-vaccination beliefs are readily adopted because they cohere with intuitions about the threat of contagion. However, other varieties of popular pseudoscience such as astrology and parapsychology contain content that violates intuitions held about objects and people. Here, we propose a pathway by which "counterintuitive pseudoscience" may spread and receive endorsement. Drawing on recent empirical evidence, we suggest that counterintuitive pseudoscience triggers the mind's communication evaluation mechanisms. These mechanisms are hypothesized to quarantine epistemically-suspect information including counterintuitive pseudoscientific concepts. As a consequence, these beliefs may not immediately update conflicting intuitions and may be largely restricted from influencing behavior. Nonetheless, counterintuitive pseudoscientific concepts, when in combination with intuitively appealing content, may differentially draw attention and memory. People may also be motivated to seek further information about these concepts, including by asking others, in an attempt to reconcile them with prior beliefs. This in turn promotes the re-transmission of these ideas. We discuss how, during this information-search, support for counterintuitive pseudoscience may come from deference to apparently authoritative sources, reasoned arguments, and the functional outcomes of these beliefs. Ultimately, these factors promote the cultural success of counterintuitive pseudoscience but explicit endorsement of these concepts may not entail tacit commitment.
Collapse
Affiliation(s)
- Spencer Mermelstein
- Department of Psychological and Brain Sciences, University of California, Santa Barbara, Santa Barbara, CA, United States
| | | |
Collapse
|
258
|
Allen J, Arechar AA, Pennycook G, Rand DG. Scaling up fact-checking using the wisdom of crowds. SCIENCE ADVANCES 2021; 7:eabf4393. [PMID: 34516925 PMCID: PMC8442902 DOI: 10.1126/sciadv.abf4393] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/27/2020] [Accepted: 07/12/2021] [Indexed: 06/07/2023]
Abstract
Professional fact-checking, a prominent approach to combating misinformation, does not scale easily. Furthermore, some distrust fact-checkers because of alleged liberal bias. We explore a solution to these problems: using politically balanced groups of laypeople to identify misinformation at scale. Examining 207 news articles flagged for fact-checking by Facebook algorithms, we compare accuracy ratings of three professional fact-checkers who researched each article to those of 1128 Americans from Amazon Mechanical Turk who rated each article’s headline and lede. The average ratings of small, politically balanced crowds of laypeople (i) correlate with the average fact-checker ratings as well as the fact-checkers’ ratings correlate with each other and (ii) predict whether the majority of fact-checkers rated a headline as “true” with high accuracy. Furthermore, cognitive reflection, political knowledge, and Democratic Party preference are positively related to agreement with fact-checkers, and identifying each headline’s publisher leads to a small increase in agreement with fact-checkers.
Collapse
Affiliation(s)
- Jennifer Allen
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Antonio A. Arechar
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
- Center for Research and Teaching in Economics, CIDE, Aguascalientes, Mexico
- Centre for Decision Research and Experimental Economics, CeDEx, Nottingham, UK
| | - Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina, Regina, Canada
| | - David G. Rand
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
- Institute for Data, Systems, and Society, Massachusetts Institute of Technology, Cambridge, MA, USA
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| |
Collapse
|
259
|
Abstract
Human memory is prone to error and distortion. Schacter (1999, 2001) proposed that memory's misdeeds can be classified into seven categories or "sins". This article discusses the impact of media and technology on four memory sins, transience (forgetting over time), absent-mindedness (lapses in attention that produce forgetting), misattribution (attributing a memory to the wrong source), and suggestibility (implanted memories). Growing concerns have been expressed about the negative impact of media and technology on memory. With respect to transience, I review research regarding the impact of the Internet (ie, Google), GPS, and photographs. Studies have documented impaired memory following specific tasks on which people rely on media/technology (eg, poor memory for a route after using GPS), but have revealed little evidence for broader impairments (eg, generally impaired memory in GPS users), and have also documented some mnemonic benefits (eg, reviewing photos of past experiences). For absent-mindedness, there is strong evidence that media multitasking is associated with poor memory for a target task (eg, a lecture) because of attentional lapses, suggestive evidence that chronic media multitasking could be associated with broader memory problems, and emerging evidence that technology can help to reduce certain kinds of absent-minded errors. Regarding misattribution and suggestibility, there is clear evidence that manipulated or misleading photos are associated with false memories for personal events and fake news, but no evidence of broader effects on susceptibility to memory distortion. Further study of the impact of media and technology on the memory sins is a fruitful pursuit for interdisciplinary studies.
Collapse
|
260
|
Kauk J, Kreysa H, Schweinberger SR. Understanding and countering the spread of conspiracy theories in social networks: Evidence from epidemiological models of Twitter data. PLoS One 2021; 16:e0256179. [PMID: 34383860 PMCID: PMC8360523 DOI: 10.1371/journal.pone.0256179] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2021] [Accepted: 08/02/2021] [Indexed: 11/24/2022] Open
Abstract
Conspiracy theories in social networks are considered to have adverse effects on individuals' compliance with public health measures in the context of a pandemic situation. A deeper understanding of how conspiracy theories propagate through social networks is critical for the development of countermeasures. The present work focuses on a novel approach to characterize the propagation of conspiracy theories through social networks by applying epidemiological models to Twitter data. A Twitter dataset was searched for tweets containing hashtags indicating belief in the "5GCoronavirus" conspiracy theory, which states that the COVID-19 pandemic is a result of, or enhanced by, the enrollment of the 5G mobile network. Despite the absence of any scientific evidence, the "5GCoronavirus" conspiracy theory propagated rapidly through Twitter, beginning at the end of January, followed by a peak at the beginning of April, and ceasing/disappearing approximately at the end of June 2020. An epidemic SIR (Susceptible-Infected-Removed) model was fitted to this time series with acceptable model fit, indicating parallels between the propagation of conspiracy theories in social networks and infectious diseases. Extended SIR models were used to simulate the effects that two specific countermeasures, fact-checking and tweet-deletion, could have had on the propagation of the conspiracy theory. Our simulations indicate that fact-checking is an effective mechanism in an early stage of conspiracy theory diffusion, while tweet-deletion shows only moderate efficacy but is less time-sensitive. More generally, an early response is critical to gain control over the spread of conspiracy theories through social networks. We conclude that an early response combined with strong fact-checking and a moderate level of deletion of problematic posts is a promising strategy to fight conspiracy theories in social networks. Results are discussed with respect to their theoretical validity and generalizability.
Collapse
Affiliation(s)
- Julian Kauk
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
| | - Helene Kreysa
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
| | - Stefan R. Schweinberger
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
- DFG Research Unit Person Perception, Friedrich Schiller University Jena, Jena, Germany
| |
Collapse
|
261
|
Van Lange PAM, Rand DG. Human Cooperation and the Crises of Climate Change, COVID-19, and Misinformation. Annu Rev Psychol 2021; 73:379-402. [PMID: 34339612 DOI: 10.1146/annurev-psych-020821-110044] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Contemporary society is facing many social dilemmas-including climate change, COVID-19, and misinformation-characterized by a conflict between short-term self-interest and longer-term collective interest. The climate crisis requires paying costs today to benefit distant others (and oneself) in the future. The COVID-19 crisis requires the less vulnerable to pay costs to benefit the more vulnerable in the face of great uncertainty. The misinformation crisis requires investing effort to assess truth and abstain from spreading attractive falsehoods. Addressing these crises requires an understanding of human cooperation. To that end, we present (a) an overview of mechanisms for the evolution of cooperation, including mechanisms based on similarity and interaction; (b) a discussion of how reputation can incentivize cooperation via conditional cooperation and signaling; and (c) a review of social preferences that undergird the proximate psychology of cooperation, including positive regard for others, parochialism, and egalitarianism. We discuss the three focal crises facing our society through the lens of cooperation, emphasizing how cooperation research can inform our efforts to address them. Expected final online publication date for the Annual Review of Psychology, Volume 73 is January 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Paul A M Van Lange
- Department of Experimental and Applied Psychology, and Institute for Brain and Behavior Amsterdam (iBBA), Vrije Universiteit Amsterdam, 1081 BT Amsterdam, The Netherlands;
| | - David G Rand
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02138, USA;
| |
Collapse
|
262
|
Gawronski B. Partisan bias in the identification of fake news. Trends Cogn Sci 2021; 25:723-724. [PMID: 34226126 DOI: 10.1016/j.tics.2021.05.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2021] [Revised: 05/01/2021] [Accepted: 05/03/2021] [Indexed: 10/21/2022]
|
263
|
Pennycook G, Rand DG. Lack of partisan bias in the identification of fake (versus real) news. Trends Cogn Sci 2021; 25:725-726. [PMID: 34226127 DOI: 10.1016/j.tics.2021.06.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Accepted: 06/04/2021] [Indexed: 11/18/2022]
Affiliation(s)
- Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina, Saskatchewan, Canada; Department of Psychology, University of Regina, Saskatchewan, Canada.
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, MA, USA; Institute for Data, Systems, and Society, Massachusetts Institute of Technology, MA, USA; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, MA, USA
| |
Collapse
|
264
|
Abstract
In recent years, we have witnessed a rise in fake news, i.e., provably false pieces of information created with the intention of deception. The dissemination of this type of news poses a serious threat to cohesion and social well-being, since it fosters political polarization and the distrust of people with respect to their leaders. The huge amount of news that is disseminated through social media makes manual verification unfeasible, which has promoted the design and implementation of automatic systems for fake news detection. The creators of fake news use various stylistic tricks to promote the success of their creations, with one of them being to excite the sentiments of the recipients. This has led to sentiment analysis, the part of text analytics in charge of determining the polarity and strength of sentiments expressed in a text, to be used in fake news detection approaches, either as a basis of the system or as a complementary element. In this article, we study the different uses of sentiment analysis in the detection of fake news, with a discussion of the most relevant elements and shortcomings, and the requirements that should be met in the near future, such as multilingualism, explainability, mitigation of biases, or treatment of multimedia elements.
Collapse
|
265
|
Beyond “fake news”: Analytic thinking and the detection of false and hyperpartisan news headlines. JUDGMENT AND DECISION MAKING 2021. [DOI: 10.1017/s1930297500008640] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractWhy is misleading partisan content believed and shared? An influential account posits that political partisanship pervasively biases reasoning, such that engaging in analytic thinking exacerbates motivated reasoning and, in turn, the acceptance of hyperpartisan content. Alternatively, it may be that susceptibility to hyperpartisan content is explained by a lack of reasoning. Across two studies using different participant pools (total N = 1,973 Americans), we had participants assess true, false, and hyperpartisan news headlines taken from social media. We found no evidence that analytic thinking was associated with judging politically consistent hyperpartisan or false headlines to be accurate and unbiased. Instead, analytic thinking was, in most cases, associated with an increased tendency to distinguish true headlines from both false and hyperpartisan headlines (and was never associated with decreased discernment). These results suggest that reasoning typically helps people differentiate between low and high quality political news, rather than facilitate belief in misleading content. Because social media play an important role in the dissemination of misinformation, we also investigated willingness to share headlines on social media. We found a similar pattern whereby analytic thinking was not generally associated with increased willingness to share hyperpartisan or false headlines. Together, these results suggest a positive role for reasoning in resisting misinformation.
Collapse
|