101
|
Obadă DR, Dabija DC, Fârte GI. Consumer perception towards electronic products from recycled components in the current geopolitical context: A structural equation modelling approach. Heliyon 2024; 10:e26475. [PMID: 38420469 PMCID: PMC10900790 DOI: 10.1016/j.heliyon.2024.e26475] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2024] [Revised: 02/09/2024] [Accepted: 02/14/2024] [Indexed: 03/02/2024] Open
Abstract
Russia's invasion of Ukraine had a negative impact worldwide, causing a severe energy crisis that also affected EU countries, which are now involved in combating the growing energy prices while also speeding up the deployment of renewable energy sources. This armed conflict impacted the electronic components supply chain, causing high prices and disruption for different raw materials, resulting in material shortages for electronic components and affecting electronic product (EP) manufacturing. Since the start of the geopolitical crisis due to the Russo-Ukrainian War, (dis)information has been disseminated via social media, affecting users' cognition, attitudes, and behavioral intentions. Therefore, this paper aims to assess the impact of social media usage, Russo-Ukrainian war fear, consumers' green values, perceived quality, usage enjoyment, and product image on consumers' purchase intention toward recycled electronic products. Based on the Stimulus-Organism-Response (SOR) approach, the authors propose a conceptual model highlighting the factors that enhance consumers' purchase intentions towards recycled electronic products. The model is tested empirically via quantitative-based research, with data collected from Romania, a close neighbor of the armed conflict, and assessed employing with structural equations modeling via SmartPLS 3.0. Results confirm that social media usage, consumers' green values, and the Russo-Ukrainian war fear do enhance consumers' image of recycled electronic products, thus leading to their increased purchase intention. The novelty of this paper consists in extending the SOR-based research regarding consumers' behavioral intentions toward buying recycled electronic products in the context of the Russo-Ukrainian war. The study highlights important managerial implications for both the electronic industry and retailers selling such goods.
Collapse
Affiliation(s)
- Daniel-Rareș Obadă
- Alexandru Ioan Cuza University of Iași, Faculty of Philosophy and Socio-Political Sciences, Department of Communication Sciences and Public Relations, Romania
| | - Dan-Cristian Dabija
- Babeș-Bolyai University Cluj-Napoca, Faculty of Economics and Business Administration, Department of Marketing, Romania
| | - Gheorghe-Ilie Fârte
- Alexandru Ioan Cuza University of Iași, Faculty of Philosophy and Socio-Political Sciences, Department of Communication Sciences and Public Relations, Romania
| |
Collapse
|
102
|
Beaudoin ME, Jones KM, Jerome B, Martinez D, George T, Pandža NB. Systematic research is needed on the potential effects of lifelong technology experience on cognition: a mini-review and recommendations. Front Psychol 2024; 15:1335864. [PMID: 38434954 PMCID: PMC10904591 DOI: 10.3389/fpsyg.2024.1335864] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 01/31/2024] [Indexed: 03/05/2024] Open
Abstract
Digital technology now occupies a fundamental space in human life. Increasingly sophisticated access to information and social interactions has enabled a sort of offloading of many aspects of cognition, and for many people, this technology use has been lifelong. While the global development of technologies advances exponentially as part of the Fourth Industrial Revolution, researchers have not yet fully characterized the human effects of this technology-centric revolution at the same pace. In this mini-review, we consider three important higher-level cognitive functions: creativity, adaptability, and decision-making, and discuss their potential relationship to lifelong digital technology experience, which here includes both passive exposure and active use of electronic devices. We then articulate the gaps in related literature and knowledge, and outline general considerations, suggestions, and challenges for future research avenues. In general, we found that prior research has investigated uses of specific technology products on lower-level cognition (e.g., how does the use of online search engines affect memory?), but there is a lack of research assessing the overall effects of technology experience on cognitive functioning, particularly complex cognition.
Collapse
Affiliation(s)
- Monique E. Beaudoin
- Applied Research Laboratory for Intelligence and Security, University of Maryland, College Park, MD, United States
| | | | | | | | | | | |
Collapse
|
103
|
Müller B. Bad social norms rather than bad believers: examining the role of social norms in bad beliefs. SYNTHESE 2024; 203:63. [PMID: 38356922 PMCID: PMC10861743 DOI: 10.1007/s11229-024-04483-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 01/02/2024] [Indexed: 02/16/2024]
Abstract
People with bad beliefs - roughly beliefs that conflict with those of the relevant experts and are maintained regardless of counter-evidence - are often cast as bad believers. Such beliefs are seen to be the result of, e.g., motivated or biased cognition and believers are judged to be epistemically irrational and blameworthy in holding them. Here I develop a novel framework to explain why people form bad beliefs. People with bad beliefs follow the social epistemic norms guiding how agents are supposed to form and share beliefs within their respective communities. Beliefs go bad because these norms aren't reliably knowledge-conducive. In other words, bad beliefs aren't due to bad believers but due bad social epistemic norms. The framework also unifies different explanations of bad beliefs, is testable and provides distinct interventions to combat such beliefs. The framework also helps to capture the complex and often contextual normative landscape surrounding bad beliefs more adequately. On this picture, it's primarily groups that are to be blamed for bad beliefs. I also suggest that some individuals will be blameless for forming their beliefs in line with their group's norms, whereas others won't be. And I draw attention to the factors that influence blameworthiness-judgements in these contexts.
Collapse
Affiliation(s)
- Basil Müller
- Institute of Philosophy, University of Bern, Bern, Switzerland
| |
Collapse
|
104
|
Geers M, Swire-Thompson B, Lorenz-Spreen P, Herzog SM, Kozyreva A, Hertwig R. The Online Misinformation Engagement Framework. Curr Opin Psychol 2024; 55:101739. [PMID: 38091666 DOI: 10.1016/j.copsyc.2023.101739] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Revised: 11/01/2023] [Accepted: 11/10/2023] [Indexed: 01/28/2024]
Abstract
Research on online misinformation has evolved rapidly, but organizing its results and identifying open research questions is difficult without a systematic approach. We present the Online Misinformation Engagement Framework, which classifies people's engagement with online misinformation into four stages: selecting information sources, choosing what information to consume or ignore, evaluating the accuracy of the information and/or the credibility of the source, and judging whether and how to react to the information (e.g., liking or sharing). We outline entry points for interventions at each stage and pinpoint the two early stages-source and information selection-as relatively neglected processes that should be addressed to further improve people's ability to contend with misinformation.
Collapse
Affiliation(s)
- Michael Geers
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany; Department of Psychology, Humboldt University of Berlin, Rudower Ch. 18, 12489 Berlin, Germany.
| | - Briony Swire-Thompson
- Network Science Institute, Northeastern University, 177 Huntington Ave., Boston, MA, 02115, USA; Institute for Quantitative Social Science, Harvard University, 1737 Cambridge St., Cambridge, MA, 02138, USA
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| | - Stefan M Herzog
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| | - Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany.
| | - Ralph Hertwig
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| |
Collapse
|
105
|
Calvillo DP, León A, Rutchick AM. Personality and misinformation. Curr Opin Psychol 2024; 55:101752. [PMID: 38065004 PMCID: PMC11193381 DOI: 10.1016/j.copsyc.2023.101752] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Revised: 11/07/2023] [Accepted: 11/14/2023] [Indexed: 01/28/2024]
Abstract
Misinformation poses a significant concern, promoting false beliefs and eroding trust in media. People differ in their susceptibility to believe and to share misinformation. In this article, we reviewed recent research on relationships between personality traits and belief in and sharing of misinformation. Findings show that more extroverted and less conscientious and agreeable people tend to be more susceptible to believing in and sharing misinformation. Additionally, the Dark Triad personality traits of narcissism, psychopathy, and Machiavellianism tend to be positively associated with sharing of misinformation, and narcissism and psychopathy are associated with greater belief in misinformation. Understanding these individual differences can inform interventions to reduce the effects of misinformation.
Collapse
Affiliation(s)
| | - Alex León
- Psychology Department, California State University San Marcos, USA
| | | |
Collapse
|
106
|
Oh J. The "Angry (Digital) Silver" in South Korea: The Rhetoric Around Older Adults' Digital Media Literacy. THE GERONTOLOGIST 2024; 64:gnad092. [PMID: 37439700 DOI: 10.1093/geront/gnad092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Indexed: 07/14/2023] Open
Abstract
Naïve yet threatening is how the South Korean news media have characterized some older adults who have taken to social media to air their political views. Labeled as "angry (digital) silver," these older adults using YouTube and other social media platforms for political activity are portrayed as digitally illiterate and aggressive. This paper examines the rhetoric surrounding older adults' digital media literacy in scholarship and popular news media with a focus on the news media's portrayal of older adults' digital political activity. By analyzing the use of language and various rhetorical strategies, I argue that specific rhetoric of caution, which warns against older adults' so-called lower digital media literacy, is used to invalidate their digital political activity. I draw upon the case of the "Taegukgi squad"-a political group mainly composed of older adults in South Korea-and the evolution of their digital presence. Addressing the media's biased portrayal of older adults' digital media literacy, this paper further invites reflection on controversies around the role of age in digital political activities around the globe.
Collapse
Affiliation(s)
- June Oh
- Department of Literature and Languages, University of Texas at Tyler, Tyler, Texas, USA
| |
Collapse
|
107
|
Spampatti T, Hahnel UJJ, Trutnevyte E, Brosch T. Psychological inoculation strategies to fight climate disinformation across 12 countries. Nat Hum Behav 2024; 8:380-398. [PMID: 38036655 PMCID: PMC10896732 DOI: 10.1038/s41562-023-01736-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2022] [Accepted: 09/26/2023] [Indexed: 12/02/2023]
Abstract
Decades after the scientific debate about the anthropogenic causes of climate change was settled, climate disinformation still challenges the scientific evidence in public discourse. Here we present a comprehensive theoretical framework of (anti)science belief formation and updating to account for the psychological factors that influence the acceptance or rejection of scientific messages. We experimentally investigated, across 12 countries (N = 6,816), the effectiveness of six inoculation strategies targeting these factors-scientific consensus, trust in scientists, transparent communication, moralization of climate action, accuracy and positive emotions-to fight real-world disinformation about climate science and mitigation actions. While exposure to disinformation had strong detrimental effects on participants' climate change beliefs (δ = -0.16), affect towards climate mitigation action (δ = -0.33), ability to detect disinformation (δ = -0.14) and pro-environmental behaviour (δ = -0.24), we found almost no evidence for protective effects of the inoculations (all δ < 0.20). We discuss the implications of these findings and propose ways forward to fight climate disinformation.
Collapse
Affiliation(s)
- Tobia Spampatti
- Swiss Centre for Affective Sciences, University of Geneva, Geneva, Switzerland.
- Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland.
| | - Ulf J J Hahnel
- Swiss Centre for Affective Sciences, University of Geneva, Geneva, Switzerland
- Faculty of Psychology, University of Basel, Basel, Switzerland
| | | | - Tobias Brosch
- Swiss Centre for Affective Sciences, University of Geneva, Geneva, Switzerland
- Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
108
|
Blair RA, Gottlieb J, Nyhan B, Paler L, Argote P, Stainfield CJ. Interventions to counter misinformation: Lessons from the Global North and applications to the Global South. Curr Opin Psychol 2024; 55:101732. [PMID: 38070207 DOI: 10.1016/j.copsyc.2023.101732] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 11/03/2023] [Accepted: 11/06/2023] [Indexed: 01/28/2024]
Abstract
We synthesize evidence from 176 experimental estimates of 11 interventions intended to combat misinformation in the Global North and Global South, which we classify as informational, educational, sociopsychological, or institutional. Among these, we find the most consistent positive evidence for two informational interventions in both Global North and Global South contexts: inoculation/prebunking and debunking. In a complementary survey of 138 misinformation scholars and practitioners, we find that experts tend to be most optimistic about interventions that have been least widely studied or that have been shown to be mostly ineffective. We provide a searchable database of misinformation randomized controlled trials and suggest avenues for future research to close the gap between expert opinion and academic research.
Collapse
Affiliation(s)
- Robert A Blair
- Department of Political Science and Watson Institute for International and Public Affairs, Brown University, United States
| | - Jessica Gottlieb
- Hobby School of Public Affairs, University of Houston, United States
| | - Brendan Nyhan
- Department of Government, Dartmouth College, United States.
| | - Laura Paler
- Department of Government, School of Public Affairs, American University, United States
| | - Pablo Argote
- Department of Political Science and International Relations, University of Southern California, United States
| | | |
Collapse
|
109
|
Riesthuis P, Woods J. "That's just like, your opinion, man": the illusory truth effect on opinions. PSYCHOLOGICAL RESEARCH 2024; 88:284-306. [PMID: 37300704 PMCID: PMC10257371 DOI: 10.1007/s00426-023-01845-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2023] [Accepted: 05/22/2023] [Indexed: 06/12/2023]
Abstract
With the expanse of technology, people are constantly exposed to an abundance of information. Of vital importance is to understand how people assess the truthfulness of such information. One indicator of perceived truthfulness seems to be whether it is repeated. That is, people tend to perceive repeated information, regardless of its veracity, as more truthful than new information, also known as the illusory truth effect. In the present study, we examined whether such effect is also observed for opinions and whether the manner in which the information is encoded influenced the illusory truth effect. Across three experiments, participants (n = 552) were presented with a list of true information, misinformation, general opinion, and/or social-political opinion statements. First, participants were either instructed to indicate whether the presented statement was a fact or opinion based on its syntax structure (Exp. 1 & 2) or assign each statement to a topic category (Exp. 3). Subsequently, participants rated the truthfulness of various new and repeated statements. Results showed that repeated information, regardless of the type of information, received higher subjective truth ratings when participants simply encoded them by assigning each statement to a topic. However, when general and social-political opinions were encoded as an opinion, we found no evidence of such effect. Moreover, we found a reversed illusory truth effect for general opinion statements when only considering information that was encoded as an opinion. These findings suggest that how information is encoded plays a crucial role in evaluating truth.
Collapse
Affiliation(s)
- Paul Riesthuis
- Leuven Institute of Criminology, KU Leuven, Leuven, Belgium.
- Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands.
| | - Josh Woods
- Faculty of Psychology, Grand View University, Des Moines, IA, USA
| |
Collapse
|
110
|
Rydz E, Telfer J, Quinn EK, Fazel SS, Holmes E, Pennycook G, Peters CE. Canadians' knowledge of cancer risk factors and belief in cancer myths. BMC Public Health 2024; 24:329. [PMID: 38291409 PMCID: PMC10829248 DOI: 10.1186/s12889-024-17832-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Accepted: 01/20/2024] [Indexed: 02/01/2024] Open
Abstract
BACKGROUND Many untrue statements about cancer prevention and risks are circulating. The objective of this study was to assess Canadians' awareness of known cancer risk factors and cancer myths (untruths or statements that are not completely true), and to explore how awareness may vary by sociodemographic and cognitive factors. METHODS Cancer myths were identified by conducting scans of published, grey literature, and social media. Intuitive-analytic thinking disposition scores included were actively open- and close-minded thinking, as well as preference for intuitive and effortful thinking. A survey was administered online to participants aged 18 years and older through Prolific. Results were summarized descriptively and analyzed using chi-square tests, as well as Spearman rank and Pearson correlations. RESULTS Responses from 734 Canadians were received. Participants were better at identifying known cancer risk factors (70% of known risks) compared to cancer myths (49%). Bivariate analyses showed differential awareness of known cancer risk factors (p < 0.05) by population density and income, cancer myths by province, and for both by ethnicity, age, and all thinking disposition scores. Active open-minded thinking and preference for effortful thinking were associated with greater discernment. Tobacco-related risk factors were well-identified (> 90% correctly identified), but recognition of other known risk factors was poor (as low as 23% for low vegetable and fruit intake). Mythical cancer risk factors with high support were consuming additives (61%), feeling stressed (52%), and consuming artificial sweeteners (49%). High uncertainty of causation was observed for glyphosate (66% neither agreed or disagreed). For factors that reduce cancer risk, reasonable awareness was observed for HPV vaccination (60%), but there was a high prevalence in cancer myths, particularly that consuming antioxidants (65%) and organic foods (45%) are protective, and some uncertainty whether drinking red wine (41%), consuming vitamins (32%), and smoking cannabis (30%) reduces cancer risk. CONCLUSIONS While Canadians were able to identify tobacco-related cancer risk factors, many myths were believed and numerous risk factors were not recognized. Cancer myths can be harmful in themselves and can detract the public's attention from and action on established risk factors.
Collapse
Affiliation(s)
- E Rydz
- School of Population and Public Health, CAREX Canada, University of British Columbia, Vancouver, Canada
- Department of Oncology, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - J Telfer
- School of Population and Public Health, CAREX Canada, University of British Columbia, Vancouver, Canada
| | - E K Quinn
- School of Population and Public Health, CAREX Canada, University of British Columbia, Vancouver, Canada
- Department of Oncology, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - S S Fazel
- Department of Oncology, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - E Holmes
- Canadian Cancer Society, Toronto, Canada
| | - G Pennycook
- Department of Psychology, Cornell University, New York, USA
| | - C E Peters
- School of Population and Public Health, CAREX Canada, University of British Columbia, Vancouver, Canada.
- Department of Oncology, Cumming School of Medicine, University of Calgary, Calgary, Canada.
- BC Centre for Disease Control, Vancouver, BC, Canada.
- BC Cancer, Vancouver, BC, Canada.
| |
Collapse
|
111
|
DeVerna MR, Guess AM, Berinsky AJ, Tucker JA, Jost JT. Rumors in Retweet: Ideological Asymmetry in the Failure to Correct Misinformation. PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN 2024; 50:3-17. [PMID: 36047663 DOI: 10.1177/01461672221114222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
We used supervised machine-learning techniques to examine ideological asymmetries in online rumor transmission. Although liberals were more likely than conservatives to communicate in general about the 2013 Boston Marathon bombings (Study 1, N = 26,422) and 2020 death of the sex trafficker Jeffrey Epstein (Study 2, N = 141,670), conservatives were more likely to share rumors. Rumor-spreading decreased among liberals following official correction, but it increased among conservatives. Marathon rumors were spread twice as often by conservatives pre-correction, and nearly 10 times more often post-correction. Epstein rumors were spread twice as often by conservatives pre-correction, and nearly, eight times more often post-correction. With respect to ideologically congenial rumors, conservatives circulated the rumor that the Clinton family was involved in Epstein's death 18.6 times more often than liberals circulated the rumor that the Trump family was involved. More than 96% of all fake news domains were shared by conservative Twitter users.
Collapse
|
112
|
Olschewski S, Luckman A, Mason A, Ludvig EA, Konstantinidis E. The Future of Decisions From Experience: Connecting Real-World Decision Problems to Cognitive Processes. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2024; 19:82-102. [PMID: 37390328 PMCID: PMC10790535 DOI: 10.1177/17456916231179138] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/02/2023]
Abstract
In many important real-world decision domains, such as finance, the environment, and health, behavior is strongly influenced by experience. Renewed interest in studying this influence led to important advancements in the understanding of these decisions from experience (DfE) in the last 20 years. Building on this literature, we suggest ways the standard experimental design should be extended to better approach important real-world DfE. These extensions include, for example, introducing more complex choice situations, delaying feedback, and including social interactions. When acting upon experiences in these richer and more complicated environments, extensive cognitive processes go into making a decision. Therefore, we argue for integrating cognitive processes more explicitly into experimental research in DfE. These cognitive processes include attention to and perception of numeric and nonnumeric experiences, the influence of episodic and semantic memory, and the mental models involved in learning processes. Understanding these basic cognitive processes can advance the modeling, understanding and prediction of DfE in the laboratory and in the real world. We highlight the potential of experimental research in DfE for theory integration across the behavioral, decision, and cognitive sciences. Furthermore, this research could lead to new methodology that better informs decision-making and policy interventions.
Collapse
Affiliation(s)
- Sebastian Olschewski
- Department of Psychology, University of Basel
- Warwick Business School, University of Warwick
| | - Ashley Luckman
- Warwick Business School, University of Warwick
- University of Exeter Business School, University of Exeter
| | - Alice Mason
- Department of Psychology, University of Bath
- Department of Psychology, University of Warwick
| | | | | |
Collapse
|
113
|
Brauner F, Fonagy P, Campbell C, Griem J, Storck T, Nolte T. "Trust me, do not trust anyone": how epistemic mistrust and credulity are associated with conspiracy mentality. RESEARCH IN PSYCHOTHERAPY (MILANO) 2023; 26:705. [PMID: 38156557 PMCID: PMC10782893 DOI: 10.4081/ripppo.2023.705] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2023] [Accepted: 10/30/2023] [Indexed: 12/30/2023]
Abstract
Previous research shows that the propensity to endorse conspiracy theories is associated with disrupted forms of epistemic trust, i.e., the appropriate openness towards interpersonally communicated information. There are associations, first, with an increased mistrust in several actors and institutions responsible for the communication of information in society, and second, with a pronounced credulity in unreliable sources and implausible phenomena (e.g., superstition, astrology). This study aims to investigate whether these phenomena are associated with specific personality-related disruptions of epistemic trust. Based on selfreported data of 417 individuals (mean = 33.28; standard deviation = 11.11) from a UK population sampled online, the potential relationships between disruptions in epistemic trust and the endorsement of a conspiracy mentality are explored. The epistemic stances characterized by mistrust and credulity (independent variables) are measured with the epistemic trust, mistrust, and credulity questionnaire (ETMCQ), and conspiracy mentality (dependent variable) is measured with the conspiracy mentality questionnaire. In a multiple linear regression model, mistrust is associated with the endorsement of a conspiracy mentality, even when accounting for other contributing factors (e.g., individual narcissism, attachment avoidance and anxiety, authoritarianism, loneliness). In a bootstrapped mediation model controlling for other relevant predictors, the association between credulity and conspiracy mentality is fully mediated by mistrust. In future research, the impact of disrupted epistemic trust on conspiracy beliefs should be investigated in terms of the specific epistemic stances of mistrust and credulity. In this respect, the ETMCQ represents a highly promising instrument to assess individual differences in factors underpinning aspects of conspiracy endorsement.
Collapse
Affiliation(s)
- Felix Brauner
- Department of Clinical Psychology and Psychotherapy, Psychologische Hochschule Berlin.
| | - Peter Fonagy
- Research Department of Clinical, Educational and Health Psychology, University College London.
| | - Chloe Campbell
- Research Department of Clinical, Educational and Health Psychology, University College London.
| | - Julia Griem
- Institute of Psychiatry, Psychology and Neuroscience, King's College London.
| | - Timo Storck
- Department of Clinical Psychology and Psychotherapy, Psychologische Hochschule Berlin.
| | - Tobias Nolte
- Research Department of Clinical, Educational and Health Psychology, University College London.
| |
Collapse
|
114
|
Hwang Y, So J, Jeong SH. Does COVID-19 Message Fatigue Lead to Misinformation Acceptance? An Extension of the Risk Information Seeking and Processing Model. HEALTH COMMUNICATION 2023; 38:2742-2749. [PMID: 35968837 DOI: 10.1080/10410236.2022.2111636] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Based on the Risk Information Seeking and Processing Model, the present study examines whether COVID-19 message fatigue leads to greater information avoidance and heuristic processing, and consequently greater acceptance of misinformation. We conducted a survey of 821 Korean adults regarding their information seeking and processing regarding COVID-19 vaccination. Results of SEM analyses showed that COVID-19 message fatigue was (a) negatively related to information insufficiency and (b) positively related to information avoidance and heuristic processing. Information avoidance and heuristic processing were subsequently related to greater levels of misinformation acceptance. Theoretical and practical implications are discussed.
Collapse
Affiliation(s)
- Yoori Hwang
- Department of Digital Media, Myongji University
| | - Jiyeon So
- Department of Communication, Yonsei University
| | | |
Collapse
|
115
|
Wilson A, Wilkes S, Teramoto Y, Hale S. Multimodal analysis of disinformation and misinformation. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230964. [PMID: 38126058 PMCID: PMC10731323 DOI: 10.1098/rsos.230964] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/05/2023] [Accepted: 11/22/2023] [Indexed: 12/23/2023]
Abstract
The use of disinformation and misinformation campaigns in the media has attracted much attention from academics and policy-makers. Multimodal analysis or the analysis of two or more semiotic systems-language, gestures, images, sounds, among others-in their interrelation and interaction is essential to understanding dis-/misinformation efforts because most human communication goes beyond just words. There is a confluence of many disciplines (e.g. computer science, linguistics, political science, communication studies) that are developing methods and analytical models of multimodal communication. This literature review brings research strands from these disciplines together, providing a map of the multi- and interdisciplinary landscape for multimodal analysis of dis-/misinformation. It records the substantial growth starting from the second quarter of 2020-the start of the COVID-19 epidemic in Western Europe-in the number of studies on multimodal dis-/misinformation coming from the field of computer science. The review examines that category of studies in more detail. Finally, the review identifies gaps in multimodal research on dis-/misinformation and suggests ways to bridge these gaps including future cross-disciplinary research directions. Our review provides scholars from different disciplines working on dis-/misinformation with a much needed bird's-eye view of the rapidly emerging research of multimodal dis-/misinformation.
Collapse
Affiliation(s)
- Anna Wilson
- Oxford School of Global and Area Studies, University of Oxford, Oxford OX1 2JD, UK
| | - Seb Wilkes
- Department of Physics, University of Oxford, Oxford, UK
| | | | - Scott Hale
- Oxford Internet Institute, University of Oxford, Oxford, UK
| |
Collapse
|
116
|
Capraro V, Celadin T. "I Think This News Is Accurate": Endorsing Accuracy Decreases the Sharing of Fake News and Increases the Sharing of Real News. PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN 2023; 49:1635-1645. [PMID: 35993352 PMCID: PMC10637098 DOI: 10.1177/01461672221117691] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Accepted: 07/17/2022] [Indexed: 11/16/2022]
Abstract
Accuracy prompts, nudges that make accuracy salient, typically decrease the sharing of fake news, while having little effect on real news. Here, we introduce a new accuracy prompt that is more effective than previous prompts, because it does not only reduce fake news sharing, but it also increases real news sharing. We report four preregistered studies showing that an "endorsing accuracy" prompt ("I think this news is accurate"), placed into the sharing button, decreases fake news sharing, increases real news sharing, and keeps overall engagement constant. We also explore the mechanism through which the intervention works. The key results are specific to endorsing accuracy, rather than accuracy salience, and endorsing accuracy does not simply make participants apply a "source heuristic." Finally, we use Pennycook et al.'s limited-attention model to argue that endorsing accuracy may work by making people more carefully consider their sharing decisions.
Collapse
|
117
|
Lee SJ, Lee CJ, Hwang H. The Role of Deliberative Cognitive Styles in Preventing Belief in Politicized COVID-19 Misinformation. HEALTH COMMUNICATION 2023; 38:2904-2914. [PMID: 36134653 DOI: 10.1080/10410236.2022.2125119] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Misinformation related to COVID-19 is a threat to public health. The present study examined the potential for deliberative cognitive styles such as actively open-minded thinking and need for evidence in deterring belief in misinformation and promoting belief in true information related to COVID-19. In addition, regarding how responses to the pandemic have been politicized, the role of political orientation and motivated reasoning were also examined. We conducted a survey in South Korea (N = 1466) during May 2020. Participants answered measures related to demographics, open-minded thinking, need for evidence, and accuracy perceptions of COVID-19 misinformation and true information items. Multi-level analyses of the survey data found that while motivated reasoning was present, deliberative cognitive styles (actively open-minded thinking and need for evidence) decreased belief in misinformation without intensifying motivated reasoning tendencies. Findings also showed a political asymmetry where conservatives detected COVID-19 misinformation at a lesser rate. Overall, results suggest that health communication related to COVID-19 misinformation should pay attention to conservative populations. Results also imply that interventions that activate deliberative cognitive styles hold promise in reducing belief in COVID-19 misinformation.
Collapse
Affiliation(s)
| | - Chul-Joo Lee
- Department of Communication, Seoul National University
| | - Hyunjung Hwang
- Department of Broadcasting Regulation Research, Korea Information Society Development Institute
| |
Collapse
|
118
|
Martel C, Rand DG. Misinformation warning labels are widely effective: A review of warning effects and their moderating features. Curr Opin Psychol 2023; 54:101710. [PMID: 37972523 DOI: 10.1016/j.copsyc.2023.101710] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Revised: 10/14/2023] [Accepted: 10/15/2023] [Indexed: 11/19/2023]
Abstract
There is growing concern over the spread of misinformation online. One widely adopted intervention by platforms for addressing falsehoods is applying "warning labels" to posts deemed inaccurate by fact-checkers. Despite a rich literature on correcting misinformation after exposure, much less work has examined the effectiveness of warning labels presented concurrent with exposure. Promisingly, existing research suggests that warning labels effectively reduce belief and spread of misinformation. The size of these beneficial effects depends on how the labels are implemented and the characteristics of the content being labeled. Despite some individual differences, recent evidence indicates that warning labels are generally effective across party lines and other demographic characteristics. We discuss potential implications and limitations of labeling policies for addressing online misinformation.
Collapse
Affiliation(s)
- Cameron Martel
- Sloan School of Management, Massachusetts Institute of Technology, 02142 Cambridge, MA, USA.
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, 02142 Cambridge, MA, USA; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 02139 Cambridge, MA, USA
| |
Collapse
|
119
|
Di Domenico G, Ding Y. Between brand attacks and broader narratives: How direct and indirect misinformation erode consumer trust. Curr Opin Psychol 2023; 54:101716. [PMID: 37952396 DOI: 10.1016/j.copsyc.2023.101716] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Revised: 10/04/2023] [Accepted: 10/18/2023] [Indexed: 11/14/2023]
Abstract
Misinformation can take various forms, from political propaganda and health-related fake news to conspiracy theories. This review investigates the consequences of both direct and indirect misinformation for brands and consumers. We review the marketing literature focused on the consequences of misinformation spread and propose a framework that acknowledges the relationship between brands and consumers in a misinformation environment. We argue that the primary consequence of misinformation is the erosion of trust among the various actors in the marketplace. Additionally, we highlight that a comprehensive understanding of the consequences of misinformation should also consider the effects of indirect misinformation on the marketplace.
Collapse
Affiliation(s)
- Giandomenico Di Domenico
- Cardiff Business School, Cardiff University, Aberconway Building, Colum Rd, CF10 3 EU, Cardiff, UK.
| | - Yu Ding
- Stanford Graduate School of Business, Stanford University, 655 Knight Way, Stanford, 94305, CA, USA.
| |
Collapse
|
120
|
Leon CS, Bonilla M, Brusco LI, Forcato C, Benítez FU. Fake news and false memory formation in the psychology debate. IBRO Neurosci Rep 2023; 15:24-30. [PMID: 37359499 PMCID: PMC10285207 DOI: 10.1016/j.ibneur.2023.06.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Accepted: 06/07/2023] [Indexed: 06/28/2023] Open
Abstract
Fake news can generate memory distortions and influence people's behavior. Within the framework of the great debates, the tendency to generate false memories from fake news seems to be modulated by the ideological alignment of each individual. This effect has been observed mainly around issues involving large sectors of society, but little is known about its impact on smaller-scale discussions focused on more specific populations. In this work we examine the formation of false memories from fake news in the debate between psychological currents in Argentina. For this, 326 individuals aligned to psychoanalysis (PSA) or Evidence-Based Practices (EBP) observed a series of news (12 true and 8 fabricated). The EBP group remembered or believed more fake news that damaged PSA. They also remembered with greater precision the statements of the news that harmed their own school, than those referring to others. These results could be understood as the product of an imbalance in the commitment between the different parties, since the group that proposes the paradigm shift (EBP) exhibited a congruence effect, while the group whose orientation is hegemonic in this field (PSA) did not show any effect of ideological alignment. The fact that the congruence effect is manifested to some extent in settings as relevant as the education of mental health professionals, highlights the need to move towards more careful practices in the consumption and production of media.
Collapse
Affiliation(s)
- Candela S. Leon
- Laboratorio de Sueño y Memoria, Departamento de Ciencias de la Vida, Instituto Tecnológico de Buenos Aires (ITBA), Buenos Aires, Argentina
- Consejo Nacional de Investigaciones Científicas y Tecnológicas (CONICET), Buenos Aires, Argentina
| | - Matías Bonilla
- Laboratorio de Sueño y Memoria, Departamento de Ciencias de la Vida, Instituto Tecnológico de Buenos Aires (ITBA), Buenos Aires, Argentina
- Consejo Nacional de Investigaciones Científicas y Tecnológicas (CONICET), Buenos Aires, Argentina
| | - Luis I. Brusco
- CENECON, Centro de Neuropsiquiatría y Neurología de la Conducta (CENECON), Buenos Aires, Argentina
| | - Cecilia Forcato
- Laboratorio de Sueño y Memoria, Departamento de Ciencias de la Vida, Instituto Tecnológico de Buenos Aires (ITBA), Buenos Aires, Argentina
- Consejo Nacional de Investigaciones Científicas y Tecnológicas (CONICET), Buenos Aires, Argentina
| | - Facundo Urreta Benítez
- Laboratorio de Sueño y Memoria, Departamento de Ciencias de la Vida, Instituto Tecnológico de Buenos Aires (ITBA), Buenos Aires, Argentina
- Consejo Nacional de Investigaciones Científicas y Tecnológicas (CONICET), Buenos Aires, Argentina
- Innocence Project Argentina, Buenos Aires, Argentina
| |
Collapse
|
121
|
Béna J, Rihet M, Carreras O, Terrier P. Repetition could increase the perceived truth of conspiracy theories. Psychon Bull Rev 2023; 30:2397-2406. [PMID: 37219761 PMCID: PMC10204694 DOI: 10.3758/s13423-023-02276-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/18/2023] [Indexed: 05/24/2023]
Abstract
Conspiracy theories can be encountered repeatedly, which raises the issue of the effect of their repeated exposure on beliefs. Earlier studies found that repetition increases truth judgments of factual statements, whether they are uncertain, highly implausible, or fake news, for instance. Would this "truth effect" be observed with conspiracy statements? If so, is the effect size smaller than the typical truth effect, and is it associated with individual differences such as cognitive style and conspiracy mentality? In the present preregistered study, we addressed these three issues. We asked participants to provide binary truth judgments to conspiracy and factual statements already displayed in an exposure phase (an interest judgment task) or that were new (displayed only in the truth judgment task). We measured participants' cognitive style with the three-item Cognitive Reflection Test (CRT), and conspiracy mentality with the Conspiracy Mentality Questionnaire (CMQ). Importantly, we found that repetition increased truth judgments of conspiracy theories, unmoderated by cognitive style and conspiracy mentality. Additionally, we found that the truth effect was smaller with conspiracy theories than with uncertain factual statements, and suggest explanations for this difference. The results suggest that repetition may be a simple way to increase belief in conspiracy theories. Whether repetition increases conspiracy beliefs in natural settings and how it contributes to conspiracism compared to other factors are important questions for future research.
Collapse
Affiliation(s)
- Jérémy Béna
- UCLouvain, PSP IPSY, 10 Place du Cardinal Mercier, 1348, Louvain-la-Neuve, Belgium.
| | - Mathias Rihet
- CLLE, Université de Toulouse, CNRS, Toulouse, France
| | | | | |
Collapse
|
122
|
Perez Santangelo A, Solovey G. Understanding belief in political statements using a model-driven experimental approach: a registered report. Sci Rep 2023; 13:21205. [PMID: 38040761 PMCID: PMC10692149 DOI: 10.1038/s41598-023-47939-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Accepted: 11/20/2023] [Indexed: 12/03/2023] Open
Abstract
Misinformation harms society by affecting citizens' beliefs and behaviour. Recent research has shown that partisanship and cognitive reflection (i.e. engaging in analytical thinking) play key roles in the acceptance of misinformation. However, the relative importance of these factors remains a topic of ongoing debate. In this registered study, we tested four hypotheses on the relationship between each factor and the belief in statements made by Argentine politicians. Participants (N = 1353) classified fact-checked political statements as true or false, completed a cognitive reflection test, and reported their voting preferences. Using Signal Detection Theory and Bayesian modeling, we found a reliable positive association between political concordance and overall belief in a statement (median = 0.663, CI95 = [0.640, 0.685]), a reliable positive association between cognitive reflection and scepticism (median = 0.039, CI95 = [0.006, 0.072]), a positive but unreliable association between cognitive reflection and truth discernment (median = 0.016, CI95 = [- 0.015, 0.046]) and a negative but unreliable association between cognitive reflection and partisan bias (median = - 0.016, CI95 = [- 0.037, 0.006]). Our results highlight the need to further investigate the relationship between cognitive reflection and partisanship in different contexts and formats. PROTOCOL REGISTRATION: The stage 1 protocol for this Registered Report was accepted in principle on 22 August 2022. The protocol, as accepted by the journal, can be found at: https://doi.org/10.17605/OSF.IO/EBRGC .
Collapse
Affiliation(s)
- Agustín Perez Santangelo
- Instituto de Investigación en Ciencias de la Computación, Universidad de Buenos Aires (UBA), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), C1428EGA, Buenos Aires, Argentina.
- Laboratorio de Neurociencia, CONICET, Universidad Torcuato Di Tella, C1428BIJ, Buenos Aires, Argentina.
| | - Guillermo Solovey
- Instituto de CálculoFacultad de Ciencias Exactas y Naturales, UBA-CONICET, Buenos Aires, Argentina.
| |
Collapse
|
123
|
Koller WN, Thompson H, Cannon TD. Conspiracy mentality, subclinical paranoia, and political conservatism are associated with perceived status threat. PLoS One 2023; 18:e0293930. [PMID: 37992025 PMCID: PMC10664880 DOI: 10.1371/journal.pone.0293930] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Accepted: 10/21/2023] [Indexed: 11/24/2023] Open
Abstract
Status threat (i.e., concern that one's dominant social group will be undermined by outsiders) is a significant factor in current United States politics. While demographic factors such as race (e.g., Whiteness) and political affiliation (e.g., conservatism) tend to be associated with heightened levels of status threat, its psychological facets have yet to be fully characterized. Informed by a "paranoid" model of American politics, we explored a suite of possible psychological and demographic associates of perceived status threat, including race/ethnicity, political conservatism, analytic thinking, magical ideation, subclinical paranoia, and conspiracy mentality. In a small, quota sample drawn from the United States (N = 300), we found that conspiracy mentality, subclinical paranoia, conservatism, and age were each positively (and uniquely) associated with status threat. In addition to replicating past work linking conservatism to status threat, this study identifies subclinical paranoia and conspiracy mentality as novel psychological associates of status threat. These findings pave the way for future research regarding how and why status threat concerns may become exaggerated in certain individuals, possibly to the detriment of personal and societal wellbeing.
Collapse
Affiliation(s)
- William N. Koller
- Department of Psychology, Yale University New Haven, Connecticut, United States of America
| | - Honor Thompson
- Department of Psychology, Yale University New Haven, Connecticut, United States of America
| | - Tyrone D. Cannon
- Department of Psychology, Yale University New Haven, Connecticut, United States of America
- Department of Psychiatry, Yale University New Haven, Connecticut, United States of America
| |
Collapse
|
124
|
Kotz J, Giese H, König LM. How to debunk misinformation? An experimental online study investigating text structures and headline formats. Br J Health Psychol 2023; 28:1097-1112. [PMID: 37263771 DOI: 10.1111/bjhp.12670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 04/04/2023] [Accepted: 05/17/2023] [Indexed: 06/03/2023]
Abstract
OBJECTIVES Misinformation is a crucial problem, particularly online, and the success of debunking messages has so far been limited. In this study, we experimentally test how debunking text structure (truth sandwich vs. bottom-heavy) and headline format (statement vs. questions) affect the belief in misinformation across topics of the safety of COVID vaccines and GMO foods. DESIGN Experimental online study. METHODS A representative German sample of 4906 participants were randomly assigned to reading one of eight debunking messages in the experimentally varied formats and subsequently rated the acceptance of this message and the agreement to misinformation statements about the mentioned topics and an unrefuted control myth. RESULTS While the debunking messages specifically decreased the belief in the targeted myth, these beliefs and the acceptance of the debunking message were unaffected by the text structures and headline formats. Yet, they were less successful when addressing individuals with strong pre-existing, incongruent attitudes and distrust in science. CONCLUSIONS The risk of backfire effects in debunking misinformation is low. Text structure and headline format are of relatively little importance for the effectiveness of debunking messages. Instead, writers may need to pay attention to the text being comprehensive, trustworthy and persuasive to maximize effectiveness.
Collapse
Affiliation(s)
- Johannes Kotz
- Department of Psychology, University of Konstanz, Konstanz, Germany
- Faculty of Life Sciences: Food, Nutrition and Health, University of Bayreuth, Bayreuth, Germany
| | - Helge Giese
- Department of Psychology, University of Konstanz, Konstanz, Germany
- Heisenberg Chair for Medical Risk Literacy and Evidence-based Decisions, Charité - Universitätsmedizin Berlin, Berlin, Germany
| | - Laura M König
- Faculty of Life Sciences: Food, Nutrition and Health, University of Bayreuth, Bayreuth, Germany
| |
Collapse
|
125
|
Abstract
In the last decade there has been a proliferation of research on misinformation. One important aspect of this work that receives less attention than it should is exactly why misinformation is a problem. To adequately address this question, we must first look to its speculated causes and effects. We examined different disciplines (computer science, economics, history, information science, journalism, law, media, politics, philosophy, psychology, sociology) that investigate misinformation. The consensus view points to advancements in information technology (e.g., the Internet, social media) as a main cause of the proliferation and increasing impact of misinformation, with a variety of illustrations of the effects. We critically analyzed both issues. As to the effects, misbehaviors are not yet reliably demonstrated empirically to be the outcome of misinformation; correlation as causation may have a hand in that perception. As to the cause, advancements in information technologies enable, as well as reveal, multitudes of interactions that represent significant deviations from ground truths through people's new way of knowing (intersubjectivity). This, we argue, is illusionary when understood in light of historical epistemology. Both doubts we raise are used to consider the cost to established norms of liberal democracy that come from efforts to target the problem of misinformation.
Collapse
Affiliation(s)
- Zoë Adams
- Department of Linguistics, School of Languages, Linguistics and Film, Queen Mary University London
| | - Magda Osman
- Centre for Science and Policy, University of Cambridge
- Judge Business School, University of Cambridge
- Leeds Business School, University of Leeds
| | | | - Björn Meder
- Department of Psychology, Health and Medical University, Potsdam, Germany
- Max Planck Research Group iSearch, Max Planck Institute for Human Development, Berlin, Germany
| |
Collapse
|
126
|
Skipper Y, Jolley D, Reddington J. 'But wait, that isn't real': A proof-of-concept study evaluating 'Project Real', a co-created intervention that helps young people to spot fake news online. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2023; 41:371-384. [PMID: 37386791 DOI: 10.1111/bjdp.12456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 05/31/2023] [Accepted: 06/15/2023] [Indexed: 07/01/2023]
Abstract
As misinformation is one of the top risks facing the world today, it is vital to ensure that young people have the confidence and skills to recognize fake news. Therefore, we used co-creation to develop an intervention (called 'Project Real') and tested its efficacy in a proof-of-concept study. One hundred and twenty-six pupils aged 11-13 completed questionnaires before and after the intervention that measured confidence and ability to recognize fake news and the number of checks they would make before sharing news. Twenty-seven pupils and three teachers participated in follow-up discussions to evaluate Project Real. Quantitative data indicated that Project Real increased participants' confidence in recognizing fake news and the number of checks they intended to make before sharing news. However, there was no change in their ability to recognize fake news. Qualitative data indicated that participants felt that they had improved their skills and confidence in recognizing fake news, supporting the quantitative data.
Collapse
|
127
|
Kožuh I, Čakš P. Social Media Fact-Checking: The Effects of News Literacy and News Trust on the Intent to Verify Health-Related Information. Healthcare (Basel) 2023; 11:2796. [PMID: 37893870 PMCID: PMC10606871 DOI: 10.3390/healthcare11202796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2023] [Revised: 10/13/2023] [Accepted: 10/20/2023] [Indexed: 10/29/2023] Open
Abstract
The recent health crisis and the rapid development of Artificial Intelligence have caused misinformation on social media to flourish by becoming more sophisticated and challenging to detect. This calls upon fact-checking and questions users' competencies and attitudes when assessing social media news. Our study provides a model of how fact-checking intent is explained by news literacy and news trust to examine how users behave in the misinformation-prone social media environment. Structural equation modeling was used to examine survey data gathered from social media users. The findings revealed that users' intent to fact-check information in social media news is explained by (1) news literacy, such as the awareness of various techniques used by creators to depict situations about COVID-19; (2) news trust, in terms of the conviction that the news contains all the essential facts; and (3) intent, such as an aim to check information in multiple pieces of news. The presented findings may aid policymakers and practitioners in developing efficient communication strategies for addressing users less prone to fact-checking. Our contribution offers a new understanding of news literacy as a sufficient tool for combating misinformation, which actively equips users with knowledge and an attitude for social media news fact-checking.
Collapse
Affiliation(s)
- Ines Kožuh
- Faculty of Electrical Engineering and Computer Science, University of Maribor, 2000 Maribor, Slovenia;
| | | |
Collapse
|
128
|
Buder J, Zimmermann A, Buttliere B, Rabl L, Vogel M, Huff M. Online Interaction Turns the Congeniality Bias Into an Uncongeniality Bias. Psychol Sci 2023; 34:1055-1068. [PMID: 37722137 DOI: 10.1177/09567976231194590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/20/2023] Open
Abstract
Online phenomena like echo chambers and polarization are believed to be driven by humans' penchant to selectively expose themselves to attitudinally congenial content. However, if like-minded content were the only predictor of online behavior, heated debate and flaming on the Internet would hardly occur. Research has overlooked how online behavior changes when people are given an opportunity to reply to dissenters. Three experiments (total N = 320; convenience student samples from Germany) and an internal meta-analysis show that in a discussion-forum setting where participants can reply to earlier comments larger cognitive conflict between participant attitude and comment attitude predicts higher likelihood to respond (uncongeniality bias). When the discussion climate was friendly (vs. oppositional) to the views of participants, the uncongeniality bias was more pronounced and was also associated with attitude polarization. These results suggest that belief polarization on social media may not only be driven by congeniality but also by conflict.
Collapse
Affiliation(s)
- Jürgen Buder
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
| | - Anja Zimmermann
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
- Department of Research and Transfer, Technical University of Darmstadt
| | - Brett Buttliere
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
- Faculty of Humanities, Nicolaus Copernicus University
| | - Lisa Rabl
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
| | - Moritz Vogel
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
| | - Markus Huff
- Perception and Action Lab, Leibniz-Institut für Wissensmedien, Tübingen
- Department of Psychology, University of Tübingen
| |
Collapse
|
129
|
Sude DJ, Sharon G, Dvir-Gvirsman S. True, justified, belief? Partisanship weakens the positive effect of news media literacy on fake news detection. Front Psychol 2023; 14:1242865. [PMID: 37823073 PMCID: PMC10562704 DOI: 10.3389/fpsyg.2023.1242865] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 09/11/2023] [Indexed: 10/13/2023] Open
Abstract
To investigate how people assess whether politically consistent news is real or fake, two studies (N = 1,008; N = 1,397) with adult American participants conducted in 2020 and 2022 utilized a within-subjects experimental design to investigate perceptions of news accuracy. When a mock Facebook post with either fake (Study 1) or real (Study 2) news content was attributed to an alternative (vs. a mainstream) news outlet, it was, on average, perceived to be less accurate. Those with beliefs reflecting News Media Literacy demonstrated greater sensitivity to the outlet's status. This relationship was itself contingent on the strength of the participant's partisan identity. Strong partisans high in News Media Literacy defended the accuracy of politically consistent content, even while recognizing that an outlet was unfamiliar. These results highlight the fundamental importance of looking at the interaction between user-traits and features of social media news posts when examining learning from political news on social media.
Collapse
Affiliation(s)
- Daniel Jeffrey Sude
- Department of Organizational Sciences and Communication, George Washington University, Washington, DC, United States
| | - Gil Sharon
- DAN Department of Communication, Tel Aviv University, Tel Aviv, Israel
| | | |
Collapse
|
130
|
Ahmed S, Rasul ME. Examining the association between social media fatigue, cognitive ability, narcissism and misinformation sharing: cross-national evidence from eight countries. Sci Rep 2023; 13:15416. [PMID: 37723265 PMCID: PMC10507063 DOI: 10.1038/s41598-023-42614-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 09/12/2023] [Indexed: 09/20/2023] Open
Abstract
Several studies have explored the causes and consequences of public engagement with misinformation and, more recently, COVID-19 misinformation. However, there is still a need to understand the mechanisms that cause misinformation propagation on social media. In addition, evidence from non-Western societies remains rare. This study reports on survey evidence from eight countries to examine whether social media fatigue can influence users to believe misinformation, influencing their sharing intentions. Our insights also build on prior cognitive and personality literature by exploring how this mechanism is conditional upon users' cognitive ability and narcissism traits. The results suggest that social media fatigue can influence false beliefs of misinformation which translates into sharing on social media. We also find that those with high levels of cognitive ability are less likely to believe and share misinformation. However, those with low cognitive ability and high levels of narcissism are most likely to share misinformation on social media due to social media fatigue. This study is one of the first to provide cross-national comparative evidence highlighting the adverse effects of social media fatigue on misinformation propagation and establishing that the relationship is not universal but dependent on both cognitive and dark personality traits of individuals.
Collapse
Affiliation(s)
- Saifuddin Ahmed
- Wee Kim Wee School of Communication and Information, Nanyang Technological University, Singapore, Singapore.
| | | |
Collapse
|
131
|
Lisi M. Navigating the COVID-19 infodemic: the influence of metacognitive efficiency on health behaviours and policy attitudes. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230417. [PMID: 37680503 PMCID: PMC10480698 DOI: 10.1098/rsos.230417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Accepted: 08/18/2023] [Indexed: 09/09/2023]
Abstract
The COVID-19 pandemic has been accompanied by an infodemic of misinformation and increasing polarization around public health measures, such as social distancing and national lockdowns. In this study, I examined metacognitive efficiency-the extent to which the subjective feeling of knowing predicts the objective accuracy of knowledge-as a tool to understand and measure the assimilation of misleading misinformation in a balanced sample of Great Britain's population (N = 1689), surveyed at the end of the third national lockdown. Using a signal-detection theory approach to quantify metacognitive efficiency, I found that at the population level, metacognitive efficiency for COVID-19 knowledge was impaired compared with general knowledge, indicating a worse alignment between confidence levels and the actual ability to discern true and false statements. Crucially, individual differences in metacognitive efficiency related to COVID-19 knowledge predicted health-protective behaviours, vaccination intentions and attitudes towards public health measures, even after accounting for the level of knowledge itself and demographic covariates, such as education, income and political alignment. These results reveal the significant impact of misinformation on public beliefs and suggest that fostering confidence in accurate knowledge should be a key target for science communication efforts aimed at promoting compliance with public health and social measures.
Collapse
Affiliation(s)
- Matteo Lisi
- Department of Psychology, University of Essex, Essex, UK
- Department of Psychology, Royal Holloway, University of London, London, UK
| |
Collapse
|
132
|
Lin H, Lasser J, Lewandowsky S, Cole R, Gully A, Rand DG, Pennycook G. High level of correspondence across different news domain quality rating sets. PNAS NEXUS 2023; 2:pgad286. [PMID: 37719749 PMCID: PMC10500312 DOI: 10.1093/pnasnexus/pgad286] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Accepted: 08/24/2023] [Indexed: 09/19/2023]
Abstract
One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains that a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the reliability of these ratings. In this study, we compared six sets of expert ratings and found that they generally correlated highly with one another. We then created a comprehensive set of domain ratings for use by the research community (github.com/hauselin/domain-quality-ratings), leveraging an ensemble "wisdom of experts" approach. To do so, we performed imputation together with principal component analysis to generate a set of aggregate ratings. The resulting rating set comprises 11,520 domains-the most extensive coverage to date-and correlates well with other rating sets that have more limited coverage. Together, these results suggest that experts generally agree on the relative quality of news domains, and the aggregate ratings that we generate offer a powerful research tool for evaluating the quality of news consumed or shared and the efficacy of misinformation interventions.
Collapse
Affiliation(s)
- Hause Lin
- Hill/Levene Schools of Business, University of Regina, 3737 Wascana Parkway, Regina, SK, S4S 0A2, Canada
- Sloan School of Management, Massachusetts Institute of Technology, 100 Main St, Cambridge, MA 02142, USA
- Department of Psychology, Cornell University, Uris Hall, 211, Tower Rd, Ithaca, NY 14853, USA
| | - Jana Lasser
- Institute for Interactive Systems and Data Science, Graz University of Technology, Inffeldgasse 16C, 8010 Graz, Austria
- Complexity Science Hub Vienna, Josefstädterstraße 39, 1080 Vienna, Austria
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol, 12a, Priory Road, Bristol BS8 1TU, UK
- School of Psychology, University of Western Australia, 35 Stirling Hwy, Crawley, WA 6009, Australia
| | - Rocky Cole
- Jigsaw (Google LLC), 1600 Amphitheatre Parkway, Mountain View, CA 94043, USA
| | - Andrew Gully
- Jigsaw (Google LLC), 1600 Amphitheatre Parkway, Mountain View, CA 94043, USA
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, 100 Main St, Cambridge, MA 02142, USA
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar St, Cambridge, MA 02139, USA
| | - Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina, 3737 Wascana Parkway, Regina, SK, S4S 0A2, Canada
- Department of Psychology, Cornell University, Uris Hall, 211, Tower Rd, Ithaca, NY 14853, USA
- Department of Psychology, University of Regina, 3737 Wascana Parkway, Regina, SK S4S 0A2, Canada
| |
Collapse
|
133
|
Meng Y, Broom M, Li A. Impact of misinformation in the evolution of collective cooperation on networks. J R Soc Interface 2023; 20:20230295. [PMID: 37751874 PMCID: PMC10522409 DOI: 10.1098/rsif.2023.0295] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2023] [Accepted: 09/04/2023] [Indexed: 09/28/2023] Open
Abstract
Human societies are organized and developed through collective cooperative behaviours. Based on the information in their environment, individuals can form collective cooperation by strategically changing unfavourable surroundings and imitating superior behaviours. However, facing the rampant proliferation and spreading of misinformation, we still lack systematic investigations into the impact of misinformation on the evolution of collective cooperation. Here, we study this problem by classical evolutionary game theory. We find that the existence of misinformation generally impedes the emergence of collective cooperation on networks, although the level of cooperation is slightly higher for weak social cooperative dilemma below a proven threshold. We further show that this possible advantage diminishes as social connections become denser, suggesting that the detrimental effect of misinformation further increases when 'social viscosity' is low. Our results uncover the quantitative effect of misinformation on suppressing collective cooperation, and pave the way for designing possible mechanisms to improve collective cooperation.
Collapse
Affiliation(s)
- Yao Meng
- Center for Systems and Control, College of Engineering, Peking University, Beijing 100871, People’s Republic of China
| | - Mark Broom
- Department of Mathematics, City, University of London, Northampton Square, London EC1V 0HB, UK
| | - Aming Li
- Center for Systems and Control, College of Engineering, Peking University, Beijing 100871, People’s Republic of China
- Center for Multi-Agent Research, Institute for Artificial Intelligence, Peking University, Beijing 100871, People’s Republic of China
| |
Collapse
|
134
|
Arechar AA, Allen J, Berinsky AJ, Cole R, Epstein Z, Garimella K, Gully A, Lu JG, Ross RM, Stagnaro MN, Zhang Y, Pennycook G, Rand DG. Understanding and combatting misinformation across 16 countries on six continents. Nat Hum Behav 2023; 7:1502-1513. [PMID: 37386111 DOI: 10.1038/s41562-023-01641-6] [Citation(s) in RCA: 27] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Accepted: 05/24/2023] [Indexed: 07/01/2023]
Abstract
The spread of misinformation online is a global problem that requires global solutions. To that end, we conducted an experiment in 16 countries across 6 continents (N = 34,286; 676,605 observations) to investigate predictors of susceptibility to misinformation about COVID-19, and interventions to combat the spread of this misinformation. In every country, participants with a more analytic cognitive style and stronger accuracy-related motivations were better at discerning truth from falsehood; valuing democracy was also associated with greater truth discernment, whereas endorsement of individual responsibility over government support was negatively associated with truth discernment in most countries. Subtly prompting people to think about accuracy had a generally positive effect on the veracity of news that people were willing to share across countries, as did minimal digital literacy tips. Finally, aggregating the ratings of our non-expert participants was able to differentiate true from false headlines with high accuracy in all countries via the 'wisdom of crowds'. The consistent patterns we observe suggest that the psychological factors underlying the misinformation challenge are similar across different regional settings, and that similar solutions may be broadly effective.
Collapse
Affiliation(s)
- Antonio A Arechar
- Center for Research and Teaching in Economics (CIDE), Aguascalientes, Mexico
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
- Centre for Decision Research and Experimental Economics, University of Nottingham, Nottingham, UK
| | - Jennifer Allen
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Adam J Berinsky
- Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA, USA
| | | | - Ziv Epstein
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
- Media Lab, Massachusetts Institute of Technology, Cambridge, MA, USA
| | | | | | - Jackson G Lu
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Robert M Ross
- Department of Philosophy, Macquarie University, Sydney, New South Wales, Australia
| | - Michael N Stagnaro
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Yunhao Zhang
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina, Regina, Saskatchewan, Canada.
- Department of Psychology, University of Regina, Regina, Saskatchewan, Canada.
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA.
- Institute for Data, Systems, and Society, Massachusetts Institute of Technology, Cambridge, MA, USA.
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA.
| |
Collapse
|
135
|
Jones CM, Diethei D, Schöning J, Shrestha R, Jahnel T, Schüz B. Impact of Social Reference Cues on Misinformation Sharing on Social Media: Series of Experimental Studies. J Med Internet Res 2023; 25:e45583. [PMID: 37616030 PMCID: PMC10485706 DOI: 10.2196/45583] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2023] [Revised: 05/24/2023] [Accepted: 05/29/2023] [Indexed: 08/25/2023] Open
Abstract
BACKGROUND Health-related misinformation on social media is a key challenge to effective and timely public health responses. Existing mitigation measures include flagging misinformation or providing links to correct information, but they have not yet targeted social processes. Current approaches focus on increasing scrutiny, providing corrections to misinformation (debunking), or alerting users prospectively about future misinformation (prebunking and inoculation). Here, we provide a test of a complementary strategy that focuses on the social processes inherent in social media use, in particular, social reinforcement, social identity, and injunctive norms. OBJECTIVE This study aimed to examine whether providing balanced social reference cues (ie, cues that provide information on users sharing and, more importantly, not sharing specific content) in addition to flagging COVID-19-related misinformation leads to reductions in sharing behavior and improvement in overall sharing quality. METHODS A total of 3 field experiments were conducted on Twitter's native social media feed (via a newly developed browser extension). Participants' feed was augmented to include misleading and control information, resulting in 4 groups: no-information control, Twitter's own misinformation warning (misinformation flag), social cue only, and combined misinformation flag and social cue. We tracked the content shared or liked by participants. Participants were provided with social information by referencing either their personal network on Twitter or all Twitter users. RESULTS A total of 1424 Twitter users participated in 3 studies (n=824, n=322, and n=278). Across all 3 studies, we found that social cues that reference users' personal network combined with a misinformation flag reduced the sharing of misleading but not control information and improved overall sharing quality. We show that this improvement could be driven by a change in injunctive social norms (study 2) but not social identity (study 3). CONCLUSIONS Social reference cues combined with misinformation flags can significantly and meaningfully reduce the amount of COVID-19-related misinformation shared and improve overall sharing quality. They are a feasible and scalable way to effectively curb the sharing of COVID-19-related misinformation on social media.
Collapse
Affiliation(s)
- Christopher M Jones
- Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
| | - Daniel Diethei
- Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
| | - Johannes Schöning
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
- School of Computer Science, University of St Gallen, St Gallen, Switzerland
| | - Rehana Shrestha
- Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
| | - Tina Jahnel
- Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
| | - Benjamin Schüz
- Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany
- Leibniz ScienceCampus Digital Public Health, Bremen, Germany
| |
Collapse
|
136
|
Piksa M, Noworyta K, Gundersen AB, Kunst J, Morzy M, Piasecki J, Rygula R. Are we willing to share what we believe is true? Factors influencing susceptibility to fake news. Front Psychiatry 2023; 14:1165103. [PMID: 37654985 PMCID: PMC10467258 DOI: 10.3389/fpsyt.2023.1165103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 07/25/2023] [Indexed: 09/02/2023] Open
Abstract
Background The contemporary media landscape is saturated with the ubiquitous presence of misinformation. One can point to several factors that amplify the spread and dissemination of false information, such as blurring the line between expert and layman's opinions, economic incentives promoting the publication of sensational information, the zero cost of sharing false information, and many more. In this study, we investigate some of the mechanisms of fake news dissemination that have eluded scientific scrutiny: the evaluation of veracity and behavioral engagement with information in light of its factual truthfulness (either true or false), cognitive utility (either enforcing or questioning participants' beliefs), and presentation style (either sober or populistic). Results Two main results emerge from our experiment. We find that the evaluation of veracity is mostly related to the objective truthfulness of a news item. However, the probability of engagement is more related to the congruence of the information with the participants' preconceived beliefs than to objective truthfulness or information presentation style. Conclusion We conclude a common notion that the spread of fake news can be limited by fact-checking and educating people might not be entirely true, as people will share fake information as long as it reduces the entropy of their mental models of the world. We also find support for the Trojan Horse hypothesis of fake news dissemination.
Collapse
Affiliation(s)
- Michal Piksa
- Department of Pharmacology, Affective Cognitive Neuroscience Laboratory, Maj Institute of Pharmacology Polish Academy of Sciences, Krakow, Poland
| | - Karolina Noworyta
- Department of Pharmacology, Affective Cognitive Neuroscience Laboratory, Maj Institute of Pharmacology Polish Academy of Sciences, Krakow, Poland
| | | | - Jonas Kunst
- Department of Psychology, University of Oslo, Oslo, Norway
| | - Mikolaj Morzy
- Faculty of Computing and Telecommunications, Poznan University of Technology, Poznan, Poland
| | - Jan Piasecki
- Faculty of Health Sciences, Department of Philosophy and Bioethics, Jagiellonian University Medical College, Krakow, Poland
| | - Rafal Rygula
- Department of Pharmacology, Affective Cognitive Neuroscience Laboratory, Maj Institute of Pharmacology Polish Academy of Sciences, Krakow, Poland
| |
Collapse
|
137
|
Pillai RM, Fazio LK, Effron DA. Repeatedly Encountered Descriptions of Wrongdoing Seem More True but Less Unethical: Evidence in a Naturalistic Setting. Psychol Sci 2023; 34:863-874. [PMID: 37428445 DOI: 10.1177/09567976231180578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/11/2023] Open
Abstract
When news about moral transgressions goes viral on social media, the same person may repeatedly encounter identical reports about a wrongdoing. In a longitudinal experiment (N = 607 U.S. adults from Mechanical Turk), we found that these repeated encounters can affect moral judgments. As participants went about their lives, we text-messaged them news headlines describing corporate wrongdoings (e.g., a cosmetics company harming animals). After 15 days, they rated these wrongdoings as less unethical than new wrongdoings. Extending prior laboratory research, these findings reveal that repetition can have a lasting effect on moral judgments in naturalistic settings, that affect plays a key role, and that increasing the number of repetitions generally makes moral judgments more lenient. Repetition also made fictitious descriptions of wrongdoing seem truer, connecting this moral-repetition effect with past work on the illusory-truth effect. The more times we hear about a wrongdoing, the more we may believe it-but the less we may care.
Collapse
Affiliation(s)
- Raunak M Pillai
- Department of Psychology and Human Development, Vanderbilt University
| | - Lisa K Fazio
- Department of Psychology and Human Development, Vanderbilt University
| | - Daniel A Effron
- Organisational Behaviour Subject Area, London Business School
| |
Collapse
|
138
|
Nerino V. Overcome the fragmentation in online propaganda literature: the role of cultural and cognitive sociology. FRONTIERS IN SOCIOLOGY 2023; 8:1170447. [PMID: 37497101 PMCID: PMC10366602 DOI: 10.3389/fsoc.2023.1170447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 06/19/2023] [Indexed: 07/28/2023]
Abstract
Evidence concerning the proliferation of propaganda on social media has renewed scientific interest in persuasive communication practices, resulting in a thriving yet quite disconnected scholarship. This fragmentation poses a significant challenge, as the absence of a structured and comprehensive organization of this extensive literature hampers the interpretation of findings, thus jeopardizing the understanding of online propaganda functioning. To address this fragmentation, I propose a systematization approach that involves utilizing Druckman's Generalizing Persuasion Framework as a unified interpretative tool to organize this scholarly work. By means of this approach, it is possible to systematically identify the various strands within the field, detect their respective shortcomings, and formulate new strategies to bridge these research strands and advance our knowledge of how online propaganda operates. I conclude by arguing that these strategies should involve the sociocultural perspectives offered by cognitive and cultural sociology, as these provide important insights and research tools to disentangle and evaluate the role played by supra-individual factors in the production, distribution, consumption, and evaluation of online propaganda.
Collapse
Affiliation(s)
- Valentina Nerino
- Interdisciplinary Centre for Gender Studies (ICFG), University of Bern, Bern, Switzerland
- Department of Sociology, University of Trento, Trento, Italy
| |
Collapse
|
139
|
Anagnostou A, Lieberman J, Greenhawt M, Mack DP, Santos AF, Venter C, Stukus D, Turner PJ, Brough HA. The future of food allergy: Challenging existing paradigms of clinical practice. Allergy 2023; 78:1847-1865. [PMID: 37129472 DOI: 10.1111/all.15757] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Revised: 04/05/2023] [Accepted: 04/25/2023] [Indexed: 05/03/2023]
Abstract
The field of food allergy has seen tremendous change over the past 5-10 years with seminal studies redefining our approach to prevention and management and novel testing modalities in the horizon. Early introduction of allergenic foods is now recommended, challenging the previous paradigm of restrictive avoidance. The management of food allergy has shifted from a passive avoidance approach to active interventions that aim to provide protection from accidental exposures, decrease allergic reaction severity and improve the quality of life of food-allergic patients and their families. Additionally, novel diagnostic tools are making their way into clinical practice with the goal to reduce the need for food challenges and assist physicians in the-often complex-diagnostic process. With all the new developments and available choices for diagnosis, prevention and therapy, shared decision-making has become a key part of medical consultation, enabling patients to make the right choice for them, based on their values and preferences. Communication with patients has also become more complex over time, as patients are seeking advice online and through social media, but the information found online may be outdated, incorrect, or lacking in context. The role of the allergist has evolved to embrace all the above exciting developments and provide patients with the optimal care that fits their needs. In this review, we discuss recent developments as well as the evolution of the field of food allergy in the next decade.
Collapse
Affiliation(s)
- Aikaterini Anagnostou
- Department of Pediatrics, Section of Immunology, Allergy and Retrovirology, Texas Children's Hospital, Houston, Texas, USA
- Section of Allergy, Immunology & Retrovirology, Baylor College of Medicine, Houston, Texas, USA
| | - Jay Lieberman
- Department of Pediatrics, The University of Tennessee Health Science Center, LeBonheur Children's Hospital, Memphis, Tennessee, USA
| | - Matthew Greenhawt
- Section of Allergy and Immunology, Food Challenge and Research Unit, Children's Hospital Colorado, Department of Pediatrics, University of Colorado School of Medicine, Denver, Colorado, USA
| | - Douglas Paul Mack
- Department of Pediatrics, McMaster University, Hamilton, Ontario, Canada
| | - Alexandra F Santos
- Department of Women and Children's Health (Pediatric Allergy), School of Life Courses Sciences, Faculty of Life Sciences and Medicine, King's College London, London, UK
- Peter Gorer Department of Immunobiology, School of Immunology and Microbial Sciences, Faculty of Life Sciences and Medicine, King's College London, London, UK
- Children's Allergy Service and Children's Allergy Service, Evelina Children's Hospital, Guy's and St. Thomas's NHS Foundation Trust, London, UK
| | - Carina Venter
- Section of Allergy and Immunology, Children's Hospital Colorado, Department of Pediatrics, University of Colorado, Denver, Colorado, USA
| | - David Stukus
- Section of Allergy, Immunology & Retrovirology, Baylor College of Medicine, Houston, Texas, USA
- Nationwide Children's Hospital, The Ohio State University College of Medicine, Ohio, USA
| | - Paul J Turner
- National Heart & Lung Institute, Imperial College London, London, UK
| | - Helen A Brough
- Department of Women and Children's Health (Pediatric Allergy), School of Life Courses Sciences, Faculty of Life Sciences and Medicine, King's College London, London, UK
- Children's Allergy Service and Children's Allergy Service, Evelina Children's Hospital, Guy's and St. Thomas's NHS Foundation Trust, London, UK
| |
Collapse
|
140
|
Kozyreva A, Smillie L, Lewandowsky S. Incorporating Psychological Science Into Policy Making: The Case of Misinformation. EUROPEAN PSYCHOLOGIST 2023; 28:a000493. [PMID: 37994309 PMCID: PMC7615323 DOI: 10.1027/1016-9040/a000493] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2023]
Abstract
The spread of false and misleading information in online social networks is a global problem in need of urgent solutions. It is also a policy problem because misinformation can harm both the public and democracies. To address the spread of misinformation, policymakers require a successful interface between science and policy, as well as a range of evidence-based solutions that respect fundamental rights while efficiently mitigating the harms of misinformation online. In this article, we discuss how regulatory and nonregulatory instruments can be informed by scientific research and used to reach EU policy objectives. First, we consider what it means to approach misinformation as a policy problem. We then outline four building blocks for cooperation between scientists and policymakers who wish to address the problem of misinformation: understanding the misinformation problem, understanding the psychological drivers and public perceptions of misinformation, finding evidence-based solutions, and co-developing appropriate policy measures. Finally, through the lens of psychological science, we examine policy instruments that have been proposed in the EU, focusing on the strengthened Code of Practice on Disinformation 2022.
Collapse
Affiliation(s)
- Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Laura Smillie
- Joint Research Center, European Commission, Brussels, Belgium
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol, UK
- School of Psychological Sciences, University of Western Australia, Australia
- Department of Psychology, University of Potsdam, Germany
| |
Collapse
|
141
|
Agley J, Xiao Y. Low trust in science may foster belief in misinformation by aligning scientifically supported and unsupported statements. Perspect Public Health 2023; 143:199-201. [PMID: 37589323 DOI: 10.1177/17579139221136722] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/18/2023]
Affiliation(s)
- J Agley
- Prevention Insights, Department of Applied Health Science, School of Public Health Bloomington, Indiana University Bloomington, 809 E. 9 St., Bloomington, IN 47405, USA
| | - Y Xiao
- Department of Population Health Sciences, Weill Cornell Medicine, New York, NY, USA
| |
Collapse
|
142
|
Frąszczak D. Detecting rumor outbreaks in online social networks. SOCIAL NETWORK ANALYSIS AND MINING 2023; 13:91. [PMID: 37274600 PMCID: PMC10233536 DOI: 10.1007/s13278-023-01092-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Revised: 03/29/2023] [Accepted: 05/03/2023] [Indexed: 06/06/2023]
Abstract
Social media platforms are broadly used to exchange information by milliards of people worldwide. Each day people share a lot of their updates and opinions on various types of topics. Moreover, politicians also use it to share their postulates and programs, shops to advertise their products, etc. Social media are so popular nowadays because of critical factors, including quick and accessible Internet communication, always available. These conditions make it easy to spread information from one user to another in close neighborhoods and around the whole social network located on the given platform. Unfortunately, it has recently been increasingly used for malicious purposes, e.g., rumor propagation. In most cases, the process starts from multiple nodes (users). There are numerous papers about detecting the real source with only one initiator. There is a lack of solutions dedicated to problems with multiple sources. Most solutions that meet those criteria need an accurate number of origins to detect them correctly, which is impossible to obtain in real-life usage. This paper analyzes the methods to detect rumor outbreaks in online social networks that can be used as an initial guess for the number of real propagation initiators.
Collapse
|
143
|
Rathje S, Roozenbeek J, Van Bavel JJ, van der Linden S. Accuracy and social motivations shape judgements of (mis)information. Nat Hum Behav 2023; 7:892-903. [PMID: 36879042 PMCID: PMC10289897 DOI: 10.1038/s41562-023-01540-w] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2022] [Accepted: 01/27/2023] [Indexed: 03/08/2023]
Abstract
The extent to which belief in (mis)information reflects lack of knowledge versus a lack of motivation to be accurate is unclear. Here, across four experiments (n = 3,364), we motivated US participants to be accurate by providing financial incentives for correct responses about the veracity of true and false political news headlines. Financial incentives improved accuracy and reduced partisan bias in judgements of headlines by about 30%, primarily by increasing the perceived accuracy of true news from the opposing party (d = 0.47). Incentivizing people to identify news that would be liked by their political allies, however, decreased accuracy. Replicating prior work, conservatives were less accurate at discerning true from false headlines than liberals, yet incentives closed the gap in accuracy between conservatives and liberals by 52%. A non-financial accuracy motivation intervention was also effective, suggesting that motivation-based interventions are scalable. Altogether, these results suggest that a substantial portion of people's judgements of the accuracy of news reflects motivational factors.
Collapse
Affiliation(s)
- Steve Rathje
- Department of Psychology, University of Cambridge, Cambridge, UK.
| | - Jon Roozenbeek
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Jay J Van Bavel
- Department of Psychology and Center for Neural Science, New York University, New York, NY, USA
| | | |
Collapse
|
144
|
Israel-Turim V, Laferrara V, Rego AR, Micó-Sanz JL. Misinformation about the COVID-19 Vaccine in Online Catholic Media. Vaccines (Basel) 2023; 11:1054. [PMID: 37376443 DOI: 10.3390/vaccines11061054] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2023] [Revised: 05/22/2023] [Accepted: 05/25/2023] [Indexed: 06/29/2023] Open
Abstract
During the COVID-19 pandemic, online media were the most widely used sources of scientific information. Often, they are also the only ones on science-related topics. Research has shown that much of the information available on the Internet about the health crisis lacked scientific rigor, and that misinformation about health issues can pose a threat to public health. In turn, millions of Catholics were found to be demonstrating against vaccination against COVID-19 based on "false" and misleading religious arguments. This research analyses publications about the vaccine in Catholic online media with the aim of understanding the presence of information (and misinformation) in this community. An algorithm designed for each media outlet collected COVID-19 vaccine-related publications from 109 Catholic media outlets in five languages. In total, 970 publications were analysed for journalistic genres, types of headlines and sources of information. The results show that most publications are informative and most of their headlines are neutral. However, opinion articles have mostly negative headlines. Furthermore, a higher percentage of the opinion authors come from the religious sphere and most of the sources cited are religious. Finally, 35% of the publications relate the vaccine to the framing issue of abortion.
Collapse
Affiliation(s)
- Verónica Israel-Turim
- Blanquerna School of Communications and International Relations, Ramon Llull University, 08022 Barcelona, Spain
| | - Valentina Laferrara
- Blanquerna School of Communications and International Relations, Ramon Llull University, 08022 Barcelona, Spain
| | - Ana Regina Rego
- Rede Nacional de Combate à Desinformação RNCd, Rio de Janeiro 21941-853, Brazil
| | - Josep Lluís Micó-Sanz
- Blanquerna School of Communications and International Relations, Ramon Llull University, 08022 Barcelona, Spain
| |
Collapse
|
145
|
Paciello M, Corbelli G, D'Errico F. The role of self-efficacy beliefs in dealing with misinformation among adolescents. Front Psychol 2023; 14:1155280. [PMID: 37275715 PMCID: PMC10233930 DOI: 10.3389/fpsyg.2023.1155280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 05/02/2023] [Indexed: 06/07/2023] Open
Abstract
The present study aims to understand the processes involved in misinformation among adolescents by examining the role of self-efficacy beliefs in dealing with misleading news. Specifically, we argue that the perceived capability to analyze and reflect critically on the reliability of online information sources should be stayed with the perceived self-regulatory capability to resist online social pressures to share unverifiable news. Moreover, we posited that specific online self-efficacies beliefs can be promoted by the capabilities related to regulating emotions and reflecting on new problems. In a sample of 273, we tested a path analysis model. The results attest that self-efficacy beliefs in dealing with online misinformation refer to specific capabilities: an active one, related to checking the sources of the news in order to validate their content, and an inhibitory one, related to the capability to refrain from sharing the news that seems unreliable. Moreover, self-efficacy beliefs in self-control during online interaction spreading misleading news are supported by cognitive reflective capability and self-efficacy in regulating negative emotion. The relationship between active self-efficacy related to fact-checking and sharing misleading news is not significant. The role of regulation in sharing misinformation during activated online dynamics is discussed.
Collapse
Affiliation(s)
| | | | - Francesca D'Errico
- Department of Education, Psychology and Communication Studies, University of Bari "Aldo Moro", Bari, Italy
| |
Collapse
|
146
|
fu L, Peng H, Liu S. KG-MFEND: an efficient knowledge graph-based model for multi-domain fake news detection. THE JOURNAL OF SUPERCOMPUTING 2023; 79:1-28. [PMID: 37359329 PMCID: PMC10184086 DOI: 10.1007/s11227-023-05381-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 05/03/2023] [Indexed: 06/28/2023]
Abstract
The widespread dissemination of fake news on social media brings adverse effects on the public and social development. Most existing techniques are limited to a single domain (e.g., medicine or politics) to identify fake news. However, many differences exist commonly across domains, such as word usage, which lead to those methods performing poorly in other domains. In the real world, social media releases millions of news pieces in diverse domains every day. Therefore, it is of significant practical importance to propose a fake news detection model that can be applied to multiple domains. In this paper, we propose a novel framework based on knowledge graphs (KG) for multi-domain fake news detection, named KG-MFEND. The model's performance is enhanced by improving the BERT and integrating external knowledge to alleviate domain differences at the word level. Specifically, we construct a new KG that encompasses multi-domain knowledge and injects entity triples to build a sentence tree to enrich the news background knowledge. To solve the problem of embedding space and knowledge noise, we use the soft position and visible matrix in knowledge embedding. To reduce the influence of label noise, we add label smoothing to the training. Extensive experiments are conducted on real Chinese datasets. And the results show that KG-MFEND has a strong generalization capability in single, mixed, and multiple domains and outperforms the current state-of-the-art methods for multi-domain fake news detection.
Collapse
Affiliation(s)
- Lifang fu
- Northeast Agricultural University, Harbin, 150030 China
| | - Huanxin Peng
- School of Engineering, Northeast Agricultural University, Harbin, 150030 China
| | - Shuai Liu
- School of Engineering, Northeast Agricultural University, Harbin, 150030 China
| |
Collapse
|
147
|
Handley-Miner IJ, Pope M, Atkins RK, Jones-Jang SM, McKaughan DJ, Phillips J, Young L. The intentions of information sources can affect what information people think qualifies as true. Sci Rep 2023; 13:7718. [PMID: 37173351 PMCID: PMC10182088 DOI: 10.1038/s41598-023-34806-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 05/08/2023] [Indexed: 05/15/2023] Open
Abstract
The concept of truth is at the core of science, journalism, law, and many other pillars of modern society. Yet, given the imprecision of natural language, deciding what information should count as true is no easy task, even with access to the ground truth. How do people decide whether a given claim of fact qualifies as true or false? Across two studies (N = 1181; 16,248 observations), participants saw claims of fact alongside the ground truth about those claims. Participants classified each claim as true or false. Although participants knew precisely how accurate the claims were, participants classified claims as false more often when they judged the information source to be intending to deceive (versus inform) their audience, and classified claims as true more often when they judged the information source to be intending to provide an approximate (versus precise) account. These results suggest that, even if people have access to the same set of facts, they might disagree about the truth of claims if they attribute discrepant intentions to information sources. Such findings may shed light on the robust and persistent disagreements over claims of fact that have arisen in the "post-truth era".
Collapse
Affiliation(s)
- Isaac J Handley-Miner
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, MA, 02467, USA.
| | - Michael Pope
- Department of Philosophy, Boston College, Chestnut Hill, MA, 02467, USA
| | | | - S Mo Jones-Jang
- Department of Communication, Boston College, Chestnut Hill, MA, 02467, USA
| | | | - Jonathan Phillips
- Program in Cognitive Science, Dartmouth College, Hanover, NH, 03755, USA
| | - Liane Young
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, MA, 02467, USA
| |
Collapse
|
148
|
Wang X, Li J, Srivatsavaya E, Rajtmajer S. Evidence of inter-state coordination amongst state-backed information operations. Sci Rep 2023; 13:7716. [PMID: 37173357 PMCID: PMC10182002 DOI: 10.1038/s41598-023-34245-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2023] [Accepted: 04/26/2023] [Indexed: 05/15/2023] Open
Abstract
Since 2018, Twitter has steadily released into the public domain content discovered on the platform and believed to be associated with information operations originating from more than a dozen state-backed organizations. Leveraging this dataset, we explore inter-state coordination amongst state-backed information operations and find evidence of intentional, strategic interaction amongst thirteen different states, separate and distinct from within-state operations. We find that coordinated, inter-state information operations attract greater engagement than baseline information operations and appear to come online in service to specific aims. We explore these ideas in depth through two case studies on the coordination between Cuba and Venezuela, and between Russia and Iran.
Collapse
Affiliation(s)
- Xinyu Wang
- College of Information Sciences and Technology, The Pennsylvania State University, University Park, PA, USA.
| | - Jiayi Li
- College of Information Sciences and Technology, The Pennsylvania State University, University Park, PA, USA
| | - Eesha Srivatsavaya
- College of Information Sciences and Technology, The Pennsylvania State University, University Park, PA, USA
| | - Sarah Rajtmajer
- College of Information Sciences and Technology, The Pennsylvania State University, University Park, PA, USA.
| |
Collapse
|
149
|
Vintilă M, Lăzărescu GM, Kalaitzaki A, Tudorel OI, Goian C. Fake news during the war in Ukraine: coping strategies and fear of war in the general population of Romania and in aid workers. Front Psychol 2023; 14:1151794. [PMID: 37251050 PMCID: PMC10213335 DOI: 10.3389/fpsyg.2023.1151794] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/26/2023] [Indexed: 05/31/2023] Open
Abstract
Introduction In addition to the health crisis that erupted during the COVID-19 pandemic, the war between Russia and Ukraine is impacting the mental health and wellbeing of the Romanian population in a negative way. Objectives This study sets out to investigate the impact that social media consumption and an overload of information related to the armed conflict between Russia and Ukraine is having on the distribution of fake news among Romanians. In addition, it explores the way in which several psychological features, including resilience, general health, perceived stress, coping strategies, and fear of war, change as a function of exposure to traumatic events or interaction with victims of war. Methods Participants (N = 633) completed the General Health Questionnaire (GHQ), the CERQ scale with its nine subscales, the Perceived Stress Scale (PSS), and the BRS scale (Brief Resilience Scale), the last of which measures resilience. Information overload, information strain and the likelihood of the person concerned spreading fake news were assessed by adapting items related to these variables. Findings Our results suggest that information strain partially moderates the relationship between information overload and the tendency to spread false information. Also, they indicate that information strain partially moderates the relationship between time spent online and the tendency to spread false information. Furthermore, our findings imply that there are differences of high and moderate significance between those who worked with refugees and those who did not as regards fear of war and coping strategies. We found no practical differences between the two groups as regards general health, level of resilience and perceived stress. Conclusion and recommendations The importance of discovering the reasons why people share false information is discussed, as is the need to adopt strategies to combat this behavior, including infographics and games designed to teach people how to detect fake news. At the same time, aid workers need to be further supported to maintain a high level of psychological wellbeing.
Collapse
Affiliation(s)
- Mona Vintilă
- Faculty of Sociology and Psychology, West University of Timișoara, Timișoara, Romania
| | | | - Argyroula Kalaitzaki
- Department of Social Work, Faculty of Health Sciences, Hellenic Mediterranean University, Heraklion, Greece
| | - Otilia Ioana Tudorel
- Faculty of Sociology and Psychology, West University of Timișoara, Timișoara, Romania
| | - Cosmin Goian
- Faculty of Sociology and Psychology, West University of Timișoara, Timișoara, Romania
| |
Collapse
|
150
|
Peren Arin K, Mazrekaj D, Thum M. Ability of detecting and willingness to share fake news. Sci Rep 2023; 13:7298. [PMID: 37147456 PMCID: PMC10160725 DOI: 10.1038/s41598-023-34402-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Accepted: 04/28/2023] [Indexed: 05/07/2023] Open
Abstract
By conducting large-scale surveys in Germany and the United Kingdom, we investigate the individual-level determinants of the ability to detect fake news and the inclination to share it. We distinguish between deliberate and accidental sharing of fake news. We document that accidental sharing is much more common than deliberate sharing. Furthermore, our results indicate that older, male, high-income, and politically left-leaning respondents better detect fake news. We also find that accidental sharing decreases with age and is more prevalent among right-leaning respondents. Deliberate sharing of fake news is more prevalent among younger respondents in the United Kingdom. Finally, our results imply that respondents have a good assessment of their ability to detect fake news: those we identified as accidental sharers were also more likely to have admitted to having shared fake news.
Collapse
Affiliation(s)
- K Peren Arin
- Zayed University, Abu Dhabi, United Arab Emirates
- Centre for Applied Macroeconomic Analysis, Canberra, Australia
| | - Deni Mazrekaj
- Utrecht University, Utrecht, The Netherlands.
- University of Oxford, Oxford, UK.
- KU Leuven, Leuven, Belgium.
| | - Marcel Thum
- TU Dresden, Dresden, Germany
- ifo Dresden, Dresden, Germany
- CESifo, Munich, Germany
| |
Collapse
|