1
|
Nie Y, Zuo L, Mao J, He X, Xiao H. The Effect of Positive Emotions on Prosocial Behavior During Ego-Depletion: Evidence From fNIRS. Psychol Res Behav Manag 2025; 18:641-655. [PMID: 40129961 PMCID: PMC11930627 DOI: 10.2147/prbm.s502161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2024] [Accepted: 02/28/2025] [Indexed: 03/26/2025] Open
Abstract
Purpose The psychological and neural mechanisms between relieving ego depletion and prosocial behavior have yet to be clearly explored. To address this, we combined behavioral experiments and fNIRS to explore how positive emotions promote prosocial tendencies under ego depletion. Methods In Experiment 1, 119 college participants (Mage=19.7±1.46) completed a dual-task self-control paradigm, confirming that ego depletion negatively impacts prosocial behavior. Experiment 2 involved 48 college participants (Mage=20.26±2.06) and combined behavioral tasks with functional near-infrared spectroscopy (fNIRS) to examine how positive emotions mitigate ego depletion and enhance prosocial behavior. Results Experiment 1 showed that participants in the low ego depletion group had a significantly higher bonus allocation amount than the high ego depletion group (t (62) =-2.24, p < 0.05). Experiment 2 showed that after both groups completed the ego depletion task, participants in the positive emotion group allocated significantly higher bonus amounts than those in the neutral emotion group (t (46) =2.06, p <0.05). And the β values for channel ch15 (right dorsolateral superior frontal gyrus) and channel ch20 (right medial superior frontal gyrus) were significantly higher in the positive emotion group compared to the neutral emotion group (p < 0.05). The β value for channel ch7 (left medial superior frontal gyrus) was also higher in the positive emotion group, approaching statistical significance (p = 0.068). Conclusion Those findings revealed that high ego depletion reduced prosocial behavior. Additionally, positive emotions alleviated ego depletion and promoted prosocial behavior by activating the medial superior frontal gyrus (SFGmed) and right dorsolateral superior frontal gyrus (SFGdor) negatively.
Collapse
Affiliation(s)
- Yangang Nie
- Research Center of Adolescent Psychology and Behavior, School of Education, Guangzhou University, Guangzhou, Guangdong, People’s Republic of China
| | - Lihua Zuo
- Research Center of Adolescent Psychology and Behavior, School of Education, Guangzhou University, Guangzhou, Guangdong, People’s Republic of China
- Mental Health and Counseling Center, Guangdong Ocean University, Zhanjiang, Guangdong, People’s Republic of China
| | - Jian Mao
- Research Center of Adolescent Psychology and Behavior, School of Education, Guangzhou University, Guangzhou, Guangdong, People’s Republic of China
| | - Xiaoqing He
- Research Center of Adolescent Psychology and Behavior, School of Education, Guangzhou University, Guangzhou, Guangdong, People’s Republic of China
| | - He Xiao
- Research Center of Adolescent Psychology and Behavior, School of Education, Guangzhou University, Guangzhou, Guangdong, People’s Republic of China
| |
Collapse
|
2
|
Scarpazza C, Gramegna C, Costa C, Pezzetta R, Saetti MC, Preti AN, Difonzo T, Zago S, Bolognini N. The Emotion Authenticity Recognition (EAR) test: normative data of an innovative test using dynamic emotional stimuli to evaluate the ability to recognize the authenticity of emotions expressed by faces. Neurol Sci 2025; 46:133-145. [PMID: 39023709 PMCID: PMC11698814 DOI: 10.1007/s10072-024-07689-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Accepted: 07/08/2024] [Indexed: 07/20/2024]
Abstract
Despite research has massively focused on how emotions conveyed by faces are perceived, the perception of emotions' authenticity is a topic that has been surprisingly overlooked. Here, we present the Emotion Authenticity Recognition (EAR) test, a test specifically developed using dynamic stimuli depicting authentic and posed emotions to evaluate the ability of individuals to correctly identify an emotion (emotion recognition index, ER Index) and classify its authenticity (authenticity recognition index (EA Index). The EAR test has been validated on 522 healthy participants and normative values are provided. Correlations with demographic characteristics, empathy and general cognitive status have been obtained revealing that both indices are negatively correlated with age, and positively with education, cognitive status and different facets of empathy. The EAR test offers a new ecological test to assess the ability to detect emotion authenticity that allow to explore the eventual social cognitive deficit even in patients otherwise cognitively intact.
Collapse
Affiliation(s)
- Cristina Scarpazza
- Department of General Psychology, University of Padova, Via Venezia 8, Padova, PD, Italy.
- IRCCS S Camillo Hospital, Venezia, Italy.
| | - Chiara Gramegna
- Ph.D. Program in Neuroscience, School of Medicine and Surgery, University of Milano-Bicocca, Monza, Italy
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Cristiano Costa
- Padova Neuroscience Center, University of Padova, Padova, Italy
| | | | - Maria Cristina Saetti
- Neurology Unit, IRCCS Fondazione Ca' Granda Ospedale Maggiore Policlinico, Milan, Italy
- Department of Pathophysiology and Transplantation, University of Milan, Milan, Italy
| | - Alice Naomi Preti
- Ph.D. Program in Neuroscience, School of Medicine and Surgery, University of Milano-Bicocca, Monza, Italy
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Teresa Difonzo
- Neurology Unit, Foundation IRCCS Ca' Granda Hospital Maggiore Policlinico, Milano, Italy
| | - Stefano Zago
- Neurology Unit, Foundation IRCCS Ca' Granda Hospital Maggiore Policlinico, Milano, Italy
| | - Nadia Bolognini
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
- Laboratory of Neuropsychology, Department of Neurorehabilitation Sciences, IRCCS Istituto Auxologico Italiano, Milano, Italy
| |
Collapse
|
3
|
Buckland CB, Taubert J. A database of naturalistic expressive faces for studying high arousal states. THE JOURNAL OF PAIN 2025; 26:104728. [PMID: 39515655 DOI: 10.1016/j.jpain.2024.104728] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 10/30/2024] [Accepted: 11/03/2024] [Indexed: 11/16/2024]
Abstract
Recent studies comparing behavior to different sets of facial stimuli have highlighted a need to employ more naturalistic, genuine facial expressions in cognitive research. To address this need, we identified and selected a large set of highly expressive face stimuli from the public domain, and used these stimuli to test whether participants can recognise when others are experiencing pain from their facial behaviour. After identifying 315 expressive faces to represent the kinds of facial behaviours often seen in three distinct contexts (i.e., injury-related, loss-related and victory-related), we ran six behavioural ratings tasks to characterise these faces along six dimensions; level of arousal, emotional valence, level of physical pain, attractiveness, familiarity, and perceived gender. The results indicate that injury-related expressions are recognised as lower in emotional valence than victory-related expressions, and higher in psychological arousal than both victory- and loss-related expressions. Overall, these findings suggest that the intense, energetic expressions of people in competitive situations are not rendered ambiguous to third parties by increased arousal. These results validate the use of naturalistic facial expressions in studies of non-verbal, injury-related behaviours and their recognition in forensic and clinical settings. PERSPECTIVE: Here we created and validated a large set of visual stimuli, which have been made available to the scientific community. Our results demonstrate that among high-arousal states, expressions related to feelings of intense pain and injury are visually distinct from expressions related to loss or triumph. Thus, the Wild Faces Database - High Arousal States (WFD-HAS) extension provides an important tool for understanding how we recognise injury-related facial expressions in the real world.
Collapse
Affiliation(s)
- Christopher B Buckland
- School of Psychology, The University of Queensland, Brisbane 4072, Queensland, Australia.
| | - Jessica Taubert
- School of Psychology, The University of Queensland, Brisbane 4072, Queensland, Australia
| |
Collapse
|
4
|
Obayashi Y, Uehara S, Yuasa A, Otaka Y. The other person's smiling amount affects one's smiling response during face-to-face conversations. Front Behav Neurosci 2024; 18:1420361. [PMID: 39184933 PMCID: PMC11341491 DOI: 10.3389/fnbeh.2024.1420361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2024] [Accepted: 07/29/2024] [Indexed: 08/27/2024] Open
Abstract
Introduction Smiling during conversation occurs interactively between people and is known to build good interpersonal relationships. However, whether and how much the amount that an individual smiles is influenced by the other person's smile has remained unclear. This study aimed to quantify the amount of two individuals' smiles during conversations and investigate the dependency of one's smile amount (i.e., intensity and frequency) on that of the other. Method Forty participants (20 females) engaged in three-minute face-to-face conversations as speakers with a listener (male or female), under three conditions, where the amount of smiling response by listeners was controlled as "less," "moderate," and "greater." The amount of the smiles was quantified based on their facial movements through automated facial expression analysis. Results The results showed that the amount of smiling by the speaker changed significantly depending on the listener's smile amount; when the listeners smiled to a greater extent, the speakers tended to smile more, especially when they were of the same gender (i.e., male-male and female-female pairs). Further analysis revealed that the smiling intensities of the two individuals changed in a temporally synchronized manner. Discussion These results provide quantitative evidence for the dependence of one's smile on the other's smile, and the differential effect between gender pairs.
Collapse
Affiliation(s)
- Yota Obayashi
- Department of Rehabilitation, Fujita Health University Hospital, Aichi, Japan
| | - Shintaro Uehara
- Faculty of Rehabilitation, Fujita Health University School of Health Sciences, Aichi, Japan
| | - Akiko Yuasa
- Department of Rehabilitation Medicine, Fujita Health University School of Medicine, Aichi, Japan
- Japan Society for the Promotion of Science, Tokyo, Japan
| | - Yohei Otaka
- Department of Rehabilitation Medicine, Fujita Health University School of Medicine, Aichi, Japan
| |
Collapse
|
5
|
Miolla A, Cardaioli M, Scarpazza C. Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions. Behav Res Methods 2023; 55:2559-2574. [PMID: 36002622 PMCID: PMC10439033 DOI: 10.3758/s13428-022-01914-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/15/2022] [Indexed: 11/08/2022]
Abstract
Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans' interpretation of and reaction to various emotions. However, scientific research on emotion mainly relied on static pictures of facial expressions posed (i.e., simulated) by actors, creating a significant bias in emotion literature. This dataset tries to fill this gap, providing a considerable amount (N = 1458) of dynamic genuine (N = 707) and posed (N = 751) clips of the six universal emotions from 56 participants. The dataset is available in two versions: original clips, including participants' body and background, and modified clips, where only the face of participants is visible. Notably, the original dataset has been validated by 122 human raters, while the modified dataset has been validated by 280 human raters. Hit rates for emotion and genuineness, as well as the mean, standard deviation of genuineness, and intensity perception, are provided for each clip to allow future users to select the most appropriate clips needed to answer their scientific questions.
Collapse
Affiliation(s)
- A. Miolla
- Department of General Psychology, University of Padua, Padua, Italy
| | - M. Cardaioli
- Department of Mathematics, University of Padua, Padua, Italy
- GFT Italy, Milan, Italy
| | - C. Scarpazza
- Department of General Psychology, University of Padua, Padua, Italy
| |
Collapse
|
6
|
Wang R, Lu X, Jiang Y. Distributed and hierarchical neural encoding of multidimensional biological motion attributes in the human brain. Cereb Cortex 2023; 33:8510-8522. [PMID: 37118887 PMCID: PMC10786095 DOI: 10.1093/cercor/bhad136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2023] [Revised: 03/31/2023] [Accepted: 04/01/2023] [Indexed: 04/30/2023] Open
Abstract
The human visual system can efficiently extract distinct physical, biological, and social attributes (e.g. facing direction, gender, and emotional state) from biological motion (BM), but how these attributes are encoded in the brain remains largely unknown. In the current study, we used functional magnetic resonance imaging to investigate this issue when participants viewed multidimensional BM stimuli. Using multiple regression representational similarity analysis, we identified distributed brain areas, respectively, related to the processing of facing direction, gender, and emotional state conveyed by BM. These brain areas are governed by a hierarchical structure in which the respective neural encoding of facing direction, gender, and emotional state is modulated by each other in descending order. We further revealed that a portion of the brain areas identified in representational similarity analysis was specific to the neural encoding of each attribute and correlated with the corresponding behavioral results. These findings unravel the brain networks for encoding BM attributes in consideration of their interactions, and highlight that the processing of multidimensional BM attributes is recurrently interactive.
Collapse
Affiliation(s)
- Ruidi Wang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China
- Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China
| | - Xiqian Lu
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China
- Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China
| | - Yi Jiang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China
- Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China
| |
Collapse
|
7
|
Long H, Peluso N, Baker CI, Japee S, Taubert J. A database of heterogeneous faces for studying naturalistic expressions. Sci Rep 2023; 13:5383. [PMID: 37012369 PMCID: PMC10070342 DOI: 10.1038/s41598-023-32659-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Facial expressions are thought to be complex visual signals, critical for communication between social agents. Most prior work aimed at understanding how facial expressions are recognized has relied on stimulus databases featuring posed facial expressions, designed to represent putative emotional categories (such as 'happy' and 'angry'). Here we use an alternative selection strategy to develop the Wild Faces Database (WFD); a set of one thousand images capturing a diverse range of ambient facial behaviors from outside of the laboratory. We characterized the perceived emotional content in these images using a standard categorization task in which participants were asked to classify the apparent facial expression in each image. In addition, participants were asked to indicate the intensity and genuineness of each expression. While modal scores indicate that the WFD captures a range of different emotional expressions, in comparing the WFD to images taken from other, more conventional databases, we found that participants responded more variably and less specifically to the wild-type faces, perhaps indicating that natural expressions are more multiplexed than a categorical model would predict. We argue that this variability can be employed to explore latent dimensions in our mental representation of facial expressions. Further, images in the WFD were rated as less intense and more genuine than images taken from other databases, suggesting a greater degree of authenticity among WFD images. The strong positive correlation between intensity and genuineness scores demonstrating that even the high arousal states captured in the WFD were perceived as authentic. Collectively, these findings highlight the potential utility of the WFD as a new resource for bridging the gap between the laboratory and real world in studies of expression recognition.
Collapse
Affiliation(s)
- Houqiu Long
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Natalie Peluso
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia.
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| |
Collapse
|
8
|
Lee M, Lori A, Langford NA, Rilling JK. The neural basis of smile authenticity judgments and the potential modulatory role of the oxytocin receptor gene (OXTR). Behav Brain Res 2023; 437:114144. [PMID: 36216140 DOI: 10.1016/j.bbr.2022.114144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Revised: 09/03/2022] [Accepted: 09/30/2022] [Indexed: 11/13/2022]
Abstract
Accurate perception of genuine vs. posed smiles is crucial for successful social navigation in humans. While people vary in their ability to assess the authenticity of smiles, little is known about the specific biological mechanisms underlying this variation. We investigated the neural substrates of smile authenticity judgments using functional magnetic resonance imaging (fMRI). We also tested a preliminary hypothesis that a common polymorphism in the oxytocin receptor gene (OXTR) rs53576 would modulate the behavioral and neural indices of accurate smile authenticity judgments. A total of 185 healthy adult participants (Neuroimaging arm: N = 44, Behavioral arm: N = 141) determined the authenticity of dynamic facial expressions of genuine and posed smiles either with or without fMRI scanning. Correctly identified genuine vs. posed smiles activated brain areas involved with reward processing, facial mimicry, and mentalizing. Activation within the inferior frontal gyrus and dorsomedial prefrontal cortex correlated with individual differences in sensitivity (d') and response criterion (C), respectively. Our exploratory genetic analysis revealed that rs53576 G homozygotes in the neuroimaging arm had a stronger tendency to judge posed smiles as genuine than did A allele carriers and showed decreased activation in the medial prefrontal cortex when viewing genuine vs. posed smiles. Yet, OXTR rs53576 did not modulate task performance in the behavioral arm, which calls for further studies to evaluate the legitimacy of this result. Our findings extend previous literature on the biological foundations of smile authenticity judgments, particularly emphasizing the involvement of brain regions implicated in reward, facial mimicry, and mentalizing.
Collapse
Affiliation(s)
| | - Adriana Lori
- Department of Psychiatry and Behavioral Science, USA
| | - Nicole A Langford
- Department of Psychiatry and Behavioral Science, USA; Nell Hodgson Woodruff School of Nursing, USA
| | - James K Rilling
- Department of Anthropology, USA; Department of Psychiatry and Behavioral Science, USA; Center for Behavioral Neuroscience, USA; Emory National Primate Research Center, USA; Center for Translational Social Neuroscience, USA.
| |
Collapse
|
9
|
Straulino E, Scarpazza C, Sartori L. What is missing in the study of emotion expression? Front Psychol 2023; 14:1158136. [PMID: 37179857 PMCID: PMC10173880 DOI: 10.3389/fpsyg.2023.1158136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/06/2023] [Indexed: 05/15/2023] Open
Abstract
While approaching celebrations for the 150 years of "The Expression of the Emotions in Man and Animals", scientists' conclusions on emotion expression are still debated. Emotion expression has been traditionally anchored to prototypical and mutually exclusive facial expressions (e.g., anger, disgust, fear, happiness, sadness, and surprise). However, people express emotions in nuanced patterns and - crucially - not everything is in the face. In recent decades considerable work has critiqued this classical view, calling for a more fluid and flexible approach that considers how humans dynamically perform genuine expressions with their bodies in context. A growing body of evidence suggests that each emotional display is a complex, multi-component, motoric event. The human face is never static, but continuously acts and reacts to internal and environmental stimuli, with the coordinated action of muscles throughout the body. Moreover, two anatomically and functionally different neural pathways sub-serve voluntary and involuntary expressions. An interesting implication is that we have distinct and independent pathways for genuine and posed facial expressions, and different combinations may occur across the vertical facial axis. Investigating the time course of these facial blends, which can be controlled consciously only in part, is recently providing a useful operational test for comparing the different predictions of various models on the lateralization of emotions. This concise review will identify shortcomings and new challenges regarding the study of emotion expressions at face, body, and contextual levels, eventually resulting in a theoretical and methodological shift in the study of emotions. We contend that the most feasible solution to address the complex world of emotion expression is defining a completely new and more complete approach to emotional investigation. This approach can potentially lead us to the roots of emotional display, and to the individual mechanisms underlying their expression (i.e., individual emotional signatures).
Collapse
Affiliation(s)
- Elisa Straulino
- Department of General Psychology, University of Padova, Padova, Italy
- *Correspondence: Elisa Straulino,
| | - Cristina Scarpazza
- Department of General Psychology, University of Padova, Padova, Italy
- IRCCS San Camillo Hospital, Venice, Italy
| | - Luisa Sartori
- Department of General Psychology, University of Padova, Padova, Italy
- Padova Neuroscience Center, University of Padova, Padova, Italy
- Luisa Sartori,
| |
Collapse
|
10
|
Moon H, Nam G, Hur JW. Neural correlates of affective theory of mind in medication-free nonsuicidal self-injury: An fMRI study. Front Psychiatry 2022; 13:850794. [PMID: 35935406 PMCID: PMC9354394 DOI: 10.3389/fpsyt.2022.850794] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/08/2022] [Accepted: 06/27/2022] [Indexed: 11/13/2022] Open
Abstract
Emerging evidence indicates that emotion processing deficits are associated with nonsuicidal self-injury (NSSI). However, limited attention has been paid to the socio-affective functions of NSSI. In this study, we aimed to investigate the affective theory of mind (ToM) in medication-free individuals engaging in NSSI at both behavioral and neural levels. Twenty-eight individuals (mean age = 22.96 years) who engaged in NSSI and 38 age-, sex-, and IQ-matched controls (mean age = 22.79 years) underwent functional magnetic resonance imaging while performing the "Reading the Mind in the Eyes Test" (RMET). All participants also completed the Difficulties in Emotion Regulation Scale (DERS), Toronto Alexithymia Scale (TAS-20), and Beck Scale for Suicide Ideation (BSI). Although we did not find significant group differences in the RMET performance, the NSSI group, relative to the controls, exhibited significantly greater left medial superior frontal lobe activation and decreased right angular gyrus activation than did the control group. Reduced right angular gyrus activity was related to higher DERS and TAS scores across all participants. Our findings provide new evidence for aberrant neural processing of affective ToM in self-injurers. Future studies in developing intervention protocols for NSSI should focus on the multifaceted phases of socio-affective processing.
Collapse
Affiliation(s)
- Hyeri Moon
- School of Psychology, Korea University, Seoul, South Korea
| | - Gieun Nam
- Department of Psychology, Chung-Ang University, Seoul, South Korea
| | - Ji-Won Hur
- School of Psychology, Korea University, Seoul, South Korea
| |
Collapse
|
11
|
Menting-Henry S, Hidalgo-Lopez E, Aichhorn M, Kronbichler M, Kerschbaum H, Pletzer B. Oral Contraceptives Modulate the Relationship Between Resting Brain Activity, Amygdala Connectivity and Emotion Recognition - A Resting State fMRI Study. Front Behav Neurosci 2022; 16:775796. [PMID: 35368304 PMCID: PMC8967165 DOI: 10.3389/fnbeh.2022.775796] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 01/24/2022] [Indexed: 12/26/2022] Open
Abstract
Recent research into the effects of hormonal contraceptives on emotion processing and brain function suggests that hormonal contraceptive users show (a) reduced accuracy in recognizing emotions compared to naturally cycling women, and (b) alterations in amygdala volume and connectivity at rest. To date, these observations have not been linked, although the amygdala has certainly been identified as core region activated during emotion recognition. To assess, whether volume, oscillatory activity and connectivity of emotion-related brain areas at rest are predictive of participant's ability to recognize facial emotional expressions, 72 participants (20 men, 20 naturally cycling women, 16 users of androgenic contraceptives, 16 users of anti-androgenic contraceptives) completed a brain structural and resting state fMRI scan, as well as an emotion recognition task. Our results showed that resting brain characteristics did not mediate oral contraceptive effects on emotion recognition performance. However, sex and oral contraceptive use emerged as a moderator of brain-behavior associations. Sex differences did emerge in the prediction of emotion recognition performance by the left amygdala amplitude of low frequency oscillations (ALFF) for anger, as well as left and right amygdala connectivity for fear. Anti-androgenic oral contraceptive users (OC) users stood out in that they showed strong brain-behavior associations, usually in the opposite direction as naturally cycling women, while androgenic OC-users showed a pattern similar to, but weaker, than naturally cycling women. This result suggests that amygdala ALFF and connectivity have predictive values for facial emotion recognition. The importance of the different connections depends heavily on sex hormones and oral contraceptive use.
Collapse
Affiliation(s)
- Shanice Menting-Henry
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Esmeralda Hidalgo-Lopez
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Markus Aichhorn
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Martin Kronbichler
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
- Neuroscience Institute, Paracelsus Medical University, Salzburg, Austria
| | - Hubert Kerschbaum
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Biosciences, University of Salzburg, Salzburg, Austria
| | - Belinda Pletzer
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| |
Collapse
|
12
|
Yi J, Pärnamets P, Olsson A. The face value of feedback: facial behaviour is shaped by goals and punishments during interaction with dynamic faces. ROYAL SOCIETY OPEN SCIENCE 2021; 8:202159. [PMID: 34295516 PMCID: PMC8278067 DOI: 10.1098/rsos.202159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Accepted: 06/21/2021] [Indexed: 06/13/2023]
Abstract
Responding appropriately to others' facial expressions is key to successful social functioning. Despite the large body of work on face perception and spontaneous responses to static faces, little is known about responses to faces in dynamic, naturalistic situations, and no study has investigated how goal directed responses to faces are influenced by learning during dyadic interactions. To experimentally model such situations, we developed a novel method based on online integration of electromyography signals from the participants' face (corrugator supercilii and zygomaticus major) during facial expression exchange with dynamic faces displaying happy and angry facial expressions. Fifty-eight participants learned by trial-and-error to avoid receiving aversive stimulation by either reciprocate (congruently) or respond opposite (incongruently) to the expression of the target face. Our results validated our method, showing that participants learned to optimize their facial behaviour, and replicated earlier findings of faster and more accurate responses in congruent versus incongruent conditions. Moreover, participants performed better on trials when confronted with smiling, when compared with frowning, faces, suggesting it might be easier to adapt facial responses to positively associated expressions. Finally, we applied drift diffusion and reinforcement learning models to provide a mechanistic explanation for our findings which helped clarifying the underlying decision-making processes of our experimental manipulation. Our results introduce a new method to study learning and decision-making in facial expression exchange, in which there is a need to gradually adapt facial expression selection to both social and non-social reinforcements.
Collapse
Affiliation(s)
- Jonathan Yi
- Department of Clinical Neuroscience, Division of Psychology, Karolinska Institutet, Solna, Sweden
| | - Philip Pärnamets
- Department of Clinical Neuroscience, Division of Psychology, Karolinska Institutet, Solna, Sweden
- Department of Psychology, New York University, New York, NY, USA
| | - Andreas Olsson
- Department of Clinical Neuroscience, Division of Psychology, Karolinska Institutet, Solna, Sweden
| |
Collapse
|
13
|
Facial expression recognition: A meta-analytic review of theoretical models and neuroimaging evidence. Neurosci Biobehav Rev 2021; 127:820-836. [PMID: 34052280 DOI: 10.1016/j.neubiorev.2021.05.023] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2020] [Revised: 04/03/2021] [Accepted: 05/24/2021] [Indexed: 11/23/2022]
Abstract
Discrimination of facial expressions is an elementary function of the human brain. While the way emotions are represented in the brain has long been debated, common and specific neural representations in recognition of facial expressions are also complicated. To examine brain organizations and asymmetry on discrete and dimensional facial emotions, we conducted an activation likelihood estimation meta-analysis and meta-analytic connectivity modelling on 141 studies with a total of 3138 participants. We found consistent engagement of the amygdala and a common set of brain networks across discrete and dimensional emotions. The left-hemisphere dominance of the amygdala and AI across categories of facial expression, but category-specific lateralization of the vmPFC, suggesting a flexibly asymmetrical neural representations of facial expression recognition. These results converge to characteristic activation and connectivity patterns across discrete and dimensional emotion categories in recognition of facial expressions. Our findings provide the first quantitatively meta-analytic brain network-based evidence supportive of the psychological constructionist hypothesis in facial expression recognition.
Collapse
|
14
|
Levine SM, Kumpf M, Rupprecht R, Schwarzbach JV. Supracategorical fear information revealed by aversively conditioning multiple categories. Cogn Neurosci 2020; 12:28-39. [PMID: 33135598 DOI: 10.1080/17588928.2020.1839039] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Fear-generalization is a critical function for survival, in which an organism extracts information from a specific instantiation of a threat (e.g., the western diamondback rattlesnake in my front yard on Sunday) and learns to fear - and accordingly respond to - pertinent higher-order information (e.g., snakes live in my yard). Previous work investigating fear-conditioning in humans has used functional magnetic resonance imaging (fMRI) to demonstrate that activity patterns representing stimuli from an aversively-conditioned category (CS+) are more similar to each other than those of a neutral category (CS-). Here we used fMRI and multiple aversively-conditioned categories to ask whether we would find only similarity increases within the CS+ categories or also similarity increases between the CS+ categories. Using representational similarity analysis, we correlated several models to activity patterns underlying different brain regions and found that, following fear-conditioning, between-category and within-category similarity increased for the CS+ categories in the insula, superior frontal gyrus (SFG), and the right temporal pole. When specifically investigating fear-generalization, these between- and within-category effects were detected in the SFG. These results advance prior pattern-based neuroimaging work by exploring the effect of aversively-conditioning multiple categories and indicate an extended role for such regions in potentially representing supracategorical information during fear-learning.
Collapse
Affiliation(s)
- Seth M Levine
- Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University , Mannheim, Germany.,Department of Psychiatry and Psychotherapy, University of Regensburg , Regensburg, Germany
| | - Miriam Kumpf
- Department of Psychiatry and Psychotherapy, University of Regensburg , Regensburg, Germany
| | - Rainer Rupprecht
- Department of Psychiatry and Psychotherapy, University of Regensburg , Regensburg, Germany
| | - Jens V Schwarzbach
- Department of Psychiatry and Psychotherapy, University of Regensburg , Regensburg, Germany
| |
Collapse
|
15
|
Ranjbar S, Mazidi M, Sharpe L, Dehghani M, Khatibi A. Attentional control moderates the relationship between pain catastrophizing and selective attention to pain faces on the antisaccade task. Sci Rep 2020; 10:12885. [PMID: 32732895 PMCID: PMC7393078 DOI: 10.1038/s41598-020-69910-2] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 05/07/2020] [Indexed: 11/09/2022] Open
Abstract
Cognitive models of chronic pain emphasize the critical role of pain catastrophizing in attentional bias to pain-related stimuli. The aim of this study was (a) to investigate the relationship between pain catastrophizing and the ability to inhibit selective attention to pain-related faces (attentional bias); and (b) to determine whether attentional control moderated this relationship. One hundred and ten pain-free participants completed the anti-saccade task with dynamic facial expressions, specifically painful, angry, happy, and neutral facial expressions and questionnaires including a measure of pain catastrophizing. As predicted, participants with high pain catastrophizing had significantly higher error rates for antisaccade trials with pain faces relative to other facial expressions, indicating a difficulty disinhibiting attention towards painful faces. In moderation analyses, data showed that attentional control moderated the relationship between attentional bias to pain faces and pain catastrophizing. Post-hoc analyses demonstrated that it was shifting attention (not focusing) that accounted for this effect. Only for those with high self-reported ability to shift attention was there a significant relationship between catastrophizing and attentional bias to pain. These findings confirm that attentional control is necessary for an association between attentional bias and catastrophizing to be observed, which may explain the lack of relationships between attentional bias and individual characteristics, such as catastrophizing, in prior research.
Collapse
Affiliation(s)
- Seyran Ranjbar
- Psychology Department, Shahid Beheshti University, Tehran, Iran
| | - Mahdi Mazidi
- Centre for the Advancement of Research on Emotion, The University of Western Australia, Crawley, WA, Australia
| | - Louise Sharpe
- School of Psychology, The University of Sydney, Sydney, NSW, Australia
| | - Mohsen Dehghani
- Psychology Department, Shahid Beheshti University, Tehran, Iran
| | - Ali Khatibi
- Centre of Precision Rehabilitation for Spinal Pain (CPR Spine), School of Sport, Exercise and Rehabilitation Sciences, College of Life and Environmental Sciences, University of Birmingham, Birmingham, B15 2TT, UK.
- Centre for Human Brain Health, University of Birmingham, Birmingham, UK.
| |
Collapse
|
16
|
Lander K, Butcher NL. Recognizing Genuine From Posed Facial Expressions: Exploring the Role of Dynamic Information and Face Familiarity. Front Psychol 2020; 11:1378. [PMID: 32719634 PMCID: PMC7347903 DOI: 10.3389/fpsyg.2020.01378] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2020] [Accepted: 05/22/2020] [Indexed: 11/13/2022] Open
Abstract
The accurate recognition of emotion is important for interpersonal interaction and when navigating our social world. However, not all facial displays reflect the emotional experience currently being felt by the expresser. Indeed, faces express both genuine and posed displays of emotion. In this article, we summarize the importance of motion for the recognition of face identity before critically outlining the role of dynamic information in determining facial expressions and distinguishing between genuine and posed expressions of emotion. We propose that both dynamic information and face familiarity may modulate our ability to determine whether an expression is genuine or not. Finally, we consider the shared role for dynamic information across different face recognition tasks and the wider impact of face familiarity on determining genuine from posed expressions during real-world interactions.
Collapse
Affiliation(s)
- Karen Lander
- Division of Neuroscience and Experimental Psychology, University of Manchester, Manchester, United Kingdom
| | - Natalie L Butcher
- School of Social Sciences, Humanities and Law, Teesside University, Middlesbrough, United Kingdom
| |
Collapse
|
17
|
Dricu M, Frühholz S. A neurocognitive model of perceptual decision-making on emotional signals. Hum Brain Mapp 2020; 41:1532-1556. [PMID: 31868310 PMCID: PMC7267943 DOI: 10.1002/hbm.24893] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2019] [Revised: 11/18/2019] [Accepted: 11/29/2019] [Indexed: 01/09/2023] Open
Abstract
Humans make various kinds of decisions about which emotions they perceive from others. Although it might seem like a split-second phenomenon, deliberating over which emotions we perceive unfolds across several stages of decisional processing. Neurocognitive models of general perception postulate that our brain first extracts sensory information about the world then integrates these data into a percept and lastly interprets it. The aim of the present study was to build an evidence-based neurocognitive model of perceptual decision-making on others' emotions. We conducted a series of meta-analyses of neuroimaging data spanning 30 years on the explicit evaluations of others' emotional expressions. We find that emotion perception is rather an umbrella term for various perception paradigms, each with distinct neural structures that underline task-related cognitive demands. Furthermore, the left amygdala was responsive across all classes of decisional paradigms, regardless of task-related demands. Based on these observations, we propose a neurocognitive model that outlines the information flow in the brain needed for a successful evaluation of and decisions on other individuals' emotions. HIGHLIGHTS: Emotion classification involves heterogeneous perception and decision-making tasks Decision-making processes on emotions rarely covered by existing emotions theories We propose an evidence-based neuro-cognitive model of decision-making on emotions Bilateral brain processes for nonverbal decisions, left brain processes for verbal decisions Left amygdala involved in any kind of decision on emotions.
Collapse
Affiliation(s)
- Mihai Dricu
- Department of PsychologyUniversity of BernBernSwitzerland
| | - Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, Department of PsychologyUniversity of ZurichZurichSwitzerland
- Neuroscience Center Zurich (ZNZ)University of Zurich and ETH ZurichZurichSwitzerland
- Center for Integrative Human Physiology (ZIHP)University of ZurichZurichSwitzerland
| |
Collapse
|
18
|
Krivan SJ, Thomas NA. A Call for the Empirical Investigation of Tear Stimuli. Front Psychol 2020; 11:52. [PMID: 32082220 PMCID: PMC7005069 DOI: 10.3389/fpsyg.2020.00052] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Accepted: 01/09/2020] [Indexed: 11/21/2022] Open
Abstract
Emotional crying is a uniquely human behavior, which typically elicits helping and empathic responses from observers. However, tears can also be used to deceive. "Crocodile tears" are insincere tears used to manipulate the observer and foster prosocial responses. The ability to discriminate between genuine and fabricated emotional displays is critical to social functioning. When insincere emotional displays are detected, they are most often met with backlash. Conversely, genuine displays foster prosocial responses. However, the majority of crying research conducted to date has used posed stimuli featuring artificial tears. As such it is yet to be determined how the artificial nature of these displays impacts person perception. Throughout this article, we discuss the necessity for empirical investigation of the differences (or similarities) in responses to posed and genuine tearful expressions. We will explore the recent adoption of genuine stimuli in emotion research and review the existing research using tear stimuli. We conclude by offering suggestions and considerations for future advancement of the emotional crying field through investigation of both posed and genuine tear stimuli.
Collapse
Affiliation(s)
- Sarah J. Krivan
- Department of Psychology, Applied Attention and Perceptual Processing Laboratory, College of Healthcare Sciences, James Cook University, Cairns, QLD, Australia
- Applied Attention and Perceptual Processing Laboratory, School of Psychological Sciences, Turner Institute for Brain and Mental Health, Monash University, Melbourne, VIC, Australia
| | - Nicole A. Thomas
- Applied Attention and Perceptual Processing Laboratory, School of Psychological Sciences, Turner Institute for Brain and Mental Health, Monash University, Melbourne, VIC, Australia
| |
Collapse
|
19
|
Discrimination between smiling faces: Human observers vs. automated face analysis. Acta Psychol (Amst) 2018; 187:19-29. [PMID: 29758397 DOI: 10.1016/j.actpsy.2018.04.019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2017] [Revised: 04/09/2018] [Accepted: 04/30/2018] [Indexed: 11/23/2022] Open
Abstract
This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes).
Collapse
|
20
|
Namba S, Kabir RS, Miyatani M, Nakao T. Dynamic Displays Enhance the Ability to Discriminate Genuine and Posed Facial Expressions of Emotion. Front Psychol 2018; 9:672. [PMID: 29896135 PMCID: PMC5987704 DOI: 10.3389/fpsyg.2018.00672] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 04/18/2018] [Indexed: 11/13/2022] Open
Abstract
Accurately gauging the emotional experience of another person is important for navigating interpersonal interactions. This study investigated whether perceivers are capable of distinguishing between unintentionally expressed (genuine) and intentionally manipulated (posed) facial expressions attributed to four major emotions: amusement, disgust, sadness, and surprise. Sensitivity to this discrimination was explored by comparing unstaged dynamic and static facial stimuli and analyzing the results with signal detection theory. Participants indicated whether facial stimuli presented on a screen depicted a person showing a given emotion and whether that person was feeling a given emotion. The results showed that genuine displays were evaluated more as felt expressions than posed displays for all target emotions presented. In addition, sensitivity to the perception of emotional experience, or discriminability, was enhanced in dynamic facial displays, but was less pronounced in the case of static displays. This finding indicates that dynamic information in facial displays contributes to the ability to accurately infer the emotional experiences of another person.
Collapse
Affiliation(s)
- Shushi Namba
- Graduate School of Education, Hiroshima University, Hiroshima, Japan
| | - Russell S Kabir
- Graduate School of Education, Hiroshima University, Hiroshima, Japan
| | - Makoto Miyatani
- Department of Psychology, Hiroshima University, Hiroshima, Japan
| | - Takashi Nakao
- Department of Psychology, Hiroshima University, Hiroshima, Japan
| |
Collapse
|
21
|
Sachs M, Habibi A, Damasio H. Reflections on music, affect, and sociality. PROGRESS IN BRAIN RESEARCH 2018; 237:153-172. [PMID: 29779733 DOI: 10.1016/bs.pbr.2018.03.009] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
Music is an important facet of and practice in human cultures, significantly related to its capacity to induce a range of intense and complex emotions. Studying the psychological and neurophysiological responses to music allows us to examine and uncover the neural mechanisms underlying the emotional impact of music. We provide an overview of different aspects of current research on how music listening produces emotions and the corresponding feelings, and consider the underlying neurophysiological mechanisms. We conclude with evidence to suggest that musical training may influence the ability to recognize the emotions of others.
Collapse
Affiliation(s)
- Matthew Sachs
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States
| | - Assal Habibi
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States
| | - Hanna Damasio
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States.
| |
Collapse
|
22
|
Fang J, Xu C, Zille P, Lin D, Deng HW, Calhoun VD, Wang YP. Fast and Accurate Detection of Complex Imaging Genetics Associations Based on Greedy Projected Distance Correlation. IEEE TRANSACTIONS ON MEDICAL IMAGING 2018; 37:860-870. [PMID: 29990017 PMCID: PMC6043419 DOI: 10.1109/tmi.2017.2783244] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
Recent advances in imaging genetics produce large amounts of data including functional MRI images, single nucleotide polymorphisms (SNPs), and cognitive assessments. Understanding the complex interactions among these heterogeneous and complementary data has the potential to help with diagnosis and prevention of mental disorders. However, limited efforts have been made due to the high dimensionality, group structure, and mixed type of these data. In this paper we present a novel method to detect conditional associations between imaging genetics data. We use projected distance correlation to build a conditional dependency graph among high-dimensional mixed data, then use multiple testing to detect significant group level associations (e.g., ROI-gene). In addition, we introduce a scalable algorithm based on orthogonal greedy algorithm, yielding the greedy projected distance correlation (G-PDC). This can reduce the computational cost, which is critical for analyzing large-volume of imaging genomics data. The results from our simulations demonstrate a higher degree of accuracy with GPDC than distance correlation, Pearson's correlation and partial correlation, especially when the correlation is nonlinear. Finally, we apply our method to the Philadelphia Neurodevelopmental data cohort with 866 samples including fMRI images and SNP profiles. The results uncover several statistically significant and biologically interesting interactions, which are further validated with many existing studies. The Matlab code is available at https://sites.google.com/site/jianfang86/gPDC.
Collapse
|
23
|
Perceived emotion genuineness: normative ratings for popular facial expression stimuli and the development of perceived-as-genuine and perceived-as-fake sets. Behav Res Methods 2018; 49:1539-1562. [PMID: 27928745 DOI: 10.3758/s13428-016-0813-2] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In everyday social interactions, people's facial expressions sometimes reflect genuine emotion (e.g., anger in response to a misbehaving child) and sometimes do not (e.g., smiling for a school photo). There is increasing theoretical interest in this distinction, but little is known about perceived emotion genuineness for existing facial expression databases. We present a new method for rating perceived genuineness using a neutral-midpoint scale (-7 = completely fake; 0 = don't know; +7 = completely genuine) that, unlike previous methods, provides data on both relative and absolute perceptions. Normative ratings from typically developing adults for five emotions (anger, disgust, fear, sadness, and happiness) provide three key contributions. First, the widely used Pictures of Facial Affect (PoFA; i.e., "the Ekman faces") and the Radboud Faces Database (RaFD) are typically perceived as not showing genuine emotion. Also, in the only published set for which the actual emotional states of the displayers are known (via self-report; the McLellan faces), percepts of emotion genuineness often do not match actual emotion genuineness. Second, we provide genuine/fake norms for 558 faces from several sources (PoFA, RaFD, KDEF, Gur, FacePlace, McLellan, News media), including a list of 143 stimuli that are event-elicited (rather than posed) and, congruently, perceived as reflecting genuine emotion. Third, using the norms we develop sets of perceived-as-genuine (from event-elicited sources) and perceived-as-fake (from posed sources) stimuli, matched on sex, viewpoint, eye-gaze direction, and rated intensity. We also outline the many types of research questions that these norms and stimulus sets could be used to answer.
Collapse
|
24
|
Groves SJ, Pitcher TL, Melzer TR, Jordan J, Carter JD, Malhi GS, Johnston LC, Porter RJ. Brain activation during processing of genuine facial emotion in depression: Preliminary findings. J Affect Disord 2018; 225:91-96. [PMID: 28802727 DOI: 10.1016/j.jad.2017.07.049] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/30/2017] [Revised: 07/25/2017] [Accepted: 07/26/2017] [Indexed: 01/22/2023]
Abstract
OBJECTIVE The current study aimed to examine the neural correlates of processing genuine compared with posed emotional expressions, in depressed and healthy subjects using a novel functional magnetic resonance imaging (fMRI) paradigm METHOD: During fMRI scanning, sixteen depressed patients and ten healthy controls performed an Emotion Categorisation Task, whereby participants were asked to distinguish between genuine and non-genuine (posed or neutral) facial displays of happiness and sadness. RESULTS Compared to controls, the depressed group showed greater activation whilst processing genuine versus posed facial displays of sadness, in the left medial orbitofrontal cortex, caudate and putamen. The depressed group also showed greater activation whilst processing genuine facial displays of sadness relative to neutral displays, in the bilateral medial frontal/orbitofrontal cortex, left dorsolateral prefrontal cortex, right dorsal anterior cingulate, bilateral posterior cingulate, right superior parietal lobe, left lingual gyrus and cuneus. No differences were found between the two groups for happy facial displays. LIMITATIONS Relatively small sample sizes and due to the exploratory nature of the study, no correction was made for multiple comparisons. CONCLUSION The findings of this exploratory study suggest that depressed individuals may show a different pattern of brain activation in response to genuine versus posed facial displays of sadness, compared to healthy individuals. This may have important implications for future studies that wish to examine the neural correlates of facial emotion processing in depression.
Collapse
Affiliation(s)
- Samantha J Groves
- Department of Psychological Medicine, University of Otago, Christchurch, New Zealand
| | - Toni L Pitcher
- Department of Medicine, University of Otago, Christchurch, New Zealand; New Zealand Brain Research Institute, Christchurch, New Zealand
| | - Tracy R Melzer
- Department of Medicine, University of Otago, Christchurch, New Zealand; New Zealand Brain Research Institute, Christchurch, New Zealand
| | - Jennifer Jordan
- Department of Psychological Medicine, University of Otago, Christchurch, New Zealand; Canterbury District Health Board, New Zealand
| | - Janet D Carter
- Psychology Department, University of Canterbury, New Zealand
| | - Gin S Malhi
- Sydney Medical School, University of Sydney, Australia
| | | | - Richard J Porter
- Department of Psychological Medicine, University of Otago, Christchurch, New Zealand.
| |
Collapse
|
25
|
Eddy CM, Rickards HE, Hansen PC. Through your eyes or mine? The neural correlates of mental state recognition in Huntington's disease. Hum Brain Mapp 2017; 39:1354-1366. [PMID: 29250867 DOI: 10.1002/hbm.23923] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2017] [Revised: 11/14/2017] [Accepted: 12/07/2017] [Indexed: 01/13/2023] Open
Abstract
Huntington's disease (HD) can impair social cognition. This study investigated whether patients with HD exhibit neural differences to healthy controls when they are considering mental and physical states relating to the static expressions of human eyes. Thirty-two patients with HD and 28 age-matched controls were scanned with fMRI during two versions of the Reading the Mind in the Eyes Task: The standard version requiring mental state judgments, and a comparison version requiring judgments about age. HD was associated with behavioral deficits on only the mental state eyes task. Contrasting the two versions of the eyes task (mental state > age judgment) revealed hypoactivation within left middle frontal gyrus and supramarginal gyrus in HD. Subgroup analyses comparing premanifest HD patients to age-matched controls revealed reduced activity in right supramarginal gyrus and increased activity in anterior cingulate during mental state recognition in these patients, while manifest HD was associated with hypoactivity in left insula and left supramarginal gyrus. When controlling for the effects of healthy aging, manifest patients exhibited declining activation within areas including right temporal pole. Our findings provide compelling evidence for a selective impairment of internal emotional status when patients with HD appraise facial features in order to make social judgements. Differential activity in temporal and anterior cingulate cortices may suggest that poor emotion regulation and emotional egocentricity underlie impaired mental state recognition in premanifest patients, while more extensive mental state recognition impairments in manifest disease reflect dysfunction in neural substrates underlying executive functions, and the experience and interpretation of emotion.
Collapse
Affiliation(s)
- Clare M Eddy
- BSMHFT National Centre for Mental Health, Birmingham, United Kingdom.,College of Medical and Dental Sciences, University of Birmingham, Birmingham, United Kingdom
| | - Hugh E Rickards
- BSMHFT National Centre for Mental Health, Birmingham, United Kingdom.,College of Medical and Dental Sciences, University of Birmingham, Birmingham, United Kingdom
| | - Peter C Hansen
- Birmingham University Imaging Centre and School of Psychology, College of Life and Environmental Sciences, University of Birmingham, Birmingham, United Kingdom
| |
Collapse
|
26
|
Qiu R, Wang H, Fu S. N170 Reveals the Categorical Perception Effect of Emotional Valence. Front Psychol 2017; 8:2056. [PMID: 29225590 PMCID: PMC5705631 DOI: 10.3389/fpsyg.2017.02056] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Accepted: 11/13/2017] [Indexed: 11/13/2022] Open
Abstract
As an important attribute of facial expression, emotional valence has been well explored, but its processing mechanisms remain ambiguous. Investigating the categorical perception (CP) of emotional valence might help uncover the objective basis of the subjective dichotomy of emotional valence and identify the stage at which this processing of valence information might occur. A judgment task was used in the current study with stimuli from the within- or between-category condition, in which participants were required to decide whether two presented faces showed the same emotion. The results of the behavioral experiment revealed a significant CP effect of emotional valence, with faster RTs and greater accuracy for the between- than for the within-category stimuli. In the ERP experiment, the N170 (peaking at approximately 150-170 ms) was found to reflect the CP effect of emotional valence, with a larger amplitude for the within- than for the between-category condition. In contrast, the P1 component (peaking at approximately 100-130 ms) was insensitive to the CP effect of emotional valence. These results reveal the existence of the CP of emotional valence and indicate that the N170 is its earliest electrophysiological index. Therefore, the categorization of emotional valence not only has an objective neural basis but occurs at a relatively early stage of processing.
Collapse
Affiliation(s)
- Ruyi Qiu
- Department of Psychology, School of Social Sciences, Tsinghua University, Beijing, China
| | - Hailing Wang
- School of Psychology, Shandong Normal University, Jinan, China
| | - Shimin Fu
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou University, Guangzhou, China
| |
Collapse
|
27
|
Smiles as Multipurpose Social Signals. Trends Cogn Sci 2017; 21:864-877. [DOI: 10.1016/j.tics.2017.08.007] [Citation(s) in RCA: 82] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2017] [Revised: 08/16/2017] [Accepted: 08/31/2017] [Indexed: 11/21/2022]
|
28
|
Perceiving emotional expressions in others: Activation likelihood estimation meta-analyses of explicit evaluation, passive perception and incidental perception of emotions. Neurosci Biobehav Rev 2016; 71:810-828. [DOI: 10.1016/j.neubiorev.2016.10.020] [Citation(s) in RCA: 62] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2016] [Revised: 09/17/2016] [Accepted: 10/24/2016] [Indexed: 01/09/2023]
|
29
|
Calvo MG, Gutiérrez-García A, Del Líbano M. What makes a smiling face look happy? Visual saliency, distinctiveness, and affect. PSYCHOLOGICAL RESEARCH 2016; 82:296-309. [PMID: 27900467 DOI: 10.1007/s00426-016-0829-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2016] [Accepted: 11/19/2016] [Indexed: 11/28/2022]
Abstract
We investigated the relative contribution of (a) perceptual (eyes and mouth visual saliency), (b) conceptual or categorical (eye expression distinctiveness), and (c) affective (rated valence and arousal) factors, and (d) specific morphological facial features (Action Units; AUs), to the recognition of facial happiness. The face stimuli conveyed truly happy expressions with a smiling mouth and happy eyes, or blended expressions with a smile but non-happy eyes (neutral, sad, fearful, disgusted, surprised, or angry). Saliency, distinctiveness, affect, and AUs served as predictors; the probability of judging a face as happy was the criterion. Both for truly happy and for blended expressions, the probability of perceiving happiness increased mainly as a function of positive valence of the facial configuration. In addition, for blended expressions, the probability of being (wrongly) perceived as happy increased as a function of (a) delayed saliency and (b) reduced distinctiveness of the non-happy eyes, and (c) enhanced AU 6 (cheek raiser) or (d) reduced AUs 4, 5, and 9 (brow lowerer, upper lid raiser, and nose wrinkler, respectively). Importantly, the later the eyes become visually salient relative to the smiling mouth, the more likely it is that faces will look happy.
Collapse
Affiliation(s)
- Manuel G Calvo
- Department of Cognitive Psychology, Universidad de La Laguna, 38205, Tenerife, Spain.
| | | | | |
Collapse
|
30
|
Gutiérrez-García A, Calvo MG. Discrimination thresholds for smiles in genuine versus blended facial expressions. COGENT PSYCHOLOGY 2015. [DOI: 10.1080/23311908.2015.1064586] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Affiliation(s)
| | - Manuel G. Calvo
- Department of Cognitive Psychology, University of La Laguna, Tenerife 38205, Spain
| |
Collapse
|
31
|
Dawel A, Palermo R, O'Kearney R, McKone E. Children can discriminate the authenticity of happy but not sad or fearful facial expressions, and use an immature intensity-only strategy. Front Psychol 2015; 6:462. [PMID: 25999868 PMCID: PMC4419677 DOI: 10.3389/fpsyg.2015.00462] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2015] [Accepted: 03/31/2015] [Indexed: 11/30/2022] Open
Abstract
Much is known about development of the ability to label facial expressions of emotion (e.g., as happy or sad), but rather less is known about the emergence of more complex emotional face processing skills. The present study investigates one such advanced skill: the ability to tell if someone is genuinely feeling an emotion or just pretending (i.e., authenticity discrimination). Previous studies have shown that children can discriminate authenticity of happy faces, using expression intensity as an important cue, but have not tested the negative emotions of sadness or fear. Here, children aged 8–12 years (n = 85) and adults (n = 57) viewed pairs of faces in which one face showed a genuinely-felt emotional expression (happy, sad, or scared) and the other face showed a pretend version. For happy faces, children discriminated authenticity above chance, although they performed more poorly than adults. For sad faces, for which our pretend and genuine images were equal in intensity, adults could discriminate authenticity, but children could not. Neither age group could discriminate authenticity of the fear faces. Results also showed that children judged authenticity based on intensity information alone for all three expressions tested, while adults used a combination of intensity and other factor/s. In addition, novel results show that individual differences in empathy (both cognitive and affective) correlated with authenticity discrimination for happy faces in adults, but not children. Overall, our results indicate late maturity of skills needed to accurately determine the authenticity of emotions from facial information alone, and raise questions about how this might affect social interactions in late childhood and the teenage years.
Collapse
Affiliation(s)
- Amy Dawel
- Research School of Psychology and ARC Centre of Excellence in Cognition and its Disorders, The Australian National University , Canberra, ACT, Australia
| | - Romina Palermo
- Research School of Psychology, The Australian National University , Canberra, ACT, Australia ; ARC Centre of Excellence in Cognition and its Disorders, and School of Psychology, University of Western Australia , Perth, WA, Australia
| | - Richard O'Kearney
- Research School of Psychology, The Australian National University , Canberra, ACT, Australia
| | - Elinor McKone
- Research School of Psychology and ARC Centre of Excellence in Cognition and its Disorders, The Australian National University , Canberra, ACT, Australia
| |
Collapse
|
32
|
Olszanowski M, Pochwatko G, Kuklinski K, Scibor-Rylski M, Lewinski P, Ohme RK. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Front Psychol 2015; 5:1516. [PMID: 25601846 PMCID: PMC4283518 DOI: 10.3389/fpsyg.2014.01516] [Citation(s) in RCA: 96] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2014] [Accepted: 12/09/2014] [Indexed: 11/24/2022] Open
Abstract
Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs of genuine facial expressions. Thus, after proper training, participants were inclined to express "felt" emotions. The novel approach taken in this study was also used to establish whether a given expression was perceived as intended by untrained judges. The judgment task for perceivers was designed to be sensitive to subtle changes in meaning caused by the way an emotional display was evoked and expressed. Consequently, this allowed us to measure the purity and intensity of emotional displays, which are parameters that validation methods used by other researchers do not capture. The final set is comprised of those pictures that received the highest recognition marks (e.g., accuracy with intended display) from independent judges, totaling 210 high quality photographs of 30 individuals. Descriptions of the accuracy, intensity, and purity of displayed emotion as well as FACS AU's codes are provided for each picture. Given the unique methodology applied to gathering and validating this set of pictures, it may be a useful tool for research using face stimuli. The Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) is freely accessible to the scientific community for non-commercial use by request at http://www.emotional-face.org.
Collapse
Affiliation(s)
- Michal Olszanowski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | | | - Krzysztof Kuklinski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | - Michal Scibor-Rylski
- Department of Psychology, University of Social Sciences and HumanitiesWarsaw, Poland
| | - Peter Lewinski
- Department of Communication, University of AmsterdamAmsterdam, Netherlands
| | - Rafal K. Ohme
- Faculty in Wroclaw, University of Social Sciences and HumanitiesWroclaw, Poland
| |
Collapse
|
33
|
Carr EW, Korb S, Niedenthal PM, Winkielman P. The two sides of spontaneity: Movement onset asymmetries in facial expressions influence social judgments. JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY 2014. [DOI: 10.1016/j.jesp.2014.05.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
34
|
Korb S, With S, Niedenthal P, Kaiser S, Grandjean D. The perception and mimicry of facial movements predict judgments of smile authenticity. PLoS One 2014; 9:e99194. [PMID: 24918939 PMCID: PMC4053432 DOI: 10.1371/journal.pone.0099194] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2014] [Accepted: 05/12/2014] [Indexed: 11/25/2022] Open
Abstract
The mechanisms through which people perceive different types of smiles and judge their authenticity remain unclear. Here, 19 different types of smiles were created based on the Facial Action Coding System (FACS), using highly controlled, dynamic avatar faces. Participants observed short videos of smiles while their facial mimicry was measured with electromyography (EMG) over four facial muscles. Smile authenticity was judged after each trial. Avatar attractiveness was judged once in response to each avatar’s neutral face. Results suggest that, in contrast to most earlier work using static pictures as stimuli, participants relied less on the Duchenne marker (the presence of crow’s feet wrinkles around the eyes) in their judgments of authenticity. Furthermore, mimicry of smiles occurred in the Zygomaticus Major, Orbicularis Oculi, and Corrugator muscles. Consistent with theories of embodied cognition, activity in these muscles predicted authenticity judgments, suggesting that facial mimicry influences the perception of smiles. However, no significant mediation effect of facial mimicry was found. Avatar attractiveness did not predict authenticity judgments or mimicry patterns.
Collapse
Affiliation(s)
- Sebastian Korb
- Department of Psychology, University of Wisconsin-Madison, Madison, Wisconsin, United States of America
| | - Stéphane With
- Department of Psychology, University of Geneva, Geneva, Switzerland
| | - Paula Niedenthal
- Department of Psychology, University of Wisconsin-Madison, Madison, Wisconsin, United States of America
| | - Susanne Kaiser
- Department of Psychology, University of Geneva, Geneva, Switzerland
| | - Didier Grandjean
- Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
35
|
Prenatal hormonal exposure (2D:4D ratio) and strength of lateralisation for processing facial emotion. PERSONALITY AND INDIVIDUAL DIFFERENCES 2014. [DOI: 10.1016/j.paid.2013.09.031] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
36
|
Liao Y, Miao D, Huan Y, Yin H, Xi Y, Liu X. Altered regional homogeneity with short-term simulated microgravity and its relationship with changed performance in mental transformation. PLoS One 2013; 8:e64931. [PMID: 23755162 PMCID: PMC3670926 DOI: 10.1371/journal.pone.0064931] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2013] [Accepted: 04/19/2013] [Indexed: 11/18/2022] Open
Abstract
In order to further the insight into the explanation of changed performance in mental transformation under microgravity, we discuss the change of performance in mental transformation and its relationship with altered regional homogeneity (ReHo) in resting-state brain by using simulated weightlessness model. Twelve male subjects with age between 24 and 31 received resting-state fMRI scan and mental transformation test both in normal condition and immediately after 72 hours -6° head down tilt (HDT). A paired sample t-test was used to test the difference of behavior performance and brain activity between these two conditions. Compare with normal condition, subjects showed a changed performance in mental transformation with short term simulated microgravity and appeared to be falling. Meanwhile, decreased ReHo were found in right inferior frontal gyrus (IFG) and left inferior parietal lobule (IPL) after 72 hours -6° HDT, while increased ReHo were found in bilateral medial frontal gyrus (MFG) and left superior frontal gyrus (SFG) (P<0.05, corrected). Particularly, there was a significant correlation between ReHo values in left IPL and velocity index of mental transformation. Our findings indicate that gravity change may disrupt the function of right IFG and left IPL in the resting-state, among of which functional change in left IPL may contribute to changed abilities of mental transformation. In addition, the enhanced activity of the bilateral MFG and decreased activity of right IFG found in the current study maybe reflect a complementation effect on inhibitory control process.
Collapse
Affiliation(s)
- Yang Liao
- Department of Psychology, Fourth Military Medical University, Xi'an Shaanxi, China
| | - Danmin Miao
- Department of Psychology, Fourth Military Medical University, Xi'an Shaanxi, China
| | - Yi Huan
- Department of Radiology, Xijing Hospital, Fourth Military Medical University, Xi’an Shaaxi, China
| | - Hong Yin
- Department of Radiology, Xijing Hospital, Fourth Military Medical University, Xi’an Shaaxi, China
| | - Yibin Xi
- Department of Radiology, Xijing Hospital, Fourth Military Medical University, Xi’an Shaaxi, China
| | - Xufeng Liu
- Department of Psychology, Fourth Military Medical University, Xi'an Shaanxi, China
| |
Collapse
|