1
|
Mauersberger H, Blaison C, Hess U. Task-irrelevant emotional expressions are not mimicked, but may modulate the mimicry of task-relevant emotional expressions. Front Psychol 2025; 15:1491832. [PMID: 39839930 PMCID: PMC11748183 DOI: 10.3389/fpsyg.2024.1491832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2024] [Accepted: 12/05/2024] [Indexed: 01/23/2025] Open
Abstract
Emotional mimicry-the imitation of others' emotions-is an empathic response that helps to navigate social interactions. Mimicry is absent when participants' task does not involve engaging with the expressers' emotions. This may be because task-irrelevant faces (i.e., faces that participants were instructed to ignore) are not processed. To assess whether processed task-irrelevant faces are also not mimicked, we conducted three studies [Study 1: N = 74 participants (27 men; Mage = 26.9 years); Study 2: N = 53 participants (20 men; Mage = 25.8 years); Study 3: N = 51 participants (7 men; Mage = 26.8 years)] using an affective priming paradigm in which one face was task-relevant and one was to be ignored, as a framework to explore the impact of disregarded yet still perceptually processed faces on mimicry. We found that even though both faces were processed, only task-relevant faces were mimicked. Hence, our studies suggest that emotional mimicry depends not only on emotional processing as such but also on the way participants prioritize one piece of information over another. Further, task-irrelevant faces interfered with the mimicry of task-relevant faces. This suggests that even though incongruent task-irrelevant faces do not elicit an empathic (mimicry) response, they still may provide a context that can change the meaning of task-relevant faces and thus impact on the mimicry response.
Collapse
Affiliation(s)
- Heidi Mauersberger
- Department of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | | | - Ursula Hess
- Department of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
2
|
Becker C, Conduit R, Chouinard PA, Laycock R. Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuli. Behav Res Methods 2024; 56:7674-7690. [PMID: 38834812 PMCID: PMC11362322 DOI: 10.3758/s13428-024-02443-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/11/2024] [Indexed: 06/06/2024]
Abstract
Video recordings accurately capture facial expression movements; however, they are difficult for face perception researchers to standardise and manipulate. For this reason, dynamic morphs of photographs are often used, despite their lack of naturalistic facial motion. This study aimed to investigate how humans perceive emotions from faces using real videos and two different approaches to artificially generating dynamic expressions - dynamic morphs, and AI-synthesised deepfakes. Our participants perceived dynamic morphed expressions as less intense when compared with videos (all emotions) and deepfakes (fearful, happy, sad). Videos and deepfakes were perceived similarly. Additionally, they perceived morphed happiness and sadness, but not morphed anger or fear, as less genuine than other formats. Our findings support previous research indicating that social responses to morphed emotions are not representative of those to video recordings. The findings also suggest that deepfakes may offer a more suitable standardized stimulus type compared to morphs. Additionally, qualitative data were collected from participants and analysed using ChatGPT, a large language model. ChatGPT successfully identified themes in the data consistent with those identified by an independent human researcher. According to this analysis, our participants perceived dynamic morphs as less natural compared with videos and deepfakes. That participants perceived deepfakes and videos similarly suggests that deepfakes effectively replicate natural facial movements, making them a promising alternative for face perception research. The study contributes to the growing body of research exploring the usefulness of generative artificial intelligence for advancing the study of human perception.
Collapse
|
3
|
Mirabella G, Grassi M, Bernardis P. The role of task relevance in saccadic responses to facial expressions. Ann N Y Acad Sci 2024; 1540:324-337. [PMID: 39316839 DOI: 10.1111/nyas.15221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/26/2024]
Abstract
Recent research on healthy individuals suggests that the valence of emotional stimuli influences behavioral reactions only when relevant to ongoing tasks, as they impact reaching arm movements and gait only when the emotional content cued the responses. However, it has been suggested that emotional expressions elicit automatic gaze shifting, indicating that oculomotor behavior might differ from that of the upper and lower limbs. To investigate, 40 participants underwent two Go/No-go tasks, an emotion discrimination task (EDT) and a gender discrimination task (GDT). In the EDT, participants had to perform a saccade to a peripheral target upon the presentation of angry or happy faces and refrain from moving with neutral ones. In the GDT, the same images were shown, but participants responded based on the posers' gender. Participants displayed two behavioral strategies: a single saccade to the target (92.7%) or two saccades (7.3%), with the first directed at a task-salient feature, that is, the mouth in the EDT and the nose-eyes regions in the GDT. In both cases, the valence of facial expression impacted the saccades only when relevant to the response. Such evidence indicates the same principles govern the interplay between emotional stimuli and motor reactions despite the effectors employed.
Collapse
Affiliation(s)
- Giovanni Mirabella
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
- IRCCS Neuromed, Pozzilli, Italy
| | - Michele Grassi
- Department of Life Sciences, University of Trieste, Trieste, Italy
| | - Paolo Bernardis
- Department of Life Sciences, University of Trieste, Trieste, Italy
| |
Collapse
|
4
|
Lacroix A, Harquel S, Barbosa LS, Kovarski K, Garrido MI, Vercueil L, Kauffmann L, Dutheil F, Gomot M, Mermillod M. Reduced spatial frequency differentiation and sex-related specificities in fearful face detection in autism: Insights from EEG and the predictive brain model. Autism Res 2024; 17:1778-1795. [PMID: 39092565 DOI: 10.1002/aur.3209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2024] [Accepted: 07/24/2024] [Indexed: 08/04/2024]
Abstract
Face processing relies on predictive processes driven by low spatial frequencies (LSF) that convey coarse information prior to fine information conveyed by high spatial frequencies. However, autistic individuals might have atypical predictive processes, contributing to facial processing difficulties. This may be more normalized in autistic females, who often exhibit better socio-communicational abilities than males. We hypothesized that autistic females would display a more typical coarse-to-fine processing for socio-emotional stimuli compared to autistic males. To test this hypothesis, we asked adult participants (44 autistic, 51 non-autistic) to detect fearful faces among neutral faces, filtered in two orders: from coarse-to-fine (CtF) and from fine-to-coarse (FtC). Results show lower d' values and longer reaction times for fearful detection in autism compared to non-autistic (NA) individuals, regardless of the filtering order. Both groups presented shorter P100 latency after CtF compared to FtC, and larger amplitude for N170 after FtC compared to CtF. However, autistic participants presented a reduced difference in source activity between CtF and FtC in the fusiform. There was also a more spatially spread activation pattern in autistic females compared to NA females. Finally, females had faster P100 and N170 latencies, as well as larger occipital activation for FtC sequences than males, irrespective of the group. Overall, the results do not suggest impaired predictive processes from LSF in autism despite behavioral differences in fear detection. However, they do indicate reduced brain modulation by spatial frequency in autism. In addition, the findings highlight sex differences that warrant consideration in understanding autistic females.
Collapse
Affiliation(s)
- Adeline Lacroix
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
| | - Sylvain Harquel
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
- Defitech Chair in Clinical Neuroengineering, Center for Neuroprosthetics and Brain Mind Institute, EPFL, Geneva, Switzerland
| | - Leonardo S Barbosa
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
- Fralin Biomedical Research Institute at VTC, Virginia Tech, Roanoke, Virginia, USA
| | - Klara Kovarski
- Sorbonne Université, Faculté des Lettres, INSPE, Paris, France
- LaPsyDÉ, Université Paris-Cité, CNRS, Paris, France
| | - Marta I Garrido
- Cognitive Neuroscience and Computational Psychiatry Lab, Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, Victoria, Australia
- Graeme Clark Institute for Biomedical Engineering, The University of Melbourne, Melbourne, Victoria, Australia
| | - Laurent Vercueil
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
| | - Louise Kauffmann
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
| | - Frédéric Dutheil
- Université Clermont Auvergne, CNRS, LaPSCo, CHU Clermont-Ferrand, Clermont-Ferrand, France
| | - Marie Gomot
- Université de Tours, INSERM, Imaging Brain and Neuropsychiatry iBraiN U1253, Tours, France
| | - Martial Mermillod
- Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
| |
Collapse
|
5
|
Nowling D, Crum KI, Joseph J. Sex differences in development of functional connections in the face processing network. J Neuroimaging 2024; 34:280-290. [PMID: 38169075 PMCID: PMC10939922 DOI: 10.1111/jon.13185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Revised: 12/16/2023] [Accepted: 12/18/2023] [Indexed: 01/05/2024] Open
Abstract
BACKGROUND AND PURPOSE Understanding sex differences in typical development of the face processing network is important for elucidating disruptions during atypical development in sex-linked developmental disorders like autism spectrum disorder. Based on prior sex difference studies in other cognitive domains, this study examined whether females show increased integration of core and extended face regions with age for face viewing, while males would show increased segregation. METHODS This study used a cross-sectional design with typically developing children and adults (n = 133) and a functional MRI face localizer task. Psychophysiological interaction (PPI) analysis examined functional connectivity between canonical and extended face processing network regions with age, with greater segregation indexed by decreased core-extended region connectivity with age and greater integration indexed by increased core-extended region connectivity with age. RESULTS PPI analysis confirmed increased segregation for males-right fusiform face area (FFA) coupling to right inferior frontal gyrus (IFG) opercular when viewing faces and left amygdala when viewing objects decreased with age. Females showed increased integration with age (increased coupling of the right FFA to right IFG opercular region and right occipital face area [OFA] to right IFG orbital when viewing faces and objects, respectively) and increased segregation (decreased coupling with age of the right OFA with IFG opercular region when viewing faces). CONCLUSIONS Development of core and extended face processing network connectivity follows sexually dimorphic paths. These differential changes mostly occur across childhood and adolescence, with males experiencing segregation and females both segregation and integration changes in connectivity.
Collapse
Affiliation(s)
- Duncan Nowling
- Department of Neuroscience, Medical University of South Carolina, Charleston, SC
| | - Kathleen I. Crum
- Department of Neuroscience, Medical University of South Carolina, Charleston, SC
- Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN
| | - Jane Joseph
- Department of Neuroscience, Medical University of South Carolina, Charleston, SC
| |
Collapse
|
6
|
Zou Z, Mubin O, Alnajjar F, Ali L. A pilot study of measuring emotional response and perception of LLM-generated questionnaire and human-generated questionnaires. Sci Rep 2024; 14:2781. [PMID: 38308014 PMCID: PMC10837442 DOI: 10.1038/s41598-024-53255-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Accepted: 01/30/2024] [Indexed: 02/04/2024] Open
Abstract
The advent of ChatGPT has sparked a heated debate surrounding natural language processing technology and AI-powered chatbots, leading to extensive research and applications across various disciplines. This pilot study aims to investigate the impact of ChatGPT on users' experiences by administering two distinct questionnaires, one generated by humans and the other by ChatGPT, along with an Emotion Detecting Model. A total of 14 participants (7 female and 7 male) aged between 18 and 35 years were recruited, resulting in the collection of 8672 ChatGPT-associated data points and 8797 human-associated data points. Data analysis was conducted using Analysis of Variance (ANOVA). The results indicate that the utilization of ChatGPT enhances participants' happiness levels and reduces their sadness levels. While no significant gender influences were observed, variations were found about specific emotions. It is important to note that the limited sample size, narrow age range, and potential cultural impacts restrict the generalizability of the findings to a broader population. Future research directions should explore the impact of incorporating additional language models or chatbots on user emotions, particularly among specific age groups such as older individuals and teenagers. As one of the pioneering works evaluating the human perception of ChatGPT text and communication, it is noteworthy that ChatGPT received positive evaluations and demonstrated effectiveness in generating extensive questionnaires.
Collapse
Affiliation(s)
- Zhao Zou
- School of Computer, Data and Mathematical Science, Western Sydney University, Sydney, Australia
| | - Omar Mubin
- School of Computer, Data and Mathematical Science, Western Sydney University, Sydney, Australia
| | - Fady Alnajjar
- College of Information Technology, United Arab Emirates University, Al Ain, United Arab Emirates.
| | - Luqman Ali
- College of Information Technology, United Arab Emirates University, Al Ain, United Arab Emirates
| |
Collapse
|
7
|
Penney D, Pruessner M, Malla AK, Joober R, Lepage M. Severe childhood trauma and emotion recognition in males and females with first-episode psychosis. Early Interv Psychiatry 2023; 17:149-158. [PMID: 35384318 DOI: 10.1111/eip.13299] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/04/2021] [Revised: 02/25/2022] [Accepted: 03/13/2022] [Indexed: 11/27/2022]
Abstract
AIM Childhood trauma increases social functioning deficits in first-episode psychosis (FEP) and is negatively associated with higher-order social cognitive processes such as emotion recognition (ER). We investigated the relationship between childhood trauma severity and ER capacity, and explored sex as a potential factor given sex differences in childhood trauma exposure. METHODS Eighty-three FEP participants (52 males, 31 females) and 69 nonclinical controls (49 males, 20 females) completed the CogState Research Battery. FEP participants completed the Childhood Trauma Questionnaire. A sex × group (FEP, controls) ANOVA examined ER differences and was followed by two-way ANCOVAs investigating sex and childhood trauma severity (none, low, moderate, and severe) on ER and global cognition in FEP. RESULTS FEP participants had significantly lower ER scores than controls (p = .035). No significant sex × group interaction emerged for ER F(3, 147) = .496, p = .438 [95% CI = -1.20-0.57], partial η2 = .003. When controlling for age at psychosis onset, a significant interaction emerged in FEP between sex and childhood trauma severity F(3, 71) = 3.173, p = .029, partial η2 = .118. Males (n = 9) with severe trauma showed ER deficits compared to females (n = 8) (p = .011 [95% CI = -2.90 to -0.39]). No significant interaction was observed for global cognition F(3, 69) = 2.410, p = .074, partial η2 = .095. CONCLUSIONS These preliminary findings provide support for longitudinal investigations examining whether trauma severity differentially impacts ER in males and females with FEP.
Collapse
Affiliation(s)
- Danielle Penney
- Douglas Mental Health University Institute, Montréal, Canada
- Department of Psychology, Université du Québec à Montréal, Montréal, Canada
| | - Marita Pruessner
- Douglas Mental Health University Institute, Montréal, Canada
- Department of Clinical Psychology, University of Konstanz, Konstanz, Germany
- Department of Psychiatry, McGill University, Montréal, Canada
| | - Ashok K Malla
- Douglas Mental Health University Institute, Montréal, Canada
- Department of Psychiatry, McGill University, Montréal, Canada
| | - Ridha Joober
- Douglas Mental Health University Institute, Montréal, Canada
- Department of Psychiatry, McGill University, Montréal, Canada
| | - Martin Lepage
- Douglas Mental Health University Institute, Montréal, Canada
- Department of Psychiatry, McGill University, Montréal, Canada
| |
Collapse
|
8
|
Emotional prosody recognition enhances and progressively complexifies from childhood to adolescence. Sci Rep 2022; 12:17144. [PMID: 36229474 PMCID: PMC9561714 DOI: 10.1038/s41598-022-21554-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2021] [Accepted: 09/28/2022] [Indexed: 01/04/2023] Open
Abstract
Emotional prosody results from the dynamic variation of language's acoustic non-verbal aspects that allow people to convey and recognize emotions. The goal of this paper is to understand how this recognition develops from childhood to adolescence. We also aim to investigate how the ability to perceive multiple emotions in the voice matures over time. We tested 133 children and adolescents, aged between 6 and 17 years old, exposed to 4 kinds of linguistically meaningless emotional (anger, fear, happiness, and sadness) and neutral stimuli. Participants were asked to judge the type and intensity of perceived emotion on continuous scales, without a forced choice task. As predicted, a general linear mixed model analysis revealed a significant interaction effect between age and emotion. The ability to recognize emotions significantly increased with age for both emotional and neutral vocalizations. Girls recognized anger better than boys, who instead confused fear with neutral prosody more than girls. Across all ages, only marginally significant differences were found between anger, happiness, and neutral compared to sadness, which was more difficult to recognize. Finally, as age increased, participants were significantly more likely to attribute multiple emotions to emotional prosody, showing that the representation of emotional content becomes increasingly complex. The ability to identify basic emotions in prosody from linguistically meaningless stimuli develops from childhood to adolescence. Interestingly, this maturation was not only evidenced in the accuracy of emotion detection, but also in a complexification of emotion attribution in prosody.
Collapse
|
9
|
The Effect of Mouth-Opening on Recognition of Facial Expressions in the NimStim Set: An Evaluation from Chinese College Students. JOURNAL OF NONVERBAL BEHAVIOR 2022. [DOI: 10.1007/s10919-022-00417-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
10
|
Sarauskyte L, Monciunskaite R, Griksiene R. The role of sex and emotion on emotion perception in artificial faces: An ERP study. Brain Cogn 2022; 159:105860. [PMID: 35339916 DOI: 10.1016/j.bandc.2022.105860] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Revised: 02/08/2022] [Accepted: 03/10/2022] [Indexed: 11/17/2022]
Abstract
Sex has a significant impact on the perception of emotional expressions. However, it remains unclear whether sex influences the perception of emotions in artificial faces, which are becoming popular in emotion research. We used an emotion recognition task with FaceGen faces portraying six basic emotions aiming to investigate the effect of sex and emotion on behavioural and electrophysiological parameters. 71 participants performed the task while EEG was recorded. The recognition of sadness was the poorest, however, females recognized sadness better than males. ERP results indicated that fear, disgust, and anger evoked higher amplitudes of late positive potential over the left parietal region compared to neutral expression. Females demonstrated higher values of global field power as compared to males. The interaction between sex and emotion on ERPs was not significant. The results of our study may be valuable for future therapies and research, as it emphasizes possibly distinct processing of emotions and potential sex differences in the recognition of emotional expressions in FaceGen faces.
Collapse
Affiliation(s)
- Livija Sarauskyte
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania.
| | - Rasa Monciunskaite
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania
| | - Ramune Griksiene
- Vilnius University, Life Sciences Center, Institute of Biosciences, Vilnius, Lithuania
| |
Collapse
|
11
|
Zhang Y, Li D, Yang T, Chen C, Li H, Zhu C. Characteristics of emotional gaze on threatening faces in children with autism spectrum disorders. Front Psychiatry 2022; 13:920821. [PMID: 36072450 PMCID: PMC9441573 DOI: 10.3389/fpsyt.2022.920821] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Accepted: 08/03/2022] [Indexed: 11/13/2022] Open
Abstract
Most evidence suggested that individuals with autism spectrum disorder (ASD) experienced gaze avoidance when looking at the eyes compared to typically developing (TD) individuals. Children with ASD magnified their fears when received threatening stimuli, resulting in a reduced duration of eye contact. Few studies have explored the gaze characteristics of children with ASD by dividing emotional faces into threatening and non-threatening pairs. In addition, although dynamic videos are more helpful in understanding the gaze characteristics of children with ASD, the experimental stimuli for some of the previous studies were still emotional pictures. We explored the viewing of dynamic threatening and non-threatening faces by children with ASD in different areas of interest (AOIs). In this study, 6-10 years old children with and without ASD viewed faces with threatening (fearful and angry) and non-threatening (sad and happy) expressions, respectively, with their eyes movements recorded. The results showed that when confronted with threatening faces, children with ASD, rather than TD, showed substantial eye avoidances, particularly non-specific avoidances in the fixation time on the mouths and significantly less time gazing at the mouths in any emotions, which was not observed for non-threatening faces. No correlations were found between the severity of symptoms and characteristics of gaze at the eyes and mouths in children with ASD. These results further enhance the understanding of the gaze characteristics of children with ASD on threatening and non-threatening faces and possibly provide additional evidence for their social interaction improvements.
Collapse
Affiliation(s)
- Yifan Zhang
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China
| | - Dandan Li
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, China.,Department of Neurology, First Affiliated Hospital, Anhui Medical University, Hefei, China
| | - Tingting Yang
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China
| | - Chuanao Chen
- Anhui Province Hefei Kang Hua Rehabilitation Hospital, Hefei, China
| | - Hong Li
- Anhui Hospital Affiliated to the Pediatric Hospital of Fudan University, Hefei, China
| | - Chunyan Zhu
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, China
| |
Collapse
|
12
|
Abstract
Many male traits are well explained by sexual selection theory as adaptations to mating competition and mate choice, whereas no unifying theory explains traits expressed more in females. Anne Campbell's "staying alive" theory proposed that human females produce stronger self-protective reactions than males to aggressive threats because self-protection tends to have higher fitness value for females than males. We examined whether Campbell's theory has more general applicability by considering whether human females respond with greater self-protectiveness than males to other threats beyond aggression. We searched the literature for physiological, behavioral, and emotional responses to major physical and social threats, and found consistent support for females' responding with greater self-protectiveness than males. Females mount stronger immune responses to many pathogens; experience a lower threshold to detect, and lesser tolerance of, pain; awaken more frequently at night; express greater concern about physically dangerous stimuli; exert more effort to avoid social conflicts; exhibit a personality style more focused on life's dangers; react to threats with greater fear, disgust and sadness; and develop more threat-based clinical conditions than males. Our findings suggest that in relation to threat human females have relatively heightened protective reactions compared to males. The pervasiveness of this result across multiple domains suggests that general mechanisms might exist underlying females' unique adaptations. An understanding of such processes would enhance knowledge of female health and well-being.
Collapse
|
13
|
Tommasi V, Prete G, Tommasi L. The role of low spatial frequencies in facial emotion processing: A study on anorthoscopic perception. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1966150] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Vincenza Tommasi
- Department of Neuroscience, Imaging and Clinical Sciences, “G. d’Annunzio” University of Chieti-Pescara, Chieti, Italy
| | - Giulia Prete
- Department of Psychological, Health and Territorial Sciences, “G. d’Annunzio” University of Chieti-Pescara, Chieti, Italy
| | - Luca Tommasi
- Department of Psychological, Health and Territorial Sciences, “G. d’Annunzio” University of Chieti-Pescara, Chieti, Italy
| |
Collapse
|
14
|
Lin Y, Ding H, Zhang Y. Gender Differences in Identifying Facial, Prosodic, and Semantic Emotions Show Category- and Channel-Specific Effects Mediated by Encoder's Gender. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2021; 64:2941-2955. [PMID: 34310173 DOI: 10.1044/2021_jslhr-20-00553] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Purpose The nature of gender differences in emotion processing has remained unclear due to the discrepancies in existing literature. This study examined the modulatory effects of emotion categories and communication channels on gender differences in verbal and nonverbal emotion perception. Method Eighty-eight participants (43 females and 45 males) were asked to identify three basic emotions (i.e., happiness, sadness, and anger) and neutrality encoded by female or male actors from verbal (i.e., semantic) or nonverbal (i.e., facial and prosodic) channels. Results While women showed an overall advantage in performance, their superiority was dependent on specific types of emotion and channel. Specifically, women outperformed men in regard to two basic emotions (happiness and sadness) in the nonverbal channels and only the anger category with verbal content. Conversely, men did better for the anger category in the nonverbal channels and for the other two emotions (happiness and sadness) in verbal content. There was an emotion- and channel-specific interaction effect between the two types of gender differences, with male subjects showing higher sensitivity to sad faces and prosody portrayed by the female encoders. Conclusion These findings reveal explicit emotion processing as a highly dynamic complex process with significant gender differences tied to specific emotion categories and communication channels. Supplemental Material https://doi.org/10.23641/asha.15032583.
Collapse
Affiliation(s)
- Yi Lin
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai, China
| | - Hongwei Ding
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai, China
| | - Yang Zhang
- Department of Speech-Language-Hearing Sciences & Center for Neurobehavioral Development, University of Minnesota Twin Cities, Minneapolis
| |
Collapse
|
15
|
Cavieres A, Maldonado R, Bland A, Elliott R. Relationship Between Gender and Performance on Emotion Perception Tasks in a Latino Population. Int J Psychol Res (Medellin) 2021; 14:106-114. [PMID: 34306583 PMCID: PMC8297575 DOI: 10.21500/20112084.5032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2020] [Revised: 11/24/2020] [Accepted: 02/26/2021] [Indexed: 12/02/2022] Open
Abstract
Basic emotions are universally recognized, although differences across cultures and between genders have been described. We report results in two emotion recognition tasks, in a sample of healthy adults from Chile. Methods: 192 volunteers (mean 31.58 years, s.d. 8.36; 106 women) completed the Emotional Recognition Task, in which they were asked to identify a briefly displayed emotion, and the Emotional Intensity Morphing Task, in which they viewed faces with increasing or decreasing emotional intensity and indicated when they either detected or no longer detected the emotion. Results: All emotions were recognized at above chance levels. The only sex differences present showed men performed better at identifying anger (p = .0485), and responded more slowly to fear (p = .0057), than women. Discussion: These findings are consistent with some, though not all, prior literature on emotion perception. Crucially, we report data on emotional perception in a healthy adult Latino population for the first time, which contributes to emerging literature on cultural differences in affective processing.
Collapse
Affiliation(s)
- Alvaro Cavieres
- Departamento de Psiquiatría, Universidad de Valparaíso, Chile. Universidad de Valparaíso Universidad de Valparaíso Chile
| | - Rocío Maldonado
- Departamento de Psiquiatría, Universidad de Valparaíso, Chile. Universidad de Valparaíso Universidad de Valparaíso Chile
| | - Amy Bland
- Department of Psychology, Manchester Metropolitan University, UK. Manchester Metropolitan University Manchester Metropolitan University United Kingdom
| | - Rebecca Elliott
- Neuroscience and Psychiatry Unit, Division of Neuroscience and Experimental Psychology,University of Manchester, UK. The University of Manchester University of Manchester United Kingdom
| |
Collapse
|
16
|
Smith KE, Norman GJ, Decety J. Increases in loneliness during medical school are associated with increases in individuals' likelihood of mislabeling emotions as negative. ACTA ACUST UNITED AC 2020; 22:740-750. [PMID: 32597671 DOI: 10.1037/emo0000773] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Expressions of emotion represent an important and unique source of information about the states of others. Being able to effectively understand expressions of emotions to make inferences about others' internal mental states and use these inferences to guide decision-making and behavior is critical to navigating social relationships. Loneliness, the perception that one lacks social connection, has important functional consequences for how individuals attend to signals of emotions in others. However, it is less clear whether loneliness changes how individuals recognize emotions in others. In medical practitioners, being able to accurately recognize emotional cues from patients is critical to effectively diagnosing and reacting with care to those patients. The current study examines the relationship between changes in loneliness during medical school and students' recognition of emotion in others. Measures of loneliness and emotion recognition were collected from 122 medical students during their first 3 years of medical school at the beginning and end of each academic year. Changes in loneliness were related to changes in emotion detection, with increases in loneliness being associated with decreases in the probability of accurately discriminating sad and angry faces from other expressions, decreases in the probability of mislabeling emotion expressions as happy, and increases in the probability of mislabeling other emotional expressions as pained and angry. This study suggests that changes in loneliness during medical school are associated with increases in students' labeling emotional expressions as negative, possibly by shifting attention to cues of negative emotion and away from cues of positive emotion. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
Affiliation(s)
- Karen E Smith
- Department of Psychology, Integrative Neuroscience Area, University of Chicago
| | | | - Jean Decety
- Department of Psychology, University of Chicago
| |
Collapse
|
17
|
Lan X, Moscardino U. Sensitivity to facial emotional expressions and peer relationship problems in Chinese rural‐to‐urban migrant early adolescents: An exploratory study. SOCIAL DEVELOPMENT 2020. [DOI: 10.1111/sode.12456] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Xiaoyu Lan
- Department of Developmental Psychology and Socialization University of Padova Padova Italy
| | - Ughetta Moscardino
- Department of Developmental Psychology and Socialization University of Padova Padova Italy
| |
Collapse
|
18
|
Hyniewska S, Sato W, Kaiser S, Pelachaud C. Naturalistic Emotion Decoding From Facial Action Sets. Front Psychol 2019; 9:2678. [PMID: 30713515 PMCID: PMC6345715 DOI: 10.3389/fpsyg.2018.02678] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2018] [Accepted: 12/13/2018] [Indexed: 11/15/2022] Open
Abstract
Researchers have theoretically proposed that humans decode other individuals' emotions or elementary cognitive appraisals from particular sets of facial action units (AUs). However, only a few empirical studies have systematically tested the relationships between the decoding of emotions/appraisals and sets of AUs, and the results are mixed. Furthermore, the previous studies relied on facial expressions of actors and no study used spontaneous and dynamic facial expressions in naturalistic settings. We investigated this issue using video recordings of facial expressions filmed unobtrusively in a real-life emotional situation, specifically loss of luggage at an airport. The AUs observed in the videos were annotated using the Facial Action Coding System. Male participants (n = 98) were asked to decode emotions (e.g., anger) and appraisals (e.g., suddenness) from facial expressions. We explored the relationships between the emotion/appraisal decoding and AUs using stepwise multiple regression analyses. The results revealed that all the rated emotions and appraisals were associated with sets of AUs. The profiles of regression equations showed AUs both consistent and inconsistent with those in theoretical proposals. The results suggest that (1) the decoding of emotions and appraisals in facial expressions is implemented by the perception of set of AUs, and (2) the profiles of such AU sets could be different from previous theories.
Collapse
Affiliation(s)
- Sylwia Hyniewska
- Kokoro Research Center, Kyoto University, Kyoto, Japan.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland.,Human Behaviour Analysis Laboratory, Department of Psychology, University of Geneva, Geneva, Switzerland
| | - Wataru Sato
- Kokoro Research Center, Kyoto University, Kyoto, Japan
| | - Susanne Kaiser
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland.,Human Behaviour Analysis Laboratory, Department of Psychology, University of Geneva, Geneva, Switzerland
| | - Catherine Pelachaud
- Institut des Systèmes Intelligents et de Robotique (ISIR), Université Pierre et Marie Curie/Centre National de la Recherche Scientifique (CNRS), Paris, France
| |
Collapse
|
19
|
Donadon MF, Martin-Santos R, Osório FL. Baby Faces: Development and psychometric study of a stimuli set based on babies' emotions. J Neurosci Methods 2018; 311:178-185. [PMID: 30347221 DOI: 10.1016/j.jneumeth.2018.10.021] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2018] [Revised: 10/05/2018] [Accepted: 10/15/2018] [Indexed: 10/28/2022]
Abstract
BACKGROUND Sets of stimuli from babies' facial emotions provide a good instrument to detect the recognition of facial emotion (RFE) in clinical and non clinical groups. However, specificities from the stimuli have not been widely explored and validated by previous studies. NEW METHOD We presented a new set of facial stimuli from infants aged 6-12 months, both sexes, different races, representing five basic emotions. We also present the psychometric properties of validity/reliability for each stimulus and assess whether the sociodemographic characteristics of the stimuli and the subjects affect the RFE. RESULTS The stimuli were obtained by a standardized protocol of activities to elicit emotions and 72 stimuli were developed. A total of 119 subjects from the community were selected for the psychometric analysis of the stimuli. The set produced indicators of validity (mean 62.5%) and reliability. Stimuli were evaluated using the Rash model and 15 stimuli had indicators of unpredictability and unmodeled residuals. The difficulty index of each stimulus was calculated, evidencing that the set was normally distributed. COMPARISON WITH EXISTING METHOD Previously published methods are limited in terms of racial diversity, standardisation of the elicitation of emotions, procedure of stimuli extraction, and psychometric evidence. CONCLUSIONS The findings reinforced the Differential Emotion Theory regarding the expression of basic emotions in infants and evidenced the effect of education level on emotion recognition to the detriment of other sociocultural characteristics (sex and race). This set is freely accessible by email request.
Collapse
Affiliation(s)
- Mariana Fortunata Donadon
- Department of Neuroscience and Behavior, Medical School of Ribeirão Preto, University of São Paulo, Brazil.
| | - Rocio Martin-Santos
- Hospital Clínic, IDIBAPS, CIBERSAM, Spain; Department of Medicine, Universidad of Barceloma, Barcelona, Spain; National Institute for Science and Technology (INCT-TM, CNPq, Brazil), Brazil
| | - Flávia L Osório
- Department of Neuroscience and Behavior, Medical School of Ribeirão Preto, University of São Paulo, Brazil; Hospital Clínic, IDIBAPS, CIBERSAM, Spain; Department of Medicine, Universidad of Barceloma, Barcelona, Spain.
| |
Collapse
|
20
|
Lausen A, Schacht A. Gender Differences in the Recognition of Vocal Emotions. Front Psychol 2018; 9:882. [PMID: 29922202 PMCID: PMC5996252 DOI: 10.3389/fpsyg.2018.00882] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2018] [Accepted: 05/15/2018] [Indexed: 11/22/2022] Open
Abstract
The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these factors to explain recognition ability in the processing of emotional prosody.
Collapse
Affiliation(s)
- Adi Lausen
- Department of Affective Neuroscience and Psychophysiology, Institute for Psychology, University of Goettingen, Goettingen, Germany.,Leibniz Science "Primate Cognition", Goettingen, Germany
| | - Annekathrin Schacht
- Department of Affective Neuroscience and Psychophysiology, Institute for Psychology, University of Goettingen, Goettingen, Germany.,Leibniz Science "Primate Cognition", Goettingen, Germany
| |
Collapse
|
21
|
Yep R, Soncin S, Brien DC, Coe BC, Marin A, Munoz DP. Using an emotional saccade task to characterize executive functioning and emotion processing in attention-deficit hyperactivity disorder and bipolar disorder. Brain Cogn 2018; 124:1-13. [PMID: 29698907 DOI: 10.1016/j.bandc.2018.04.002] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2017] [Revised: 04/11/2018] [Accepted: 04/15/2018] [Indexed: 01/02/2023]
Abstract
Despite distinct diagnostic criteria, attention-deficit hyperactivity disorder (ADHD) and bipolar disorder (BD) share cognitive and emotion processing deficits that complicate diagnoses. The goal of this study was to use an emotional saccade task to characterize executive functioning and emotion processing in adult ADHD and BD. Participants (21 control, 20 ADHD, 20 BD) performed an interleaved pro/antisaccade task (look toward vs. look away from a visual target, respectively) in which the sex of emotional face stimuli acted as the cue to perform either the pro- or antisaccade. Both patient groups made more direction (erroneous prosaccades on antisaccade trials) and anticipatory (saccades made before cue processing) errors than controls. Controls exhibited lower microsaccade rates preceding correct anti- vs. prosaccade initiation, but this task-related modulation was absent in both patient groups. Regarding emotion processing, the ADHD group performed worse than controls on neutral face trials, while the BD group performed worse than controls on trials presenting faces of all valence. These findings support the role of fronto-striatal circuitry in mediating response inhibition deficits in both ADHD and BD, and suggest that such deficits are exacerbated in BD during emotion processing, presumably via dysregulated limbic system circuitry involving the anterior cingulate and orbitofrontal cortex.
Collapse
Affiliation(s)
- Rachel Yep
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada.
| | - Stephen Soncin
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada
| | - Donald C Brien
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada
| | - Brian C Coe
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada
| | - Alina Marin
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada; Department of Psychiatry, Hotel Dieu Hospital, Kingston, ON, Canada
| | - Douglas P Munoz
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada; Department of Biomedical and Molecular Sciences, Queen's University, Kingston, ON, Canada.
| |
Collapse
|
22
|
van Rooijen R, Junge CMM, Kemner C. The Interplay between Gaze Following, Emotion Recognition, and Empathy across Adolescence; a Pubertal Dip in Performance? Front Psychol 2018; 9:127. [PMID: 29487555 PMCID: PMC5816800 DOI: 10.3389/fpsyg.2018.00127] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2017] [Accepted: 01/26/2018] [Indexed: 11/13/2022] Open
Abstract
During puberty a dip in face recognition is often observed, possibly caused by heightened levels of gonadal hormones which in turn affects the re-organization of relevant cortical circuitry. In the current study we investigated whether a pubertal dip could be observed in three other abilities related to social information processing: gaze following, emotion recognition from the eyes, and empathizing abilities. Across these abilities we further explored whether these measurements revealed sex differences as another way to understand how gonadal hormones affect processing of social information. Results show that across adolescence, there are improvements in emotion recognition from the eyes and in empathizing abilities. These improvements did not show a dip, but are more plateau-like. The gaze cueing effect did not change over adolescence. We only observed sex differences in empathizing abilities, with girls showing higher scores than boys. Based on these results it appears that gonadal hormones are not exerting a unified influence on higher levels of social information processing. Further research should also explore changes in (visual) information processing around puberty onset to find a more fitted explanation for changes in social behavior across adolescence.
Collapse
Affiliation(s)
- Rianne van Rooijen
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands.,Department of Developmental Psychology, Utrecht University, Utrecht, Netherlands
| | - Caroline M M Junge
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands.,Department of Developmental Psychology, Utrecht University, Utrecht, Netherlands
| | - Chantal Kemner
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands.,Department of Developmental Psychology, Utrecht University, Utrecht, Netherlands.,Brain Centre Rudolf Magnus, University Medical Center, Utrecht, Netherlands
| |
Collapse
|
23
|
Sellaro R, de Gelder B, Finisguerra A, Colzato LS. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies. Cortex 2017; 99:213-223. [PMID: 29275193 DOI: 10.1016/j.cortex.2017.11.007] [Citation(s) in RCA: 49] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Revised: 09/19/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes.
Collapse
Affiliation(s)
- Roberta Sellaro
- Leiden University, Cognitive Psychology Unit & Leiden Institute for Brain and Cognition, Leiden, The Netherlands.
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands
| | - Alessandra Finisguerra
- Leiden University, Cognitive Psychology Unit & Leiden Institute for Brain and Cognition, Leiden, The Netherlands
| | - Lorenza S Colzato
- Leiden University, Cognitive Psychology Unit & Leiden Institute for Brain and Cognition, Leiden, The Netherlands; Department of Cognitive Psychology, Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr University Bochum, Bochum, Germany; Institute for Sports and Sport Science, University of Kassel, Kassel, Germany
| |
Collapse
|
24
|
Oakley BFM, Brewer R, Bird G, Catmur C. Theory of mind is not theory of emotion: A cautionary note on the Reading the Mind in the Eyes Test. JOURNAL OF ABNORMAL PSYCHOLOGY 2017; 125:818-823. [PMID: 27505409 PMCID: PMC4976760 DOI: 10.1037/abn0000182] [Citation(s) in RCA: 242] [Impact Index Per Article: 30.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
The ability to represent mental states (theory of mind [ToM]) is crucial in understanding individual differences in social ability and social impairments evident in conditions such as autism spectrum disorder (ASD). The Reading the Mind in the Eyes Test (RMET) is a popular measure of ToM ability, validated in part by the poor performance of those with ASD. However, the RMET requires recognition of facial emotion, which is impaired in those with alexithymia, which frequently co-occurs with ASD. Thus, it is unclear whether the RMET indexes emotion recognition, associated with alexithymia, or ToM, associated with ASD. We therefore investigated the independent contributions of ASD and alexithymia to performance on the RMET. ASD and alexithymia-matched control participants did not differ on RMET performance, whereas ASD participants demonstrated impaired performance on an alternative test of ToM, the Movie for Assessment of Social Cognition (MASC). Furthermore, alexithymia, but not ASD diagnosis, significantly influenced RMET performance but did not affect MASC performance. These results suggest that the RMET measures emotion recognition rather than ToM ability and support the alexithymia hypothesis of emotion-related deficits in ASD. This study suggests that a highly popular test of the ability to detect what someone else is thinking—the Reading the Mind in the Eyes Test—is instead a test of the ability to recognize another person’s emotional expression. This is important because it suggests that patients who perform badly on this test may still be able to understand another person’s mental state and that, conversely, patients who perform well on this test may still have difficulties in mental state understanding.
Collapse
Affiliation(s)
| | - Rebecca Brewer
- Medical Research Council Social, Genetic, & Developmental Psychiatry Centre
| | - Geoffrey Bird
- Medical Research Council Social, Genetic, & Developmental Psychiatry Centre
| | | |
Collapse
|
25
|
Donadon MF, Osório FDL. Current alcohol dependence and emotional facial expression recognition: a cross-sectional study. ARCH CLIN PSYCHIAT 2017. [DOI: 10.1590/0101-60830000000120] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
|
26
|
Kokinous J, Tavano A, Kotz SA, Schröger E. Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency. Biol Psychol 2017; 123:155-165. [DOI: 10.1016/j.biopsycho.2016.12.007] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 10/11/2016] [Accepted: 12/11/2016] [Indexed: 10/20/2022]
|
27
|
Vieira JB, Tavares TP, Marsh AA, Mitchell DGV. Emotion and personal space: Neural correlates of approach-avoidance tendencies to different facial expressions as a function of coldhearted psychopathic traits. Hum Brain Mapp 2016; 38:1492-1506. [PMID: 27859920 DOI: 10.1002/hbm.23467] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2016] [Revised: 10/25/2016] [Accepted: 11/06/2016] [Indexed: 12/30/2022] Open
Abstract
In social interactions, humans are expected to regulate interpersonal distance in response to the emotion displayed by others. Yet, the neural mechanisms implicated in approach-avoidance tendencies to distinct emotional expressions have not been fully described. Here, we investigated the neural systems implicated in regulating the distance to different emotions, and how they vary as a function of empathy. Twenty-three healthy participants assessed for psychopathic traits underwent fMRI scanning while they viewed approaching and withdrawing angry, fearful, happy, sad and neutral faces. Participants were also asked to set the distance to those faces on a computer screen, and to adjust the physical distance from the experimenter outside the scanner. Participants kept the greatest distances from angry faces, and shortest from happy expressions. This was accompanied by increased activation in the dorsomedial prefrontal and orbitofrontal cortices, inferior frontal gyrus, and temporoparietal junction for angry and happy expressions relative to the other emotions. Irrespective of emotion, longer distances were kept from approaching faces, which was associated with increased activation in the amygdala and insula, as well as parietal and prefrontal regions. Amygdala activation was positively correlated with greater preferred distances to angry, fearful and sad expressions. Moreover, participants scoring higher on coldhearted psychopathic traits (lower empathy) showed reduced amygdala activation to sad expressions. These findings elucidate the neural mechanisms underlying social approach-avoidance, and how they are related to variations in empathy. Hum Brain Mapp 38:1492-1506, 2017. © 2016 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Joana B Vieira
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Anatomy and Cell Biology, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada
| | - Tamara P Tavares
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada
| | - Abigail A Marsh
- Department of Psychology, Georgetown University, Washington, DC, USA
| | - Derek G V Mitchell
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Anatomy and Cell Biology, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada.,Department of Psychiatry, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada.,Department of Psychology, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
28
|
Ávila RFD, Morais DD, Bomfim AJ, Chagas MHN. Empatia e reconhecimento de expressões faciais de emoções básicas e complexas em estudantes de Medicina. JORNAL BRASILEIRO DE PSIQUIATRIA 2016. [DOI: 10.1590/0047-2085000000126] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
RESUMO Objetivo Avaliar a empatia e a capacidade de reconhecimento de emoções básicas e complexas e suas correlações em estudantes de Medicina. Métodos O desenho do estudo foi transversal. Foram avaliados 86 alunos do 3º e 4º ano do curso de Medicina de uma faculdade de Medicina do interior do estado de São Paulo com os seguintes instrumentos: (i) escala Jefferson de empatia, (ii) tarefa de Reconhecimento de Expressões Faciais de emoções básicas (REF) e (iii) Reading the mind in the eyes test (RMEt). Resultados A média geral de acertos no REF foi 15,6 (DP: ±2,3). Houve diferença estatisticamente significante no número de acertos da emoção tristeza no sexo feminino comparado com o masculino (t84 = 2,30; p = 0,02). Em relação ao RMEt, a média geral de acertos foi de 26,5 (DP: ±3,3) com diferença estatisticamente significante entre os gêneros com maior número de acertos entre as estudantes do sexo feminino (t84 = 3,43; p < 0,01). O escore total médio na escala de empatia foi 121,3 (DP: ±9,8). Houve correlação positiva fraca entre o escore total da escala de empatia e o número de acertos para a emoção tristeza (r = 0,29; p < 0,01). Conclusão O número de acertos para a emoção tristeza no REF e o escore total do RMEt foi maior no sexo feminino comparado com sexo masculino. Além disso, a empatia parece estar diretamente relacionada com a capacidade de reconhecer a emoção tristeza. Outros estudos parecem pertinentes para avaliar de forma mais profunda aspectos de empatia e reconhecimento de expressões faciais da emoção em estudantes de medicina.
Collapse
|
29
|
Kogler L, Müller VI, Seidel EM, Boubela R, Kalcher K, Moser E, Habel U, Gur RC, Eickhoff SB, Derntl B. Sex differences in the functional connectivity of the amygdalae in association with cortisol. Neuroimage 2016; 134:410-423. [PMID: 27039701 PMCID: PMC6594554 DOI: 10.1016/j.neuroimage.2016.03.064] [Citation(s) in RCA: 56] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Revised: 03/24/2016] [Accepted: 03/24/2016] [Indexed: 12/23/2022] Open
Abstract
Human amygdalae are involved in various behavioral functions such as affective and stress processing. For these behavioral functions, as well as for psychophysiological arousal including cortisol release, sex differences are reported. Here, we assessed cortisol levels and resting-state functional connectivity (rsFC) of left and right amygdalae in 81 healthy participants (42 women) to investigate potential modulation of amygdala rsFC by sex and cortisol concentration. Our analyses revealed that rsFC of the left amygdala significantly differed between women and men: Women showed stronger rsFC than men between the left amygdala and left middle temporal gyrus, inferior frontal gyrus, postcentral gyrus and hippocampus, regions involved in face processing, inner-speech, fear and pain processing. No stronger connections were detected for men and no sex difference emerged for right amygdala rsFC. Also, an interaction of sex and cortisol appeared: In women, cortisol was negatively associated with rsFC of the amygdalae with striatal regions, mid-orbital frontal gyrus, anterior cingulate gyrus, middle and superior frontal gyri, supplementary motor area and the parietal-occipital sulcus. Contrarily in men, positive associations of cortisol with rsFC of the left amygdala and these structures were observed. Functional decoding analyses revealed an association of the amygdalae and these regions with emotion, reward and memory processing, as well as action execution. Our results suggest that functional connectivity of the amygdalae as well as the regulatory effect of cortisol on brain networks differs between women and men. These sex-differences and the mediating and sex-dependent effect of cortisol on brain communication systems should be taken into account in affective and stress-related neuroimaging research. Thus, more studies including both sexes are required.
Collapse
Affiliation(s)
- Lydia Kogler
- Department of Psychiatry and Psychotherapy, Medical School, University of Tübingen, Germany; Jülich-Aachen-Research Alliance, Translational Brain Medicine, Germany; Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Aachen, Germany.
| | - Veronika I Müller
- Institute of Neuroscience und Medicine, INM-1, Research Centre Jülich, Jülich, Germany; Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, Heinrich Heine University, Düsseldorf, Germany
| | - Eva-Maria Seidel
- Department of Basic Psychological Research and Research Methods, Faculty of Psychology, University of Vienna, Vienna, Austria
| | - Roland Boubela
- MR Centre of Excellence, Medical University of Vienna, Vienna, Austria; Centre for Medical Physics and Biomedical Engineering, Medical University, Vienna, Austria
| | - Klaudius Kalcher
- MR Centre of Excellence, Medical University of Vienna, Vienna, Austria; Centre for Medical Physics and Biomedical Engineering, Medical University, Vienna, Austria
| | - Ewald Moser
- MR Centre of Excellence, Medical University of Vienna, Vienna, Austria; Centre for Medical Physics and Biomedical Engineering, Medical University, Vienna, Austria; Neuropsychiatry Division, Department of Psychiatry, Medical School, University of Pennsylvania, Philadelphia, USA
| | - Ute Habel
- Jülich-Aachen-Research Alliance, Translational Brain Medicine, Germany; Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Aachen, Germany; Institute of Neuroscience und Medicine, INM-6, Research Centre Jülich, Research Centre Jülich, Jülich, Germany; JARA BRAIN Institute 1: Structure Function Relationship
| | - Ruben C Gur
- Neuropsychiatry Division, Department of Psychiatry, Medical School, University of Pennsylvania, Philadelphia, USA
| | - Simon B Eickhoff
- Institute of Neuroscience und Medicine, INM-1, Research Centre Jülich, Jülich, Germany; Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, Heinrich Heine University, Düsseldorf, Germany
| | - Birgit Derntl
- Department of Psychiatry and Psychotherapy, Medical School, University of Tübingen, Germany; Institute of Neuroscience und Medicine, INM-1, Research Centre Jülich, Jülich, Germany; Jülich-Aachen-Research Alliance, Translational Brain Medicine, Germany; Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
30
|
Pietschnig J, Schröder L, Ratheiser I, Kryspin-Exner I, Pflüger M, Moser D, Auff E, Pirker W, Pusswald G, Lehrner J. Facial emotion recognition and its relationship to cognition and depressive symptoms in patients with Parkinson's disease. Int Psychogeriatr 2016; 28:1165-79. [PMID: 26987816 DOI: 10.1017/s104161021600034x] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
BACKGROUND Impairments in facial emotion recognition (FER) have been detected in patients with Parkinson disease (PD). Presently, we aim at assessing differences in emotion recognition performance in PD patient groups with and without mild forms of cognitive impairment (MCI) compared to healthy controls. METHODS Performance on a concise emotion recognition test battery (VERT-K) of three groups of 97 PD patients was compared with an age-equivalent sample of 168 healthy controls. Patients were categorized into groups according to two well-established classifications of MCI according to Petersen's (cognitively intact vs. amnestic MCI, aMCI, vs. non-amnestic MCI, non-aMCI) and Litvan's (cognitively intact vs. single-domain MCI, sMCI, vs. multi-domain MCI, mMCI) criteria. Patients and controls underwent individual assessments using a comprehensive neuropsychological test battery examining attention, executive functioning, language, and memory (Neuropsychological Test Battery Vienna, NTBV), the Beck Depression Inventory, and a measure of premorbid IQ (WST). RESULTS Cognitively intact PD patients and patients with MCI in PD (PD-MCI) showed significantly worse emotion recognition performance when compared to healthy controls. Between-groups effect sizes were substantial, showing non-trivial effects in all comparisons (Cohen's ds from 0.31 to 1.22). Moreover, emotion recognition performance was higher in women, positively associated with premorbid IQ and negatively associated with age. Depressive symptoms were not related to FER. CONCLUSIONS The present investigation yields further evidence for impaired FER in PD. Interestingly, our data suggest FER deficits even in cognitively intact PD patients indicating FER dysfunction prior to the development of overt cognitive dysfunction. Age showed a negative association whereas IQ showed a positive association with FER.
Collapse
Affiliation(s)
- J Pietschnig
- Department of Applied Psychology: Health, Development, Enhancement and Intervention,Faculty of Psychology,University of Vienna,Vienna,Austria
| | - L Schröder
- Department of Applied Psychology: Health, Development, Enhancement and Intervention,Faculty of Psychology,University of Vienna,Vienna,Austria
| | - I Ratheiser
- Department of Applied Psychology: Health, Development, Enhancement and Intervention,Faculty of Psychology,University of Vienna,Vienna,Austria
| | - I Kryspin-Exner
- Department of Applied Psychology: Health, Development, Enhancement and Intervention,Faculty of Psychology,University of Vienna,Vienna,Austria
| | - M Pflüger
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| | - D Moser
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| | - E Auff
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| | - W Pirker
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| | - G Pusswald
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| | - J Lehrner
- Department of Neurology,Medical University of Vienna,Vienna,Austria
| |
Collapse
|