1
|
Yi K, Wang X, Wu Y, Zhang L. Teacher-student empathic relationship shaping: an elimination mechanism for psychological segmentations in entrepreneurship education. BMC Psychol 2025; 13:292. [PMID: 40128920 PMCID: PMC11931819 DOI: 10.1186/s40359-025-02607-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2024] [Accepted: 03/12/2025] [Indexed: 03/26/2025] Open
Abstract
OBJECTIVE From the perspective of empathy theory, this study focuses on the process of entrepreneurship education to explore the mechanism between teacher-student empathic relationship and psychological connection. BACKGROUND Entrepreneurship education aims to provide talent to support the innovative development of society. Previous studies have focused on the educational significance of the promoting entrepreneurial intention, and few have paid attention to the psychological differentiation caused by the incomprehension between teachers and students. METHOD By conducting three experiments on cognitive empathy shaping to eliminate confidence segmentation, affective empathy shaping to eliminate confidence segmentation, and the non-negligible affective bias, a total of 424 college undergraduates were invited to participate in this study (405 final valid samples were collected). This study analyzed the impact of cognitive empathy, affective empathy and affective bias on confidence segmentations in entrepreneurship, within the theoretical framework of empathy, as well as the moderating role of objective environmental perception. RESULTS Cognitive empathy and affective empathy has significant positive effects on the entrepreneurial confidence segmentation between teachers and students, while affective bias plays its role to aggravate the entrepreneurial confidence segmentation. On this basis, objective factors such as individual conditions and supporting environment serve as important moderators, and the significance of rational relationship is higher than that of perceptual relationship. CONCLUSIONS This study indicates that the teacher-student empathic relationship in entrepreneurship education is a dual-process mechanism of connection, and that the rational relationship is more vital than the emotional one. We confirm the psychological significance of teacher-student empathic relationship in entrepreneurship education, and demonstrate the framework of empathy theory in a new context.
Collapse
Affiliation(s)
- Kui Yi
- Nanchang Institute of Science and Technology, Nanchang, China
- School of Economics and Management, East China Jiaotong University, Nanchang, China
| | - Xinyu Wang
- School of Economics and Management, East China Jiaotong University, Nanchang, China
| | - Yingqi Wu
- School of International Economics and Trade, Jiangxi University of Finance and Economics, Nanchang, China
| | - Le Zhang
- School of Economics and Management, East China Jiaotong University, Nanchang, China.
| |
Collapse
|
2
|
Diel A, Sato W, Hsu CT, Bäuerle A, Teufel M, Minato T. An android can show the facial expressions of complex emotions. Sci Rep 2025; 15:2433. [PMID: 39828769 PMCID: PMC11743596 DOI: 10.1038/s41598-024-84224-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2024] [Accepted: 12/19/2024] [Indexed: 01/22/2025] Open
Abstract
Trust and rapport are essential abilities for human-robot interaction. Producing emotional expressions in the robots' faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether androids can show the facial expression of complex emotions remains unknown. In this experiment, we investigated the android Nikola's ability to produce 22 dynamic facial expressions of complex emotions. For each video, 240 international participants (120 Japanese, 120 German) rated the emotions expressed by Nikola. For 13 complex emotions (i.e., amusement, appal, awe, boredom, contentment, coyness, hatred, hesitation, moral disgust, not face, pain, sleepiness, suspicion), participants of both samples rated the target emotion above the mean of other non-target emotions. Four emotions (bitterness, confusion, pride, relief) were rated above mean by one sample. For twelve of these emotions, target emotions were among the highest ranked. The results suggest that androids can produce the facial expressions of a wide range of complex emotions, which can facilitate human-robot interactions.
Collapse
Affiliation(s)
- Alexander Diel
- Clinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-Essen, 45147, Essen, Germany.
- Guardian Robot Project, RIKEN, Kyoto, Japan.
- Center for Translational Neuro- and Behavioral Sciences (C-TNBS), University of Duisburg-Essen, 45147, Essen, Germany.
| | | | | | - Alexander Bäuerle
- Clinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-Essen, 45147, Essen, Germany
- Center for Translational Neuro- and Behavioral Sciences (C-TNBS), University of Duisburg-Essen, 45147, Essen, Germany
| | - Martin Teufel
- Clinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-Essen, 45147, Essen, Germany
- Center for Translational Neuro- and Behavioral Sciences (C-TNBS), University of Duisburg-Essen, 45147, Essen, Germany
| | | |
Collapse
|
3
|
Wohltjen S, Colón YI, Zhu Z, Miller K, Huang WC, Mutlu B, Li Y, Niedenthal PM. Uniting theory and data: the promise and challenge of creating an honest model of facial expression. Cogn Emot 2025:1-15. [PMID: 39746011 DOI: 10.1080/02699931.2024.2446945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2024] [Revised: 11/26/2024] [Accepted: 12/16/2024] [Indexed: 01/04/2025]
Abstract
People routinely use facial expressions to communicate successfully and to regulate other's behaviour, yet modelling the form and meaning of these facial behaviours has proven surprisingly complex. One reason for this difficulty may lie in an over-reliance on the assumptions inherent in existing theories of facial expression - specifically that (1) there is a putative set of facial expressions that signal an internal state of emotion, (2) patterns of facial movement have been empirically linked to the prototypical emotions in this set, and (3) static, non-social, posed images from convenience samples are adequate to validate the first two assumptions. These assumptions have guided the creation of datasets, which are then used to train unrepresentative computational models of facial expression. In this article, we discuss existing theories of facial expression and review how they have shaped current facial expression recognition tools. We then discuss the resources that are available to help researchers build a more ecologically valid model of facial expressions.
Collapse
Affiliation(s)
- Sophie Wohltjen
- Department of Psychology, University of Wisconsin - Madison, Madison, WI, USA
| | - Yolanda Ivette Colón
- Department of Psychology, University of Wisconsin - Madison, Madison, WI, USA
- Wisconsin Institute for Discovery, University of Wisconsin - Madison, Madison, WI, USA
| | - Zihao Zhu
- Department of Computer Sciences, University of Wisconsin - Madison, Madison, WI, USA
| | - Karina Miller
- Department of Psychology, University of Wisconsin - Madison, Madison, WI, USA
| | - Wei-Chun Huang
- Department of Computer Sciences, University of Wisconsin - Madison, Madison, WI, USA
| | - Bilge Mutlu
- Department of Computer Sciences, University of Wisconsin - Madison, Madison, WI, USA
| | - Yin Li
- Department of Computer Sciences, University of Wisconsin - Madison, Madison, WI, USA
- Department of Biostatistics and Medical Informatics, University of Wisconsin - Madison, Madison, WI, USA
| | - Paula M Niedenthal
- Department of Psychology, University of Wisconsin - Madison, Madison, WI, USA
| |
Collapse
|
4
|
Poublan-Couzardot A, Talmi D. Pain perception as hierarchical Bayesian inference: A test case for the theory of constructed emotion. Ann N Y Acad Sci 2024; 1536:42-59. [PMID: 38837401 DOI: 10.1111/nyas.15141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/07/2024]
Abstract
An intriguing perspective about human emotion, the theory of constructed emotion considers emotions as generative models according to the Bayesian brain hypothesis. This theory brings fresh insight to existing findings, but its complexity renders it challenging to test experimentally. We argue that laboratory studies of pain could support the theory because although some may not consider pain to be a genuine emotion, the theory must at minimum be able to explain pain perception and its dysfunction in pathology. We review emerging evidence that bear on this question. We cover behavioral and neural laboratory findings, computational models, placebo hyperalgesia, and chronic pain. We conclude that there is substantial evidence for a predictive processing account of painful experience, paving the way for a better understanding of neuronal and computational mechanisms of other emotions.
Collapse
Affiliation(s)
- Arnaud Poublan-Couzardot
- Université Claude Bernard Lyon 1, INSERM, Centre de Recherche en Neurosciences de Lyon CRNL, Bron, France
| | - Deborah Talmi
- Department of Psychology, University of Cambridge, Cambridge, UK
| |
Collapse
|
5
|
Zhang Z, Peng Y, Jiang Y, Chen T. The pictorial set of Emotional Social Interactive Scenarios between Chinese Adults (ESISCA): Development and validation. Behav Res Methods 2024; 56:2581-2594. [PMID: 37528294 DOI: 10.3758/s13428-023-02168-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/12/2023] [Indexed: 08/03/2023]
Abstract
Affective picture databases with a single facial expression or body posture in one image have been widely applied to investigate emotion. However, to date, there was no standardized database containing the stimuli which involve multiple emotional signals in social interactive scenarios. The current study thus developed a pictorial set comprising 274 images depicting two Chinese adults' interactive scenarios conveying emotions of happiness, anger, sadness, fear, disgust, and neutral. The data of the valence and arousal ratings of the scenes and the emotional categories of the scenes and the faces in the images were provided in the present study. Analyses of the data collected from 70 undergraduate students suggested high reliabilities of the valence and arousal ratings of the scenes and high judgmental agreements in categorizing the scene and facial emotions. The findings suggested that the present dataset is well constructed and could be useful for future studies to investigate the emotion recognition or empathy in social interactions in both healthy and clinical (e.g., ASD) populations.
Collapse
Affiliation(s)
- Ziyu Zhang
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, 215123, China
| | - Yanqin Peng
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, 215123, China
| | - Yiyao Jiang
- College of Arts and Sciences, Syracuse University, Syracuse, NY, 13244, USA
| | - Tingji Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, 215123, China.
| |
Collapse
|
6
|
Bianchini E, Rinaldi D, Alborghetti M, Simonelli M, D’Audino F, Onelli C, Pegolo E, Pontieri FE. The Story behind the Mask: A Narrative Review on Hypomimia in Parkinson's Disease. Brain Sci 2024; 14:109. [PMID: 38275529 PMCID: PMC10814039 DOI: 10.3390/brainsci14010109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 01/18/2024] [Accepted: 01/19/2024] [Indexed: 01/27/2024] Open
Abstract
Facial movements are crucial for social and emotional interaction and well-being. Reduced facial expressions (i.e., hypomimia) is a common feature in patients with Parkinson's disease (PD) and previous studies linked this manifestation to both motor symptoms of the disease and altered emotion recognition and processing. Nevertheless, research on facial motor impairment in PD has been rather scarce and only a limited number of clinical evaluation tools are available, often suffering from poor validation processes and high inter- and intra-rater variability. In recent years, the availability of technology-enhanced quantification methods of facial movements, such as automated video analysis and machine learning application, led to increasing interest in studying hypomimia in PD. In this narrative review, we summarize the current knowledge on pathophysiological hypotheses at the basis of hypomimia in PD, with particular focus on the association between reduced facial expressions and emotional processing and analyze the current evaluation tools and management strategies for this symptom, as well as future research perspectives.
Collapse
Affiliation(s)
- Edoardo Bianchini
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- AGEIS, Université Grenoble Alpes, 38000 Grenoble, France
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Domiziana Rinaldi
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marika Alborghetti
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marta Simonelli
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Ospedale dei Castelli, ASL Rome 6, 00040 Ariccia, Italy
| | | | - Camilla Onelli
- Department of Molecular Medicine, University of Padova, 35121 Padova, Italy;
| | - Elena Pegolo
- Department of Information Engineering, University of Padova, 35131 Padova, Italy;
| | - Francesco E. Pontieri
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
- Fondazione Santa Lucia IRCCS, 00179 Rome, Italy
| |
Collapse
|
7
|
Chen C, Messinger DS, Chen C, Yan H, Duan Y, Ince RAA, Garrod OGB, Schyns PG, Jack RE. Cultural facial expressions dynamically convey emotion category and intensity information. Curr Biol 2024; 34:213-223.e5. [PMID: 38141619 PMCID: PMC10831323 DOI: 10.1016/j.cub.2023.12.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 10/27/2023] [Accepted: 12/01/2023] [Indexed: 12/25/2023]
Abstract
Communicating emotional intensity plays a vital ecological role because it provides valuable information about the nature and likelihood of the sender's behavior.1,2,3 For example, attack often follows signals of intense aggression if receivers fail to retreat.4,5 Humans regularly use facial expressions to communicate such information.6,7,8,9,10,11 Yet how this complex signaling task is achieved remains unknown. We addressed this question using a perception-based, data-driven method to mathematically model the specific facial movements that receivers use to classify the six basic emotions-"happy," "surprise," "fear," "disgust," "anger," and "sad"-and judge their intensity in two distinct cultures (East Asian, Western European; total n = 120). In both cultures, receivers expected facial expressions to dynamically represent emotion category and intensity information over time, using a multi-component compositional signaling structure. Specifically, emotion intensifiers peaked earlier or later than emotion classifiers and represented intensity using amplitude variations. Emotion intensifiers are also more similar across emotions than classifiers are, suggesting a latent broad-plus-specific signaling structure. Cross-cultural analysis further revealed similarities and differences in expectations that could impact cross-cultural communication. Specifically, East Asian and Western European receivers have similar expectations about which facial movements represent high intensity for threat-related emotions, such as "anger," "disgust," and "fear," but differ on those that represent low threat emotions, such as happiness and sadness. Together, our results provide new insights into the intricate processes by which facial expressions can achieve complex dynamic signaling tasks by revealing the rich information embedded in facial expressions.
Collapse
Affiliation(s)
- Chaona Chen
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK.
| | - Daniel S Messinger
- Departments of Psychology, Pediatrics, and Electrical & Computer Engineering, University of Miami, 5665 Ponce De Leon Blvd, Coral Gables, FL 33146, USA
| | - Cheng Chen
- Foreign Language Department, Teaching Centre for General Courses, Chengdu Medical College, 601 Tianhui Street, Chengdu 610083, China
| | - Hongmei Yan
- The MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, North Jianshe Road, Chengdu 611731, China
| | - Yaocong Duan
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Robin A A Ince
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Oliver G B Garrod
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Philippe G Schyns
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| | - Rachael E Jack
- School of Psychology and Neuroscience, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, Scotland, UK
| |
Collapse
|
8
|
Zhang L, Liang H, Bjureberg J, Xiong F, Cai Z. The Association Between Emotion Recognition and Internalizing Problems in Children and Adolescents: A Three-Level Meta-Analysis. J Youth Adolesc 2024; 53:1-20. [PMID: 37991601 DOI: 10.1007/s10964-023-01891-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2023] [Accepted: 10/17/2023] [Indexed: 11/23/2023]
Abstract
Numerous studies have explored the link between how well youth recognize emotions and their internalizing problems, but a consensus remains elusive. This study used a three-level meta-analysis model to quantitatively synthesize the findings of existing studies to assess the relationship. A moderation analysis was also conducted to explore the sources of research heterogeneity. Through a systematic literature search, a total of 42 studies with 201 effect sizes were retrieved for the current meta-analysis, and 7579 participants were included. Emotion recognition was negatively correlated with internalizing problems. Children and adolescents with weaker emotion recognition skills were more likely to have internalizing problems. In addition, this meta-analysis found that publication year had a significant moderating effect. The correlation between emotion recognition and internalizing problems decreased over time. The degree of internalizing problems was also found to be a significant moderator. The correlation between emotion recognition and internalizing disorders was higher than the correlation between emotion recognition and internalizing symptoms. Deficits in emotion recognition might be relevant for the development and/or maintenance of internalizing problems in children and adolescents. The overall effect was small and future research should explore the clinical relevance of the association.
Collapse
Affiliation(s)
- Lin Zhang
- School of Psychology, Central China Normal University, Wuhan, China.
- Key Laboratory of Adolescent Cyberpsychology and Behavior, Ministry of Education, Wuhan, China.
- Key Laboratory of Human Development and Mental Health of Hubei Province, Wuhan, China.
| | - Heting Liang
- School of Psychology, Central China Normal University, Wuhan, China
- Key Laboratory of Adolescent Cyberpsychology and Behavior, Ministry of Education, Wuhan, China
- Key Laboratory of Human Development and Mental Health of Hubei Province, Wuhan, China
| | - Johan Bjureberg
- Centre for Psychiatry Research, Karolinska Institutet and Stockholm Health Care Services, Stockholm County Council, Stockholm, Sweden
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Fen Xiong
- School of Psychology, Central China Normal University, Wuhan, China
- Key Laboratory of Adolescent Cyberpsychology and Behavior, Ministry of Education, Wuhan, China
- Key Laboratory of Human Development and Mental Health of Hubei Province, Wuhan, China
| | - Zhihui Cai
- School of Psychology, Central China Normal University, Wuhan, China
- Key Laboratory of Adolescent Cyberpsychology and Behavior, Ministry of Education, Wuhan, China
- Key Laboratory of Human Development and Mental Health of Hubei Province, Wuhan, China
| |
Collapse
|
9
|
Correia-Caeiro C, Guo K, Mills DS. Visual perception of emotion cues in dogs: a critical review of methodologies. Anim Cogn 2023; 26:727-754. [PMID: 36870003 PMCID: PMC10066124 DOI: 10.1007/s10071-023-01762-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 02/07/2023] [Accepted: 02/20/2023] [Indexed: 03/05/2023]
Abstract
Comparative studies of human-dog cognition have grown exponentially since the 2000's, but the focus on how dogs look at us (as well as other dogs) as social partners is a more recent phenomenon despite its importance to human-dog interactions. Here, we briefly summarise the current state of research in visual perception of emotion cues in dogs and why this area is important; we then critically review its most commonly used methods, by discussing conceptual and methodological challenges and associated limitations in depth; finally, we suggest some possible solutions and recommend best practice for future research. Typically, most studies in this field have concentrated on facial emotional cues, with full body information rarely considered. There are many challenges in the way studies are conceptually designed (e.g., use of non-naturalistic stimuli) and the way researchers incorporate biases (e.g., anthropomorphism) into experimental designs, which may lead to problematic conclusions. However, technological and scientific advances offer the opportunity to gather much more valid, objective, and systematic data in this rapidly expanding field of study. Solving conceptual and methodological challenges in the field of emotion perception research in dogs will not only be beneficial in improving research in dog-human interactions, but also within the comparative psychology area, in which dogs are an important model species to study evolutionary processes.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- School of Psychology, University of Lincoln, Brayford Pool, Lincoln, LN6 7TS, UK.
- Department of Life Sciences, University of Lincoln, Lincoln, LN6 7DL, UK.
- Primate Research Institute, Kyoto University, Inuyama, 484-8506, Japan.
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, Inuyama, 484-8506, Japan.
| | - Kun Guo
- School of Psychology, University of Lincoln, Brayford Pool, Lincoln, LN6 7TS, UK
| | - Daniel S Mills
- Department of Life Sciences, University of Lincoln, Lincoln, LN6 7DL, UK
| |
Collapse
|
10
|
Mitzkovitz C, Dowd SM, Cothran T, Musil S. The Eyes Have It: Psychotherapy in the Era of Masks. J Clin Psychol Med Settings 2022; 29:886-897. [PMID: 35118604 PMCID: PMC8812949 DOI: 10.1007/s10880-022-09856-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/16/2022] [Indexed: 11/24/2022]
Abstract
Nonverbal communication is integral to the success of psychotherapy and facial expression is an important component of nonverbal communication. The SARS CoV-2 pandemic has caused alterations in how psychotherapy services are provided. In this paper, potential issues that may arise from conducting psychotherapy when both the patient and therapist are wearing masks are explored. These include higher likelihood of misidentifying facial expression, especially when expression is incongruent with body language, and when the lower face is more important for correct identification of emotion. These issues may be particularly problematic for patient populations for whom emotion recognition may be a problem at baseline, or for those more prone to biases in emotional recognition. Suggestions are made for therapists to consider when seeing patients in-person when masks are necessary.
Collapse
Affiliation(s)
- Cayla Mitzkovitz
- Department of Psychology, Central Michigan University, Mount Pleasant, MI, USA
| | - Sheila M Dowd
- Department of Psychiatry and Behavioral Sciences, Rush University Medical Center, Chicago, IL, USA
| | - Thomas Cothran
- Office of Neuropsychology, Community Care Network, Inc., Munster, IN, USA
| | - Suzanne Musil
- Department of Psychiatry and Behavioral Sciences, Rush University Medical Center, Chicago, IL, USA.
- Department of Psychiatry and Behavioral Sciences, Rush University Medical Center, 1645 W. Jackson Blvd, Ste. 400, Chicago, IL, 60612, USA.
| |
Collapse
|
11
|
Frederick DA, Reynolds TA, Barrera CA, Murray SB. Demographic and sociocultural predictors of face image satisfaction: The U.S. Body Project I. Body Image 2022; 41:1-16. [PMID: 35228101 DOI: 10.1016/j.bodyim.2022.01.016] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/16/2020] [Revised: 12/22/2021] [Accepted: 01/26/2022] [Indexed: 11/28/2022]
Abstract
Despite substantial literature surrounding how people process and perceive faces, there is very little research investigating how people evaluate their own faces. We examined how gender, body mass, race, age, and sexual orientation were linked to people's satisfaction with the appearance of their eyes, nose, facial shape, and face overall among 11,620 adults recruited via Mechanical Turk. Most people mostly or definitely agreed they were happy with their facial appearance. There were notable racial differences, with Asian participants tending to report greater dissatisfaction. For example, only 66% of Asian women and 60% of Asian men mostly or definitely agreed that they were happy with the appearance of their eyes, which was lower than other racial groups. BMI and age were not strongly associated with face satisfaction. Sexual minority men were less satisfied than heterosexual men. About one in four gay and bisexual men, compared to only one in seven heterosexual men, reported dissatisfaction with their overall facial appearance. Men and women with poorer face image engaged in more appearance surveillance, more strongly internalized the thin-ideal, and perceived stronger sociocultural pressures from peers, parents, and media. The current study highlights important sociocultural and demographic factors tied to poorer face image.
Collapse
Affiliation(s)
- David A Frederick
- Crean College of Health and Behavioral Sciences, Chapman University, Orange, CA, USA.
| | - Tania A Reynolds
- Psychology Department, University of New Mexico, Albuquerque, NM, USA; The Kinsey Institute, Indiana University, Bloomington, IN, USA
| | - Carlos A Barrera
- Department of Psychology, University of California, San Diego, La Jolla, CA, USA
| | - Stuart B Murray
- Department of Psychiatry & Behavioral Sciences, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
12
|
Tejada J, Freitag RMK, Pinheiro BFM, Cardoso PB, Souza VRA, Silva LS. Building and validation of a set of facial expression images to detect emotions: a transcultural study. PSYCHOLOGICAL RESEARCH 2021; 86:1996-2006. [PMID: 34652530 DOI: 10.1007/s00426-021-01605-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2020] [Accepted: 09/28/2021] [Indexed: 11/26/2022]
Abstract
The automatic emotion recognition from facial expressions has become an exceptional tool in research involving human subjects and has made it possible to obtain objective measurements of the emotional state of research subjects. Different software and commercial solutions are offered to perform this task. However, the adaptation to cultural context and the recognition of complex expressions and/or emotions are two of the main challenges faced by these solutions. Here, we describe the construction and validation of a set of facial expression images suitable for training a recognition system. Our datasets consist of images of people with no experience in acting who were recorded with a webcam as they performed a computer-assisted task in a room with a light background and overhead illumination. The six basic emotions and mockery were included and a combination of OpenCV, Dlib and Scikit-learn Python libraries were used to develop a support vector machine classifier. The code is available at GitHub and the images will be provided upon request. Since transcultural facial expressions to evaluate complex emotions and open-source solutions were used in this study, we strongly believe that our dataset will be useful in different research contexts.
Collapse
Affiliation(s)
- Julian Tejada
- Departamento de Psicologia, Universidade Federal de Sergipe, São Cristóvão, Brazil.
- Facultad de Psicología, Fundación Universitaria Konrad Lorenz, Bogotá, Colombia.
| | | | | | | | | | - Lucas Santos Silva
- Departamento de Letras, Universidade Federal de Sergipe, São Cristóvão, Brazil
| |
Collapse
|
13
|
Carlisi CO, Reed K, Helmink FGL, Lachlan R, Cosker DP, Viding E, Mareschal I. Using genetic algorithms to uncover individual differences in how humans represent facial emotion. ROYAL SOCIETY OPEN SCIENCE 2021; 8:202251. [PMID: 34659775 PMCID: PMC8511778 DOI: 10.1098/rsos.202251] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Accepted: 09/17/2021] [Indexed: 06/13/2023]
Abstract
Emotional facial expressions critically impact social interactions and cognition. However, emotion research to date has generally relied on the assumption that people represent categorical emotions in the same way, using standardized stimulus sets and overlooking important individual differences. To resolve this problem, we developed and tested a task using genetic algorithms to derive assumption-free, participant-generated emotional expressions. One hundred and five participants generated a subjective representation of happy, angry, fearful and sad faces. Population-level consistency was observed for happy faces, but fearful and sad faces showed a high degree of variability. High test-retest reliability was observed across all emotions. A separate group of 108 individuals accurately identified happy and angry faces from the first study, while fearful and sad faces were commonly misidentified. These findings are an important first step towards understanding individual differences in emotion representation, with the potential to reconceptualize the way we study atypical emotion processing in future research.
Collapse
Affiliation(s)
- Christina O. Carlisi
- Division of Psychology and Language Sciences, Developmental Risk and Resilience Unit, University College London, 26 Bedford Way, London WC1H 0AP, UK
| | - Kyle Reed
- Department of Computer Science, University of Bath, 1 West, Claverton Down, Bath BA2 7AY, UK
| | - Fleur G. L. Helmink
- Erasmus University Medical Center, s-Gravendijkwal 230, Rotterdam 3015 CE, The Netherlands
| | - Robert Lachlan
- Department of Psychology, Royal Holloway University of London, Wolfson Building, Egham TW20 0EX, UK
| | - Darren P. Cosker
- Department of Computer Science, University of Bath, 1 West, Claverton Down, Bath BA2 7AY, UK
| | - Essi Viding
- Division of Psychology and Language Sciences, Developmental Risk and Resilience Unit, University College London, 26 Bedford Way, London WC1H 0AP, UK
| | - Isabelle Mareschal
- School of Biological and Chemical Sciences, Department of Psychology, Queen Mary University of London, G. E. Fogg Building, Mile End Road, London E1 4DQ, UK
| |
Collapse
|
14
|
Chen PHA, Qu Y. Taking a Computational Cultural Neuroscience Approach to Study Parent-Child Similarities in Diverse Cultural Contexts. Front Hum Neurosci 2021; 15:703999. [PMID: 34512293 PMCID: PMC8426574 DOI: 10.3389/fnhum.2021.703999] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2021] [Accepted: 07/19/2021] [Indexed: 12/03/2022] Open
Abstract
Parent-child similarities and discrepancies at multiple levels provide a window to understand the cultural transmission process. Although prior research has examined parent-child similarities at the belief, behavioral, and physiological levels across cultures, little is known about parent-child similarities at the neural level. The current review introduces an interdisciplinary computational cultural neuroscience approach, which utilizes computational methods to understand neural and psychological processes being involved during parent-child interactions at intra- and inter-personal level. This review provides three examples, including the application of intersubject representational similarity analysis to analyze naturalistic neuroimaging data, the usage of computer vision to capture non-verbal social signals during parent-child interactions, and unraveling the psychological complexities involved during real-time parent-child interactions based on their simultaneous recorded brain response patterns. We hope that this computational cultural neuroscience approach can provide researchers an alternative way to examine parent-child similarities and discrepancies across different cultural contexts and gain a better understanding of cultural transmission processes.
Collapse
Affiliation(s)
- Pin-Hao A. Chen
- Department of Psychology, National Taiwan University, Taipei, Taiwan
- Neurobiology and Cognitive Science Center, National Taiwan University, Taipei, Taiwan
- Center for Artificial Intelligence and Advanced Robotics, National Taiwan University, Taipei, Taiwan
| | - Yang Qu
- School of Education and Social Policy, Northwestern University, Evanston, IL, United States
| |
Collapse
|
15
|
Chakrabarty M, Dasgupta G, Acharya R, Chatterjee SS, Guha P, Belmonte MK, Bhattacharya K. Validation of revised reading the mind in the eyes test in the Indian (Bengali) population: A preliminary study. Indian J Psychiatry 2021; 63:74-79. [PMID: 34083824 PMCID: PMC8106414 DOI: 10.4103/psychiatry.indianjpsychiatry_967_20] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Revised: 09/29/2020] [Accepted: 10/01/2020] [Indexed: 12/19/2022] Open
Abstract
BACKGROUND Social cognition deficits are common in clinical populations but there is a dearth of standardized social cognition assessment tools in India. Theory of mind (ToM) is an important aspect of social cognition which is often assessed with the revised reading the mind in eyes test (RMET-R). However, we do not have a statistically validated version of the test for the Indian population. AIM This study aims to assess the acceptability, reliability, and validity of the Bengali version of the RMET-R. MATERIALS AND METHODS We administered the RMET-R to 23 patients with chronic schizophrenia (SCZ), 22 patients with bipolar disorder, and 104 healthy controls (HCs) to evaluate the reliability and validity of the instrument in the Indian (Bengali) population. RESULTS We obtained moderate internal consistency (Cronbach's alpha = 0.6) and test-retest reliability (intraclass correlation coefficient = 0.64, P < 0.001). Positive correlations were found between RMET-R and Wechsler picture arrangement (r = 0.60, P < 0.001), picture completion (r = 0.54, P < 0.001), and comprehension subtests (r = 0.48, P < 0.001). Patients with SCZ (M = 49.7, standard deviation [SD] = 16.5) scored significantly lower than HCs (M = 68.9, SD = 13.8) (P = 0.008; Cohen's d = 1.3) on the RMET-R. Thus this tool could discriminate patients who are reported to have Theory of Mind deficits from healthy controls. CONCLUSION The Bengali version of the RMET-R is a reliable and valid tool for assessing first-order ToM insofar as the original RMET-R measures this construct.
Collapse
Affiliation(s)
| | - Gargi Dasgupta
- Department of Psychiatry, Medical College, Kolkata, West Bengal, India
| | | | - Seshadri Sekhar Chatterjee
- Department of Psychiatry, Diamond Harbour Government Medical College and Hospital, Diamond Harbour, West Bengal, India
| | - Prathama Guha
- Department of Psychiatry, Calcutta National Medical College, Kolkata, West Bengal, India
| | - Matthew K Belmonte
- The Com DEALL Trust, Bengaluru, Karnataka, India.,Division of Psychology, Nottingham Trent University, Nottingham, England, UK
| | - Kaberi Bhattacharya
- Department of Psychiatry, Midnapore Medical College, Midnapore, West Bengal, India
| |
Collapse
|
16
|
Mohan SN, Mukhtar F, Jobson L. An Exploratory Study on Cross-Cultural Differences in Facial Emotion Recognition Between Adults From Malaysia and Australia. Front Psychiatry 2021; 12:622077. [PMID: 34177636 PMCID: PMC8219914 DOI: 10.3389/fpsyt.2021.622077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 05/07/2021] [Indexed: 01/29/2023] Open
Abstract
While culture and depression influence the way in which humans process emotion, these two areas of investigation are rarely combined. Therefore, the aim of this study was to investigate the difference in facial emotion recognition among Malaysian Malays and Australians with a European heritage with and without depression. A total of 88 participants took part in this study (Malays n = 47, Australians n = 41). All participants were screened using The Structured Clinical Interview for DSM-5 Clinician Version (SCID-5-CV) to assess the Major Depressive Disorder (MDD) diagnosis and they also completed the Beck Depression Inventory (BDI). This study consisted of the facial emotion recognition (FER) task whereby the participants were asked to look at facial images and determine the emotion depicted by each of the facial expressions. It was found that depression status and cultural group did not significantly influence overall FER accuracy. Malaysian participants without MDD and Australian participants with MDD performed quicker as compared to Australian participants without MDD on the FER task. Also, Malaysian participants more accurately recognized fear as compared to Australian participants. Future studies can focus on the extent of the influence and other aspects of culture and participant condition on facial emotion recognition.
Collapse
Affiliation(s)
- Sindhu Nair Mohan
- Department of Psychiatry, School of Medicine and Health Sciences, Universiti Putra Malaysia, Seri Kembangan, Malaysia
| | - Firdaus Mukhtar
- Department of Psychiatry, School of Medicine and Health Sciences, Universiti Putra Malaysia, Seri Kembangan, Malaysia
| | - Laura Jobson
- School of Psychological Sciences, Turner Institute for Brain and Mental Health, Monash University, Clayton, VIC, Australia
| |
Collapse
|
17
|
Mieronkoski R, Syrjälä E, Jiang M, Rahmani A, Pahikkala T, Liljeberg P, Salanterä S. Developing a pain intensity prediction model using facial expression: A feasibility study with electromyography. PLoS One 2020; 15:e0235545. [PMID: 32645045 PMCID: PMC7347182 DOI: 10.1371/journal.pone.0235545] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2020] [Accepted: 06/17/2020] [Indexed: 11/25/2022] Open
Abstract
The automatic detection of facial expressions of pain is needed to ensure accurate pain assessment of patients who are unable to self-report pain. To overcome the challenges of automatic systems for determining pain levels based on facial expressions in clinical patient monitoring, a surface electromyography method was tested for feasibility in healthy volunteers. In the current study, two types of experimental gradually increasing pain stimuli were induced in thirty-one healthy volunteers who attended the study. We used a surface electromyography method to measure the activity of five facial muscles to detect facial expressions during pain induction. Statistical tests were used to analyze the continuous electromyography data, and a supervised machine learning was applied for pain intensity prediction model. Muscle activation of corrugator supercilii was most strongly associated with self-reported pain, and the levator labii superioris and orbicularis oculi showed a statistically significant increase in muscle activation when the pain stimulus reached subjects' self -reported pain thresholds. The two strongest features associated with pain, the waveform length of the corrugator supercilii and levator labii superioris, were selected for a prediction model. The performance of the pain prediction model resulted in a c-index of 0.64. In the study results, the most detectable difference in muscle activity during the pain experience was connected to eyebrow lowering, nose wrinkling and upper lip raising. As the performance of the prediction model remains modest, yet with a statistically significant ordinal classification, we suggest testing with a larger sample size to further explore the variables that affect variation in expressiveness and subjective pain experience.
Collapse
Affiliation(s)
| | - Elise Syrjälä
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Mingzhe Jiang
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Amir Rahmani
- Department of Computer Science, University of California, Irvine, California, United States of America
- School of Nursing, University of California, Irvine, California, United States of America
| | - Tapio Pahikkala
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Pasi Liljeberg
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Sanna Salanterä
- Department of Nursing Science, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| |
Collapse
|
18
|
Dorante MI, Kollar B, Obed D, Haug V, Fischer S, Pomahac B. Recognizing Emotional Expression as an Outcome Measure After Face Transplant. JAMA Netw Open 2020; 3:e1919247. [PMID: 31940037 PMCID: PMC6991259 DOI: 10.1001/jamanetworkopen.2019.19247] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/21/2023] Open
Abstract
IMPORTANCE Limited quantitative data exist on the restoration of nonverbal communication via facial emotional expression after face transplant. Objective and noninvasive methods for measuring outcomes and tracking rehabilitation after face transplant are lacking. OBJECTIVE To measure emotional expression as an indicator of functional outcomes and rehabilitation after face transplant via objective, noninvasive, and nonobtrusive software-based video analysis. DESIGN, SETTING, AND PARTICIPANTS This single-center case-control study analyzed videos with commercially available video analysis software capable of detecting emotional expression. The study participants were 6 patients who underwent face transplant at Brigham and Women's Hospital between April 2009 and March 2014. They were matched by age, race/ethnicity, culture, and sex to 6 healthy controls with no prior facial surgical procedures. Participants were asked to perform either emotional expressions (direct evaluation) or standardized facial movements (indirect evaluation). Videos were obtained in a clinical setting, except for direct evaluation videos of 3 patients that were recorded at the patients' residences. Data analysis was performed from June 2018 to November 2018. MAIN OUTCOMES AND MEASURES The possibility of detecting the emotional expressions of happiness, sadness, anger, fear, surprise, and disgust was evaluated using intensity score values between 0 and 1, representing expressions that are absent or fully present, respectively. RESULTS Six patients underwent face transplant (4 men; mean [SD] age, 42 [14] years). Four underwent full face transplants, and 2 underwent partial face transplants of the middle and lower two-thirds of the face. In healthy controls, happiness was the only emotion reliably recognized in both indirect (mean [SD] intensity score, 0.92 [0.05]) and direct (mean [SD] intensity score, 0.91 [0.04]) evaluation. Indirect evaluation showed that expression of happiness significantly improved 1 year after transplant (0.04 point per year; 95% CI, 0.02 to 0.06 point per year; P = .002). Expression of happiness was restored to a mean of 43% (range, 14% to 75%) of that of healthy controls after face transplant. The expression of sadness showed a significant change only during the first year after transplant (-0.53 point per year; 95% CI, -0.82 to -0.24 point per year; P = .005). All other emotions were detectable with no significant change after transplant. Nearly all emotions were detectable in long-term direct evaluation of 3 patients, with expression of happiness restored to a mean of 26% (range, 5% to 59%) of that of healthy controls. CONCLUSIONS AND RELEVANCE Partial restoration of facial emotional expression is possible after face transplant. Video analysis software may provide useful clinical information and aid rehabilitation after face transplant.
Collapse
Affiliation(s)
- Miguel I. Dorante
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
- Lahey Hospital and Medical Center, Department of Plastic and Reconstructive Surgery, Beth Israel Lahey Health, Burlington, Massachusetts
| | - Branislav Kollar
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
| | - Doha Obed
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
| | - Valentin Haug
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
- Department of Hand, Plastic, and Reconstructive Surgery, Burn Trauma Center, BG Trauma Center Ludwigshafen, University of Heidelberg, Ludwigshafen, Germany
| | - Sebastian Fischer
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
- Department of Hand, Plastic, and Reconstructive Surgery, Burn Trauma Center, BG Trauma Center Ludwigshafen, University of Heidelberg, Ludwigshafen, Germany
| | - Bohdan Pomahac
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
19
|
van der Meulen A, de Ruyter D, Blokland A, Krabbendam L. Cross-Cultural Mental State Reading Ability in Antillean Dutch, Moroccan Dutch, and Dutch Young Adults. JOURNAL OF CROSS-CULTURAL PSYCHOLOGY 2019. [DOI: 10.1177/0022022118823283] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Understanding how bicultural and monocultural individuals are oriented toward the cultures they come into frequent contact with can increase insights into their adaptation and well-being. Previous research has shown a relation between culture and mental state reading in the form of the cultural in-group effect, which is defined as the advantage in reading mental states from the own cultural group compared with other groups. Thus, orientation toward cultures can be assessed not only in self-reported behavioral and psychological acculturation but also in the domain of social–cognitive abilities. The aim of the current research is to gain insight into acculturation in the social–cognitive ability of mental state reading. In addition, it explores how this facet of acculturation is related to the more traditionally studied behavioral and psychological acculturation. Cross-cultural mental state reading, language and possession of friends (behavioral acculturation), and cultural identification (psychological acculturation) were assessed in Antillean Dutch ( n = 128), Moroccan Dutch ( n = 204), and Dutch ( n = 349) young adults between 19 and 24 years old ( M = 21.57 years, SD = 1.38 years). For cross-cultural mental state reading, the in-group effect was confirmed for the Dutch but not for the Antillean Dutch and Moroccan Dutch participants. Furthermore, there were no consistent associations between mental state reading and behavioral and psychological acculturation in the three groups. The present results extend fundamental research on cross-cultural mental state reading and also help to further understand the orientation of these specific cultural groups.
Collapse
Affiliation(s)
- Anna van der Meulen
- Section of Clinical Developmental Psychology and Research Institute LEARN!, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, The Netherlands
| | | | - Arjan Blokland
- Institute for Criminal Law and Criminology, Leiden University, The Netherlands
- Netherlands Institute for the Study of Crime and Law Enforcement, Amsterdam, The Netherlands
| | - Lydia Krabbendam
- Section of Clinical Developmental Psychology and Research Institute LEARN!, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, The Netherlands
| |
Collapse
|
20
|
Mayo LM, Heilig M. In the face of stress: Interpreting individual differences in stress-induced facial expressions. Neurobiol Stress 2019; 10:100166. [PMID: 31193535 PMCID: PMC6535645 DOI: 10.1016/j.ynstr.2019.100166] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2018] [Revised: 04/17/2019] [Accepted: 04/19/2019] [Indexed: 11/22/2022] Open
Abstract
Stress is an inevitable part of life that can profoundly impact social and emotional functioning, contributing to the development of psychiatric disease. One key component of emotion and social processing is facial expressions, which humans can readily detect and react to even without conscious awareness. Facial expressions have been the focus of philosophic and scientific interest for centuries. Historically, facial expressions have been relegated to peripheral indices of fixed emotion states. More recently, affective neuroscience has undergone a conceptual revolution, resulting in novel interpretations of these muscle movements. Here, we review the role of facial expressions according to the leading affective neuroscience theories, including constructed emotion and social-motivation accounts. We specifically highlight recent data (Mayo et al, 2018) demonstrating the way in which stress shapes facial expressions and how this is influenced by individual factors. In particular, we focus on the consequence of genetic variation within the endocannabinoid system, a neuromodulatory system implicated in stress and emotion, and its impact on stress-induced facial muscle activity. In a re-analysis of this dataset, we highlight how gender may also influence these processes, conceptualized as variation in the "fight-or-flight" or "tend-and-befriend" behavioral responses to stress. We speculate on how these interpretations may contribute to a broader understanding of facial expressions, discuss the potential use of facial expressions as a trans-diagnostic marker of psychiatric disease, and suggest future work necessary to resolve outstanding questions.
Collapse
Affiliation(s)
- Leah M. Mayo
- Center for Social and Affective Neuroscience, Department of Clinical and Experimental Medicine, Linköping University, Sweden
| | | |
Collapse
|
21
|
Garrido MV, Prada M. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces. Front Psychol 2017; 8:2181. [PMID: 29312053 PMCID: PMC5742208 DOI: 10.3389/fpsyg.2017.02181] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Accepted: 11/30/2017] [Indexed: 12/21/2022] Open
Abstract
The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).
Collapse
Affiliation(s)
| | - Marília Prada
- Instituto Universitário de Lisboa (ISCTE-IUL), CIS - IUL, Lisboa, Portugal
| |
Collapse
|