1
|
Villa MC, Borriero A, Diano M, Ciorli T, Celeghin A, de Gelder B, Tamietto M. Dissociable neural networks for processing fearful bodily expressions at different spatial frequencies. Cereb Cortex 2025; 35:bhaf067. [PMID: 40277422 DOI: 10.1093/cercor/bhaf067] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2024] [Revised: 02/20/2025] [Accepted: 02/27/2025] [Indexed: 04/26/2025] Open
Abstract
The human brain processes visual input across various spatial frequency (SF) ranges to extract emotional cues. Prior studies have extensively explored SF processing in facial expressions, yielding partly conflicting results. However, bodily expressions, which provide complementary emotional and survival-relevant cues, remain unexplored. We investigated the neural mechanisms underlying the processing of low (LSF), high (HSF), and broad spatial frequency (BSF) components in fearful versus neutral bodily postures. Using functional Magnetic Resonance Imaging, we examined brain activity in 20 participants viewing SF-filtered images of bodily expressions in a semi-passive task. A multivariate "searchlight" analysis based on Multi-Voxel Pattern Analysis was employed to decode the non-linear activation patterns associated with each SF band. Our findings reveal that SF processing engages distinct neural networks in response to fearful bodily expressions. BSF stimuli activated a widespread network, including the amygdala, pulvinar, frontal, and temporal cortices. These findings suggest a general threat-detection system integrating information across all SFs. HSF stimuli engaged cortical regions associated with detailed emotional evaluation and motor planning, such as the orbitofrontal cortex, anterior cingulate cortex, and premotor areas, suggesting that processing fine-grained fear cues involves computationally demanding networks related to emotional resonance and action preparation. In contrast, LSF stimuli primarily activated motor-preparatory regions linked to rapid, action-oriented responses, highlighting the brain prioritization of quick readiness to low-detail threats. Notably, the amygdala showed no SF selectivity, supporting its role as a generalized "relevance detector" in emotional processing. The present study demonstrates that the brain flexibly adapts its SF processing strategy based on the visual details available in fearful bodily expressions, underscoring the complexity and adaptability of emotional processing from bodily signals.
Collapse
Affiliation(s)
- Maria-Chiara Villa
- Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
| | - Alessio Borriero
- Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
- International School of Advanced Studies, University of Camerino, via Gentile III da Varano, Camerino (MC) 62032, Italy
- Pegaso Telematic University, Via Porzio, Centro Direzionale, Isola F2, Naples 80143, Italy
| | - Matteo Diano
- Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
- Neuroscience Institute of Turin - NIT, via G. Verdi 10, Torino 10124, Italy
| | - Tommaso Ciorli
- SAMBA (SpAtial, Motor and Bodily Awareness) Research Group, Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
| | - Alessia Celeghin
- Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
- Neuroscience Institute of Turin - NIT, via G. Verdi 10, Torino 10124, Italy
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Oxfordlaan 55, EV 6229, Maastricht, The Netherlands
- The Italian Academy for Advanced Studies at Columbia University, 1161 Amsterdam Avenue, New York, NY 10027, United States
| | - Marco Tamietto
- Department of Psychology, University of Torino, via G. Verdi 10, Torino 10124, Italy
- Neuroscience Institute of Turin - NIT, via G. Verdi 10, Torino 10124, Italy
- Department of Medical and Clinical Psychology, and CoRPS-Center of Research on Psychology in Somatic diseases, Tilburg University, PO Box 90153, Tilburg, LE 5000, The Netherlands
| |
Collapse
|
2
|
Lundell-Creagh R, Monroy M, Ocampo J, Keltner D. Blocking lower facial features reduces emotion identification accuracy in static faces and full body dynamic expressions. Cogn Emot 2025:1-12. [PMID: 40094937 DOI: 10.1080/02699931.2025.2477745] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2024] [Revised: 01/21/2025] [Accepted: 02/22/2025] [Indexed: 03/19/2025]
Abstract
During COVID, much of the world wore masks covering their lower faces to prevent the spread of disease. These masks cover lower facial features, but how vital are these lower facial features to the recognition of facial expressions of emotion? Going beyond the Ekman 6 emotions, in Study 1 (N = 372), we used a multilevel logistic regression to examine how artificially rendered masks influence emotion recognition from static photos of facial muscle configurations for many commonly experienced positive and negative emotions. On average, masks reduced emotion recognition accuracy by 17% percent for negative emotions and 23% for positive emotions. In Study 2 (N = 338), we asked whether these results generalised to multimodal full-body expressions of emotions, accompanied by vocal expressions. Participants viewed videos from a previously validated set, where the lower facial features were blurred from the nose down. Here, though the decreases in emotion recognition were noticeably less pronounced, highlighting the power of multimodal information, we did see important decreases for certain specific emotions and for positive emotions overall. Results are discussed in the context of the social and emotional consequences of compromised emotion recognition, as well as the unique facial features which accompany certain emotions.
Collapse
Affiliation(s)
- Ryan Lundell-Creagh
- Department of Psychology, Kwantlen Polytechnic University, Surrey, BC, Canada
- Department of Psychology, University of California, Berkeley, CA, USA
| | - Maria Monroy
- Department of Psychology, Yale University, New Haven, CT, USA
| | - Joseph Ocampo
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | - Dacher Keltner
- Department of Psychology, University of California, Berkeley, CA, USA
| |
Collapse
|
3
|
Zhang Z, Zerwas FK, Keltner D. Emotion specificity, coherence, and cultural variation in conceptualizations of positive emotions: a study of body sensations and emotion recognition. Cogn Emot 2024:1-14. [PMID: 39586014 DOI: 10.1080/02699931.2024.2430400] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Revised: 10/14/2024] [Accepted: 11/11/2024] [Indexed: 11/27/2024]
Abstract
The present study examines the association between people's interoceptive representation of physical sensations and the recognition of vocal and facial expressions of emotion. We used body maps to study the granularity of the interoceptive conceptualisation of 11 positive emotions (amusement, awe, compassion, contentment, desire, love, joy, interest, pride, relief, and triumph) and a new emotion recognition test (Emotion Expression Understanding Test) to assess the ability to recognise emotions from vocal and facial behaviour. Overall, we found evidence for distinct interoceptive conceptualizations of 11 positive emotions across Asian American, European American, and Latino/a American cultures, as well as the reliable identification of emotion in facial and vocal expressions. Central to new theorising about emotion-related representation, the granularity of physical sensations did not covary with emotion recognition accuracy, suggesting that two kinds of emotion conceptualisation processes might be distinct.
Collapse
Affiliation(s)
- Zaiyao Zhang
- Department of Psychology, University of California, Berkeley, Berkeley, CA, USA
| | - Felicia K Zerwas
- Department of Psychology, University of California, Berkeley, Berkeley, CA, USA
| | - Dacher Keltner
- Department of Psychology, University of California, Berkeley, Berkeley, CA, USA
| |
Collapse
|
4
|
Kiyokawa H, Hayashi R. Commonalities and variations in emotion representation across modalities and brain regions. Sci Rep 2024; 14:20992. [PMID: 39251743 PMCID: PMC11385795 DOI: 10.1038/s41598-024-71690-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Accepted: 08/30/2024] [Indexed: 09/11/2024] Open
Abstract
Humans express emotions through various modalities such as facial expressions and natural language. However, the relationships between emotions expressed through different modalities and their correlations with neural activities remain uncertain. Here, we aimed to unveil some of these uncertainties by investigating the similarity of emotion representations across modalities and brain regions. First, we represented various emotion categories as multi-dimensional vectors derived from visual (face), linguistic, and visio-linguistic data, and used representational similarity analysis to compare these modalities. Second, we examined the linear transferability of emotion representation from other modalities to the visual modality. Third, we compared the representational structure derived in the first step with those from brain activities across 360 regions. Our findings revealed that emotion representations share commonalities across modalities with modality-type dependent variations, and they can be linearly mapped from other modalities to the visual modality. Additionally, emotion representations in uni-modalities showed relatively higher similarity with specific brain regions, while multi-modal emotion representation was most similar to representations across the entire brain region. These findings suggest that emotional experiences are represented differently across various brain regions with varying degrees of similarity to different modality types, and that they may be multi-modally conveyable in visual and linguistic domains.
Collapse
Affiliation(s)
- Hiroaki Kiyokawa
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan
- Graduate School of Science and Engineering, Saitama University, Saitama, Japan
| | - Ryusuke Hayashi
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan.
| |
Collapse
|
5
|
Liu S, He W, Zhang M, Li Y, Ren J, Guan Y, Fan C, Li S, Gu R, Luo W. Emotional concepts shape the perceptual representation of body expressions. Hum Brain Mapp 2024; 45:e26789. [PMID: 39185719 PMCID: PMC11345699 DOI: 10.1002/hbm.26789] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 06/25/2024] [Accepted: 07/03/2024] [Indexed: 08/27/2024] Open
Abstract
Emotion perception interacts with how we think and speak, including our concept of emotions. Body expression is an important way of emotion communication, but it is unknown whether and how its perception is modulated by conceptual knowledge. In this study, we employed representational similarity analysis and conducted three experiments combining semantic similarity, mouse-tracking task, and one-back behavioral task with electroencephalography and functional magnetic resonance imaging techniques, the results of which show that conceptual knowledge predicted the perceptual representation of body expressions. Further, this prediction effect occurred at approximately 170 ms post-stimulus. The neural encoding of body expressions in the fusiform gyrus and lingual gyrus was impacted by emotion concept knowledge. Taken together, our results indicate that conceptual knowledge of emotion categories shapes the configural representation of body expressions in the ventral visual cortex, which offers compelling evidence for the constructed emotion theory.
Collapse
Affiliation(s)
- Shuaicheng Liu
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Weiqi He
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Mingming Zhang
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Yiwen Li
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain ResearchBeijing Normal UniversityBeijingChina
| | - Jie Ren
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Yuanhao Guan
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Cong Fan
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Shuaixia Li
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| | - Ruolei Gu
- Key Laboratory of Behavioral Science, Institute of PsychologyChinese Academy of SciencesBeijingChina
- Department of PsychologyUniversity of Chinese Academy of SciencesBeijingChina
| | - Wenbo Luo
- Research Center of Brain and Cognitive NeuroscienceLiaoning Normal UniversityDalianChina
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning ProvinceDalianChina
| |
Collapse
|
6
|
Hinojosa JA, Guasch M, Montoro PR, Albert J, Fraga I, Ferré P. The bright side of words: Norms for 9000 Spanish words in seven discrete positive emotions. Behav Res Methods 2024; 56:4909-4929. [PMID: 37749425 PMCID: PMC11289151 DOI: 10.3758/s13428-023-02229-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/29/2023] [Indexed: 09/27/2023]
Abstract
In recent years, assumptions about the existence of a single construct of happiness that accounts for all positive emotions have been questioned. Instead, several discrete positive emotions with their own neurobiological and psychological mechanisms have been proposed. Of note, the effects of positive emotions on language processing are not yet properly understood. Here we provide a database for a large set of 9000 Spanish words scored by 3437 participants in the positive emotions of awe, contentment, amusement, excitement, serenity, relief, and pleasure. We also report significant correlations between discrete positive emotions and several affective (e.g., valence, arousal, happiness, negative discrete emotions) and lexico-semantic (e.g., frequency of use, familiarity, concreteness, age of acquisition) characteristics of words. Finally, we analyze differences between words conveying a single emotion ("pure" emotion words) and those denoting more than one emotion ("mixed" emotion words). This study will provide researchers a rich source of information to do research that contributes to expanding the current knowledge on the role of positive emotions in language. The norms are available at https://doi.org/10.6084/m9.figshare.21533571.v2.
Collapse
Affiliation(s)
- José A Hinojosa
- Instituto Pluridisciplinar, Universidad Complutense de Madrid, Madrid, Spain.
- Dpto. Psicología Experimental, Procesos Cognitivos y Logopedia, Universidad Complutense de Madrid, Madrid, Spain.
- Centro de Investigación Nebrija en Cognición (CINC), Universidad Nebrija, Madrid, Spain.
| | - Marc Guasch
- Department of Psychology and CRAMC, Universitat Rovira i Virgili, Tarragona, Spain
| | - Pedro R Montoro
- Departamento de Psicología Básica 1, Facultad de Psicología, Universidad Nacional de Educación a Distancia (UNED), Madrid, Spain
| | - Jacobo Albert
- Departamento de Psicología Biológica y de la Salud, Facultad de Psicología, Universidad Autónoma de Madrid, Madrid, Spain
| | - Isabel Fraga
- Cognitive Processes & Behaviour Research Group, Department of Social Psychology, Basic Psychology & Methodology, University of Santiago de Compostela, Santiago de Compostela, Spain
| | - Pilar Ferré
- Department of Psychology and CRAMC, Universitat Rovira i Virgili, Tarragona, Spain
| |
Collapse
|
7
|
Cowen AS, Brooks JA, Prasad G, Tanaka M, Kamitani Y, Kirilyuk V, Somandepalli K, Jou B, Schroff F, Adam H, Sauter D, Fang X, Manokara K, Tzirakis P, Oh M, Keltner D. How emotion is experienced and expressed in multiple cultures: a large-scale experiment across North America, Europe, and Japan. Front Psychol 2024; 15:1350631. [PMID: 38966733 PMCID: PMC11223574 DOI: 10.3389/fpsyg.2024.1350631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2023] [Accepted: 03/04/2024] [Indexed: 07/06/2024] Open
Abstract
Core to understanding emotion are subjective experiences and their expression in facial behavior. Past studies have largely focused on six emotions and prototypical facial poses, reflecting limitations in scale and narrow assumptions about the variety of emotions and their patterns of expression. We examine 45,231 facial reactions to 2,185 evocative videos, largely in North America, Europe, and Japan, collecting participants' self-reported experiences in English or Japanese and manual and automated annotations of facial movement. Guided by Semantic Space Theory, we uncover 21 dimensions of emotion in the self-reported experiences of participants in Japan, the United States, and Western Europe, and considerable cross-cultural similarities in experience. Facial expressions predict at least 12 dimensions of experience, despite massive individual differences in experience. We find considerable cross-cultural convergence in the facial actions involved in the expression of emotion, and culture-specific display tendencies-many facial movements differ in intensity in Japan compared to the U.S./Canada and Europe but represent similar experiences. These results quantitatively detail that people in dramatically different cultures experience and express emotion in a high-dimensional, categorical, and similar but complex fashion.
Collapse
Affiliation(s)
- Alan S. Cowen
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| | - Jeffrey A. Brooks
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| | | | - Misato Tanaka
- Advanced Telecommunications Research Institute, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Yukiyasu Kamitani
- Advanced Telecommunications Research Institute, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | | | - Krishna Somandepalli
- Google Research, Mountain View, CA, United States
- Department of Electrical Engineering, University of Southern California, Los Angeles, CA, United States
| | - Brendan Jou
- Google Research, Mountain View, CA, United States
| | | | - Hartwig Adam
- Google Research, Mountain View, CA, United States
| | - Disa Sauter
- Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| | - Xia Fang
- Zhejiang University, Zhejiang, China
| | - Kunalan Manokara
- Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| | | | - Moses Oh
- Hume AI, New York, NY, United States
| | - Dacher Keltner
- Hume AI, New York, NY, United States
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| |
Collapse
|
8
|
Moura N, Fonseca P, Vilas-Boas JP, Serra S. Increased body movement equals better performance? Not always! Musical style determines motion degree perceived as optimal in music performance. PSYCHOLOGICAL RESEARCH 2024; 88:1314-1330. [PMID: 38329559 PMCID: PMC11142955 DOI: 10.1007/s00426-024-01928-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Accepted: 01/18/2024] [Indexed: 02/09/2024]
Abstract
Musicians' body behaviour has a preponderant role in audience perception. We investigated how performers' motion is perceived depending on the musical style and musical expertise. To further explore the effect of visual input, stimuli were presented in audio-only, audio-visual and visual-only conditions. We used motion and audio recordings of expert saxophone players playing two contrasting excerpts (positively and negatively valenced). For each excerpt, stimuli represented five motion degrees with increasing quantity of motion (QoM) and distinct predominant gestures. In the experiment (online and in-person), 384 participants rated performance recordings for expressiveness, professionalism and overall quality. Results revealed that, for the positively valenced excerpt, ratings increased as a function of QoM, whilst for the negatively valenced, the recording with predominant flap motion was favoured. Musicianship did not have a significant effect in motion perception. Concerning multisensory integration, both musicians and non-musicians presented visual dominance in the positively valenced excerpt, whereas in the negatively valenced, musicians shifted to auditory dominance. Our findings demonstrate that musical style not only determines the way observers perceive musicians' movement as adequate, but also that it can promote changes in multisensory integration.
Collapse
Affiliation(s)
- Nádia Moura
- Research Centre in Science and Technology of the Arts (CITAR), School of Arts, Universidade Católica Portuguesa, Porto, Portugal.
- Porto Biomechanics Laboratory (LABIOMEP), Faculty of Sport, University of Porto, Porto, Portugal.
| | - Pedro Fonseca
- Porto Biomechanics Laboratory (LABIOMEP), Faculty of Sport, University of Porto, Porto, Portugal
| | - João Paulo Vilas-Boas
- Porto Biomechanics Laboratory (LABIOMEP), Faculty of Sport, University of Porto, Porto, Portugal
- Centre of Research, Education, Innovation and Intervention in Sport (CIFI2D), Faculty of Sport, University of Porto, Porto, Portugal
| | - Sofia Serra
- Research Centre in Science and Technology of the Arts (CITAR), School of Arts, Universidade Católica Portuguesa, Porto, Portugal
- Instituto de Etnomusicologia-Centro de Estudos em Música e Dança (INET-MD), Departamento de Comunicação e Arte, Universidade de Aveiro, Aveiro, Portugal
| |
Collapse
|
9
|
Loshenko O, Palíšek P, Straka O, Jabůrek M, Portešová Š, Ševčíková A. Impact of the War in Ukraine on the Ability of Children to Recognize Basic Emotions. Int J Public Health 2024; 69:1607094. [PMID: 38835807 PMCID: PMC11148555 DOI: 10.3389/ijph.2024.1607094] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Accepted: 04/24/2024] [Indexed: 06/06/2024] Open
Abstract
Objectives This study assessed emotion recognition skills in school-age children in wartime conditions in Ukraine. Methods An online survey based on the concept of basic emotions was administrated to a sample of 419 schoolchildren from Ukraine and a control group of 310 schoolchildren from the Czech Republic, aged 8 to 12. Results There is no difference in judging the intensity of anger and fear by Ukrainian children, compared with the control group. There is no evidence that the emotions of anger, fear, and sadness were better recognized in the Ukrainian group. Children from Ukraine were better at recognizing positive emotions than Czech children. Conclusion Increased risks of threats and wartime experience do not impair the accuracy of identification of emotions like fear or the assessment of intensity of basic emotions by children who experience war in Ukraine. Still, it is important to continue studying the long-term consequences of military conflicts in order to deepen the understanding of their impact on human mental functioning.
Collapse
Affiliation(s)
- Oleksandra Loshenko
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| | - Petr Palíšek
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| | - Ondřej Straka
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| | - Michal Jabůrek
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| | - Šárka Portešová
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| | - Anna Ševčíková
- Faculty of Social Studies, Psychology Research Institute, Masaryk University, Brno, Czechia
| |
Collapse
|
10
|
Brooks JA, Kim L, Opara M, Keltner D, Fang X, Monroy M, Corona R, Tzirakis P, Baird A, Metrick J, Taddesse N, Zegeye K, Cowen AS. Deep learning reveals what facial expressions mean to people in different cultures. iScience 2024; 27:109175. [PMID: 38433918 PMCID: PMC10906517 DOI: 10.1016/j.isci.2024.109175] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2023] [Revised: 09/05/2023] [Accepted: 02/06/2024] [Indexed: 03/05/2024] Open
Abstract
Cross-cultural studies of the meaning of facial expressions have largely focused on judgments of small sets of stereotypical images by small numbers of people. Here, we used large-scale data collection and machine learning to map what facial expressions convey in six countries. Using a mimicry paradigm, 5,833 participants formed facial expressions found in 4,659 naturalistic images, resulting in 423,193 participant-generated facial expressions. In their own language, participants also rated each expression in terms of 48 emotions and mental states. A deep neural network tasked with predicting the culture-specific meanings people attributed to facial movements while ignoring physical appearance and context discovered 28 distinct dimensions of facial expression, with 21 dimensions showing strong evidence of universality and the remainder showing varying degrees of cultural specificity. These results capture the underlying dimensions of the meanings of facial expressions within and across cultures in unprecedented detail.
Collapse
Affiliation(s)
- Jeffrey A. Brooks
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Lauren Kim
- Research Division, Hume AI, New York, NY 10010, USA
| | | | - Dacher Keltner
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Xia Fang
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, Zhejiang, China
| | - Maria Monroy
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Rebecca Corona
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | | | - Alice Baird
- Research Division, Hume AI, New York, NY 10010, USA
| | | | | | | | - Alan S. Cowen
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| |
Collapse
|
11
|
Mortillaro M, Schlegel K. Embracing the Emotion in Emotional Intelligence Measurement: Insights from Emotion Theory and Research. J Intell 2023; 11:210. [PMID: 37998709 PMCID: PMC10672494 DOI: 10.3390/jintelligence11110210] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Revised: 10/16/2023] [Accepted: 10/28/2023] [Indexed: 11/25/2023] Open
Abstract
Emotional intelligence (EI) has gained significant popularity as a scientific construct over the past three decades, yet its conceptualization and measurement still face limitations. Applied EI research often overlooks its components, treating it as a global characteristic, and there are few widely used performance-based tests for assessing ability EI. The present paper proposes avenues for advancing ability EI measurement by connecting the main EI components to models and theories from the emotion science literature and related fields. For emotion understanding and emotion recognition, we discuss the implications of basic emotion theory, dimensional models, and appraisal models of emotion for creating stimuli, scenarios, and response options. For the regulation and management of one's own and others' emotions, we discuss how the process model of emotion regulation and its extensions to interpersonal processes can inform the creation of situational judgment items. In addition, we emphasize the importance of incorporating context, cross-cultural variability, and attentional and motivational factors into future models and measures of ability EI. We hope this article will foster exchange among scholars in the fields of ability EI, basic emotion science, social cognition, and emotion regulation, leading to an enhanced understanding of the individual differences in successful emotional functioning and communication.
Collapse
Affiliation(s)
- Marcello Mortillaro
- Swiss Center for Affective Sciences, University of Geneva, 1202 Geneva, Switzerland
| | - Katja Schlegel
- Institute of Psychology, University of Bern, 3012 Bern, Switzerland
| |
Collapse
|
12
|
LaPalme ML, Barsade SG, Brackett MA, Floman JL. The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception. J Intell 2023; 11:145. [PMID: 37504788 PMCID: PMC10381771 DOI: 10.3390/jintelligence11070145] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2023] [Revised: 07/13/2023] [Accepted: 07/16/2023] [Indexed: 07/29/2023] Open
Abstract
Emotion perception is a primary facet of Emotional Intelligence (EI) and the underpinning of interpersonal communication. In this study, we examined meso-expressions-the everyday, moderate-intensity emotions communicated through the face, voice, and body. We theoretically distinguished meso-expressions from other well-known emotion research paradigms (i.e., macro-expression and micro-expressions). In Study 1, we demonstrated that people can reliably discriminate between meso-expressions, and we created a corpus of 914 unique video displays of meso-expressions across a race- and gender-diverse set of expressors. In Study 2, we developed a novel video-based assessment of emotion perception ability: The Meso-Expression Test (MET). In this study, we found that the MET is psychometrically valid and demonstrated measurement equivalence across Asian, Black, Hispanic, and White perceiver groups and across men and women. In Study 3, we examined the construct validity of the MET and showed that it converged with other well-known measures of emotion perception and diverged from cognitive ability. Finally, in Study 4, we showed that the MET is positively related to important psychosocial outcomes, including social well-being, social connectedness, and empathic concern and is negatively related to alexithymia, stress, depression, anxiety, and adverse social interactions. We conclude with a discussion focused on the implications of our findings for EI ability research and the practical applications of the MET.
Collapse
Affiliation(s)
- Matthew L LaPalme
- Yale Center for Emotional Intelligence, Yale University, New Haven, CT 06511, USA
| | - Sigal G Barsade
- Wharton, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Marc A Brackett
- Yale Center for Emotional Intelligence, Yale University, New Haven, CT 06511, USA
| | - James L Floman
- Yale Center for Emotional Intelligence, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
13
|
Owners' Beliefs regarding the Emotional Capabilities of Their Dogs and Cats. Animals (Basel) 2023; 13:ani13050820. [PMID: 36899676 PMCID: PMC10000035 DOI: 10.3390/ani13050820] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Revised: 02/15/2023] [Accepted: 02/22/2023] [Indexed: 03/12/2023] Open
Abstract
The correct interpretation of an animal's emotional state is crucial for successful human-animal interaction. When studying dog and cat emotional expressions, a key source of information is the pet owner, given the extensive interactions they have had with their pets. In this online survey we asked 438 owners whether their dogs and/or cats could express 22 different primary and secondary emotions, and to indicate the behavioral cues they relied upon to identify those expressed emotions. Overall, more emotions were reported in dogs compared to cats, both from owners that owned just one species and those that owned both. Although owners reported a comparable set of sources of behavioral cues (e.g., body posture, facial expression, and head posture) for dogs and cats in expressing the same emotion, distinct combinations tended to be associated with specific emotions in both cats and dogs. Furthermore, the number of emotions reported by dog owners was positively correlated with their personal experience with dogs but negatively correlated with their professional experience. The number of emotions reported in cats was higher in cat-only households compared to those that also owned dogs. These results provide a fertile ground for further empirical investigation of the emotional expressions of dogs and cats, aimed at validating specific emotions in these species.
Collapse
|
14
|
Brooks JA, Tzirakis P, Baird A, Kim L, Opara M, Fang X, Keltner D, Monroy M, Corona R, Metrick J, Cowen AS. Deep learning reveals what vocal bursts express in different cultures. Nat Hum Behav 2023; 7:240-250. [PMID: 36577898 DOI: 10.1038/s41562-022-01489-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Accepted: 10/26/2022] [Indexed: 12/29/2022]
Abstract
Human social life is rich with sighs, chuckles, shrieks and other emotional vocalizations, called 'vocal bursts'. Nevertheless, the meaning of vocal bursts across cultures is only beginning to be understood. Here, we combined large-scale experimental data collection with deep learning to reveal the shared and culture-specific meanings of vocal bursts. A total of n = 4,031 participants in China, India, South Africa, the USA and Venezuela mimicked vocal bursts drawn from 2,756 seed recordings. Participants also judged the emotional meaning of each vocal burst. A deep neural network tasked with predicting the culture-specific meanings people attributed to vocal bursts while disregarding context and speaker identity discovered 24 acoustic dimensions, or kinds, of vocal expression with distinct emotion-related meanings. The meanings attributed to these complex vocal modulations were 79% preserved across the five countries and three languages. These results reveal the underlying dimensions of human emotional vocalization in remarkable detail.
Collapse
Affiliation(s)
- Jeffrey A Brooks
- Research Division, Hume AI, New York, NY, USA. .,University of California, Berkeley, Berkeley, CA, USA.
| | | | - Alice Baird
- Research Division, Hume AI, New York, NY, USA
| | - Lauren Kim
- Research Division, Hume AI, New York, NY, USA
| | | | - Xia Fang
- Zhejiang University, Hangzhou, China
| | - Dacher Keltner
- Research Division, Hume AI, New York, NY, USA.,University of California, Berkeley, Berkeley, CA, USA
| | - Maria Monroy
- University of California, Berkeley, Berkeley, CA, USA
| | | | | | - Alan S Cowen
- Research Division, Hume AI, New York, NY, USA. .,University of California, Berkeley, Berkeley, CA, USA.
| |
Collapse
|
15
|
Abstract
Pride is a self-conscious emotion, comprised of two distinct facets known as authentic and hubristic pride, and associated with a cross-culturally recognized nonverbal expression. Authentic pride involves feelings of accomplishment and confidence and promotes prosocial behaviors, whereas hubristic pride involves feelings of arrogance and conceit and promotes antisociality. Each facet of pride, we argue, contributes to a distinct means of attaining social rank: Authentic pride seems to promote prestige-a rank based on earned respect-whereas hubristic pride seems to promote dominance-a rank based on aggression and coercion. Both prestige and dominance are effective routes to power and influence in human groups, so both facets of pride are likely to be functional adaptations. Overall, the reviewed research suggests that pride is likely to be a human universal, critical for social relationships and rank attainment across human societies.
Collapse
Affiliation(s)
- Jessica L Tracy
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada;
| | - Eric Mercadante
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada;
| | - Ian Hohm
- Department of Psychology, University of British Columbia, Vancouver, British Columbia, Canada;
| |
Collapse
|
16
|
Höfling TTA, Alpers GW. Automatic facial coding predicts self-report of emotion, advertisement and brand effects elicited by video commercials. Front Neurosci 2023; 17:1125983. [PMID: 37205049 PMCID: PMC10185761 DOI: 10.3389/fnins.2023.1125983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Accepted: 02/10/2023] [Indexed: 05/21/2023] Open
Abstract
Introduction Consumers' emotional responses are the prime target for marketing commercials. Facial expressions provide information about a person's emotional state and technological advances have enabled machines to automatically decode them. Method With automatic facial coding we investigated the relationships between facial movements (i.e., action unit activity) and self-report of commercials advertisement emotion, advertisement and brand effects. Therefore, we recorded and analyzed the facial responses of 219 participants while they watched a broad array of video commercials. Results Facial expressions significantly predicted self-report of emotion as well as advertisement and brand effects. Interestingly, facial expressions had incremental value beyond self-report of emotion in the prediction of advertisement and brand effects. Hence, automatic facial coding appears to be useful as a non-verbal quantification of advertisement effects beyond self-report. Discussion This is the first study to measure a broad spectrum of automatically scored facial responses to video commercials. Automatic facial coding is a promising non-invasive and non-verbal method to measure emotional responses in marketing.
Collapse
|
17
|
Searching, Navigating, and Recommending Movies through Emotions: A Scoping Review. HUMAN BEHAVIOR AND EMERGING TECHNOLOGIES 2022. [DOI: 10.1155/2022/7831013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2022]
Abstract
Movies offer viewers a broad range of emotional experiences, providing entertainment, and meaning. Following the PRISMA-ScR guidelines, we reviewed the literature on digital systems designed to help users search and browse movie libraries and offer recommendations based on emotional content. Our search yielded 83 eligible documents (published between 2000 and 2021). We identified 22 case studies, 34 empirical studies, 26 proof of concept, and one theoretical paper. User transactions (e.g., ratings, tags) were the preferred source of information. The documents examined approached emotions from both a categorical (
) and dimensional (
) perspectives, and nine documents offer a combination of both approaches. Although there are several authors mentioned, the references used are frequently dated, and 12 documents do not mention author or model used. We identified 61 words related to emotion or affect. Documents presented on average 1.36 positive terms and 2.64 negative terms. Sentiment analysis (
) is frequently used for emotion identification, followed by subjective evaluations (
), movie low-level audio and visual features (n = 11), and face recognition technologies (
). We discuss limitations and offer a brief review of current emotion models and research.
Collapse
|
18
|
Only the good cry: Investigating the relationship between crying proneness and moral judgments and behavior. SOCIAL PSYCHOLOGICAL BULLETIN 2022. [DOI: 10.32872/spb.6475] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
People cry for various reasons and in numerous situations, some involving highly moral aspects such as altruism or moral beauty. At the same time, criers have been found to be evaluated as more morally upright—they are perceived as more honest, reliable, and sincere than non-criers. The current project provides a first comprehensive investigation to test whether this perception is adequate. Across six studies sampling Dutch, Indian, and British adults (N = 2325), we explored the relationship between self-reported crying proneness and moral judgments and behavior, employing self-report measures and actual behavior assessments. Across all studies, we observed positive correlations of crying proneness with moral judgments (r = .27 [.17, .38]) and prosocial behavioral tendencies and behaviors (r = .20 [.12, .28]). These associations held in three (moral judgment) or two (prosocial tendencies and behaviors) out of five studies when controlling for other important variables. Thus, the current project provides first evidence that crying is related to moral evaluation and behavior, and we discuss its importance for the literature on human emotional crying.
Collapse
|
19
|
Superior Communication of Positive Emotions Through Nonverbal Vocalisations Compared to Speech Prosody. JOURNAL OF NONVERBAL BEHAVIOR 2021; 45:419-454. [PMID: 34744232 PMCID: PMC8553689 DOI: 10.1007/s10919-021-00375-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/22/2021] [Indexed: 11/29/2022]
Abstract
The human voice communicates emotion through two different types of vocalizations: nonverbal vocalizations (brief non-linguistic sounds like laughs) and speech prosody (tone of voice). Research examining recognizability of emotions from the voice has mostly focused on either nonverbal vocalizations or speech prosody, and included few categories of positive emotions. In two preregistered experiments, we compare human listeners’ (total n = 400) recognition performance for 22 positive emotions from nonverbal vocalizations (n = 880) to that from speech prosody (n = 880). The results show that listeners were more accurate in recognizing most positive emotions from nonverbal vocalizations compared to prosodic expressions. Furthermore, acoustic classification experiments with machine learning models demonstrated that positive emotions are expressed with more distinctive acoustic patterns for nonverbal vocalizations as compared to speech prosody. Overall, the results suggest that vocal expressions of positive emotions are communicated more successfully when expressed as nonverbal vocalizations compared to speech prosody.
Collapse
|
20
|
Cavieres A, Maldonado R, Bland A, Elliott R. Relationship Between Gender and Performance on Emotion Perception Tasks in a Latino Population. Int J Psychol Res (Medellin) 2021; 14:106-114. [PMID: 34306583 PMCID: PMC8297575 DOI: 10.21500/20112084.5032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2020] [Revised: 11/24/2020] [Accepted: 02/26/2021] [Indexed: 12/02/2022] Open
Abstract
Basic emotions are universally recognized, although differences across cultures and between genders have been described. We report results in two emotion recognition tasks, in a sample of healthy adults from Chile. Methods: 192 volunteers (mean 31.58 years, s.d. 8.36; 106 women) completed the Emotional Recognition Task, in which they were asked to identify a briefly displayed emotion, and the Emotional Intensity Morphing Task, in which they viewed faces with increasing or decreasing emotional intensity and indicated when they either detected or no longer detected the emotion. Results: All emotions were recognized at above chance levels. The only sex differences present showed men performed better at identifying anger (p = .0485), and responded more slowly to fear (p = .0057), than women. Discussion: These findings are consistent with some, though not all, prior literature on emotion perception. Crucially, we report data on emotional perception in a healthy adult Latino population for the first time, which contributes to emerging literature on cultural differences in affective processing.
Collapse
Affiliation(s)
- Alvaro Cavieres
- Departamento de Psiquiatría, Universidad de Valparaíso, Chile. Universidad de Valparaíso Universidad de Valparaíso Chile
| | - Rocío Maldonado
- Departamento de Psiquiatría, Universidad de Valparaíso, Chile. Universidad de Valparaíso Universidad de Valparaíso Chile
| | - Amy Bland
- Department of Psychology, Manchester Metropolitan University, UK. Manchester Metropolitan University Manchester Metropolitan University United Kingdom
| | - Rebecca Elliott
- Neuroscience and Psychiatry Unit, Division of Neuroscience and Experimental Psychology,University of Manchester, UK. The University of Manchester University of Manchester United Kingdom
| |
Collapse
|
21
|
Lange J, Heerdink MW, van Kleef GA. Reading emotions, reading people: Emotion perception and inferences drawn from perceived emotions. Curr Opin Psychol 2021; 43:85-90. [PMID: 34303128 DOI: 10.1016/j.copsyc.2021.06.008] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 06/10/2021] [Accepted: 06/11/2021] [Indexed: 01/17/2023]
Abstract
Emotional expressions play an important role in coordinating social interaction. We review research on two critical processes that underlie such coordination: (1) perceiving emotions from emotion expressions and (2) drawing inferences from perceived emotions. Broad evidence indicates that (a) observers can accurately perceive emotions from a person's facial, bodily, vocal, verbal, and symbolic expressions and that such emotion perception is further informed by contextual information. Moreover, (b) observers draw consequential and contextualized inferences from these perceived emotions about the expresser, the situation, and the self. Thus, emotion expressions enable coordinated action by providing information that facilitates adaptive behavioral responses. We recommend that future studies investigate how people integrate information from different expressive modalities and how this affects consequential inferences.
Collapse
Affiliation(s)
- Jens Lange
- Department of Differential Psychology and Psychological Assessment, University of Hamburg, Von-Melle-Park 5, 20146 Hamburg, Germany.
| | - Marc W Heerdink
- Department of Social Psychology, University of Amsterdam, PO Box 15900, 1001 NK Amsterdam, the Netherlands
| | - Gerben A van Kleef
- Department of Social Psychology, University of Amsterdam, PO Box 15900, 1001 NK Amsterdam, the Netherlands
| |
Collapse
|
22
|
An evaluation of the reading the mind in the eyes test's psychometric properties and scores in South Africa-cultural implications. PSYCHOLOGICAL RESEARCH 2021; 86:2289-2300. [PMID: 34125281 DOI: 10.1007/s00426-021-01539-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 05/25/2021] [Indexed: 12/19/2022]
Abstract
The 'Reading the Mind in the Eyes' test (RMET) has been translated and tested in many cultural settings. Results indicate that items show variability in meeting the original psychometric testing criteria. Individuals from non-Western cultures score differently on the RMET. As such, questions arise as to the cross-cultural validity of the RMET. This study tested the English version of the RMET, that consists almost exclusively of White faces, at a large South African university to determine its validity in a culturally diverse context. A total of 443 students from a range of different demographic backgrounds completed the instrument. Students were selected using simple random sampling. 30 out of the 36 items continued to show satisfactory psychometric properties. Further evidence shows significant differences based on race and home language in both overall scores and item level scores. Black race and African home language respondents show lower RMET scores and different item level perspectives on certain mental states. The current RMET is not inclusive. It requires stimuli reflecting more races and cultures. This lack of diversity is likely to be influencing and biasing results and psychometric properties. The continued exclusion of racial stimuli such as Black race is also promoting a systemic discriminatory instrument. These results have cultural implications for how we interpret and use the RMET.
Collapse
|
23
|
Cowen AS, Keltner D, Schroff F, Jou B, Adam H, Prasad G. Sixteen facial expressions occur in similar contexts worldwide. Nature 2021; 589:251-257. [PMID: 33328631 DOI: 10.1038/s41586-020-3037-7] [Citation(s) in RCA: 79] [Impact Index Per Article: 19.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2020] [Accepted: 10/30/2020] [Indexed: 01/29/2023]
Abstract
Understanding the degree to which human facial expressions co-vary with specific social contexts across cultures is central to the theory that emotions enable adaptive responses to important challenges and opportunities1-6. Concrete evidence linking social context to specific facial expressions is sparse and is largely based on survey-based approaches, which are often constrained by language and small sample sizes7-13. Here, by applying machine-learning methods to real-world, dynamic behaviour, we ascertain whether naturalistic social contexts (for example, weddings or sporting competitions) are associated with specific facial expressions14 across different cultures. In two experiments using deep neural networks, we examined the extent to which 16 types of facial expression occurred systematically in thousands of contexts in 6 million videos from 144 countries. We found that each kind of facial expression had distinct associations with a set of contexts that were 70% preserved across 12 world regions. Consistent with these associations, regions varied in how frequently different facial expressions were produced as a function of which contexts were most salient. Our results reveal fine-grained patterns in human facial expressions that are preserved across the modern world.
Collapse
Affiliation(s)
- Alan S Cowen
- Department of Psychology, University of California Berkeley, Berkeley, CA, USA. .,Google Research, Mountain View, CA, USA.
| | - Dacher Keltner
- Department of Psychology, University of California Berkeley, Berkeley, CA, USA
| | | | | | | | | |
Collapse
|
24
|
Semantic Space Theory: A Computational Approach to Emotion. Trends Cogn Sci 2020; 25:124-136. [PMID: 33349547 DOI: 10.1016/j.tics.2020.11.004] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2020] [Revised: 11/09/2020] [Accepted: 11/13/2020] [Indexed: 12/30/2022]
Abstract
Within affective science, the central line of inquiry, animated by basic emotion theory and constructivist accounts, has been the search for one-to-one mappings between six emotions and their subjective experiences, prototypical expressions, and underlying brain states. We offer an alternative perspective: semantic space theory. This computational approach uses wide-ranging naturalistic stimuli and open-ended statistical techniques to capture systematic variation in emotion-related behaviors. Upwards of 25 distinct varieties of emotional experience have distinct profiles of associated antecedents and expressions. These emotions are high-dimensional, categorical, and often blended. This approach also reveals that specific emotions, more than valence, organize emotional experience, expression, and neural processing. Overall, moving beyond traditional models to study broader semantic spaces of emotion can enrich our understanding of human experience.
Collapse
|
25
|
Abstract
Historically, research characterizing the development of emotion recognition has focused on identifying specific skills and the age periods, or milestones, at which these abilities emerge. However, advances in emotion research raise questions about whether this conceptualization accurately reflects how children learn about, understand, and respond to others’ emotions in everyday life. In this review, we propose a developmental framework for the emergence of emotion reasoning—that is, how children develop the ability to make reasonably accurate inferences and predictions about the emotion states of other people. We describe how this framework holds promise for building upon extant research. Our review suggests that use of the term emotion recognition can be misleading and imprecise, with the developmental processes of interest better characterized by the term emotion reasoning. We also highlight how the age at which children succeed on many tasks reflects myriad developmental processes. This new framing of emotional development can open new lines of inquiry about how humans learn to navigate their social worlds.
Collapse
Affiliation(s)
- Ashley L. Ruba
- Department of Psychology, University of Wisconsin–Madison, Madison, Wisconsin 53706, USA;,
| | - Seth D. Pollak
- Department of Psychology, University of Wisconsin–Madison, Madison, Wisconsin 53706, USA;,
| |
Collapse
|
26
|
Reply to Bowling: How specific emotions are primary in subjective experience. Proc Natl Acad Sci U S A 2020; 117:9694-9695. [DOI: 10.1073/pnas.2003626117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
|
27
|
Cowen A, Sauter D, Tracy JL, Keltner D. Mapping the Passions: Toward a High-Dimensional Taxonomy of Emotional Experience and Expression. Psychol Sci Public Interest 2019; 20:69-90. [PMID: 31313637 PMCID: PMC6675572 DOI: 10.1177/1529100619850176] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
What would a comprehensive atlas of human emotions include? For 50 years, scientists have sought to map emotion-related experience, expression, physiology, and recognition in terms of the "basic six"-anger, disgust, fear, happiness, sadness, and surprise. Claims about the relationships between these six emotions and prototypical facial configurations have provided the basis for a long-standing debate over the diagnostic value of expression (for review and latest installment in this debate, see Barrett et al., p. 1). Building on recent empirical findings and methodologies, we offer an alternative conceptual and methodological approach that reveals a richer taxonomy of emotion. Dozens of distinct varieties of emotion are reliably distinguished by language, evoked in distinct circumstances, and perceived in distinct expressions of the face, body, and voice. Traditional models-both the basic six and affective-circumplex model (valence and arousal)-capture a fraction of the systematic variability in emotional response. In contrast, emotion-related responses (e.g., the smile of embarrassment, triumphant postures, sympathetic vocalizations, blends of distinct expressions) can be explained by richer models of emotion. Given these developments, we discuss why tests of a basic-six model of emotion are not tests of the diagnostic value of facial expression more generally. Determining the full extent of what facial expressions can tell us, marginally and in conjunction with other behavioral and contextual cues, will require mapping the high-dimensional, continuous space of facial, bodily, and vocal signals onto richly multifaceted experiences using large-scale statistical modeling and machine-learning methods.
Collapse
Affiliation(s)
- Alan Cowen
- Department of Psychology, University of California, Berkeley
| | - Disa Sauter
- Faculty of Social and Behavioural Sciences, University of Amsterdam
| | | | - Dacher Keltner
- Department of Psychology, University of California, Berkeley
| |
Collapse
|
28
|
Barrett LF, Adolphs R, Marsella S, Martinez A, Pollak SD. Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements. Psychol Sci Public Interest 2019; 20:1-68. [PMID: 31313636 PMCID: PMC6640856 DOI: 10.1177/1529100619832930] [Citation(s) in RCA: 450] [Impact Index Per Article: 75.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
It is commonly assumed that a person's emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.
Collapse
Affiliation(s)
- Lisa Feldman Barrett
- Northeastern University, Department of Psychology, Boston, MA
- Massachusetts General Hospital, Department of Psychiatry and the Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA
- Harvard Medical School, Department of Psychiatry, Boston MA
| | - Ralph Adolphs
- California Institute of Technology, Departments of Psychology, Neuroscience, and Biology,Pasadena, CA
| | - Stacy Marsella
- Northeastern University, Department of Psychology, Boston, MA
- Northeastern University, College of Computer and Information Science, Boston, MA
- University of Glasgow, Glasgow, Scotland
| | - Aleix Martinez
- The Ohio State University, Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, Columbus, OH
| | - Seth D. Pollak
- University of Wisconsin - Madison, Department of Psychology, Madison, WI
| |
Collapse
|
29
|
Cowen AS, Keltner D. What the face displays: Mapping 28 emotions conveyed by naturalistic expression. ACTA ACUST UNITED AC 2019; 75:349-364. [PMID: 31204816 DOI: 10.1037/amp0000488] [Citation(s) in RCA: 44] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
What emotions do the face and body express? Guided by new conceptual and quantitative approaches (Cowen, Elfenbein, Laukka, & Keltner, 2018; Cowen & Keltner, 2017, 2018), we explore the taxonomy of emotion recognized in facial-bodily expression. Participants (N = 1,794; 940 female, ages 18-76 years) judged the emotions captured in 1,500 photographs of facial-bodily expression in terms of emotion categories, appraisals, free response, and ecological validity. We find that facial-bodily expressions can reliably signal at least 28 distinct categories of emotion that occur in everyday life. Emotion categories, more so than appraisals such as valence and arousal, organize emotion recognition. However, categories of emotion recognized in naturalistic facial and bodily behavior are not discrete but bridged by smooth gradients that correspond to continuous variations in meaning. Our results support a novel view that emotions occupy a high-dimensional space of categories bridged by smooth gradients of meaning. They offer an approximation of a taxonomy of facial-bodily expressions, visualized within an online interactive map. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|