1
|
Domenici V, Collignon O, Lettieri G. Affect in the dark: Navigating the complex landscape of social cognition in blindness. PROGRESS IN BRAIN RESEARCH 2025; 292:175-202. [PMID: 40409920 DOI: 10.1016/bs.pbr.2025.02.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2025]
Abstract
Research on the consequence of blindness has primarily focused on how visual experience influences basic sensory abilities, mainly overlooking the intricate world of social cognition. However, social cognition abilities are crucial as they enable individuals to navigate complex interactions, understand others' perspectives, regulate emotions, and establish meaningful connections, all essential for successful adaptation and integration into society. Emotional and social signals are frequently conveyed through nonverbal visual cues, and understanding the foundational role vision plays in shaping everyday affective experiences is fundamental. Here, we aim to summarize existing research on social cognition in individuals with blindness. By doing so, we strive to offer a comprehensive overview of social processing in sensory deprivation while pinpointing areas that are still largely unexplored. By identifying gaps in current knowledge, this review paves the way for future investigations to reveal how visual experience shapes the development of emotional and social cognition in the mind and the brain.
Collapse
Affiliation(s)
- Veronica Domenici
- Affective Physiology and Interoception Group (API), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; Social and Affective Neuroscience Group (SANe), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; University of Camerino, Camerino, Italy
| | - Olivier Collignon
- Crossmodal Perception and Plasticity Laboratory, Institute of Research in Psychology (IPSY) and Institute of Neuroscience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne, Switzerland
| | - Giada Lettieri
- Affective Physiology and Interoception Group (API), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; Crossmodal Perception and Plasticity Laboratory, Institute of Research in Psychology (IPSY) and Institute of Neuroscience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium.
| |
Collapse
|
2
|
Merchie A, Ranty Z, Aguillon-Hernandez N, Aucouturier JJ, Wardak C, Gomot M. Emotional contagion to vocal smile revealed by combined pupil reactivity and motor resonance. Sci Rep 2024; 14:25043. [PMID: 39443497 PMCID: PMC11499673 DOI: 10.1038/s41598-024-74848-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2024] [Accepted: 09/30/2024] [Indexed: 10/25/2024] Open
Abstract
The interplay between the different components of emotional contagion (i.e. emotional state and facial motor resonance), both during implicit and explicit appraisal of emotion, remains controversial. The aims of this study were (i) to distinguish between these components thanks to vocal smile processing and (ii) to assess how they reflect implicit processes and/or an explicit appraisal loop. Emotional contagion to subtle vocal emotions was studied in 25 adults through motor resonance and Autonomic Nervous System (ANS) reactivity. Facial expressions (fEMG: facial electromyography) and pupil dilation were assessed during the processing and judgement of artificially emotionally modified sentences. fEMG revealed that Zygomaticus major was reactive to the perceived valence of sounds, whereas the activity of Corrugator supercilii reflected explicit judgement. Timing analysis of pupil dilation provided further insight into both the emotional state and the implicit and explicit processing of vocal emotion, showing earlier activity for emotional stimuli than for neutral stimuli, followed by valence-dependent variations and a late judgement-dependent increase in pupil diameter. This innovative combination of different electrophysiological measures shed new light on the debate between central and peripherical views within the framework of emotional contagion.
Collapse
Affiliation(s)
- Annabelle Merchie
- INSERM, Imaging Brain & Neuropsychiatry iBraiN U1253, Université de Tours, Tours, 37032, France
| | - Zoé Ranty
- INSERM, Imaging Brain & Neuropsychiatry iBraiN U1253, Université de Tours, Tours, 37032, France
| | | | - Jean-Julien Aucouturier
- FEMTO-ST Institute, CNRS, Université de Bourgogne Franche Comté, Besançon, France
- STMS Lab IRCAM, CNRS, Sorbonne Université, Paris, France
| | - Claire Wardak
- INSERM, Imaging Brain & Neuropsychiatry iBraiN U1253, Université de Tours, Tours, 37032, France
| | - Marie Gomot
- INSERM, Imaging Brain & Neuropsychiatry iBraiN U1253, Université de Tours, Tours, 37032, France.
| |
Collapse
|
3
|
Vaessen M, Van der Heijden K, de Gelder B. Modality-specific brain representations during automatic processing of face, voice and body expressions. Front Neurosci 2023; 17:1132088. [PMID: 37869514 PMCID: PMC10587395 DOI: 10.3389/fnins.2023.1132088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2022] [Accepted: 09/05/2023] [Indexed: 10/24/2023] Open
Abstract
A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
Collapse
|
4
|
Arias Sarah P, Hall L, Saitovitch A, Aucouturier JJ, Zilbovicius M, Johansson P. Pupil dilation reflects the dynamic integration of audiovisual emotional speech. Sci Rep 2023; 13:5507. [PMID: 37016041 PMCID: PMC10073148 DOI: 10.1038/s41598-023-32133-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Accepted: 03/22/2023] [Indexed: 04/06/2023] Open
Abstract
Emotional speech perception is a multisensory process. When speaking with an individual we concurrently integrate the information from their voice and face to decode e.g., their feelings, moods, and emotions. However, the physiological reactions-such as the reflexive dilation of the pupil-associated to these processes remain mostly unknown. That is the aim of the current article, to investigate whether pupillary reactions can index the processes underlying the audiovisual integration of emotional signals. To investigate this question, we used an algorithm able to increase or decrease the smiles seen in a person's face or heard in their voice, while preserving the temporal synchrony between visual and auditory channels. Using this algorithm, we created congruent and incongruent audiovisual smiles, and investigated participants' gaze and pupillary reactions to manipulated stimuli. We found that pupil reactions can reflect emotional information mismatch in audiovisual speech. In our data, when participants were explicitly asked to extract emotional information from stimuli, the first fixation within emotionally mismatching areas (i.e., the mouth) triggered pupil dilation. These results reveal that pupil dilation can reflect the dynamic integration of audiovisual emotional speech and provide insights on how these reactions are triggered during stimulus perception.
Collapse
Affiliation(s)
- Pablo Arias Sarah
- Lund University Cognitive Science, Lund University, Lund, Sweden.
- STMS Lab, UMR 9912 (IRCAM/CNRS/SU), Paris, France.
- School of Neuroscience and Psychology, Glasgow University, Glasgow, UK.
| | - Lars Hall
- STMS Lab, UMR 9912 (IRCAM/CNRS/SU), Paris, France
| | - Ana Saitovitch
- U1000 Brain Imaging in Psychiatry, INSERM-CEA, Pediatric Radiology Service, Necker Enfants Malades Hospital, Paris V René Descartes University, Paris, France
| | - Jean-Julien Aucouturier
- Department of Robotics and Automation FEMTO-ST Institute (CNRS/Université de Bourgogne Franche Comté), Besançon, France
| | - Monica Zilbovicius
- U1000 Brain Imaging in Psychiatry, INSERM-CEA, Pediatric Radiology Service, Necker Enfants Malades Hospital, Paris V René Descartes University, Paris, France
| | | |
Collapse
|
5
|
Abstract
The goal of this article is to discuss theoretical arguments concerning the idea that emotional mimicry is an intrinsic part of our social being and thus can be considered a social act. For this, we will first present the theoretical assumptions underlying the Emotional Mimicry as Social Regulator view. We then provide a brief overview of recent developments in emotional mimicry research and specifically discuss new developments regarding the role of emotional mimicry in actual interactions and relationships, and individual differences in emotional mimicry. We conclude with open questions for future research.
Collapse
Affiliation(s)
- Ursula Hess
- Department of Psychology, Humboldt-University of Berlin, Berlin, Germany
| | - Agneta Fischer
- Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|