1
|
Orlandi A, Candidi M. Toward a neuroaesthetics of interactions: Insights from dance on the aesthetics of individual and interacting bodies. iScience 2025; 28:112365. [PMID: 40330884 PMCID: PMC12051600 DOI: 10.1016/j.isci.2025.112365] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/08/2025] Open
Abstract
Neuroaesthetics has advanced our understanding of the neural processes underlying human aesthetic evaluation of crafted and natural entities, including the human body. While much research has examined the neurocognitive mechanisms behind evaluating "single-body" forms and movements, the perception and aesthetic evaluation of multiple individuals moving together have only recently gained attention. This review examines the neural foundations of static and dynamic body perception and how neural representations of observed and executed movements influence their aesthetic evaluation. Focusing on dance, it describes the role of stimulus features and individual characteristics in movement aesthetics. We review neural systems supporting visual processing of social interactions and propose a role for these systems in the aesthetic evaluation of interpersonal interactions, defined as the neuroaesthetics of interactions. Our goal is to highlight the benefits of integrating insights and methods from social cognition, neuroscience, and neuroaesthetics to understand mechanisms underlying interaction aesthetics, while addressing future challenges.
Collapse
Affiliation(s)
- Andrea Orlandi
- Department of Psychology, Sapienza University, Rome, Italy
- IRCCS Santa Lucia Foundation, Rome, Italy
- School of Psychological Sciences, Macquarie University, Sydney, NSW, Australia
| | - Matteo Candidi
- Department of Psychology, Sapienza University, Rome, Italy
- IRCCS Santa Lucia Foundation, Rome, Italy
| |
Collapse
|
2
|
Longo C, Mattavelli G, Beati A, Pennacchio M, Bertoldi B, Malaguti MC, Papagno C. Affective priming of body and facial expressions in Parkinson's disease. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2025:10.3758/s13415-025-01290-4. [PMID: 40113739 DOI: 10.3758/s13415-025-01290-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 02/25/2025] [Indexed: 03/22/2025]
Abstract
Patients with Parkinson's disease (PD) often experience impairments in emotion processing. Previous literature has highlighted deficits in facial expression recognition and body movement processing, including social signals. However, to date, the integration of facial and bodily expressions has been investigated in healthy populations, but not in individuals with PD. The present study assessed the reciprocal influence between facial and body emotion recognition by using subliminal priming paradigms in a sample of PD patients and in healthy controls (HC). Participants completed both a Face-Body and a Body-Face priming task, in which facial or body expressions subliminally primed the discrimination of body or face emotions, respectively. Recognition of face and body emotions was also assessed. The results revealed that the discrimination of fearful and happy body expressions was not modulated by the previous congruent, incongruent, or neutral face in PD patients, whereas a significant Face-Body priming effect was observed in HC. In contrast, body emotion did not significantly prime face expression discrimination in either group. These findings suggest an impairment in the automatic integration of emotional information from faces and bodies in PD, which may hinder the detection of mismatches between emotional information from different cues.
Collapse
Affiliation(s)
- Chiara Longo
- Department of Neurology, "Santa Chiara Hospital", Azienda Provinciale per i Servizi Sanitari (APSS), 38122, Trento, Italy
| | - Giulia Mattavelli
- IUSS Cognitive Neuroscience (ICoN) Center, Scuola Universitaria Superiore IUSS, Pavia, Italy.
- Cognitive Neuroscience Laboratory of Pavia Institute, Istituti Clinici Scientifici Maugeri IRCCS, Pavia, Italy.
| | - Alice Beati
- Psychology Service, Azienda Sanitaria dell'Alto Adige, 39100, Bolzano, Italy
- Department of Psychology, University of Milano-Bicocca, 20126, Milan, Italy
| | - Maria Pennacchio
- Center for Mind/Brain Sciences (CIMeC), University of Trento, 38068, Rovereto, Italy
| | - Bryan Bertoldi
- Psychology Service, Azienda Sanitaria dell'Alto Adige, 39100, Bolzano, Italy
- Department of Psychology, University of Milano-Bicocca, 20126, Milan, Italy
| | - Maria Chiara Malaguti
- Department of Neurology, "Santa Chiara Hospital", Azienda Provinciale per i Servizi Sanitari (APSS), 38122, Trento, Italy
| | - Costanza Papagno
- Center for Mind/Brain Sciences (CIMeC), University of Trento, 38068, Rovereto, Italy
| |
Collapse
|
3
|
Keck J, Bachmann J, Zabicki A, Munzert J, Krüger B. Decoding affect in emotional body language: valence representation in the action observation network. Soc Cogn Affect Neurosci 2025; 20:nsaf021. [PMID: 39953789 PMCID: PMC11879420 DOI: 10.1093/scan/nsaf021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2024] [Revised: 09/17/2024] [Accepted: 02/13/2025] [Indexed: 02/17/2025] Open
Abstract
Humans are highly adept at inferring emotional states from body movements in social interactions. Nonetheless, it is under debate how this process is facilitated by neural activations across multiple brain regions. The specific contributions of various brain areas to the perception of valence in biological motion remain poorly understood, particularly those within the action observation network (AON) and those involved in processing emotional valence. This study explores which cortical regions involved in processing emotional body language depicted by kinematic stimuli contain valence information, and whether this is reflected either in the magnitude of activation or in distinct activation patterns. Results showed that neural patterns within the AON, notably the inferior parietal lobule (IPL), exhibit a neural geometry that reflects the valence impressions of the observed stimuli. However, the representational geometry of valence-sensitive areas mirrors these impressions to a lesser degree. Our findings also reveal that the activation magnitude in both AON and valence-sensitive regions does not correlate with the perceived valence of emotional interactions. Results underscore the critical role of the AON, particularly the IPL, in interpreting the valence of emotional interactions, indicating its essential function in the perception of valence, especially when observing biological movements.
Collapse
Affiliation(s)
- Johannes Keck
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
- Center for Mind, Brain and Behavior (CMBB), Philipps University of Marburg and Justus-Liebig-University Giessen, Marburg 35032, Germany
| | - Julia Bachmann
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| | - Adam Zabicki
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| | - Jörn Munzert
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
- Center for Mind, Brain and Behavior (CMBB), Philipps University of Marburg and Justus-Liebig-University Giessen, Marburg 35032, Germany
| | - Britta Krüger
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| |
Collapse
|
4
|
Zanelli V, Lui F, Casadio C, Ricci F, Carpentiero O, Ballotta D, Ambrosecchia M, Ardizzi M, Gallese V, Porro CA, Benuzzi F. Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain. Brain Sci 2025; 15:185. [PMID: 40002518 PMCID: PMC11852981 DOI: 10.3390/brainsci15020185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2024] [Revised: 02/07/2025] [Accepted: 02/10/2025] [Indexed: 02/27/2025] Open
Abstract
Background/Objectives: Fake pain expressions are more intense, prolonged, and include non-pain-related actions compared to genuine ones. Despite these differences, individuals struggle to detect deception in direct tasks (i.e., when asked to detect liars). Regarding neural correlates, while pain observation has been extensively studied, little is known about the neural distinctions between processing genuine, fake, and suppressed pain facial expressions. This study seeks to address this gap using authentic pain stimuli and an implicit emotional processing task. Methods: Twenty-four healthy women underwent an fMRI study, during which they were instructed to complete an implicit gender discrimination task. Stimuli were video clips showing genuine, fake, suppressed pain, and neutral facial expressions. After the scanning session, participants reviewed the stimuli and rated them indirectly according to the intensity of the facial expression (IE) and the intensity of the pain (IP). Results: Mean scores of IE and IP were significantly different for each category. A greater BOLD response for the observation of genuine pain compared to fake pain was observed in the pregenual anterior cingulate cortex (pACC). A parametric analysis showed a correlation between brain activity in the mid-cingulate cortex (aMCC) and the IP ratings. Conclusions: Higher IP ratings for genuine pain expressions and higher IE ratings for fake ones suggest that participants were indirectly able to recognize authenticity in facial expressions. At the neural level, pACC and aMCC appear to be involved in unveiling the genuine vs. fake pain and in coding the intensity of the perceived pain, respectively.
Collapse
Affiliation(s)
- Vanessa Zanelli
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Fausta Lui
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Claudia Casadio
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Francesco Ricci
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Omar Carpentiero
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Daniela Ballotta
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Marianna Ambrosecchia
- Neuroscience Unit, Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy; (M.A.); (M.A.); (V.G.)
- Center for Studies and Research in Cognitive Neuroscience of Cesena, 47522 Cesena, Italy
| | - Martina Ardizzi
- Neuroscience Unit, Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy; (M.A.); (M.A.); (V.G.)
| | - Vittorio Gallese
- Neuroscience Unit, Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy; (M.A.); (M.A.); (V.G.)
| | - Carlo Adolfo Porro
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| | - Francesca Benuzzi
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy; (V.Z.); (C.C.); (F.R.); (O.C.); (D.B.); (C.A.P.); (F.B.)
| |
Collapse
|
5
|
Lanzilotto M, Dal Monte O, Diano M, Panormita M, Battaglia S, Celeghin A, Bonini L, Tamietto M. Learning to fear novel stimuli by observing others in the social affordance framework. Neurosci Biobehav Rev 2025; 169:106006. [PMID: 39788170 DOI: 10.1016/j.neubiorev.2025.106006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2023] [Revised: 12/12/2024] [Accepted: 01/06/2025] [Indexed: 01/12/2025]
Abstract
Fear responses to novel stimuli can be learned directly, through personal experiences (Fear Conditioning, FC), or indirectly, by observing conspecific reactions to a stimulus (Social Fear Learning, SFL). Although substantial knowledge exists about FC and SFL in humans and other species, they are typically conceived as mechanisms that engage separate neural networks and operate at different levels of complexity. Here, we propose a broader framework that links these two fear learning modes by supporting the view that social signals may act as unconditioned stimuli during SFL. In this context, we highlight the potential role of subcortical structures of ancient evolutionary origin in encoding social signals and argue that they play a pivotal function in transforming observed emotional expressions into adaptive behavioural responses. This perspective extends the social affordance hypothesis to subcortical circuits underlying vicarious learning in social contexts. Recognising the interplay between these two modes of fear learning paves the way for new empirical studies focusing on interspecies comparisons and broadens the boundaries of our knowledge of fear acquisition.
Collapse
Affiliation(s)
- M Lanzilotto
- Department of Medicine and Surgery, University of Parma, Parma, Italy; Department of Psychology, University of Turin, Turin, Italy.
| | - O Dal Monte
- Department of Psychology, University of Turin, Turin, Italy; Department of Psychology, Yale University, New Haven, USA
| | - M Diano
- Department of Psychology, University of Turin, Turin, Italy
| | - M Panormita
- Department of Psychology, University of Turin, Turin, Italy; Department of Neuroscience, KU Leuven University, Leuven, Belgium
| | - S Battaglia
- Department of Psychology, University of Turin, Turin, Italy; Department of Psychology, University of Bologna, Cesena, Italy
| | - A Celeghin
- Department of Psychology, University of Turin, Turin, Italy
| | - L Bonini
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - M Tamietto
- Department of Psychology, University of Turin, Turin, Italy; Department of Medical and Clinical Psychology, Tilburg University, Netherlands; Centro Linceo Interdisciplinare "Beniamino Segre", Accademia Nazionale dei Lincei, Roma, Italy.
| |
Collapse
|
6
|
Taubert J, Japee S. Real Face Value: The Processing of Naturalistic Facial Expressions in the Macaque Inferior Temporal Cortex. J Cogn Neurosci 2024; 36:2725-2741. [PMID: 38261366 DOI: 10.1162/jocn_a_02108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2024]
Abstract
For primates, expressions of fear are thought to be powerful social signals. In laboratory settings, faces with fearful expressions have reliably evoked valence effects in inferior temporal cortex. However, because macaques use so called "fear grins" in a variety of different contexts, the deeper question is whether the macaque inferior temporal cortex is tuned to the prototypical fear grin, or to conspecifics signaling fear? In this study, we combined neuroimaging with the results of a behavioral task to investigate how macaques encode a wide variety of fearful facial expressions. In Experiment 1, we identified two sets of macaque face stimuli using different approaches; we selected faces based on the emotional context (i.e., calm vs. fearful), and we selected faces based on the engagement of action units (i.e., neutral vs. fear grins). We also included human faces in Experiment 1. Then, using fMRI, we found that the faces selected based on context elicited a larger valence effect in the inferior temporal cortex than faces selected based on visual appearance. Furthermore, human facial expressions only elicited weak valence effects. These observations were further supported by the results of a two-alternative, forced-choice task (Experiment 2), suggesting that fear grins vary in their perceived pleasantness. Collectively, these findings indicate that the macaque inferior temporal cortex is more involved in social intelligence than commonly assumed, encoding emergent properties in naturalistic face stimuli that transcend basic visual features. These results demand a rethinking of theories surrounding the function and operationalization of primate inferior temporal cortex.
Collapse
Affiliation(s)
- Jessica Taubert
- The National Institute of Mental Health
- The University of Queensland
| | | |
Collapse
|
7
|
Chen P, Zhang C, Li B, Tong L, Wang L, Ma S, Cao L, Yu Z, Yan B. An fMRI dataset in response to large-scale short natural dynamic facial expression videos. Sci Data 2024; 11:1247. [PMID: 39562568 PMCID: PMC11576863 DOI: 10.1038/s41597-024-04088-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Accepted: 11/06/2024] [Indexed: 11/21/2024] Open
Abstract
Facial expression is among the most natural methods for human beings to convey their emotional information in daily life. Although the neural mechanisms of facial expression have been extensively studied employing lab-controlled images and a small number of lab-controlled video stimuli, how the human brain processes natural dynamic facial expression videos still needs to be investigated. To our knowledge, this type of data specifically on large-scale natural facial expression videos is currently missing. We describe here the natural Facial Expressions Dataset (NFED), an fMRI dataset including responses to 1,320 short (3-second) natural facial expression video clips. These video clips are annotated with three types of labels: emotion, gender, and ethnicity, along with accompanying metadata. We validate that the dataset has good quality within and across participants and, notably, can capture temporal and spatial stimuli features. NFED provides researchers with fMRI data for understanding of the visual processing of large number of natural facial expression videos.
Collapse
Affiliation(s)
- Panpan Chen
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Chi Zhang
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Bao Li
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Li Tong
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - LinYuan Wang
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - ShuXiao Ma
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Long Cao
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - ZiYa Yu
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Bin Yan
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China.
| |
Collapse
|
8
|
Abassi E, Bognár A, de Gelder B, Giese M, Isik L, Lappe A, Mukovskiy A, Solanas MP, Taubert J, Vogels R. Neural Encoding of Bodies for Primate Social Perception. J Neurosci 2024; 44:e1221242024. [PMID: 39358024 PMCID: PMC11450534 DOI: 10.1523/jneurosci.1221-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2024] [Revised: 07/22/2024] [Accepted: 07/23/2024] [Indexed: 10/04/2024] Open
Abstract
Primates, as social beings, have evolved complex brain mechanisms to navigate intricate social environments. This review explores the neural bases of body perception in both human and nonhuman primates, emphasizing the processing of social signals conveyed by body postures, movements, and interactions. Early studies identified selective neural responses to body stimuli in macaques, particularly within and ventral to the superior temporal sulcus (STS). These regions, known as body patches, represent visual features that are present in bodies but do not appear to be semantic body detectors. They provide information about posture and viewpoint of the body. Recent research using dynamic stimuli has expanded the understanding of the body-selective network, highlighting its complexity and the interplay between static and dynamic processing. In humans, body-selective areas such as the extrastriate body area (EBA) and fusiform body area (FBA) have been implicated in the perception of bodies and their interactions. Moreover, studies on social interactions reveal that regions in the human STS are also tuned to the perception of dyadic interactions, suggesting a specialized social lateral pathway. Computational work developed models of body recognition and social interaction, providing insights into the underlying neural mechanisms. Despite advances, significant gaps remain in understanding the neural mechanisms of body perception and social interaction. Overall, this review underscores the importance of integrating findings across species to comprehensively understand the neural foundations of body perception and the interaction between computational modeling and neural recording.
Collapse
Affiliation(s)
- Etienne Abassi
- The Neuro, Montreal Neurological Institute-Hospital, McGill University, Montréal, QC H3A 2B4, Canada
| | - Anna Bognár
- Department of Neuroscience, KU Leuven, Leuven 3000, Belgium
- Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| | - Bea de Gelder
- Cognitive Neuroscience, Maastricht University, Maastricht 6229 EV, Netherlands
| | - Martin Giese
- Section Computational Sensomotorics, Hertie Institute for Clinical Brain Research & Centre for Integrative Neurocience, University Clinic Tuebingen, Tuebingen D-72076, Germany
| | - Leyla Isik
- Cognitive Science, Johns Hopkins University, Baltimore, Maryland 21218
| | - Alexander Lappe
- Section Computational Sensomotorics, Hertie Institute for Clinical Brain Research & Centre for Integrative Neurocience, University Clinic Tuebingen, Tuebingen D-72076, Germany
| | - Albert Mukovskiy
- Section Computational Sensomotorics, Hertie Institute for Clinical Brain Research & Centre for Integrative Neurocience, University Clinic Tuebingen, Tuebingen D-72076, Germany
| | - Marta Poyo Solanas
- Cognitive Neuroscience, Maastricht University, Maastricht 6229 EV, Netherlands
| | - Jessica Taubert
- The School of Psychology, University of Queensland, St Lucia, QLD 4072, Australia
| | - Rufin Vogels
- Department of Neuroscience, KU Leuven, Leuven 3000, Belgium
- Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| |
Collapse
|
9
|
Antonioni A, Raho EM, Straudi S, Granieri E, Koch G, Fadiga L. The cerebellum and the Mirror Neuron System: A matter of inhibition? From neurophysiological evidence to neuromodulatory implications. A narrative review. Neurosci Biobehav Rev 2024; 164:105830. [PMID: 39069236 DOI: 10.1016/j.neubiorev.2024.105830] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2024] [Revised: 07/20/2024] [Accepted: 07/24/2024] [Indexed: 07/30/2024]
Abstract
Mirror neurons show activity during both the execution (AE) and observation of actions (AO). The Mirror Neuron System (MNS) could be involved during motor imagery (MI) as well. Extensive research suggests that the cerebellum is interconnected with the MNS and may be critically involved in its activities. We gathered evidence on the cerebellum's role in MNS functions, both theoretically and experimentally. Evidence shows that the cerebellum plays a major role during AO and MI and that its lesions impair MNS functions likely because, by modulating the activity of cortical inhibitory interneurons with mirror properties, the cerebellum may contribute to visuomotor matching, which is fundamental for shaping mirror properties. Indeed, the cerebellum may strengthen sensory-motor patterns that minimise the discrepancy between predicted and actual outcome, both during AE and AO. Furthermore, through its connections with the hippocampus, the cerebellum might be involved in internal simulations of motor programs during MI. Finally, as cerebellar neuromodulation might improve its impact on MNS activity, we explored its potential neurophysiological and neurorehabilitation implications.
Collapse
Affiliation(s)
- Annibale Antonioni
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy; Department of Neuroscience, Ferrara University Hospital, Ferrara 44124, Italy; Doctoral Program in Translational Neurosciences and Neurotechnologies, University of Ferrara, Ferrara 44121, Italy.
| | - Emanuela Maria Raho
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy
| | - Sofia Straudi
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy; Department of Neuroscience, Ferrara University Hospital, Ferrara 44124, Italy
| | - Enrico Granieri
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy
| | - Giacomo Koch
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy; Center for Translational Neurophysiology of Speech and Communication (CTNSC), Italian Institute of Technology (IIT), Ferrara 44121 , Italy; Non Invasive Brain Stimulation Unit, Istituto di Ricovero e Cura a Carattere Scientifico Santa Lucia, Rome 00179, Italy
| | - Luciano Fadiga
- Department of Neuroscience and Rehabilitation, University of Ferrara, Ferrara 44121, Italy; Center for Translational Neurophysiology of Speech and Communication (CTNSC), Italian Institute of Technology (IIT), Ferrara 44121 , Italy
| |
Collapse
|
10
|
Oswald F, Samra SK. A scoping review and index of body stimuli in psychological science. Behav Res Methods 2024; 56:5434-5455. [PMID: 38030921 DOI: 10.3758/s13428-023-02278-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/20/2023] [Indexed: 12/01/2023]
Abstract
Naturalistic body stimuli are necessary for understanding many aspects of human psychology, yet there are no centralized databases of body stimuli. Furthermore, there are a high number of independently developed stimulus sets lacking in standardization and reproducibility potential, and a general lack of organization, contributing to issues of both replicability and generalizability in body-related research. We conducted a comprehensive scoping review to index and explore existing naturalistic whole-body stimuli. Our research questions were as follows: (1) What sets of naturalistic human whole-body stimuli are present in the literature? And (2) On what factors (e.g., demographics, emotion expression) do these stimuli vary? To be included, stimulus sets had to (1) include human bodies as stimuli; (2) be photographs, videos, or other depictions of real human bodies (not computer generated, drawn, etc.); (3) include the whole body (defined as torso, arms, and legs); and (4) could include edited images, but still had to be recognizable as human bodies. We identified a relatively large number of existing stimulus sets (N = 79) which offered relative variability in terms of main manipulated factors and the degree of visual information included (i.e., inclusion of heads and/or faces). However, stimulus sets were demographically homogenous, skewed towards White, young adult, and female bodies. We identified significant issues in reporting and availability practices, posing a challenge to the generalizability, reliability, and reproducibility of body-related research. Accordingly, we urge researchers to adopt transparent and accessible practices and to take steps to diversify body stimuli.
Collapse
Affiliation(s)
- Flora Oswald
- Department of Psychology, The Pennsylvania State University, University Park, PA, USA.
- Department of Women's, Gender, and Sexuality Studies, The Pennsylvania State University, University Park, PA, USA.
- Department of Psychological Sciences, University of Connecticut, Storrs, CT, USA.
- Department of Psychology, University of Denver, Denver, CO, USA.
| | | |
Collapse
|
11
|
Dehnavi SI, Mortazavi SS, Ramezani MA, Gharraee B, Ashouri A. Identification of emotion schemes in women with premenstrual dysphoric disorder (PMDD) using an emotion-focused therapy (EFT) approach: A qualitative study. JOURNAL OF EDUCATION AND HEALTH PROMOTION 2024; 13:309. [PMID: 39429842 PMCID: PMC11488770 DOI: 10.4103/jehp.jehp_935_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 08/26/2023] [Indexed: 10/22/2024]
Abstract
BACKGROUND In women with premenstrual dysphoric disorder (PMDD), emotional problems constitute most of their symptoms. This study aimed to identify the emotion schemes of women with PMDD and to determine the core emotional pains at the center of the emotion scheme of PMDD to facilitate the treatment of this disorder using an emotion-focused therapy (EFT) approach. MATERIALS AND METHODS This study was performed using the directed content analysis method by Hsieh and Shannon. The participants were selected by purposive sampling. The Premenstrual Symptoms Screening Tool (PSST) was used for the primary diagnosis of women, and the Structured Clinical Interview for DSM-5 (SCID-5) was conducted for the final diagnosis. A total of 10 participants were examined via in-depth interviews in this study. The emotion scheme matrix was used as a framework to identify the emotion scheme of PMDD. RESULTS Based on the emotion scheme matrix, the participants' experiences were classified into two main themes, four categories, and eight subcategories, with 37 extracted codes. CONCLUSION Based on the present results, the primary and secondary emotions and behaviors in women with PMDD indicated perceptual-situational, bodily-expressive, motivational-behavioral, and symbolic-conceptual elements in line with the core emotional pains of desperation, despair, and feelings of worthlessness based on the primary maladaptive scheme of shame, which is responsible for different psychological symptoms.
Collapse
Affiliation(s)
- Saeideh Izadi Dehnavi
- Department of Clinical Psychology, School of Behavioral Sciences and Mental Health, Tehran Institute of Psychiatry, Iran University of Medical Sciences, Tehran, Iran
| | - Seyede Salehe Mortazavi
- Geriatric Mental Health Research Center, School of Behavioral Sciences and Mental Health, Tehran Institute of Psychiatry, Iran University of Medical Sciences, Tehran, Iran
| | - Mohammad Arash Ramezani
- Assistant Professor of Clinical Psychology, Rouzbahan Institute of Higher Education, Tehran, Iran
| | - Banafshe Gharraee
- Department of Clinical Psychology, School of Behavioral Sciences and Mental Health, Tehran Institute of Psychiatry, Iran University of Medical Sciences, Tehran, Iran
| | - Ahmad Ashouri
- Department of Clinical Psychology, School of Behavioral Sciences and Mental Health, Tehran Institute of Psychiatry, Iran University of Medical Sciences, Tehran, Iran
| |
Collapse
|
12
|
Shekhar S, Hirvi P, Maria A, Kotilahti K, Tuulari JJ, Karlsson L, Karlsson H, Nissilä I. Maternal prenatal depressive symptoms and child brain responses to affective touch at two years of age. J Affect Disord 2024; 356:177-189. [PMID: 38508459 DOI: 10.1016/j.jad.2024.03.092] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Revised: 03/13/2024] [Accepted: 03/16/2024] [Indexed: 03/22/2024]
Abstract
BACKGROUND Touch is an essential form of mother-child interaction, instigating better social bonding and emotional stability. METHODS We used diffuse optical tomography to explore the relationship between total haemoglobin (HbT) responses to affective touch in the child's brain at two years of age and maternal self-reported prenatal depressive symptoms (EPDS). Affective touch was implemented via slow brushing of the child's right forearm at 3 cm/s and non-affective touch via fast brushing at 30 cm/s and HbT responses were recorded on the left hemisphere. RESULTS We discovered a cluster in the postcentral gyrus exhibiting a negative correlation (Pearson's r = -0.84, p = 0.015 corrected for multiple comparisons) between child HbT response to affective touch and EPDS at gestational week 34. Based on region of interest (ROI) analysis, we found negative correlations between child responses to affective touch and maternal prenatal EPDS at gestational week 14 in the precentral gyrus, Rolandic operculum and secondary somatosensory cortex. The responses to non-affective touch did not correlate with EPDS in these regions. LIMITATIONS The number of mother-child dyads was 16. However, by utilising high-density optode arrangements, individualised anatomical models, and video and accelerometry to monitor movement, we were able to minimize methodological sources of variability in the data. CONCLUSIONS The results show that maternal depressive symptoms during pregnancy may be associated with reduced child responses to affective touch in the temporoparietal cortex. Responses to affective touch may be considered as potential biomarkers for psychosocial development in children. Early identification of and intervention in maternal depression may be important already during early pregnancy.
Collapse
Affiliation(s)
- Shashank Shekhar
- Duke University School of Medicine, Department of Neurology, Durham, NC, USA; University of Turku, Department of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Finland; University of Turku and Turku University Hospital, Department of Psychiatry, Finland
| | - Pauliina Hirvi
- Aalto University, Department of Neuroscience and Biomedical Engineering, Finland; Aalto University, Department of Mathematics and Systems Analysis, Finland
| | - Ambika Maria
- University of Turku, Department of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Finland; University of Turku and Turku University Hospital, Department of Psychiatry, Finland
| | - Kalle Kotilahti
- Aalto University, Department of Neuroscience and Biomedical Engineering, Finland
| | - Jetro J Tuulari
- University of Turku, Department of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Finland; University of Turku and Turku University Hospital, Department of Psychiatry, Finland; Turku Collegium for Science, Medicine and Technology, TCSMT, University of Turku, Finland
| | - Linnea Karlsson
- University of Turku, Department of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Finland; University of Turku and Turku University Hospital, Department of Psychiatry, Finland; University of Turku and Turku University Hospital, Department of Paediatrics and Adolescent Medicine, Finland; Centre for Population Health Research, Turku University Hospital and University of Turku, Turku, Finland
| | - Hasse Karlsson
- University of Turku, Department of Clinical Medicine, Turku Brain and Mind Center, FinnBrain Birth Cohort Study, Finland; University of Turku and Turku University Hospital, Department of Psychiatry, Finland
| | - Ilkka Nissilä
- Aalto University, Department of Neuroscience and Biomedical Engineering, Finland.
| |
Collapse
|
13
|
El Zein M, Mennella R, Sequestro M, Meaux E, Wyart V, Grèzes J. Prioritized neural processing of social threats during perceptual decision-making. iScience 2024; 27:109951. [PMID: 38832023 PMCID: PMC11145357 DOI: 10.1016/j.isci.2024.109951] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 04/24/2024] [Accepted: 05/07/2024] [Indexed: 06/05/2024] Open
Abstract
Emotional signals, notably those signaling threat, benefit from prioritized processing in the human brain. Yet, it remains unclear whether perceptual decisions about the emotional, threat-related aspects of stimuli involve specific or similar neural computations compared to decisions about their non-threatening/non-emotional components. We developed a novel behavioral paradigm in which participants performed two different detection tasks (emotion vs. color) on the same, two-dimensional visual stimuli. First, electroencephalographic (EEG) activity in a cluster of central electrodes reflected the amount of perceptual evidence around 100 ms following stimulus onset, when the decision concerned emotion, not color. Second, participants' choice could be predicted earlier for emotion (240 ms) than for color (380 ms) by the mu (10 Hz) rhythm, which reflects motor preparation. Taken together, these findings indicate that perceptual decisions about threat-signaling dimensions of facial displays are associated with prioritized neural coding in action-related brain regions, supporting the motivational value of socially relevant signals.
Collapse
Affiliation(s)
- M. El Zein
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
- Center for Adaptive Rationality, Max-Planck for Human Development, Berlin, Germany
- Centre for Political Research (CEVIPOF), Sciences Po, Paris, France
- Humans Matter, Paris, France
| | - R. Mennella
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
- Laboratory of the Interactions Between Cognition Action and Emotion (LICAÉ, EA2931), UFR STAPS, Université Paris Nanterre, Nanterre, France
| | - M. Sequestro
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
| | - E. Meaux
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
| | - V. Wyart
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
- Institut du Psychotraumatisme de l’Enfant et de l’Adolescent, Conseil Départemental Yvelines et Hauts-de-Seine, Versailles, France
| | - J. Grèzes
- Cognitive and Computational Neuroscience Laboratory (LNC), INSERM U960, DEC, Ecole Normale Supérieure, PSL University, 75005 Paris, France
| |
Collapse
|
14
|
Whitehead JC, Spiousas I, Armony JL. Individual differences in the evaluation of ambiguous visual and auditory threat-related expressions. Eur J Neurosci 2024; 59:370-393. [PMID: 38185821 DOI: 10.1111/ejn.16220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Revised: 10/29/2023] [Accepted: 11/22/2023] [Indexed: 01/09/2024]
Abstract
This study investigated the neural correlates of the judgement of auditory and visual ambiguous threat-related information, and the influence of state anxiety on this process. Healthy subjects were scanned using a fast, high-resolution functional magnetic resonance imaging (fMRI) multiband sequence while they performed a two-alternative forced-choice emotion judgement task on faces and vocal utterances conveying explicit anger or fear, as well as ambiguous ones. Critically, the latter was specific to each subject, obtained through a morphing procedure and selected prior to scanning following a perceptual decision-making task. Behavioural results confirmed a greater task-difficulty for subject-specific ambiguous stimuli and also revealed a judgement bias for visual fear, and, to a lesser extent, for auditory anger. Imaging results showed increased activity in regions of the salience and frontoparietal control networks (FPCNs) and deactivation in areas of the default mode network for ambiguous, relative to explicit, expressions. In contrast, the right amygdala (AMG) responded more strongly to explicit stimuli. Interestingly, its response to the same ambiguous stimulus depended on the subjective judgement of the expression. Finally, we found that behavioural and neural differences between ambiguous and explicit expressions decreased as a function of state anxiety scores. Taken together, our results show that behavioural and brain responses to emotional expressions are determined not only by emotional clarity but also modality and the subjects' subjective perception of the emotion expressed, and that some of these responses are modulated by state anxiety levels.
Collapse
Affiliation(s)
- Jocelyne C Whitehead
- Human Neuroscience, Douglas Mental Health University Institute, Verdun, Quebec, Canada
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Integrated Program in Neuroscience, McGill University, Montreal, Quebec, Canada
| | - Ignacio Spiousas
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Laboratorio Interdisciplinario del Tiempo y la Experiencia (LITERA), CONICET, Universidad de San Andrés, Victoria, Argentina
| | - Jorge L Armony
- Human Neuroscience, Douglas Mental Health University Institute, Verdun, Quebec, Canada
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Laboratorio Interdisciplinario del Tiempo y la Experiencia (LITERA), CONICET, Universidad de San Andrés, Victoria, Argentina
- Department of Psychiatry, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
15
|
Borgomaneri S, Vitale F, Battaglia S, de Vega M, Avenanti A. Task-related modulation of motor response to emotional bodies: A TMS motor-evoked potential study. Cortex 2024; 171:235-246. [PMID: 38096756 DOI: 10.1016/j.cortex.2023.10.013] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 09/19/2023] [Accepted: 10/06/2023] [Indexed: 02/12/2024]
Abstract
Exposure to emotional body postures during perceptual decision-making tasks has been linked to transient suppression of motor reactivity, supporting the monitoring of emotionally relevant information. However, it remains unclear whether this effect occurs implicitly, i.e., when emotional information is irrelevant to the task. To investigate this issue, we used single-pulse transcranial magnetic stimulation (TMS) to assess motor excitability while healthy participants were asked to categorize pictures of body expressions as emotional or neutral (emotion recognition task) or as belonging to a male or a female actor (gender recognition task) while receiving TMS over the motor cortex at 100 and 125 ms after picture onset. Results demonstrated that motor-evoked potentials (MEPs) were reduced for emotional body postures relative to neutral postures during the emotion recognition task. Conversely, MEPs increased for emotional body postures relative to neutral postures during the gender recognition task. These findings indicate that motor inhibition, contingent upon observing emotional body postures, is selectively associated with actively monitoring emotional features. In contrast, observing emotional body postures prompts motor facilitation when task-relevant features are non-emotional. These findings contribute to embodied cognition models that link emotion perception and action tendencies.
Collapse
Affiliation(s)
- Sara Borgomaneri
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy.
| | - Francesca Vitale
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Simone Battaglia
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy
| | - Manuel de Vega
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Alessio Avenanti
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy; Centro de Investigación en Neuropsicología y Neurosciencias Cognitivas, Universidad Católica Del Maule, Talca, Chile.
| |
Collapse
|
16
|
Tanabe H, Yamamoto K. Structural equation modeling of female gait attractiveness using gait kinematics. Sci Rep 2023; 13:17823. [PMID: 37857803 PMCID: PMC10587354 DOI: 10.1038/s41598-023-45130-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 10/16/2023] [Indexed: 10/21/2023] Open
Abstract
In our social lives, movement's attractiveness greatly affects interpersonal cognition, and gait kinematics mediates walkers' attractiveness. However, no model using gait kinematics has so far predicted gait attractiveness. Thus, this study constructed models of female gait attractiveness with gait kinematics and physique factors as explanatory variables for both barefoot and high-heel walking. First, using motion capture data from 17 women walking, including seven professional runway models, we created gait animations. We also calculated the following gait kinematics as candidate variables to explain walking's attractiveness: four body-silhouette-related variables and six health-related variables. Then, 60 observers evaluated each gait animation's attractiveness and femininity. We performed correlation analysis between these variables and evaluation scores to obtain explanatory variables. Structural equation modeling suggested two models for gait attractiveness, one composed of trunk and head silhouette factors and the other of physique, trunk silhouette, and health-related gait factors. The study's results deepened our understanding of mechanisms behind nonverbal interpersonal cognition through physical movement and brought us closer to realization of artificial generation of attractive gait motions.
Collapse
Affiliation(s)
- Hiroko Tanabe
- Institutes of Innovation for Future Society, Nagoya University, Furo-cho, Chikusa-ku, Nagoya-shi, Aichi, 464-8601, Japan.
| | - Kota Yamamoto
- Japan Society for the Promotion of Science, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo, 102-0083, Japan
- Graduate School of Informatics, Nagoya University, Furo-cho, Chikusa-ku, Nagoya-shi, Aichi, 464-8601, Japan
| |
Collapse
|
17
|
Pazhoohi F, Arantes J, Kingstone A, Pinal D. Neural Correlates and Perceived Attractiveness of Male and Female Shoulder-to-Hip Ratio in Men and Women: An EEG Study. ARCHIVES OF SEXUAL BEHAVIOR 2023:10.1007/s10508-023-02610-w. [PMID: 37170034 DOI: 10.1007/s10508-023-02610-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/06/2022] [Revised: 04/18/2023] [Accepted: 04/20/2023] [Indexed: 05/13/2023]
Abstract
While there are studies regarding the neural correlates of human facial attractiveness, there are few investigations considering neural responses for body form attractiveness. The most prominent physical feature defining men's attractiveness is their physical fitness and upper body strength. Shoulder-to-hip ratio (SHR), a sexually dimorphic trait in humans, is an indicator of men's attractiveness for both men and women. The current study is the first to report on the neurophysiological responses to male and female body forms varying in SHR in healthy heterosexual men and women observers. Electroencephalographic (EEG) signals were acquired while participants completed an oddball task as well as a subsequent attractiveness judgement task. Behavioral results showed larger SHRs were considered more attractive than smaller SHRs, regardless of stimuli and participants' sex. The electrophysiological results for both the oddball task and the explicit judgement of attractiveness showed that brain activity related to male SHR body stimuli differed depending on the specific ratios, both at early and late processing stages. For female avatars, SHR did not modulate neural activity. Collectively the data implicate posterior brain regions in the perception of body forms that differ in attractiveness vis-a-vis variation of SHR, and frontal brain regions when such perceptions are rated explicitly.
Collapse
Affiliation(s)
- Farid Pazhoohi
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada.
| | - Joana Arantes
- Department of Basic Psychology, School of Psychology, University of Minho, Braga, Portugal
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada
| | - Diego Pinal
- Psychological Neuroscience Lab, CIPsi, School of Psychology, University of Minho, Braga, Portugal
| |
Collapse
|
18
|
Bognár A, Raman R, Taubert N, Zafirova Y, Li B, Giese M, De Gelder B, Vogels R. The contribution of dynamics to macaque body and face patch responses. Neuroimage 2023; 269:119907. [PMID: 36717042 PMCID: PMC9986793 DOI: 10.1016/j.neuroimage.2023.119907] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 12/20/2022] [Accepted: 01/26/2023] [Indexed: 01/29/2023] Open
Abstract
Previous functional imaging studies demonstrated body-selective patches in the primate visual temporal cortex, comparing activations to static bodies and static images of other categories. However, the use of static instead of dynamic displays of moving bodies may have underestimated the extent of the body patch network. Indeed, body dynamics provide information about action and emotion and may be processed in patches not activated by static images. Thus, to map with fMRI the full extent of the macaque body patch system in the visual temporal cortex, we employed dynamic displays of natural-acting monkey bodies, dynamic monkey faces, objects, and scrambled versions of these videos, all presented during fixation. We found nine body patches in the visual temporal cortex, starting posteriorly in the superior temporal sulcus (STS) and ending anteriorly in the temporal pole. Unlike for static images, body patches were present consistently in both the lower and upper banks of the STS. Overall, body patches showed a higher activation by dynamic displays than by matched static images, which, for identical stimulus displays, was less the case for the neighboring face patches. These data provide the groundwork for future single-unit recording studies to reveal the spatiotemporal features the neurons of these body patches encode. These fMRI findings suggest that dynamics have a stronger contribution to population responses in body than face patches.
Collapse
Affiliation(s)
- A Bognár
- Deparment of Neurosciences, KU Leuven, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - R Raman
- Deparment of Neurosciences, KU Leuven, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - N Taubert
- Department of Cognitive Neurology, University of Tuebingen, Tuebingen, Germany
| | - Y Zafirova
- Deparment of Neurosciences, KU Leuven, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - B Li
- Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands
| | - M Giese
- Department of Cognitive Neurology, University of Tuebingen, Tuebingen, Germany
| | - B De Gelder
- Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands; Department of Computer Science, University College London, London, UK
| | - R Vogels
- Deparment of Neurosciences, KU Leuven, Leuven, Belgium; Leuven Brain Institute, KU Leuven, Leuven, Belgium.
| |
Collapse
|
19
|
Dadario NB, Tanglay O, Stafford JF, Davis EJ, Young IM, Fonseka RD, Briggs RG, Yeung JT, Teo C, Sughrue ME. Topology of the lateral visual system: The fundus of the superior temporal sulcus and parietal area H connect nonvisual cerebrum to the lateral occipital lobe. Brain Behav 2023; 13:e2945. [PMID: 36912573 PMCID: PMC10097165 DOI: 10.1002/brb3.2945] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/04/2022] [Revised: 02/13/2023] [Accepted: 02/17/2023] [Indexed: 03/14/2023] Open
Abstract
BACKGROUND AND PURPOSE Mapping the topology of the visual system is critical for understanding how complex cognitive processes like reading can occur. We aim to describe the connectivity of the visual system to understand how the cerebrum accesses visual information in the lateral occipital lobe. METHODS Using meta-analytic software focused on task-based functional MRI studies, an activation likelihood estimation (ALE) of the visual network was created. Regions of interest corresponding to the cortical parcellation scheme previously published under the Human Connectome Project were co-registered onto the ALE to identify the hub-like regions of the visual network. Diffusion Spectrum Imaging-based fiber tractography was performed to determine the structural connectivity of these regions with extraoccipital cortices. RESULTS The fundus of the superior temporal sulcus (FST) and parietal area H (PH) were identified as hub-like regions for the visual network. FST and PH demonstrated several areas of coactivation beyond the occipital lobe and visual network. Furthermore, these parcellations were highly interconnected with other cortical regions throughout extraoccipital cortices related to their nonvisual functional roles. A cortical model demonstrating connections to these hub-like areas was created. CONCLUSIONS FST and PH are two hub-like areas that demonstrate extensive functional coactivation and structural connections to nonvisual cerebrum. Their structural interconnectedness with language cortices along with the abnormal activation of areas commonly located in the temporo-occipital region in dyslexic individuals suggests possible important roles of FST and PH in the integration of information related to language and reading. Future studies should refine our model by examining the functional roles of these hub areas and their clinical significance.
Collapse
Affiliation(s)
- Nicholas B Dadario
- Robert Wood Johnson Medical School, Rutgers, The State University of New Jersey, New Brunswick, New Jersey, USA
| | - Onur Tanglay
- Omniscient Neurotechnology, Sydney, New South Wales, Australia
| | - Jordan F Stafford
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | | | | | - R Dineth Fonseka
- Centre for Minimally Invasive Neurosurgery, Prince of Wales Private Hospital, Sydney, New South Wales, Australia
| | - Robert G Briggs
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | | | - Charles Teo
- Cingulum Health, Sydney, New South Wales, Australia
| | - Michael E Sughrue
- Omniscient Neurotechnology, Sydney, New South Wales, Australia.,Cingulum Health, Sydney, New South Wales, Australia.,Centre for Minimally Invasive Neurosurgery, Prince of Wales Private Hospital, Sydney, New South Wales, Australia
| |
Collapse
|
20
|
Li B, Solanas MP, Marrazzo G, Raman R, Taubert N, Giese M, Vogels R, de Gelder B. A large-scale brain network of species-specific dynamic human body perception. Prog Neurobiol 2023; 221:102398. [PMID: 36565985 DOI: 10.1016/j.pneurobio.2022.102398] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 11/25/2022] [Accepted: 12/19/2022] [Indexed: 12/24/2022]
Abstract
This ultrahigh field 7 T fMRI study addressed the question of whether there exists a core network of brain areas at the service of different aspects of body perception. Participants viewed naturalistic videos of monkey and human faces, bodies, and objects along with mosaic-scrambled videos for control of low-level features. Independent component analysis (ICA) based network analysis was conducted to find body and species modulations at both the voxel and the network levels. Among the body areas, the highest species selectivity was found in the middle frontal gyrus and amygdala. Two large-scale networks were highly selective to bodies, dominated by the lateral occipital cortex and right superior temporal sulcus (STS) respectively. The right STS network showed high species selectivity, and its significant human body-induced node connectivity was focused around the extrastriate body area (EBA), STS, temporoparietal junction (TPJ), premotor cortex, and inferior frontal gyrus (IFG). The human body-specific network discovered here may serve as a brain-wide internal model of the human body serving as an entry point for a variety of processes relying on body descriptions as part of their more specific categorization, action, or expression recognition functions.
Collapse
Affiliation(s)
- Baichen Li
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Giuseppe Marrazzo
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Rajani Raman
- Laboratory for Neuro, and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven 3000, Belgium; Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| | - Nick Taubert
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen 72076, Germany
| | - Martin Giese
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen 72076, Germany
| | - Rufin Vogels
- Laboratory for Neuro, and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven 3000, Belgium; Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands; Department of Computer Science, University College London, London WC1E 6BT, UK.
| |
Collapse
|
21
|
Smith RA, Cross ES. The McNorm library: creating and validating a new library of emotionally expressive whole body dance movements. PSYCHOLOGICAL RESEARCH 2023; 87:484-508. [PMID: 35385989 PMCID: PMC8985749 DOI: 10.1007/s00426-022-01669-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 02/23/2022] [Indexed: 11/28/2022]
Abstract
The ability to exchange affective cues with others plays a key role in our ability to create and maintain meaningful social relationships. We express our emotions through a variety of socially salient cues, including facial expressions, the voice, and body movement. While significant advances have been made in our understanding of verbal and facial communication, to date, understanding of the role played by human body movement in our social interactions remains incomplete. To this end, here we describe the creation and validation of a new set of emotionally expressive whole-body dance movement stimuli, named the Motion Capture Norming (McNorm) Library, which was designed to reconcile a number of limitations associated with previous movement stimuli. This library comprises a series of point-light representations of a dancer's movements, which were performed to communicate to observers neutrality, happiness, sadness, anger, and fear. Based on results from two validation experiments, participants could reliably discriminate the intended emotion expressed in the clips in this stimulus set, with accuracy rates up to 60% (chance = 20%). We further explored the impact of dance experience and trait empathy on emotion recognition and found that neither significantly impacted emotion discrimination. As all materials for presenting and analysing this movement library are openly available, we hope this resource will aid other researchers in further exploration of affective communication expressed by human bodily movement.
Collapse
Affiliation(s)
- Rebecca A. Smith
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland
| | - Emily S. Cross
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland ,Department of Cognitive Science, Macquarie University, Sydney, Australia
| |
Collapse
|
22
|
Zhang M, Li P, Yu L, Ren J, Jia S, Wang C, He W, Luo W. Emotional body expressions facilitate working memory: Evidence from the n‐back task. Psych J 2022; 12:178-184. [PMID: 36403986 DOI: 10.1002/pchj.616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2022] [Accepted: 10/10/2022] [Indexed: 11/22/2022]
Abstract
In daily life, individuals need to recognize and update emotional information from others' changing body expressions. However, whether emotional bodies can enhance working memory (WM) remains unknown. In the present study, participants completed a modified n-back task, in which they were required to indicate whether a presented image of an emotional body matched that of an item displayed before each block (0-back) or two positions previously (2-back). Each block comprised only fear, happiness, or neutral. We found that in the 0-back trials, when compared with neutral body expressions, the participants took less time and showed comparable ceiling effects for accuracy in happy bodies followed by fearful bodies. When WM load increased to 2-back, both fearful and happy bodies significantly facilitated WM performance (i.e., faster reaction time and higher accuracy) relative to neutral conditions. In summary, the current findings reveal the enhancement effect of emotional body expressions on WM and highlight the importance of emotional action information in WM.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| | - Ping Li
- School of Literature and Journalism North Minzu University Yinchuan China
| | - Lu Yu
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| | - Jie Ren
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| | - Shuxin Jia
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| | - Chaolun Wang
- Department of Psychology Sun Yat‐Sen University Guangzhou China
| | - Weiqi He
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience Liaoning Normal University Dalian China
- Key Laboratory of Brain and Cognitive Neuroscience Dalian China
| |
Collapse
|
23
|
Landsiedel J, Daughters K, Downing PE, Koldewyn K. The role of motion in the neural representation of social interactions in the posterior temporal cortex. Neuroimage 2022; 262:119533. [PMID: 35931309 PMCID: PMC9485464 DOI: 10.1016/j.neuroimage.2022.119533] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2022] [Revised: 07/15/2022] [Accepted: 08/01/2022] [Indexed: 11/30/2022] Open
Abstract
Humans are an inherently social species, with multiple focal brain regions sensitive to various visual social cues such as faces, bodies, and biological motion. More recently, research has begun to investigate how the brain responds to more complex, naturalistic social scenes, identifying a region in the posterior superior temporal sulcus (SI-pSTS; i.e., social interaction pSTS), amongst others, as an important region for processing social interaction. This research, however, has presented images or videos, and thus the contribution of motion to social interaction perception in these brain regions is not yet understood. In the current study, 22 participants viewed videos, image sequences, scrambled image sequences and static images of either social interactions or non-social independent actions. Combining univariate and multivariate analyses, we confirm that bilateral SI-pSTS plays a central role in dynamic social interaction perception but is much less involved when 'interactiveness' is conveyed solely with static cues. Regions in the social brain, including SI-pSTS and extrastriate body area (EBA), showed sensitivity to both motion and interactive content. While SI-pSTS is somewhat more tuned to video interactions than is EBA, both bilateral SI-pSTS and EBA showed a greater response to social interactions compared to non-interactions and both regions responded more strongly to videos than static images. Indeed, both regions showed higher responses to interactions than independent actions in videos and intact sequences, but not in other conditions. Exploratory multivariate regression analyses suggest that selectivity for simple visual motion does not in itself drive interactive sensitivity in either SI-pSTS or EBA. Rather, selectivity for interactions expressed in point-light animations, and selectivity for static images of bodies, make positive and independent contributions to this effect across the LOTC region. Our results strongly suggest that EBA and SI-pSTS work together during dynamic interaction perception, at least when interactive information is conveyed primarily via body information. As such, our results are also in line with proposals of a third visual stream supporting dynamic social scene perception.
Collapse
Affiliation(s)
| | | | - Paul E Downing
- School of Human and Behavioural Sciences, Bangor University
| | - Kami Koldewyn
- School of Human and Behavioural Sciences, Bangor University.
| |
Collapse
|
24
|
Ross P, George E. Are Face Masks a Problem for Emotion Recognition? Not When the Whole Body Is Visible. Front Neurosci 2022; 16:915927. [PMID: 35924222 PMCID: PMC9339646 DOI: 10.3389/fnins.2022.915927] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 06/23/2022] [Indexed: 01/10/2023] Open
Abstract
The rise of the novel COVID-19 virus has made face masks commonplace items around the globe. Recent research found that face masks significantly impair emotion recognition on isolated faces. However, faces are rarely seen in isolation and the body is also a key cue for emotional portrayal. Here, therefore, we investigated the impact of face masks on emotion recognition when surveying the full body. Stimuli expressing anger, happiness, sadness, and fear were selected from the BEAST stimuli set. Masks were added to these images and participants were asked to recognize the emotion and give a confidence level for that decision for both the masked and unmasked stimuli. We found that, contrary to some work viewing faces in isolation, emotion recognition was generally not impaired by face masks when the whole body is present. We did, however, find that when viewing masked faces, only the recognition of happiness significantly decreased when the whole body was present. In contrast to actual performance, confidence levels were found to decline during the Mask condition across all emotional conditions. This research suggests that the impact of masks on emotion recognition may not be as pronounced as previously thought, as long as the whole body is also visible.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
25
|
Haipt A, Rosenbaum D, Fuhr K, Giese M, Batra A, Ehlis AC. The effects of hypnotherapy compared to cognitive behavioral therapy in depression: a NIRS-study using an emotional gait paradigm. Eur Arch Psychiatry Clin Neurosci 2022; 272:729-739. [PMID: 35113202 PMCID: PMC9095550 DOI: 10.1007/s00406-021-01348-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Accepted: 10/19/2021] [Indexed: 11/15/2022]
Abstract
Hypnotherapy (HT) is a promising approach to treating depression, but so far, no data are available on the neuronal mechanisms of functional reorganization after HT for depressed patients. Here, 75 patients with mild to moderate depression, who received either HT or Cognitive Behavioral Therapy (CBT), were measured before and after therapy using functional near-infrared spectroscopy. We investigated the patients' cerebral activation during an emotional human gait paradigm. Further, rumination was included as predictor. Our results showed a decrease of functional connectivity (FC) between two regions that are crucial to emotional processing, the Extrastriate Body Area (EBA) and the Superior Temporal Sulcus (STS). This FC decrease was traced back to an activation change throughout therapy in the right STS, not the EBA and was only found in the HT group, depending on rumination: less ruminating HT patients showed a decrease in right STS activation, while highly ruminating patients showed an increase. We carefully propose that this activation change is due to the promotion of emotional experiences during HT, while in CBT a focus lay on activating behavior and changing negative cognitions. HT seemed to have had differential effects on the patients, depending on their rumination style: The increase of right STS activation in highly ruminating patients might mirror the improvement of impaired emotional processing, whilst the decrease of activation in low ruminating patients might reflect a dismissal of an over-compensation, associated with a hyperactivity before therapy. We conclude that HT affects emotional processing and this effect is moderated by rumination.
Collapse
Affiliation(s)
- Alina Haipt
- Department of Psychiatry and Psychotherapy, University Hospital of Tuebingen, Tuebingen, Germany.
| | - David Rosenbaum
- Department of Psychiatry and Psychotherapy, University Hospital of Tuebingen, Tuebingen, Germany
| | - Kristina Fuhr
- Department of Psychiatry and Psychotherapy, University Hospital of Tuebingen, Tuebingen, Germany
| | - Martin Giese
- Section for Computational Sensomotorics, Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research Centre for Integrative Neuroscience, University Hospital of Tuebingen, Tuebingen, Germany
| | - Anil Batra
- Tübingen Center for Mental Health (TüCMH), Tuebingen, Germany
- Department of Psychiatry and Psychotherapy, University Hospital of Tuebingen, Tuebingen, Germany
| | - Ann-Christine Ehlis
- Tübingen Center for Mental Health (TüCMH), Tuebingen, Germany
- Department of Psychiatry and Psychotherapy, University Hospital of Tuebingen, Tuebingen, Germany
| |
Collapse
|
26
|
Dezecache G, Sievers C, Gruber T. What non-Humans Can Teach us about the Role of Emotions in Cultural Evolution. EMOTION REVIEW 2022. [DOI: 10.1177/17540739221082217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The role emotions play in the dynamics of cultural phenomena has long been neglected. The collection of articles recently published in Emotion Review provides an important first step into this necessary endeavor. In this commentary, we discuss this contribution by emphasizing the role epistemological parsimony should play in the future of this research agenda. The cultural behavior and emotions of chimpanzees is taken as reference.
Collapse
Affiliation(s)
| | | | - Thibaud Gruber
- Faculty of Psychology and Educational Sciences, and Swiss Center for Affective Sciences, University of Geneva, Switzerland
| |
Collapse
|
27
|
Whitehead JC, Armony JL. Intra-individual Reliability of Voice- and Music-elicited Responses and their Modulation by Expertise. Neuroscience 2022; 487:184-197. [PMID: 35182696 DOI: 10.1016/j.neuroscience.2022.02.011] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Revised: 01/19/2022] [Accepted: 02/10/2022] [Indexed: 10/19/2022]
Abstract
A growing number of functional neuroimaging studies have identified regions within the temporal lobe, particularly along the planum polare and planum temporale, that respond more strongly to music than other types of acoustic stimuli, including voice. This "music preferred" regions have been reported using a variety of stimulus sets, paradigms and analysis approaches and their consistency across studies confirmed through meta-analyses. However, the critical question of intra-subject reliability of these responses has received less attention. Here, we directly assessed this important issue by contrasting brain responses to musical vs. vocal stimuli in the same subjects across three consecutive fMRI runs, using different types of stimuli. Moreover, we investigated whether these music- and voice-preferred responses were reliably modulated by expertise. Results demonstrated that music-preferred activity previously reported in temporal regions, and its modulation by expertise, exhibits a high intra-subject reliability. However, we also found that activity in some extra-temporal regions, such as the precentral and middle frontal gyri, did depend on the particular stimuli employed, which may explain why these are less consistently reported in the literature. Taken together, our findings confirm and extend the notion that specific regions in the brain consistently respond more strongly to certain socially-relevant stimulus categories, such as faces, voices and music, but that some of these responses appear to depend, at least to some extent, on the specific features of the paradigm employed.
Collapse
Affiliation(s)
- Jocelyne C Whitehead
- Douglas Mental Health University Institute, Verdun, Canada; BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Canada; Integrated Program in Neuroscience, McGill University, Montreal, Canada.
| | - Jorge L Armony
- Douglas Mental Health University Institute, Verdun, Canada; BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Canada; Department of Psychiatry, McGill University, Montreal, Canada
| |
Collapse
|
28
|
Ferrari C, Ciricugno A, Urgesi C, Cattaneo Z. Cerebellar contribution to emotional body language perception: a TMS study. Soc Cogn Affect Neurosci 2022; 17:81-90. [PMID: 31588511 PMCID: PMC8824541 DOI: 10.1093/scan/nsz074] [Citation(s) in RCA: 36] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2019] [Revised: 07/23/2019] [Accepted: 09/06/2019] [Indexed: 11/14/2022] Open
Abstract
Consistent evidence suggests that the cerebellum contributes to the processing of emotional facial expressions. However, it is not yet known whether the cerebellum is recruited when emotions are expressed by body postures or movements, or whether it is recruited differently for positive and negative emotions. In this study, we asked healthy participants to discriminate between body postures (with masked face) expressing emotions of opposite valence (happiness vs anger, Experiment 1), or of the same valence (negative: anger vs sadness; positive: happiness vs surprise, Experiment 2). While performing the task, participants received online transcranial magnetic stimulation (TMS) over a region of the posterior left cerebellum and over two control sites (early visual cortex and vertex). We found that TMS over the cerebellum affected participants' ability to discriminate emotional body postures, but only when one of the emotions was negatively valenced (i.e. anger). These findings suggest that the cerebellar region we stimulated is involved in processing the emotional content conveyed by body postures and gestures. Our findings complement prior evidence on the role of the cerebellum in emotional face processing and have important implications from a clinical perspective, where non-invasive cerebellar stimulation is a promising tool for the treatment of motor, cognitive and affective deficits.
Collapse
Affiliation(s)
- Chiara Ferrari
- Department of Psychology, University of Milano–Bicocca, Milan 20126, Italy
| | - Andrea Ciricugno
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia 27100, Italy
- IRCCS Mondino Foundation, Pavia 27100, Italy
| | - Cosimo Urgesi
- Laboratory of Cognitive Neuroscience, Department of Languages and Literatures, Communication, Education and Society University of Udine, Udine 33100, Italy
- Scientific Institute, IRCCS E. Medea, Neuropsychiatry and Neurorehabilitation Unit, Bosisio Parini, Lecco 23900, Italy
| | - Zaira Cattaneo
- Department of Psychology, University of Milano–Bicocca, Milan 20126, Italy
- IRCCS Mondino Foundation, Pavia 27100, Italy
| |
Collapse
|
29
|
Mello M, Dupont L, Engelen T, Acciarino A, de Borst AW, de Gelder B. The influence of body expression, group affiliation and threat proximity on interactions in virtual reality. CURRENT RESEARCH IN BEHAVIORAL SCIENCES 2022. [DOI: 10.1016/j.crbeha.2022.100075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022] Open
|
30
|
Marrazzo G, Vaessen MJ, de Gelder B. Decoding the difference between explicit and implicit body expression representation in high level visual, prefrontal and inferior parietal cortex. Neuroimage 2021; 243:118545. [PMID: 34478822 DOI: 10.1016/j.neuroimage.2021.118545] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 08/30/2021] [Accepted: 08/31/2021] [Indexed: 11/28/2022] Open
Abstract
Recent studies provide an increasing understanding of how visual objects categories like faces or bodies are represented in the brain and also raised the question whether a category based or more dynamic network inspired models are more powerful. Two important and so far sidestepped issues in this debate are, first, how major category attributes like the emotional expression directly influence category representation and second, whether category and attribute representation are sensitive to task demands. This study investigated the impact of a crucial category attribute like emotional expression on category area activity and whether this varies with the participants' task. Using (fMRI) we measured BOLD responses while participants viewed whole body expressions and performed either an explicit (emotion) or an implicit (shape) recognition task. Our results based on multivariate methods show that the type of task is the strongest determinant of brain activity and can be decoded in EBA, VLPFC and IPL. Brain activity was higher for the explicit task condition in VLPFC and was not emotion specific. This pattern suggests that during explicit recognition of the body expression, body category representation may be strengthened, and emotion and action related activity suppressed. Taken together these results stress the importance of the task and of the role of category attributes for understanding the functional organization of high level visual cortex.
Collapse
Affiliation(s)
- Giuseppe Marrazzo
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands
| | - Maarten J Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands; Department of Computer Science, University College London, London WC1E 6BT, United Kingdom.
| |
Collapse
|
31
|
de Gelder B, Poyo Solanas M. A computational neuroethology perspective on body and expression perception. Trends Cogn Sci 2021; 25:744-756. [PMID: 34147363 DOI: 10.1016/j.tics.2021.05.010] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 04/22/2021] [Accepted: 05/24/2021] [Indexed: 01/17/2023]
Abstract
Survival prompts organisms to prepare adaptive behavior in response to environmental and social threat. However, what are the specific features of the appearance of a conspecific that trigger such adaptive behaviors? For social species, the prime candidates for triggering defense systems are the visual features of the face and the body. We propose a novel approach for studying the ability of the brain to gather survival-relevant information from seeing conspecific body features. Specifically, we propose that behaviorally relevant information from bodies and body expressions is coded at the levels of midlevel features in the brain. These levels are relatively independent from higher-order cognitive and conscious perception of bodies and emotions. Instead, our approach is embedded in an ethological framework and mobilizes computational models for feature discovery.
Collapse
Affiliation(s)
- Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands; Department of Computer Science, University College London, London WC1E 6BT, UK.
| | - Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands
| |
Collapse
|
32
|
New Horizons on Non-invasive Brain Stimulation of the Social and Affective Cerebellum. THE CEREBELLUM 2021; 21:482-496. [PMID: 34270081 DOI: 10.1007/s12311-021-01300-4] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 06/22/2021] [Indexed: 10/20/2022]
Abstract
The cerebellum is increasingly attracting scientists interested in basic and clinical research of neuromodulation. Here, we review available studies that used either transcranial magnetic stimulation (TMS) or transcranial direct current stimulation (tDCS) to examine the role of the posterior cerebellum in different aspects of social and affective cognition, from mood regulation to emotion discrimination, and from the ability to identify biological motion to higher-level social inferences (mentalizing). We discuss how at the functional level the role of the posterior cerebellum in these different processes may be explained by a generic prediction mechanism and how the posterior cerebellum may exert this function within different cortico-cerebellar and cerebellar limbic networks involved in social cognition. Furthermore, we suggest to deepen our understanding of the cerebro-cerebellar circuits involved in different aspects of social cognition by employing promising stimulation approaches that have so far been primarily used to study cortical functions and networks, such as paired-pulse TMS, frequency-tuned stimulation, state-dependent protocols, and chronometric TMS. The ability to modulate cerebro-cerebellar connectivity opens up possible clinical applications for improving impairments in social and affective skills associated with cerebellar abnormalities.
Collapse
|
33
|
Frontotemporal dementia, music perception and social cognition share neurobiological circuits: A meta-analysis. Brain Cogn 2021; 148:105660. [PMID: 33421942 DOI: 10.1016/j.bandc.2020.105660] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2020] [Revised: 10/27/2020] [Accepted: 11/26/2020] [Indexed: 01/18/2023]
Abstract
Frontotemporal dementia (FTD) is a neurodegenerative disease that presents with profound changes in social cognition. Music might be a sensitive probe for social cognition abilities, but underlying neurobiological substrates are unclear. We performed a meta-analysis of voxel-based morphometry studies in FTD patients and functional MRI studies for music perception and social cognition tasks in cognitively normal controls to identify robust patterns of atrophy (FTD) or activation (music perception or social cognition). Conjunction analyses were performed to identify overlapping brain regions. In total 303 articles were included: 53 for FTD (n = 1153 patients, 42.5% female; 1337 controls, 53.8% female), 28 for music perception (n = 540, 51.8% female) and 222 for social cognition in controls (n = 5664, 50.2% female). We observed considerable overlap in atrophy patterns associated with FTD, and functional activation associated with music perception and social cognition, mostly encompassing the ventral language network. We further observed overlap across all three modalities in mesolimbic, basal forebrain and striatal regions. The results of our meta-analysis suggest that music perception and social cognition share neurobiological circuits that are affected in FTD. This supports the idea that music might be a sensitive probe for social cognition abilities with implications for diagnosis and monitoring.
Collapse
|
34
|
Asymmetric Contributions of the Fronto-Parietal Network to Emotional Conflict in the Word–Face Interference Task. Symmetry (Basel) 2020. [DOI: 10.3390/sym12101701] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
The fronto-parietal network is involved in top-down and bottom-up processes necessary to achieve cognitive control. We investigated the role of asymmetric enhancement of the left dorsolateral prefrontal cortex (lDLPFC) and right posterior parietal cortex (rPPC) in cognitive control under conditions of emotional conflict arising from emotional distractors. The effects of anodal tDCS over the lDLPFC/cathodal over the rPPC and the effects of anodal tDCS over the rPPC/cathodal over the lDLPFC were compared to sham tDCS in a double-blind design. The findings showed that anodal stimulation over the lDLPFC reduced interference from emotional distractors, but only when participants had already gained experience with the task. In contrast, having already performed the task only eliminated facilitation effects for positive stimuli. Importantly, anodal stimulation of the rPPC did not affect distractors’ interference. Therefore, the present findings indicate that the lDLPFC plays a crucial role in implementing top-down control to resolve emotional conflict, but that experience with the task is necessary to reveal this role.
Collapse
|
35
|
Poyo Solanas M, Vaessen M, de Gelder B. Computation-Based Feature Representation of Body Expressions in the Human Brain. Cereb Cortex 2020; 30:6376-6390. [DOI: 10.1093/cercor/bhaa196] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2020] [Revised: 06/04/2020] [Accepted: 06/26/2020] [Indexed: 01/31/2023] Open
Abstract
Abstract
Humans and other primate species are experts at recognizing body expressions. To understand the underlying perceptual mechanisms, we computed postural and kinematic features from affective whole-body movement videos and related them to brain processes. Using representational similarity and multivoxel pattern analyses, we showed systematic relations between computation-based body features and brain activity. Our results revealed that postural rather than kinematic features reflect the affective category of the body movements. The feature limb contraction showed a central contribution in fearful body expression perception, differentially represented in action observation, motor preparation, and affect coding regions, including the amygdala. The posterior superior temporal sulcus differentiated fearful from other affective categories using limb contraction rather than kinematics. The extrastriate body area and fusiform body area also showed greater tuning to postural features. The discovery of midlevel body feature encoding in the brain moves affective neuroscience beyond research on high-level emotion representations and provides insights in the perceptual features that possibly drive automatic emotion perception.
Collapse
Affiliation(s)
- Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Maarten Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
- Department of Computer Science, University College London, London WC1E 6BT, UK
| |
Collapse
|
36
|
Geangu E, Vuong QC. Look up to the body: An eye-tracking investigation of 7-months-old infants' visual exploration of emotional body expressions. Infant Behav Dev 2020; 60:101473. [PMID: 32739668 DOI: 10.1016/j.infbeh.2020.101473] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2019] [Revised: 07/22/2020] [Accepted: 07/22/2020] [Indexed: 02/02/2023]
Abstract
The human body is an important source of information to infer a person's emotional state. Research with adult observers indicate that the posture of the torso, arms and hands provide important perceptual cues for recognising anger, fear and happy expressions. Much less is known about whether infants process body regions differently for different body expressions. To address this issue, we used eye tracking to investigate whether infants' visual exploration patterns differed when viewing body expressions. Forty-eight 7-months-old infants were randomly presented with static images of adult female bodies expressing anger, fear and happiness, as well as an emotionally-neutral posture. Facial cues to emotional state were removed by masking the faces. We measured the proportion of looking time, proportion and number of fixations, and duration of fixations on the head, upper body and lower body regions for the different expressions. We showed that infants explored the upper body more than the lower body. Importantly, infants at this age fixated differently on different body regions depending on the expression of the body posture. In particular, infants spent a larger proportion of their looking times and had longer fixation durations on the upper body for fear relative to the other expressions. These results extend and replicate the information about infant processing of emotional expressions displayed by human bodies, and they support the hypothesis that infants' visual exploration of human bodies is driven by the upper body.
Collapse
|
37
|
Williams JHG, Huggins CF, Zupan B, Willis M, Van Rheenen TE, Sato W, Palermo R, Ortner C, Krippl M, Kret M, Dickson JM, Li CSR, Lowe L. A sensorimotor control framework for understanding emotional communication and regulation. Neurosci Biobehav Rev 2020; 112:503-518. [PMID: 32070695 PMCID: PMC7505116 DOI: 10.1016/j.neubiorev.2020.02.014] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Revised: 01/22/2020] [Accepted: 02/11/2020] [Indexed: 12/12/2022]
Abstract
Our research team was asked to consider the relationship of the neuroscience of sensorimotor control to the language of emotions and feelings. Actions are the principal means for the communication of emotions and feelings in both humans and other animals, and the allostatic mechanisms controlling action also apply to the regulation of emotional states by the self and others. We consider how motor control of hierarchically organised, feedback-based, goal-directed action has evolved in humans, within a context of consciousness, appraisal and cultural learning, to serve emotions and feelings. In our linguistic analysis, we found that many emotion and feelings words could be assigned to stages in the sensorimotor learning process, but the assignment was often arbitrary. The embodied nature of emotional communication means that action words are frequently used, but that the meanings or senses of the word depend on its contextual use, just as the relationship of an action to an emotion is also contextually dependent.
Collapse
Affiliation(s)
- Justin H G Williams
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom.
| | - Charlotte F Huggins
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom
| | - Barbra Zupan
- Central Queensland University, School of Health, Medical and Applied Sciences, Bruce Highway, Rockhampton, QLD 4702, Australia
| | - Megan Willis
- Australian Catholic University, School of Psychology, ARC Centre for Excellence in Cognition and its Disorders, Sydney, NSW 2060, Australia
| | - Tamsyn E Van Rheenen
- University of Melbourne, Melbourne Neuropsychiatry Centre, Department of Psychiatry, 161 Barry Street, Carlton, VIC 3053, Australia
| | - Wataru Sato
- Kyoto University, Kokoro Research Centre, 46 Yoshidashimoadachicho, Sakyo Ward, Kyoto, 606-8501, Japan
| | - Romina Palermo
- University of Western Australia, School of Psychological Science, Perth, WA, 6009, Australia
| | - Catherine Ortner
- Thompson Rivers University, Department of Psychology, 805 TRU Way, Kamloops, BC V2C 0C8, Canada
| | - Martin Krippl
- Otto von Guericke University Magdeburg, Faculty of Natural Sciences, Department of Psychology, Universitätsplatz 2, Magdeburg, 39106, Germany
| | - Mariska Kret
- Leiden University, Cognitive Psychology, Pieter de la Court, Waassenaarseweg 52, Leiden, 2333 AK, the Netherlands
| | - Joanne M Dickson
- Edith Cowan University, Psychology Department, School of Arts and Humanities, 270 Joondalup Dr, Joondalup, WA 6027, Australia
| | - Chiang-Shan R Li
- Yale University, Connecticut Mental Health Centre, S112, 34 Park Street, New Haven, CT 06519-1109, USA
| | - Leroy Lowe
- Neuroqualia, Room 229A, Forrester Hall, 36 Arthur Street, Truro, Nova Scotia, B2N 1X5, Canada
| |
Collapse
|
38
|
Mazzoni N, Landi I, Ricciardelli P, Actis-Grosso R, Venuti P. "Motion or Emotion? Recognition of Emotional Bodily Expressions in Children With Autism Spectrum Disorder With and Without Intellectual Disability". Front Psychol 2020; 11:478. [PMID: 32269539 PMCID: PMC7109394 DOI: 10.3389/fpsyg.2020.00478] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Accepted: 03/02/2020] [Indexed: 01/03/2023] Open
Abstract
The recognition of emotional body movement (BM) is impaired in individuals with Autistic Spectrum Disorder ASD, yet it is not clear whether the difficulty is related to the encoding of body motion, emotions, or both. Besides, BM recognition has been traditionally studied using point-light displays stimuli (PLDs) and is still underexplored in individuals with ASD and intellectual disability (ID). In the present study, we investigated the recognition of happy, fearful, and neutral BM in children with ASD with and without ID. In a non-verbal recognition task, participants were asked to recognize pure-body-motion and visible-body-form stimuli (by means of point-light displays-PLDs and full-light displays-FLDs, respectively). We found that the children with ASD were less accurate than TD children in recognizing both the emotional and neutral BM, either when presented as FLDs or PLDs. These results suggest that the difficulty in understanding the observed BM may rely on atypical processing of BM information rather than emotion. Moreover, we found that the accuracy improved with age and IQ only in children with ASD without ID, suggesting that high level of cognitive resources can mediate the acquisition of compensatory mechanisms which develop with age.
Collapse
Affiliation(s)
- Noemi Mazzoni
- ODFLab - Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy
| | - Isotta Landi
- ODFLab - Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy.,MPBA, Fondazione Bruno Kessler, Trento, Italy
| | - Paola Ricciardelli
- Department of Psychology, University of Milano - Bicocca, Milan, Italy.,Milan Centre for Neuroscience, Milan, Italy
| | - Rossana Actis-Grosso
- Department of Psychology, University of Milano - Bicocca, Milan, Italy.,Milan Centre for Neuroscience, Milan, Italy
| | - Paola Venuti
- ODFLab - Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy
| |
Collapse
|
39
|
Lima Portugal LC, Alves RDCS, Junior OF, Sanchez TA, Mocaiber I, Volchan E, Smith Erthal F, David IA, Kim J, Oliveira L, Padmala S, Chen G, Pessoa L, Pereira MG. Interactions between emotion and action in the brain. Neuroimage 2020; 214:116728. [PMID: 32199954 PMCID: PMC7485650 DOI: 10.1016/j.neuroimage.2020.116728] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Revised: 03/04/2020] [Accepted: 03/07/2020] [Indexed: 11/16/2022] Open
Abstract
A growing literature supports the existence of interactions between emotion and action in the brain, and the central participation of the anterior midcingulate cortex (aMCC) in this regard. In the present functional magnetic resonance imaging study, we sought to investigate the role of self-relevance during such interactions by varying the context in which threating pictures were presented (with guns pointed towards or away from the observer). Participants performed a simple visual detection task following exposure to such stimuli. Except for voxelwise tests, we adopted a Bayesian analysis framework which evaluated evidence for the hypotheses of interest, given the data, in a continuous fashion. Behaviorally, our results demonstrated a valence by context interaction such that there was a tendency of speeding up responses to targets after viewing threat pictures directed towards the participant. In the brain, interaction patterns that paralleled those observed behaviorally were observed most notably in the middle temporal gyrus, supplementary motor area, precentral gyrus, and anterior insula. In these regions, activity was overall greater during threat conditions relative to neutral ones, and this effect was enhanced in the directed towards context. A valence by context interaction was observed in the aMCC too, where we also observed a correlation (across participants) of evoked responses and reaction time data. Taken together, our study revealed the context-sensitive engagement of motor-related areas during emotional perception, thus supporting the idea that emotion and action interact in important ways in the brain.
Collapse
Affiliation(s)
- Liana Catarina Lima Portugal
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Rita de Cássia Soares Alves
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Orlando Fernandes Junior
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Tiago Arruda Sanchez
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Izabela Mocaiber
- Laboratory of Cognitive Psychophysiology, Department of Natural Sciences, Institute of Humanities and Health, Federal Fluminense University, Rio das Ostras, RJ, Brazil
| | - Eliane Volchan
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Fátima Smith Erthal
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Isabel Antunes David
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Jongwan Kim
- Department of Psychology, University of Maryland, College Park, MD, USA
| | - Leticia Oliveira
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | | | - Gang Chen
- Scientific and Statistical Computing Core, National Institute of Mental Health, USA
| | - Luiz Pessoa
- Department of Psychology, University of Maryland, College Park, MD, USA; Maryland Neuroimaging Center, University of Maryland, College Park, MD, USA
| | - Mirtes Garcia Pereira
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil.
| |
Collapse
|
40
|
Waist to hip ratio and breast size modulate the processing of female body silhouettes: An EEG study. EVOL HUM BEHAV 2020. [DOI: 10.1016/j.evolhumbehav.2020.01.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
41
|
Ross P, Flack T. Removing Hand Form Information Specifically Impairs Emotion Recognition for Fearful and Angry Body Stimuli. Perception 2019; 49:98-112. [PMID: 31801026 DOI: 10.1177/0301006619893229] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Emotion perception research has largely been dominated by work on facial expressions, but emotion is also strongly conveyed from the body. Research exploring emotion recognition from the body tends to refer to “the body” as a whole entity. However, the body is made up of different components (hands, arms, trunk, etc.), all of which could be differentially contributing to emotion recognition. We know that the hands can help to convey actions and, in particular, are important for social communication through gestures, but we currently do not know to what extent the hands influence emotion recognition from the body. Here, 93 adults viewed static emotional body stimuli with either the hands, arms, or both components removed and completed a forced-choice emotion recognition task. Removing the hands significantly reduced recognition accuracy for fear and anger but made no significant difference to the recognition of happiness and sadness. Removing the arms had no effect on emotion recognition accuracy compared with the full-body stimuli. These results suggest the hands may play a key role in the recognition of emotions from the body.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, UK
| | - Tessa Flack
- School of Psychology, University of Lincoln, UK
| |
Collapse
|
42
|
Liang Y, Liu B, Ji J, Li X. Network Representations of Facial and Bodily Expressions: Evidence From Multivariate Connectivity Pattern Classification. Front Neurosci 2019; 13:1111. [PMID: 31736683 PMCID: PMC6828617 DOI: 10.3389/fnins.2019.01111] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2019] [Accepted: 10/02/2019] [Indexed: 01/21/2023] Open
Abstract
Emotions can be perceived from both facial and bodily expressions. Our previous study has found the successful decoding of facial expressions based on the functional connectivity (FC) patterns. However, the role of the FC patterns in the recognition of bodily expressions remained unclear, and no neuroimaging studies have adequately addressed the question of whether emotions perceiving from facial and bodily expressions are processed rely upon common or different neural networks. To address this, the present study collected functional magnetic resonance imaging (fMRI) data from a block design experiment with facial and bodily expression videos as stimuli (three emotions: anger, fear, and joy), and conducted multivariate pattern classification analysis based on the estimated FC patterns. We found that in addition to the facial expressions, bodily expressions could also be successfully decoded based on the large-scale FC patterns. The emotion classification accuracies for the facial expressions were higher than that for the bodily expressions. Further contributive FC analysis showed that emotion-discriminative networks were widely distributed in both hemispheres, containing regions that ranged from primary visual areas to higher-level cognitive areas. Moreover, for a particular emotion, discriminative FCs for facial and bodily expressions were distinct. Together, our findings highlight the key role of the FC patterns in the emotion processing, indicating how large-scale FC patterns reconfigure in processing of facial and bodily expressions, and suggest the distributed neural representation for the emotion recognition. Furthermore, our results also suggest that the human brain employs separate network representations for facial and bodily expressions of the same emotions. This study provides new evidence for the network representations for emotion perception and may further our understanding of the potential mechanisms underlying body language emotion recognition.
Collapse
Affiliation(s)
- Yin Liang
- Faculty of Information Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Baolin Liu
- Tianjin Key Laboratory of Cognitive Computing and Application, School of Computer Science and Technology, Tianjin University, Tianjin, China.,School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing, China.,State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| | - Junzhong Ji
- Faculty of Information Technology, Beijing Artificial Intelligence Institute, Beijing University of Technology, Beijing, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| |
Collapse
|
43
|
Falagiarda F, Collignon O. Time-resolved discrimination of audio-visual emotion expressions. Cortex 2019; 119:184-194. [DOI: 10.1016/j.cortex.2019.04.017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2018] [Revised: 04/05/2019] [Accepted: 04/29/2019] [Indexed: 10/26/2022]
|
44
|
Ding X, Liu J, Kang T, Wang R, Kret ME. Automatic Change Detection of Emotional and Neutral Body Expressions: Evidence From Visual Mismatch Negativity. Front Psychol 2019; 10:1909. [PMID: 31507485 PMCID: PMC6716465 DOI: 10.3389/fpsyg.2019.01909] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2019] [Accepted: 08/05/2019] [Indexed: 11/13/2022] Open
Abstract
Rapidly and effectively detecting emotions in others is an important social skill. Since emotions expressed by the face are relatively easy to fake or hide, we often use body language to gauge the genuine emotional state of others. Recent studies suggest that expression-related visual mismatch negativity (vMMN) reflects the automatic processing of emotional changes in facial expression; however, the automatic processing of changes in body expression has not yet been studied systematically. The current study uses an oddball paradigm where neutral body actions served as standard stimuli, while fearful body expressions and other neutral body actions served as two different deviants to define body-related vMMN, and to compare the mechanisms underlying the processing of emotional changes to neutral postural changes. The results show a more negative vMMN amplitude for fear deviants 210-260 ms after stimulus onset which corresponds with the negativity bias that was obtained on the N190 component. In earlier time windows, the vMMN amplitude following the two types of deviant stimuli are identical. Therefore, we present a two-stage model for processing changes in body posture, where changes in body posture are processed in the first 170-210 ms, but emotional changes in the time window of 210-260 ms.
Collapse
Affiliation(s)
- Xiaobin Ding
- Psychology Department, Northwest Normal University, Lanzhou, China.,Key Laboratory of Behavioral and Mental Health of Gansu Province, Lanzhou, China
| | - Jianyi Liu
- Psychology Department, Northwest Normal University, Lanzhou, China.,Key Laboratory of Behavioral and Mental Health of Gansu Province, Lanzhou, China
| | - Tiejun Kang
- Psychology Department, Northwest Normal University, Lanzhou, China.,Key Laboratory of Behavioral and Mental Health of Gansu Province, Lanzhou, China
| | - Rui Wang
- Psychology Department, Northwest Normal University, Lanzhou, China.,Key Laboratory of Behavioral and Mental Health of Gansu Province, Lanzhou, China
| | - Mariska E Kret
- Cognitive Psychology Department, Leiden University, Leiden, Netherlands.,Leiden Institute for Brain and Cognition (LIBC), Leiden, Netherlands
| |
Collapse
|
45
|
Fernandes O, Portugal LCL, Alves RDCS, Arruda-Sanchez T, Volchan E, Pereira MG, Mourão-Miranda J, Oliveira L. How do you perceive threat? It's all in your pattern of brain activity. Brain Imaging Behav 2019; 14:2251-2266. [PMID: 31446554 PMCID: PMC7648008 DOI: 10.1007/s11682-019-00177-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Whether subtle differences in the emotional context during threat perception can be detected by multi-voxel pattern analysis (MVPA) remains a topic of debate. To investigate this question, we compared the ability of pattern recognition analysis to discriminate between patterns of brain activity to a threatening versus a physically paired neutral stimulus in two different emotional contexts (the stimulus being directed towards or away from the viewer). The directionality of the stimuli is known to be an important factor in activating different defensive responses. Using multiple kernel learning (MKL) classification models, we accurately discriminated patterns of brain activation to threat versus neutral stimuli in the directed towards context but not during the directed away context. Furthermore, we investigated whether it was possible to decode an individual’s subjective threat perception from patterns of whole-brain activity to threatening stimuli in the different emotional contexts using MKL regression models. Interestingly, we were able to accurately predict the subjective threat perception index from the pattern of brain activation to threat only during the directed away context. These results show that subtle differences in the emotional context during threat perception can be detected by MVPA. In the directed towards context, the threat perception was more intense, potentially producing more homogeneous patterns of brain activation across individuals. In the directed away context, the threat perception was relatively less intense and more variable across individuals, enabling the regression model to successfully capture the individual differences and predict the subjective threat perception.
Collapse
Affiliation(s)
- Orlando Fernandes
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, 255 Rodolpho Paulo Rocco st., Ilha do Fundão, Rio de Janeiro, RJ, 21941-590, Brazil.
| | - Liana Catrina Lima Portugal
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Rita de Cássia S Alves
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil.,IBMR University Center, Rio de Janeiro, RJ, Brazil
| | - Tiago Arruda-Sanchez
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, 255 Rodolpho Paulo Rocco st., Ilha do Fundão, Rio de Janeiro, RJ, 21941-590, Brazil
| | - Eliane Volchan
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Mirtes Garcia Pereira
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Janaina Mourão-Miranda
- Centre for Medical Image Computing, Department of Computer Science, University College London, London, UK.,Max Planck University College London Centre for Computational Psychiatry and Ageing Research, University College London, London, UK
| | - Letícia Oliveira
- Laboratory of Behavioral Neurophysiology, Department of Physiology and Pharmacology, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| |
Collapse
|
46
|
Pollux PM, Craddock M, Guo K. Gaze patterns in viewing static and dynamic body expressions. Acta Psychol (Amst) 2019; 198:102862. [PMID: 31226535 DOI: 10.1016/j.actpsy.2019.05.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2019] [Revised: 05/09/2019] [Accepted: 05/26/2019] [Indexed: 11/25/2022] Open
Abstract
Evidence for the importance of bodily cues for emotion recognition has grown over the last two decades. Despite this growing literature, it is underspecified how observers view whole bodies for body expression recognition. Here we investigate to which extent body-viewing is face- and context-specific when participants are categorizing whole body expressions in static (Experiment 1) and dynamic displays (Experiment 2). Eye-movement recordings showed that observers viewed the face exclusively when visible in dynamic displays, whereas viewing was distributed over head, torso and arms in static displays and in dynamic displays with faces not visible. The strong face bias in dynamic face-visible expressions suggests that viewing of the body responds flexibly to the informativeness of facial cues for emotion categorisation. However, when facial expressions are static or not visible, observers adopt a viewing strategy that includes all upper body regions. This viewing strategy is further influenced by subtle viewing biases directed towards emotion-specific body postures and movements to optimise recruitment of diagnostic information for emotion categorisation.
Collapse
|
47
|
Addabbo M, Vacaru SV, Meyer M, Hunnius S. 'Something in the way you move': Infants are sensitive to emotions conveyed in action kinematics. Dev Sci 2019; 23:e12873. [PMID: 31144771 DOI: 10.1111/desc.12873] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2018] [Revised: 05/20/2019] [Accepted: 05/27/2019] [Indexed: 11/28/2022]
Abstract
Body movements, as well as faces, communicate emotions. Research in adults has shown that the perception of action kinematics has a crucial role in understanding others' emotional experiences. Still, little is known about infants' sensitivity to body emotional expressions, since most of the research in infancy focused on faces. While there is some first evidence that infants can recognize emotions conveyed in whole-body postures, it is still an open question whether they can extract emotional information from action kinematics. We measured electromyographic (EMG) activity over the muscles involved in happy (zygomaticus major, ZM), angry (corrugator supercilii, CS) and fearful (frontalis, F) facial expressions, while 11-month-old infants observed the same action performed with either happy or angry kinematics. Results demonstrate that infants responded to angry and happy kinematics with matching facial reactions. In particular, ZM activity increased while CS activity decreased in response to happy kinematics and vice versa for angry kinematics. Our results show for the first time that infants can rely on kinematic information to pick up on the emotional content of an action. Thus, from very early in life, action kinematics represent a fundamental and powerful source of information in revealing others' emotional state.
Collapse
Affiliation(s)
- Margaret Addabbo
- Department of Psychology, University of Milano-Bicocca, Milano, Italy
| | - Stefania V Vacaru
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| | - Marlene Meyer
- Department of Psychology, University of Chicago, Chicago, Illinois
| | - Sabine Hunnius
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| |
Collapse
|
48
|
Ross P, de Gelder B, Crabbe F, Grosbras MH. Emotion modulation of the body-selective areas in the developing brain. Dev Cogn Neurosci 2019; 38:100660. [PMID: 31128318 PMCID: PMC6969350 DOI: 10.1016/j.dcn.2019.100660] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2018] [Revised: 05/02/2019] [Accepted: 05/13/2019] [Indexed: 01/18/2023] Open
Abstract
Passive viewing fMRI task using dynamic emotional bodies and non-human objects. Adults showed increased activation in the body-selective areas compared with children. Adults also showed more activation than adolescents, but only in the right hemisphere. Crucially, we found no age differences in the emotion modulation of these areas.
Emotions are strongly conveyed by the human body and the ability to recognize emotions from body posture or movement is still developing through childhood and adolescence. To date, very few studies have explored how these behavioural observations are paralleled by functional brain development. Furthermore, currently no studies have explored the development of emotion modulation in these areas. In this study, we used functional magnetic resonance imaging (fMRI) to compare the brain activity of 25 children (age 6–11), 18 adolescents (age 12–17) and 26 adults while they passively viewed short videos of angry, happy or neutral body movements. We observed that when viewing dynamic bodies generally, adults showed higher activity than children bilaterally in the body-selective areas; namely the extra-striate body area (EBA), fusiform body area (FBA), posterior superior temporal sulcus (pSTS), as well as the amygdala (AMY). Adults also showed higher activity than adolescents, but only in the right hemisphere. Crucially, however, we found that there were no age differences in the emotion modulation of activity in these areas. These results indicate, for the first time, that despite activity selective to body perception increasing across childhood and adolescence, emotion modulation of these areas in adult-like from 7 years of age.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, UK; Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK.
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Frances Crabbe
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| | - Marie-Hélène Grosbras
- Laboratoire de Neurosciences Cognitives, Aix Marseille Université, Marseille, France; Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| |
Collapse
|
49
|
Santamaría-García H, Ibáñez A, Montaño S, García AM, Patiño-Saenz M, Idarraga C, Pino M, Baez S. Out of Context, Beyond the Face: Neuroanatomical Pathways of Emotional Face-Body Language Integration in Adolescent Offenders. Front Behav Neurosci 2019; 13:34. [PMID: 30863291 PMCID: PMC6399662 DOI: 10.3389/fnbeh.2019.00034] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2018] [Accepted: 02/07/2019] [Indexed: 12/29/2022] Open
Abstract
Background: Adolescent offenders (AOs) are characterized by social-norm transgression and aggressive behaviors. Those traits have been associated with alterations in socio-cognitive processes, including facial emotion recognition. While this would suggest that AOs tend to interpret negative emotional cues as threatening information, most research has relied on context-free stimuli, thus failing to directly track integrative processes typical of everyday cognition. Methods: In this study, we assessed the impact of body language and surrounding context on facial emotion recognition in AOs and non-offenders (NOs). We recruited 35 AOs from a reform school for young male offenders and 30 NOs matched for age and sex with the former group. All participants completed a well-validated task aimed to determine how contextual cues (i.e., emotional body language and surrounding context) influence facial emotion recognition through the use of congruent and incongruent combinations of facial and bodily emotional information. Results: This study showed that AOs tend to overvalue bodily and contextual signals in emotion recognition, with poorer facial-emotion categorization and increased sensitivity to context information in incongruent face-body scenarios. This pattern was associated with executive dysfunctions and disruptive behaviors, as well as with gray matter (GM) of brain regions supporting body-face recognition [fusiform gyrus (FG)], emotion processing [cingulate cortex (CC), superior temporal gyrus (STG)], contextual integration (precuneus, STG), and motor resonance [cerebellum, supplementary motor area (SMA)]. Discussion: Together, our results pave the way for a better understanding of the neurocognitive association between contextual emotion recognition, behavioral regulation, cognitive control, and externalized behaviors in AOs.
Collapse
Affiliation(s)
- Hernando Santamaría-García
- Departamentos de Psiquiatría y Fisiología, Pontificia Universidad Javeriana, Bogotá, Colombia.,Centro de memoria y cognición Intellectus, Hospital Universitario San Ignacio, Bogotá, Colombia.,Grupo de Investigación Cerebro y Cognición Social, Bogotá, Colombia
| | - Agustin Ibáñez
- Laboratory of Experimental Psychology and Neuroscience (LPEN), Institute of Cognitive and Translational Neuroscience (INCYT), INECO Foundation, Favaloro University, Buenos Aires, Argentina.,National Scientific and Technical Research Council (CONICET), Buenos Aires, Argentina.,Departamento de Psicología, Universidad Autónoma del Caribe, Barranquilla, Colombia.,Center for Social and Cognitive Neuroscience (CSCN), School of Psychology, Universidad Adolfo Ibáñez, Santiago de Chile, Chile.,Australian Research Council Centre of Excellence in Cognition and its Disorders, Sydney, NSW, Australia
| | - Synella Montaño
- Departamento de Psicología, Universidad Autónoma del Caribe, Barranquilla, Colombia
| | - Adolfo M García
- Laboratory of Experimental Psychology and Neuroscience (LPEN), Institute of Cognitive and Translational Neuroscience (INCYT), INECO Foundation, Favaloro University, Buenos Aires, Argentina.,National Scientific and Technical Research Council (CONICET), Buenos Aires, Argentina.,Faculty of Education, National University of Cuyo (UNCuyo), Mendoza, Argentina
| | | | - Claudia Idarraga
- Departamento de Psicología, Universidad de la Costa, Barranquilla, Colombia
| | - Mariana Pino
- Departamento de Psicología, Universidad Autónoma del Caribe, Barranquilla, Colombia
| | - Sandra Baez
- Grupo de Investigación Cerebro y Cognición Social, Bogotá, Colombia.,Departamento de Psicología, Universidad de los Andes, Bogotá, Colombia
| |
Collapse
|
50
|
Whitehead JC, Armony JL. Multivariate fMRI pattern analysis of fear perception across modalities. Eur J Neurosci 2019; 49:1552-1563. [DOI: 10.1111/ejn.14322] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2018] [Revised: 11/23/2018] [Accepted: 12/17/2018] [Indexed: 01/04/2023]
Affiliation(s)
- Jocelyne C. Whitehead
- Douglas Mental Health University Institute Verdun Quebec Canada
- BRAMS LaboratoryCentre for Research on Brain, Language and Music Montreal Quebec Canada
- Integrated Program in NeuroscienceMcGill University Montreal Quebec Canada
| | - Jorge L. Armony
- Douglas Mental Health University Institute Verdun Quebec Canada
- BRAMS LaboratoryCentre for Research on Brain, Language and Music Montreal Quebec Canada
- Department of PsychiatryMcGill University Montreal Quebec Canada
| |
Collapse
|