1
|
Li Q, Zhang X, Yang X, Pan N, Li X, Kemp GJ, Wang S, Gong Q. Pre-COVID brain network topology prospectively predicts social anxiety alterations during the COVID-19 pandemic. Neurobiol Stress 2023; 27:100578. [PMID: 37842018 PMCID: PMC10570707 DOI: 10.1016/j.ynstr.2023.100578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 09/12/2023] [Accepted: 09/30/2023] [Indexed: 10/17/2023] Open
Abstract
Background Social anxiety (SA) is a negative emotional response that can lead to mental health issues, which some have experienced during the coronavirus disease 2019 (COVID-19) pandemic. Little attention has been given to the neurobiological mechanisms underlying inter-individual differences in SA alterations related to COVID-19. This study aims to identify neurofunctional markers of COVID-specific SA development. Methods 110 healthy participants underwent resting-state magnetic resonance imaging and behavioral tests before the pandemic (T1, October 2019 to January 2020) and completed follow-up behavioral measurements during the pandemic (T2, February to May 2020). We constructed individual functional networks and used graph theoretical analysis to estimate their global and nodal topological properties, then used Pearson correlation and partial least squares correlations examine their associations with COVID-specific SA alterations. Results In terms of global network parameters, SA alterations (T2-T1) were negatively related to pre-pandemic brain small-worldness and normalized clustering coefficient. In terms of nodal network parameters, SA alterations were positively linked to a pronounced degree centrality pattern, encompassing both the high-level cognitive networks (dorsal attention network, cingulo-opercular task control network, default mode network, memory retrieval network, fronto-parietal task control network, and subcortical network) and low-level perceptual networks (sensory/somatomotor network, auditory network, and visual network). These findings were robust after controlling for pre-pandemic general anxiety, other stressful life events, and family socioeconomic status, as well as by treating SA alterations as categorical variables. Conclusions The individual functional network associated with SA alterations showed a disrupted topological organization with a more random state, which may shed light on the neurobiological basis of COVID-related SA changes at the network level.
Collapse
Affiliation(s)
- Qingyuan Li
- Department of Interventional Therapy, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, 100021, China
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, 610041, China
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, 610041, China
- Functional & Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, 610041, China
| | - Xun Zhang
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, 610041, China
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, 610041, China
- Functional & Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, 610041, China
| | - Xun Yang
- School of Public Affairs, Chongqing University, Chongqing, 400044, China
| | - Nanfang Pan
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, 610041, China
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, 610041, China
- Functional & Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, 610041, China
| | - Xiao Li
- Department of Interventional Therapy, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, 100021, China
| | - Graham J. Kemp
- Liverpool Magnetic Resonance Imaging Centre (LiMRIC) and Institute of Life Course and Medical Sciences, University of Liverpool, Liverpool, L69 3BX, UK
| | - Song Wang
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, 610041, China
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, 610041, China
- Functional & Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, 610041, China
| | - Qiyong Gong
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, 610041, China
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, 610041, China
- Department of Radiology, West China Xiamen Hospital of Sichuan University, Xiamen, 361000, China
| |
Collapse
|
2
|
Tanabe H, Yamamoto K. Structural equation modeling of female gait attractiveness using gait kinematics. Sci Rep 2023; 13:17823. [PMID: 37857803 PMCID: PMC10587354 DOI: 10.1038/s41598-023-45130-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 10/16/2023] [Indexed: 10/21/2023] Open
Abstract
In our social lives, movement's attractiveness greatly affects interpersonal cognition, and gait kinematics mediates walkers' attractiveness. However, no model using gait kinematics has so far predicted gait attractiveness. Thus, this study constructed models of female gait attractiveness with gait kinematics and physique factors as explanatory variables for both barefoot and high-heel walking. First, using motion capture data from 17 women walking, including seven professional runway models, we created gait animations. We also calculated the following gait kinematics as candidate variables to explain walking's attractiveness: four body-silhouette-related variables and six health-related variables. Then, 60 observers evaluated each gait animation's attractiveness and femininity. We performed correlation analysis between these variables and evaluation scores to obtain explanatory variables. Structural equation modeling suggested two models for gait attractiveness, one composed of trunk and head silhouette factors and the other of physique, trunk silhouette, and health-related gait factors. The study's results deepened our understanding of mechanisms behind nonverbal interpersonal cognition through physical movement and brought us closer to realization of artificial generation of attractive gait motions.
Collapse
Affiliation(s)
- Hiroko Tanabe
- Institutes of Innovation for Future Society, Nagoya University, Furo-cho, Chikusa-ku, Nagoya-shi, Aichi, 464-8601, Japan.
| | - Kota Yamamoto
- Japan Society for the Promotion of Science, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo, 102-0083, Japan
- Graduate School of Informatics, Nagoya University, Furo-cho, Chikusa-ku, Nagoya-shi, Aichi, 464-8601, Japan
| |
Collapse
|
3
|
Wang G, Zeng M, Li J, Liu Y, Wei D, Long Z, Chen H, Zang X, Yang J. Neural Representation of Collective Self-esteem in Resting-state Functional Connectivity and its Validation in Task-dependent Modality. Neuroscience 2023; 530:66-78. [PMID: 37619767 DOI: 10.1016/j.neuroscience.2023.08.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 08/01/2023] [Accepted: 08/09/2023] [Indexed: 08/26/2023]
Abstract
INTRODUCTION Collective self-esteem (CSE) is an important personality variable, defined as self-worth derived from membership in social groups. A study explored the neural basis of CSE using a task-based functional magnetic resonance imaging (fMRI) paradigm; however, task-independent neural basis of CSE remains to be explored, and whether the CSE neural basis of resting-state fMRI is consistent with that of task-based fMRI is unclear. METHODS We built support vector regression (SVR) models to predict CSE scores using topological metrics measured in the resting-state functional connectivity network (RSFC) as features. Then, to test the reliability of the SVR analysis, the activation pattern of the identified brain regions from SVR analysis was used as features to distinguish collective self-worth from other conditions by multivariate pattern classification in task-based fMRI dataset. RESULTS SVR analysis results showed that leverage centrality successfully decoded the individual differences in CSE. The ventromedial prefrontal cortex, anterior cingulate cortex, posterior cingulate gyrus, precuneus, orbitofrontal cortex, posterior insula, postcentral gyrus, inferior parietal lobule, temporoparietal junction, and inferior frontal gyrus, which are involved in self-referential processing, affective processing, and social cognition networks, participated in this prediction. Multivariate pattern classification analysis found that the activation pattern of the identified regions from the SVR analysis successfully distinguished collective self-worth from relational self-worth, personal self-worth and semantic control. CONCLUSION Our findings revealed CSE neural basis in the whole-brain RSFC network, and established the concordance between leverage centrality and the activation pattern (evoked during collective self-worth task) of the identified regions in terms of representing CSE.
Collapse
Affiliation(s)
- Guangtong Wang
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Mei Zeng
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Jiwen Li
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Yadong Liu
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Dongtao Wei
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Zhiliang Long
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Haopeng Chen
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Xinlei Zang
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China
| | - Juan Yang
- Faculty of Psychology, Southwest University, Chongqing 400715, China; Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing 400715, China.
| |
Collapse
|
4
|
Zhang X, Cheng B, Yang X, Suo X, Pan N, Chen T, Wang S, Gong Q. Emotional intelligence mediates the protective role of the orbitofrontal cortex spontaneous activity measured by fALFF against depressive and anxious symptoms in late adolescence. Eur Child Adolesc Psychiatry 2023; 32:1957-1967. [PMID: 35737106 DOI: 10.1007/s00787-022-02020-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/29/2021] [Accepted: 06/01/2022] [Indexed: 02/05/2023]
Abstract
As a stable personality construct, trait emotional intelligence (TEI) refers to a battery of perceived emotion-related skills that make individuals behave effectively to adapt to the environment and maintain well-being. Abundant evidence has consistently shown that TEI is important for the outcomes of many mental health issues, particularly depression and anxiety. However, the neural substrates involved in TEI and the underlying neurobehavioral mechanism of how TEI reduces depression and anxiety symptoms remain largely unknown. Herein, resting-state functional magnetic resonance imaging and a group of behavioral measures were applied to examine these questions among a large sample comprising 231 general adolescent students aged 16-20 years (52% female). Whole-brain correlation analysis and prediction analysis demonstrated that TEI was negatively linked with spontaneous activity (measured with the fractional amplitude of low-frequency fluctuations) in the bilateral medial orbitofrontal cortex (OFC), a critical site implicated in emotion-related processes. Furthermore, structural equation modeling analysis found that TEI mediated the link of OFC spontaneous activity to depressive and anxious symptoms. Collectively, the current findings present new evidence for the neurofunctional bases of TEI and suggest a potential "brain-personality-symptom" pathway for alleviating depressive and anxious symptoms among students in late adolescence.
Collapse
Affiliation(s)
- Xun Zhang
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, China
- Functional and Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, China
| | - Bochao Cheng
- Department of Radiology, West China Second University Hospital of Sichuan University, Chengdu, China
| | - Xun Yang
- School of Public Affairs, Chongqing University, Chongqing, China
| | - Xueling Suo
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, China
- Functional and Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, China
| | - Nanfang Pan
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, China
- Functional and Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, China
| | - Taolin Chen
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, China
- Functional and Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, China
| | - Song Wang
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China.
- Research Unit of Psychoradiology, Chinese Academy of Medical Sciences, Chengdu, China.
- Functional and Molecular Imaging Key Laboratory of Sichuan Province, West China Hospital of Sichuan University, Chengdu, China.
| | - Qiyong Gong
- Huaxi MR Research Center (HMRRC), Department of Radiology, West China Hospital of Sichuan University, Chengdu, China.
- Department of Radiology, West China Xiamen Hospital of Sichuan University, Xiamen, China.
| |
Collapse
|
5
|
Scheliga S, Kellermann T, Lampert A, Rolke R, Spehr M, Habel U. Neural correlates of multisensory integration in the human brain: an ALE meta-analysis. Rev Neurosci 2023; 34:223-245. [PMID: 36084305 DOI: 10.1515/revneuro-2022-0065] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 07/22/2022] [Indexed: 02/07/2023]
Abstract
Previous fMRI research identified superior temporal sulcus as central integration area for audiovisual stimuli. However, less is known about a general multisensory integration network across senses. Therefore, we conducted activation likelihood estimation meta-analysis with multiple sensory modalities to identify a common brain network. We included 49 studies covering all Aristotelian senses i.e., auditory, visual, tactile, gustatory, and olfactory stimuli. Analysis revealed significant activation in bilateral superior temporal gyrus, middle temporal gyrus, thalamus, right insula, and left inferior frontal gyrus. We assume these regions to be part of a general multisensory integration network comprising different functional roles. Here, thalamus operate as first subcortical relay projecting sensory information to higher cortical integration centers in superior temporal gyrus/sulcus while conflict-processing brain regions as insula and inferior frontal gyrus facilitate integration of incongruent information. We additionally performed meta-analytic connectivity modelling and found each brain region showed co-activations within the identified multisensory integration network. Therefore, by including multiple sensory modalities in our meta-analysis the results may provide evidence for a common brain network that supports different functional roles for multisensory integration.
Collapse
Affiliation(s)
- Sebastian Scheliga
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Thilo Kellermann
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Angelika Lampert
- Institute of Physiology, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Roman Rolke
- Department of Palliative Medicine, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Marc Spehr
- Department of Chemosensation, RWTH Aachen University, Institute for Biology, Worringerweg 3, 52074 Aachen, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| |
Collapse
|
6
|
Resting-state BOLD temporal variability in sensorimotor and salience networks underlies trait emotional intelligence and explains differences in emotion regulation strategies. Sci Rep 2022; 12:15163. [PMID: 36071093 PMCID: PMC9452559 DOI: 10.1038/s41598-022-19477-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Accepted: 08/30/2022] [Indexed: 11/09/2022] Open
Abstract
A converging body of behavioural findings supports the hypothesis that the dispositional use of emotion regulation (ER) strategies depends on trait emotional intelligence (trait EI) levels. Unfortunately, neuroscientific investigations of such relationship are missing. To fill this gap, we analysed trait measures and resting state data from 79 healthy participants to investigate whether trait EI and ER processes are associated to similar neural circuits. An unsupervised machine learning approach (independent component analysis) was used to decompose resting-sate functional networks and to assess whether they predict trait EI and specific ER strategies. Individual differences results showed that high trait EI significantly predicts and negatively correlates with the frequency of use of typical dysfunctional ER strategies. Crucially, we observed that an increased BOLD temporal variability within sensorimotor and salience networks was associated with both high trait EI and the frequency of use of cognitive reappraisal. By contrast, a decreased variability in salience network was associated with the use of suppression. These findings support the tight connection between trait EI and individual tendency to use functional ER strategies, and provide the first evidence that modulations of BOLD temporal variability in specific brain networks may be pivotal in explaining this relationship.
Collapse
|
7
|
Zuberer A, Schwarz L, Kreifelts B, Wildgruber D, Erb M, Fallgatter A, Scheffler K, Ethofer T. Neural Basis of Impaired Emotion Recognition in Adult Attention-Deficit/Hyperactivity Disorder. BIOLOGICAL PSYCHIATRY. COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2022; 7:680-687. [PMID: 33551283 DOI: 10.1016/j.bpsc.2020.11.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/28/2020] [Revised: 11/23/2020] [Accepted: 11/23/2020] [Indexed: 12/28/2022]
Abstract
BACKGROUND Deficits in emotion recognition have been repeatedly documented in patients diagnosed with attention-deficit/hyperactivity disorder (ADHD), but their neural basis is unknown so far. METHODS In the current study, adult patients with ADHD (n = 44) and healthy control subjects (n = 43) underwent functional magnetic resonance imaging during explicit emotion recognition of stimuli expressing affective information in face, voice, or face-voice combinations. The employed experimental paradigm allowed us to delineate areas for processing audiovisual information based on their functional activation profile, including the bilateral posterior superior temporal gyrus/middle temporal gyrus, amygdala, medial prefrontal cortex, and precuneus, as well as the right posterior thalamus. RESULTS As expected, unbiased hit rates for correct classification of the expressed emotions were lower in patients with ADHD than in healthy control subjects irrespective of the presented sensory modality. This deficit at a behavioral level was accompanied by lower activation in patients with ADHD versus healthy control subjects in the cortex adjacent to the right superior temporal gyrus/middle temporal gyrus and the right posterior thalamus, which represent key areas for processing socially relevant signals and their integration across modalities. A cortical region adjacent to the right posterior superior temporal gyrus was the only brain region that showed a significant correlation between brain activation and emotion identification performance. CONCLUSIONS Altogether, these results provide the first evidence for a potential neural substrate of the observed impairments in emotion recognition in adults with ADHD.
Collapse
Affiliation(s)
- Agnieszka Zuberer
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany; Department of Psychiatry and Psychotherapy, Jena University Hospital, Jena, Germany.
| | - Lena Schwarz
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Michael Erb
- Department of Biomedical Magnetic Resonance, University of Tübingen, Tübingen, Germany
| | - Andreas Fallgatter
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Klaus Scheffler
- Department of Biomedical Magnetic Resonance, University of Tübingen, Tübingen, Germany; Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Thomas Ethofer
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany; Department of Biomedical Magnetic Resonance, University of Tübingen, Tübingen, Germany
| |
Collapse
|
8
|
Correlates of individual voice and face preferential responses during resting state. Sci Rep 2022; 12:7117. [PMID: 35505233 PMCID: PMC9065073 DOI: 10.1038/s41598-022-11367-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2021] [Accepted: 04/15/2022] [Indexed: 11/20/2022] Open
Abstract
Human nonverbal social signals are transmitted to a large extent by vocal and facial cues. The prominent importance of these cues is reflected in specialized cerebral regions which preferentially respond to these stimuli, e.g. the temporal voice area (TVA) for human voices and the fusiform face area (FFA) for human faces. But it remained up to date unknown whether there are respective specializations during resting state, i.e. in the absence of any cues, and if so, whether these representations share neural substrates across sensory modalities. In the present study, resting state functional connectivity (RSFC) as well as voice- and face-preferential activations were analysed from functional magnetic resonance imaging (fMRI) data sets of 60 healthy individuals. Data analysis comprised seed-based analyses using the TVA and FFA as regions of interest (ROIs) as well as multi voxel pattern analyses (MVPA). Using the face- and voice-preferential responses of the FFA and TVA as regressors, we identified several correlating clusters during resting state spread across frontal, temporal, parietal and occipital regions. Using these regions as seeds, characteristic and distinct network patterns were apparent with a predominantly convergent pattern for the bilateral TVAs whereas a largely divergent pattern was observed for the bilateral FFAs. One region in the anterior medial frontal cortex displayed a maximum of supramodal convergence of informative connectivity patterns reflecting voice- and face-preferential responses of both TVAs and the right FFA, pointing to shared neural resources in supramodal voice and face processing. The association of individual voice- and face-preferential neural activity with resting state connectivity patterns may support the perspective of a network function of the brain beyond an activation of specialized regions.
Collapse
|
9
|
Proverbio AM, Santoni S, Adorni R. ERP Markers of Valence Coding in Emotional Speech Processing. iScience 2020; 23:100933. [PMID: 32151976 PMCID: PMC7063241 DOI: 10.1016/j.isci.2020.100933] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2019] [Revised: 12/20/2019] [Accepted: 02/19/2020] [Indexed: 11/01/2022] Open
Abstract
How is auditory emotional information processed? The study's aim was to compare cerebral responses to emotionally positive or negative spoken phrases matched for structure and content. Twenty participants listened to 198 vocal stimuli while detecting filler phrases containing first names. EEG was recorded from 128 sites. Three event-related potential (ERP) components were quantified and found to be sensitive to emotional valence since 350 ms of latency. P450 and late positivity were enhanced by positive content, whereas anterior negativity was larger to negative content. A similar set of markers (P300, N400, LP) was found previously for the processing of positive versus negative affective vocalizations, prosody, and music, which suggests a common neural mechanism for extracting the emotional content of auditory information. SwLORETA applied to potentials recorded between 350 and 550 ms showed that negative speech activated the right temporo/parietal areas (BA40, BA20/21), whereas positive speech activated the left homologous and inferior frontal areas.
Collapse
Affiliation(s)
- Alice Mado Proverbio
- Milan Center for Neuroscience, Department of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, Milan, Italy.
| | - Sacha Santoni
- Milan Center for Neuroscience, Department of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, Milan, Italy
| | - Roberta Adorni
- Milan Center for Neuroscience, Department of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, Milan, Italy
| |
Collapse
|
10
|
Kreifelts B, Ethofer T, Wiegand A, Brück C, Wächter S, Erb M, Lotze M, Wildgruber D. The Neural Correlates of Face-Voice-Integration in Social Anxiety Disorder. Front Psychiatry 2020; 11:657. [PMID: 32765311 PMCID: PMC7381153 DOI: 10.3389/fpsyt.2020.00657] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/13/2020] [Accepted: 06/24/2020] [Indexed: 12/04/2022] Open
Abstract
Faces and voices are very important sources of threat in social anxiety disorder (SAD), a common psychiatric disorder where core elements are fears of social exclusion and negative evaluation. Previous research in social anxiety evidenced increased cerebral responses to negative facial or vocal expressions and also generally increased hemodynamic responses to voices and faces. But it is unclear if also the cerebral process of face-voice-integration is altered in SAD. Applying functional magnetic resonance imaging, we investigated the correlates of the audiovisual integration of dynamic faces and voices in SAD as compared to healthy individuals. In the bilateral midsections of the superior temporal sulcus (STS) increased integration effects in SAD were observed driven by greater activation increases during audiovisual stimulation as compared to auditory stimulation. This effect was accompanied by increased functional connectivity with the visual association cortex and a more anterior position of the individual integration maxima along the STS in SAD. These findings demonstrate that the audiovisual integration of facial and vocal cues in SAD is not only systematically altered with regard to intensity and connectivity but also the individual location of the integration areas within the STS. These combined findings offer a novel perspective on the neuronal representation of social signal processing in individuals suffering from SAD.
Collapse
Affiliation(s)
- Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Thomas Ethofer
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany.,Department for Biomedical Magnetic Resonance, University of Tübingen, Tübingen, Germany
| | - Ariane Wiegand
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Carolin Brück
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Sarah Wächter
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Michael Erb
- Department for Biomedical Magnetic Resonance, University of Tübingen, Tübingen, Germany
| | - Martin Lotze
- Functional Imaging Group, Department for Diagnostic Radiology and Neuroradiology, University of Greifswald, Greifswald, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| |
Collapse
|
11
|
Gao C, Weber CE, Shinkareva SV. The brain basis of audiovisual affective processing: Evidence from a coordinate-based activation likelihood estimation meta-analysis. Cortex 2019; 120:66-77. [DOI: 10.1016/j.cortex.2019.05.016] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Revised: 05/03/2019] [Accepted: 05/28/2019] [Indexed: 01/19/2023]
|
12
|
Aryani A, Hsu CT, Jacobs AM. Affective iconic words benefit from additional sound-meaning integration in the left amygdala. Hum Brain Mapp 2019; 40:5289-5300. [PMID: 31444898 PMCID: PMC6864889 DOI: 10.1002/hbm.24772] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Revised: 07/21/2019] [Accepted: 07/31/2019] [Indexed: 01/01/2023] Open
Abstract
Recent studies have shown that a similarity between sound and meaning of a word (i.e., iconicity) can help more readily access the meaning of that word, but the neural mechanisms underlying this beneficial role of iconicity in semantic processing remain largely unknown. In an fMRI study, we focused on the affective domain and examined whether affective iconic words (e.g., high arousal in both sound and meaning) activate additional brain regions that integrate emotional information from different domains (i.e., sound and meaning). In line with our hypothesis, affective iconic words, compared to their non‐iconic counterparts, elicited additional BOLD responses in the left amygdala known for its role in multimodal representation of emotions. Functional connectivity analyses revealed that the observed amygdalar activity was modulated by an interaction of iconic condition and activations in two hubs representative for processing sound (left superior temporal gyrus) and meaning (left inferior frontal gyrus) of words. These results provide a neural explanation for the facilitative role of iconicity in language processing and indicate that language users are sensitive to the interaction between sound and meaning aspect of words, suggesting the existence of iconicity as a general property of human language.
Collapse
Affiliation(s)
- Arash Aryani
- Department of Experimental and Neurocognitive Psychology, Freie Universität Berlin, Germany
| | - Chun-Ting Hsu
- Kokoro Research Center, Kyoto University, Kyoto, Japan
| | - Arthur M Jacobs
- Department of Experimental and Neurocognitive Psychology, Freie Universität Berlin, Germany.,Centre for Cognitive Neuroscience Berlin (CCNB), Berlin, Germany
| |
Collapse
|
13
|
Song L, Meng J, Liu Q, Huo T, Zhu X, Li Y, Ren Z, Wang X, Qiu J. Polygenic Score of Subjective Well-Being Is Associated with the Brain Morphology in Superior Temporal Gyrus and Insula. Neuroscience 2019; 414:210-218. [PMID: 31173807 DOI: 10.1016/j.neuroscience.2019.05.055] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2019] [Revised: 05/07/2019] [Accepted: 05/27/2019] [Indexed: 01/15/2023]
Abstract
Subjective well-being (SWB) is closely related to our physical and mental health. Existing studies show that neural or genetic basis underpins individual difference in SWB. Moreover, researchers have found high enrichment of SWB-related mutations in the central nervous system, but the relationship between the genetic architecture of SWB and brain morphology has not been explored. Considering the polygenic nature of SWB, in this study, we aim to establish a measure of additive genetic effect on SWB and explore its relationship to the brain anatomical structure. Based on the results of genome-wide association study (GWAS) on SWB, the polygenic scores (PGSs) of SWB at eight different thresholds were calculated in a large Chinese sample (N = 585). Then, we analyzed the associations between the PGSs of SWB and cortical thickness (CT) or gray matter volume (GMV) measured from 3.0-T structural imaging data. In whole-brain analyses, we found that a higher PGS was significantly associated with increased CT in the right superior temporal gyrus (STG) and GMV in the right insula, both of which are involved in social cognition and emotional processing. More importantly, these findings were repeatable at some different thresholds. The results may suggest that the brain morphology of right STG and insula is partly regulated by SWB-related genetic factors.
Collapse
Affiliation(s)
- Li Song
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Jie Meng
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Qiang Liu
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Tengbin Huo
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Xingxing Zhu
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Yiman Li
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Zhiting Ren
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Xiao Wang
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China
| | - Jiang Qiu
- Key Laboratory of Cognition and Personality (SWU), Ministry of Education, Chongqing 400715, China; School of Psychology, Southwest University (SWU), Chongqing 400715, China; Southwest University Branch, Collaborative Innovation Center of Assessment toward Basic Education Quality, Beijing Normal University, Beijing 100875, China.
| |
Collapse
|
14
|
Zhang D, Chen Y, Hou X, Wu YJ. Near-infrared spectroscopy reveals neural perception of vocal emotions in human neonates. Hum Brain Mapp 2019; 40:2434-2448. [PMID: 30697881 DOI: 10.1002/hbm.24534] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2018] [Revised: 01/19/2019] [Accepted: 01/20/2019] [Indexed: 12/20/2022] Open
Abstract
Processing affective prosody, that is the emotional tone of a speaker, is fundamental to human communication and adaptive behaviors. Previous studies have mainly focused on adults and infants; thus the neural mechanisms underlying the processing of affective prosody in newborns remain unclear. Here, we used near-infrared spectroscopy to examine the ability of 0-to-4-day-old neonates to discriminate emotions conveyed by speech prosody in their maternal language and a foreign language. Happy, fearful, and angry prosodies enhanced neural activation in the right superior temporal gyrus relative to neutral prosody in the maternal but not the foreign language. Happy prosody elicited greater activation than negative prosody in the left superior frontal gyrus and the left angular gyrus, regions that have not been associated with affective prosody processing in infants or adults. These findings suggest that sensitivity to affective prosody is formed through prenatal exposure to vocal stimuli of the maternal language. Furthermore, the sensitive neural correlates appeared more distributed in neonates than infants, indicating a high-level of neural specialization between the neonatal stage and early infancy. Finally, neonates showed preferential neural responses to positive over negative prosody, which is contrary to the "negativity bias" phenomenon established in adult and infant studies.
Collapse
Affiliation(s)
- Dandan Zhang
- College of Psychology and Sociology, Shenzhen University, Shenzhen, China.,Shenzhen Key Laboratory of Affective and Social Cognitive Science, Shenzhen University, Shenzhen, China
| | - Yu Chen
- College of Psychology and Sociology, Shenzhen University, Shenzhen, China
| | - Xinlin Hou
- Department of Pediatrics, Peking University First Hospital, Beijing, China
| | - Yan Jing Wu
- Faculty of Foreign Languages, Ningbo University, Ningbo, China
| |
Collapse
|
15
|
Cerebral resting state markers of biased perception in social anxiety. Brain Struct Funct 2018; 224:759-777. [PMID: 30506458 DOI: 10.1007/s00429-018-1803-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2018] [Accepted: 11/24/2018] [Indexed: 01/29/2023]
Abstract
Social anxiety (SA) comprises a multitude of persistent fears around the central element of dreaded negative evaluation and exclusion. This very common anxiety is spectrally distributed among the general population and associated with social perception biases deemed causal in its maintenance. Here, we investigated cerebral resting state markers linking SA and biased social perception. To this end, resting state functional connectivity (RSFC) was assessed as the neurobiological marker in a study population with greatly varying SA using fMRI in the first step of the experiment. One month later the impact of unattended laughter-exemplifying social threat-on a face rating task was evaluated as a measure of biased social perception. Applying a dimensional approach, SA-related cognitive biases tied to the valence, dominance and arousal of the threat signal and their underlying RSFC patterns among central nodes of the cerebral emotion, voice and face processing networks were identified. In particular, the connectivity patterns between the amygdalae and the right temporal voice area met all criteria for a cerebral mediation of the association between SA and the laughter valence-related interpretation bias. Thus, beyond this identification of non-state-dependent cerebral markers of biased perception in SA, this study highlights both a starting point and targets for future research on the causal relationships between cerebral connectivity patterns, SA and biased perception, potentially via neurofeedback methods.
Collapse
|
16
|
Karle KN, Ethofer T, Jacob H, Brück C, Erb M, Lotze M, Nizielski S, Schütz A, Wildgruber D, Kreifelts B. Neurobiological correlates of emotional intelligence in voice and face perception networks. Soc Cogn Affect Neurosci 2018; 13:233-244. [PMID: 29365199 PMCID: PMC5827352 DOI: 10.1093/scan/nsy001] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2016] [Accepted: 01/07/2018] [Indexed: 01/27/2023] Open
Abstract
Facial expressions and voice modulations are among the most important communicational signals to convey emotional information. The ability to correctly interpret this information is highly relevant for successful social interaction and represents an integral component of emotional competencies that have been conceptualized under the term emotional intelligence. Here, we investigated the relationship of emotional intelligence as measured with the Salovey-Caruso-Emotional-Intelligence-Test (MSCEIT) with cerebral voice and face processing using functional and structural magnetic resonance imaging. MSCEIT scores were positively correlated with increased voice-sensitivity and gray matter volume of the insula accompanied by voice-sensitivity enhanced connectivity between the insula and the temporal voice area, indicating generally increased salience of voices. Conversely, in the face processing system, higher MSCEIT scores were associated with decreased face-sensitivity and gray matter volume of the fusiform face area. Taken together, these findings point to an alteration in the balance of cerebral voice and face processing systems in the form of an attenuated face-vs-voice bias as one potential factor underpinning emotional intelligence.
Collapse
Affiliation(s)
- Kathrin N Karle
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany
| | - Thomas Ethofer
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany.,Department for Biomedical Magnetic Resonance, University of Tübingen, 72076 Tübingen, Germany
| | - Heike Jacob
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany
| | - Carolin Brück
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany
| | - Michael Erb
- Department for Biomedical Magnetic Resonance, University of Tübingen, 72076 Tübingen, Germany
| | - Martin Lotze
- Functional Imaging Group, Department for Diagnostic Radiology and Neuroradiology, University of Greifswald, 17475 Greifswald, Germany
| | - Sophia Nizielski
- Department of Psychology, Technical University Chemnitz, 09111 Chemnitz, Germany
| | - Astrid Schütz
- Department of Psychology, University of Bamberg, 96045 Bamberg, Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany
| | - Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University of Tübingen, 72076 Tübingen, Germany
| |
Collapse
|
17
|
Gao C, Wedell DH, Green JJ, Jia X, Mao X, Guo C, Shinkareva SV. Temporal dynamics of audiovisual affective processing. Biol Psychol 2018; 139:59-72. [DOI: 10.1016/j.biopsycho.2018.10.001] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2018] [Revised: 08/28/2018] [Accepted: 10/01/2018] [Indexed: 11/16/2022]
|
18
|
He L, Mao Y, Sun J, Zhuang K, Zhu X, Qiu J, Chen X. Examining Brain Structures Associated With Emotional Intelligence and the Mediated Effect on Trait Creativity in Young Adults. Front Psychol 2018; 9:925. [PMID: 29962984 PMCID: PMC6014059 DOI: 10.3389/fpsyg.2018.00925] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2018] [Accepted: 05/22/2018] [Indexed: 01/06/2023] Open
Abstract
Little is known about the association between emotional intelligence (EI) and trait creativity (TC), and the brain structural bases which involves. This study investigated the neuroanatomical basis of the association between EI and TC which measured by the Schutte self-report EI scale and the Williams creativity aptitude test. First, the voxel-based morphometry (VBM) analysis was used to explore the brain structures which is closely related to EI in a large young sample (n = 213). The results showed that EI was positively correlated with the regional gray matter volume (rGMV) in the right orbitofrontal cortex (OFC), which is regarded as a key region of emotional processing. More importantly, further mediation analysis revealed that rGMV in the right OFC partially mediated the association between EI and TC, which showed the OFC volume could account for the relationship between EI and TC. These findings confirmed the close relationship between EI and TC, and highlighted that the brain volumetric variation in the OFC associated with the top-down processing of emotion regulation, which may play a critical role in the promotion of TC. Together, these findings contributed to sharpening the understanding of the complex relationship between EI and TC from the perspective of brain structural basis.
Collapse
Affiliation(s)
- Li He
- School of Education, Chongqing Normal University, Chongqing, China
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Yu Mao
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Jiangzhou Sun
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Kaixiang Zhuang
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Xingxing Zhu
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Jiang Qiu
- Key Laboratory of Cognition and Personality, Ministry of Education, Southwest University, Chongqing, China
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Xiaoyi Chen
- School of Education, Chongqing Normal University, Chongqing, China
- Student Mental Health Education and Consultation Center, Chongqing Normal University, Chongqing, China
| |
Collapse
|
19
|
Speech Prosodies of Different Emotional Categories Activate Different Brain Regions in Adult Cortex: an fNIRS Study. Sci Rep 2018; 8:218. [PMID: 29317758 PMCID: PMC5760650 DOI: 10.1038/s41598-017-18683-2] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Accepted: 12/14/2017] [Indexed: 11/12/2022] Open
Abstract
Emotional expressions of others embedded in speech prosodies are important for social interactions. This study used functional near-infrared spectroscopy to investigate how speech prosodies of different emotional categories are processed in the cortex. The results demonstrated several cerebral areas critical for emotional prosody processing. We confirmed that the superior temporal cortex, especially the right middle and posterior parts of superior temporal gyrus (BA 22/42), primarily works to discriminate between emotional and neutral prosodies. Furthermore, the results suggested that categorization of emotions occurs within a high-level brain region–the frontal cortex, since the brain activation patterns were distinct when positive (happy) were contrasted to negative (fearful and angry) prosody in the left middle part of inferior frontal gyrus (BA 45) and the frontal eye field (BA8), and when angry were contrasted to neutral prosody in bilateral orbital frontal regions (BA 10/11). These findings verified and extended previous fMRI findings in adult brain and also provided a “developed version” of brain activation for our following neonatal study.
Collapse
|
20
|
Tanaka S, Kirino E. Dynamic Reconfiguration of the Supplementary Motor Area Network during Imagined Music Performance. Front Hum Neurosci 2017; 11:606. [PMID: 29311870 PMCID: PMC5732967 DOI: 10.3389/fnhum.2017.00606] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2017] [Accepted: 11/28/2017] [Indexed: 11/18/2022] Open
Abstract
The supplementary motor area (SMA) has been shown to be the center for motor planning and is active during music listening and performance. However, limited data exist on the role of the SMA in music. Music performance requires complex information processing in auditory, visual, spatial, emotional, and motor domains, and this information is integrated for the performance. We hypothesized that the SMA is engaged in multimodal integration of information, distributed across several regions of the brain to prepare for ongoing music performance. To test this hypothesis, functional networks involving the SMA were extracted from functional magnetic resonance imaging (fMRI) data that were acquired from musicians during imagined music performance and during the resting state. Compared with the resting condition, imagined music performance increased connectivity of the SMA with widespread regions in the brain including the sensorimotor cortices, parietal cortex, posterior temporal cortex, occipital cortex, and inferior and dorsolateral prefrontal cortex. Increased connectivity of the SMA with the dorsolateral prefrontal cortex suggests that the SMA is under cognitive control, while increased connectivity with the inferior prefrontal cortex suggests the involvement of syntax processing. Increased connectivity with the parietal cortex, posterior temporal cortex, and occipital cortex is likely for the integration of spatial, emotional, and visual information. Finally, increased connectivity with the sensorimotor cortices was potentially involved with the translation of thought planning into motor programs. Therefore, the reconfiguration of the SMA network observed in this study is considered to reflect the multimodal integration required for imagined and actual music performance. We propose that the SMA network construct “the internal representation of music performance” by integrating multimodal information required for the performance.
Collapse
Affiliation(s)
- Shoji Tanaka
- Department of Information and Communication Sciences, Sophia University, Tokyo, Japan
| | - Eiji Kirino
- Department of Psychiatry, School of Medicine, Juntendo University, Tokyo, Japan.,Department of Psychiatry, Juntendo Shizuoka Hospital, Shizuoka, Japan
| |
Collapse
|
21
|
Liebenthal E, Silbersweig DA, Stern E. The Language, Tone and Prosody of Emotions: Neural Substrates and Dynamics of Spoken-Word Emotion Perception. Front Neurosci 2016; 10:506. [PMID: 27877106 PMCID: PMC5099784 DOI: 10.3389/fnins.2016.00506] [Citation(s) in RCA: 49] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2016] [Accepted: 10/24/2016] [Indexed: 11/24/2022] Open
Abstract
Rapid assessment of emotions is important for detecting and prioritizing salient input. Emotions are conveyed in spoken words via verbal and non-verbal channels that are mutually informative and unveil in parallel over time, but the neural dynamics and interactions of these processes are not well understood. In this paper, we review the literature on emotion perception in faces, written words, and voices, as a basis for understanding the functional organization of emotion perception in spoken words. The characteristics of visual and auditory routes to the amygdala—a subcortical center for emotion perception—are compared across these stimulus classes in terms of neural dynamics, hemispheric lateralization, and functionality. Converging results from neuroimaging, electrophysiological, and lesion studies suggest the existence of an afferent route to the amygdala and primary visual cortex for fast and subliminal processing of coarse emotional face cues. We suggest that a fast route to the amygdala may also function for brief non-verbal vocalizations (e.g., laugh, cry), in which emotional category is conveyed effectively by voice tone and intensity. However, emotional prosody which evolves on longer time scales and is conveyed by fine-grained spectral cues appears to be processed via a slower, indirect cortical route. For verbal emotional content, the bulk of current evidence, indicating predominant left lateralization of the amygdala response and timing of emotional effects attributable to speeded lexical access, is more consistent with an indirect cortical route to the amygdala. Top-down linguistic modulation may play an important role for prioritized perception of emotions in words. Understanding the neural dynamics and interactions of emotion and language perception is important for selecting potent stimuli and devising effective training and/or treatment approaches for the alleviation of emotional dysfunction across a range of neuropsychiatric states.
Collapse
Affiliation(s)
- Einat Liebenthal
- Department of Psychiatry, Brigham and Women's Hospital Boston, MA, USA
| | | | - Emily Stern
- Department of Psychiatry, Brigham and Women's HospitalBoston, MA, USA; Department of Radiology, Brigham and Women's HospitalBoston, MA, USA
| |
Collapse
|
22
|
Mitchell RLC, Jazdzyk A, Stets M, Kotz SA. Recruitment of Language-, Emotion- and Speech-Timing Associated Brain Regions for Expressing Emotional Prosody: Investigation of Functional Neuroanatomy with fMRI. Front Hum Neurosci 2016; 10:518. [PMID: 27803656 PMCID: PMC5067951 DOI: 10.3389/fnhum.2016.00518] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2016] [Accepted: 09/29/2016] [Indexed: 12/02/2022] Open
Abstract
We aimed to progress understanding of prosodic emotion expression by establishing brain regions active when expressing specific emotions, those activated irrespective of the target emotion, and those whose activation intensity varied depending on individual performance. BOLD contrast data were acquired whilst participants spoke non-sense words in happy, angry or neutral tones, or performed jaw-movements. Emotion-specific analyses demonstrated that when expressing angry prosody, activated brain regions included the inferior frontal and superior temporal gyri, the insula, and the basal ganglia. When expressing happy prosody, the activated brain regions also included the superior temporal gyrus, insula, and basal ganglia, with additional activation in the anterior cingulate. Conjunction analysis confirmed that the superior temporal gyrus and basal ganglia were activated regardless of the specific emotion concerned. Nevertheless, disjunctive comparisons between the expression of angry and happy prosody established that anterior cingulate activity was significantly higher for angry prosody than for happy prosody production. Degree of inferior frontal gyrus activity correlated with the ability to express the target emotion through prosody. We conclude that expressing prosodic emotions (vs. neutral intonation) requires generic brain regions involved in comprehending numerous aspects of language, emotion-related processes such as experiencing emotions, and in the time-critical integration of speech information.
Collapse
Affiliation(s)
- Rachel L C Mitchell
- Centre for Affective Disorders, Institute of Psychiatry Psychology and Neuroscience, King's College London London, UK
| | | | - Manuela Stets
- Department of Psychology, University of Essex Colchester, UK
| | - Sonja A Kotz
- Section of Neuropsychology and Psychopharmacology, Maastricht University Maastricht, Netherlands
| |
Collapse
|
23
|
Hogeveen J, Salvi C, Grafman J. 'Emotional Intelligence': Lessons from Lesions. Trends Neurosci 2016; 39:694-705. [PMID: 27647325 DOI: 10.1016/j.tins.2016.08.007] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2016] [Revised: 08/19/2016] [Accepted: 08/23/2016] [Indexed: 01/12/2023]
Abstract
'Emotional intelligence' (EI) is one of the most highly used psychological terms in popular nomenclature, yet its construct, divergent, and predictive validities are contentiously debated. Despite this debate, the EI construct is composed of a set of emotional abilities - recognizing emotional states in the self and others, using emotions to guide thought and behavior, understanding how emotions shape behavior, and emotion regulation - that undoubtedly influence important social and personal outcomes. In this review, evidence from human lesion studies is reviewed in order to provide insight into the necessary brain regions for each of these core emotional abilities. Critically, we consider how this neuropsychological evidence might help to guide efforts to define and measure EI.
Collapse
Affiliation(s)
- J Hogeveen
- MIND Institute, University of California-Davis, Sacramento, CA, USA; Department of Psychiatry & Behavioral Sciences, University of California-Davis, Sacramento, CA, USA.
| | - C Salvi
- Cognitive Neuroscience Laboratory, Rehabilitation Institute of Chicago, Chicago, IL, USA; Department of Psychology, Northwestern University, Evanston, IL, USA
| | - J Grafman
- Cognitive Neuroscience Laboratory, Rehabilitation Institute of Chicago, Chicago, IL, USA; Department of Psychology, Northwestern University, Evanston, IL, USA; Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA; Department of Neurology, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA.
| |
Collapse
|
24
|
Petrides KV, Mikolajczak M, Mavroveli S, Sanchez-Ruiz MJ, Furnham A, Pérez-González JC. Developments in Trait Emotional Intelligence Research. EMOTION REVIEW 2016. [DOI: 10.1177/1754073916650493] [Citation(s) in RCA: 219] [Impact Index Per Article: 27.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Trait emotional intelligence (“trait EI”) concerns our perceptions of our emotional abilities, that is, how good we believe we are in terms of understanding, regulating, and expressing emotions in order to adapt to our environment and maintain well-being. In this article, we present succinct summaries of selected findings from research on (a) the location of trait EI in personality factor space, (b) the biological underpinnings of the construct, (c) indicative applications in the areas of clinical, health, social, educational, organizational, and developmental psychology, and (d) trait EI training. Findings to date suggest that individual differences in trait EI are a consistent predictor of human behavior across the life span.
Collapse
Affiliation(s)
- K. V. Petrides
- London Psychometric Laboratory, University College London, UK
| | | | | | | | - Adrian Furnham
- Department of Clinical, Educational, and Health Psychology, University College London, UK
| | | |
Collapse
|
25
|
The sound of emotions-Towards a unifying neural network perspective of affective sound processing. Neurosci Biobehav Rev 2016; 68:96-110. [PMID: 27189782 DOI: 10.1016/j.neubiorev.2016.05.002] [Citation(s) in RCA: 109] [Impact Index Per Article: 13.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Revised: 05/01/2016] [Accepted: 05/04/2016] [Indexed: 12/15/2022]
Abstract
Affective sounds are an integral part of the natural and social environment that shape and influence behavior across a multitude of species. In human primates, these affective sounds span a repertoire of environmental and human sounds when we vocalize or produce music. In terms of neural processing, cortical and subcortical brain areas constitute a distributed network that supports our listening experience to these affective sounds. Taking an exhaustive cross-domain view, we accordingly suggest a common neural network that facilitates the decoding of the emotional meaning from a wide source of sounds rather than a traditional view that postulates distinct neural systems for specific affective sound types. This new integrative neural network view unifies the decoding of affective valence in sounds, and ascribes differential as well as complementary functional roles to specific nodes within a common neural network. It also highlights the importance of an extended brain network beyond the central limbic and auditory brain systems engaged in the processing of affective sounds.
Collapse
|
26
|
Reduced functional connectivity to the frontal cortex during processing of social cues in autism spectrum disorder. J Neural Transm (Vienna) 2016; 123:937-47. [DOI: 10.1007/s00702-016-1544-3] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2015] [Accepted: 03/29/2016] [Indexed: 11/25/2022]
|
27
|
Quarto T, Blasi G, Maddalena C, Viscanti G, Lanciano T, Soleti E, Mangiulli I, Taurisano P, Fazio L, Bertolino A, Curci A. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions. PLoS One 2016; 11:e0148621. [PMID: 26859495 PMCID: PMC4747486 DOI: 10.1371/journal.pone.0148621] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2015] [Accepted: 01/21/2016] [Indexed: 11/19/2022] Open
Abstract
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.
Collapse
Affiliation(s)
- Tiziana Quarto
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
- Cognitive Brain Research Unit, Institute of Behavioral Science, University of Helsinki, Helsinki, Finland
| | - Giuseppe Blasi
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
| | - Chiara Maddalena
- Department of Education Science, Psychology and Communication Science, University of Bari "Aldo Moro", Bari, Italy
| | - Giovanna Viscanti
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
| | - Tiziana Lanciano
- Department of Education Science, Psychology and Communication Science, University of Bari "Aldo Moro", Bari, Italy
| | - Emanuela Soleti
- Department of Education Science, Psychology and Communication Science, University of Bari "Aldo Moro", Bari, Italy
| | - Ivan Mangiulli
- Department of Education Science, Psychology and Communication Science, University of Bari "Aldo Moro", Bari, Italy
| | - Paolo Taurisano
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
| | - Leonardo Fazio
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
| | - Alessandro Bertolino
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari “Aldo Moro”, Bari, Italy
- pRED, NORD DTA, Hoffman-La Roche Ltd, Basel, Switzerland
| | - Antonietta Curci
- Department of Education Science, Psychology and Communication Science, University of Bari "Aldo Moro", Bari, Italy
- * E-mail:
| |
Collapse
|
28
|
Pannese A, Grandjean D, Frühholz S. Subcortical processing in auditory communication. Hear Res 2015; 328:67-77. [DOI: 10.1016/j.heares.2015.07.003] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2015] [Revised: 06/23/2015] [Accepted: 07/01/2015] [Indexed: 12/21/2022]
|
29
|
Pan W, Liu C, Yang Q, Gu Y, Yin S, Chen A. The neural basis of trait self-esteem revealed by the amplitude of low-frequency fluctuations and resting state functional connectivity. Soc Cogn Affect Neurosci 2015; 11:367-76. [PMID: 26400859 DOI: 10.1093/scan/nsv119] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2015] [Accepted: 09/17/2015] [Indexed: 02/06/2023] Open
Abstract
Self-esteem is an affective, self-evaluation of oneself and has a significant effect on mental and behavioral health. Although research has focused on the neural substrates of self-esteem, little is known about the spontaneous brain activity that is associated with trait self-esteem (TSE) during the resting state. In this study, we used the resting-state functional magnetic resonance imaging (fMRI) signal of the amplitude of low-frequency fluctuations (ALFFs) and resting state functional connectivity (RSFC) to identify TSE-related regions and networks. We found that a higher level of TSE was associated with higher ALFFs in the left ventral medial prefrontal cortex (vmPFC) and lower ALFFs in the left cuneus/lingual gyrus and right lingual gyrus. RSFC analyses revealed that the strengths of functional connectivity between the left vmPFC and bilateral hippocampus were positively correlated with TSE; however, the connections between the left vmPFC and right inferior frontal gyrus and posterior superior temporal sulcus were negatively associated with TSE. Furthermore, the strengths of functional connectivity between the left cuneus/lingual gyrus and right dorsolateral prefrontal cortex and anterior cingulate cortex were positively related to TSE. These findings indicate that TSE is linked to core regions in the default mode network and social cognition network, which is involved in self-referential processing, autobiographical memory and social cognition.
Collapse
Affiliation(s)
- Weigang Pan
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| | - Congcong Liu
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| | - Qian Yang
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| | - Yan Gu
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| | - Shouhang Yin
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| | - Antao Chen
- Key Laboratory of Cognition and Personality of Ministry of Education, Faculty of Psychology, Southwest University, Chongqing, China
| |
Collapse
|
30
|
P8. Social brain network and autism spectrum disorder: Reduced connectivity to the frontal cortex. Clin Neurophysiol 2015. [DOI: 10.1016/j.clinph.2015.04.130] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
31
|
Kong F, Hu S, Wang X, Song Y, Liu J. Neural correlates of the happy life: The amplitude of spontaneous low frequency fluctuations predicts subjective well-being. Neuroimage 2015; 107:136-145. [PMID: 25463465 DOI: 10.1016/j.neuroimage.2014.11.033] [Citation(s) in RCA: 90] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2013] [Revised: 09/17/2014] [Accepted: 11/14/2014] [Indexed: 10/24/2022] Open
|
32
|
Frühholz S, Trost W, Grandjean D. The role of the medial temporal limbic system in processing emotions in voice and music. Prog Neurobiol 2014; 123:1-17. [PMID: 25291405 DOI: 10.1016/j.pneurobio.2014.09.003] [Citation(s) in RCA: 83] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2014] [Revised: 09/16/2014] [Accepted: 09/29/2014] [Indexed: 01/15/2023]
Abstract
Subcortical brain structures of the limbic system, such as the amygdala, are thought to decode the emotional value of sensory information. Recent neuroimaging studies, as well as lesion studies in patients, have shown that the amygdala is sensitive to emotions in voice and music. Similarly, the hippocampus, another part of the temporal limbic system (TLS), is responsive to vocal and musical emotions, but its specific roles in emotional processing from music and especially from voices have been largely neglected. Here we review recent research on vocal and musical emotions, and outline commonalities and differences in the neural processing of emotions in the TLS in terms of emotional valence, emotional intensity and arousal, as well as in terms of acoustic and structural features of voices and music. We summarize the findings in a neural framework including several subcortical and cortical functional pathways between the auditory system and the TLS. This framework proposes that some vocal expressions might already receive a fast emotional evaluation via a subcortical pathway to the amygdala, whereas cortical pathways to the TLS are thought to be equally used for vocal and musical emotions. While the amygdala might be specifically involved in a coarse decoding of the emotional value of voices and music, the hippocampus might process more complex vocal and musical emotions, and might have an important role especially for the decoding of musical emotions by providing memory-based and contextual associations.
Collapse
Affiliation(s)
- Sascha Frühholz
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland.
| | - Wiebke Trost
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| |
Collapse
|
33
|
Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration. J Neurosci 2014; 34:6813-21. [PMID: 24828635 DOI: 10.1523/jneurosci.4478-13.2014] [Citation(s) in RCA: 67] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The integration of emotional information from the face and voice of other persons is known to be mediated by a number of "multisensory" cerebral regions, such as the right posterior superior temporal sulcus (pSTS). However, whether multimodal integration in these regions is attributable to interleaved populations of unisensory neurons responding to face or voice or rather by multimodal neurons receiving input from the two modalities is not fully clear. Here, we examine this question using functional magnetic resonance adaptation and dynamic audiovisual stimuli in which emotional information was manipulated parametrically and independently in the face and voice via morphing between angry and happy expressions. Healthy human adult subjects were scanned while performing a happy/angry emotion categorization task on a series of such stimuli included in a fast event-related, continuous carryover design. Subjects integrated both face and voice information when categorizing emotion-although there was a greater weighting of face information-and showed behavioral adaptation effects both within and across modality. Adaptation also occurred at the neural level: in addition to modality-specific adaptation in visual and auditory cortices, we observed for the first time a crossmodal adaptation effect. Specifically, fMRI signal in the right pSTS was reduced in response to a stimulus in which facial emotion was similar to the vocal emotion of the preceding stimulus. These results suggest that the integration of emotional information from face and voice in the pSTS involves a detectable proportion of bimodal neurons that combine inputs from visual and auditory cortices.
Collapse
|
34
|
Heckemann B, Schols JM, Halfens RJ. A reflective framework to foster emotionally intelligent leadership in nursing. J Nurs Manag 2014; 23:744-53. [DOI: 10.1111/jonm.12204] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/11/2013] [Indexed: 11/27/2022]
Affiliation(s)
- Birgit Heckemann
- CAPHRI - School for Public Health and Primary Care; Maastricht University; Maastricht the Netherlands
| | - Jos M.G.A Schols
- Faculty of Health, Medicine and Life Sciences, Department of Family Medicine and Department of Health Services Research; CAPHRI - School for Public Health and Primary Care; Maastricht University; Maastricht the Netherlands
| | - Ruud J.G. Halfens
- Faculty of Health, Medicine and Life Sciences, Department of Health Care and Nursing Science; CAPHRI - School for Public Health and Primary Care; Maastricht University; Maastricht the Netherlands
| |
Collapse
|
35
|
Lee H, Ku J, Kim J, Jang DP, Yoon KJ, Kim SI, Kim JJ. Aberrant neural responses to social rejection in patients with schizophrenia. Soc Neurosci 2014; 9:412-23. [PMID: 24731078 DOI: 10.1080/17470919.2014.907202] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
36
|
Song JJ, Lee HJ, Kang H, Lee DS, Chang SO, Oh SH. Effects of congruent and incongruent visual cues on speech perception and brain activity in cochlear implant users. Brain Struct Funct 2014; 220:1109-25. [DOI: 10.1007/s00429-013-0704-6] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2013] [Accepted: 12/30/2013] [Indexed: 12/01/2022]
|
37
|
Watson R, Latinus M, Noguchi T, Garrod O, Crabbe F, Belin P. Dissociating task difficulty from incongruence in face-voice emotion integration. Front Hum Neurosci 2013; 7:744. [PMID: 24294196 PMCID: PMC3826561 DOI: 10.3389/fnhum.2013.00744] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2013] [Accepted: 10/18/2013] [Indexed: 11/13/2022] Open
Abstract
In the everyday environment, affective information is conveyed by both the face and the voice. Studies have demonstrated that a concurrently presented voice can alter the way that an emotional face expression is perceived, and vice versa, leading to emotional conflict if the information in the two modalities is mismatched. Additionally, evidence suggests that incongruence of emotional valence activates cerebral networks involved in conflict monitoring and resolution. However, it is currently unclear whether this is due to task difficulty—that incongruent stimuli are harder to categorize—or simply to the detection of mismatching information in the two modalities. The aim of the present fMRI study was to examine the neurophysiological correlates of processing incongruent emotional information, independent of task difficulty. Subjects were scanned while judging the emotion of face-voice affective stimuli. Both the face and voice were parametrically morphed between anger and happiness and then paired in all audiovisual combinations, resulting in stimuli each defined by two separate values: the degree of incongruence between the face and voice, and the degree of clarity of the combined face-voice information. Due to the specific morphing procedure utilized, we hypothesized that the clarity value, rather than incongruence value, would better reflect task difficulty. Behavioral data revealed that participants integrated face and voice affective information, and that the clarity, as opposed to incongruence value correlated with categorization difficulty. Cerebrally, incongruence was more associated with activity in the superior temporal region, which emerged after task difficulty had been accounted for. Overall, our results suggest that activation in the superior temporal region in response to incongruent information cannot be explained simply by task difficulty, and may rather be due to detection of mismatching information between the two modalities.
Collapse
Affiliation(s)
- Rebecca Watson
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University Maastricht, Netherlands ; Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow Glasgow, UK
| | | | | | | | | | | |
Collapse
|
38
|
Kreifelts B, Jacob H, Brück C, Erb M, Ethofer T, Wildgruber D. Non-verbal emotion communication training induces specific changes in brain function and structure. Front Hum Neurosci 2013; 7:648. [PMID: 24146641 PMCID: PMC3797968 DOI: 10.3389/fnhum.2013.00648] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2013] [Accepted: 09/18/2013] [Indexed: 01/20/2023] Open
Abstract
The perception of emotional cues from voice and face is essential for social interaction. However, this process is altered in various psychiatric conditions along with impaired social functioning. Emotion communication trainings have been demonstrated to improve social interaction in healthy individuals and to reduce emotional communication deficits in psychiatric patients. Here, we investigated the impact of a non-verbal emotion communication training (NECT) on cerebral activation and brain structure in a controlled and combined functional magnetic resonance imaging (fMRI) and voxel-based morphometry study. NECT-specific reductions in brain activity occurred in a distributed set of brain regions including face and voice processing regions as well as emotion processing- and motor-related regions presumably reflecting training-induced familiarization with the evaluation of face/voice stimuli. Training-induced changes in non-verbal emotion sensitivity at the behavioral level and the respective cerebral activation patterns were correlated in the face-selective cortical areas in the posterior superior temporal sulcus and fusiform gyrus for valence ratings and in the temporal pole, lateral prefrontal cortex and midbrain/thalamus for the response times. A NECT-induced increase in gray matter (GM) volume was observed in the fusiform face area. Thus, NECT induces both functional and structural plasticity in the face processing system as well as functional plasticity in the emotion perception and evaluation system. We propose that functional alterations are presumably related to changes in sensory tuning in the decoding of emotional expressions. Taken together, these findings highlight that the present experimental design may serve as a valuable tool to investigate the altered behavioral and neuronal processing of emotional cues in psychiatric disorders as well as the impact of therapeutic interventions on brain function and structure.
Collapse
Affiliation(s)
- Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, Eberhard Karls University of Tübingen Tübingen, Germany
| | | | | | | | | | | |
Collapse
|
39
|
Show me how you walk and I tell you how you feel - a functional near-infrared spectroscopy study on emotion perception based on human gait. Neuroimage 2013; 85 Pt 1:380-90. [PMID: 23921096 DOI: 10.1016/j.neuroimage.2013.07.078] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2013] [Revised: 07/22/2013] [Accepted: 07/29/2013] [Indexed: 11/20/2022] Open
Abstract
The ability to recognize and adequately interpret emotional states in others plays a fundamental role in regulating social interaction. Body language presents an essential element of nonverbal communication which is often perceived prior to mimic expression. However, the neural networks that underlie the processing of emotionally expressive body movement and body posture are poorly understood. 33 healthy subjects have been investigated using the optically based imaging method functional near-infrared spectroscopy (fNIRS) during the performance of a newly developed emotion discrimination paradigm consisting of faceless avatars expressing fearful, angry, sad, happy or neutral gait patterns. Participants were instructed to judge (a) the presented emotional state (emotion task) and (b) the observed walking speed of the respective avatar (speed task). We measured increases in cortical oxygenated haemoglobin (O2HB) in response to visual stimulation during emotion discrimination. These O2HB concentration changes were enhanced for negative emotions in contrast to neutral gait sequences in right occipito-temporal and left temporal and temporo-parietal brain regions. Moreover, fearful and angry bodies elicited higher activation increases during the emotion task compared to the speed task. Haemodynamic responses were correlated with a number of behavioural measures, whereby a positive relationship between emotion regulation strategy preference and O2HB concentration increases after sad walks was mediated by the ability to accurately categorize sad walks. Our results support the idea of a distributed brain network involved in the recognition of bodily emotion expression that comprises visual association areas as well as body/movement perception specific cortical regions that are also sensitive to emotion. This network is activated less when the emotion is not intentionally processed (i.e. during the speed task). Furthermore, activity of this perceptive network is, mediated by the ability to correctly recognize emotions, indirectly connected to active emotion regulation processes. We conclude that a full understanding of emotion perception and its neural substrate requires the investigation of dynamic representations and means of expression other than the face.
Collapse
|
40
|
Watson R, Latinus M, Charest I, Crabbe F, Belin P. People-selectivity, audiovisual integration and heteromodality in the superior temporal sulcus. Cortex 2013; 50:125-36. [PMID: 23988132 PMCID: PMC3884128 DOI: 10.1016/j.cortex.2013.07.011] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2012] [Revised: 06/27/2013] [Accepted: 07/25/2013] [Indexed: 11/16/2022]
Abstract
The functional role of the superior temporal sulcus (STS) has been implicated in a number of studies, including those investigating face perception, voice perception, and face–voice integration. However, the nature of the STS preference for these ‘social stimuli’ remains unclear, as does the location within the STS for specific types of information processing. The aim of this study was to directly examine properties of the STS in terms of selective response to social stimuli. We used functional magnetic resonance imaging (fMRI) to scan participants whilst they were presented with auditory, visual, or audiovisual stimuli of people or objects, with the intention of localising areas preferring both faces and voices (i.e., ‘people-selective’ regions) and audiovisual regions designed to specifically integrate person-related information. Results highlighted a ‘people-selective, heteromodal’ region in the trunk of the right STS which was activated by both faces and voices, and a restricted portion of the right posterior STS (pSTS) with an integrative preference for information from people, as compared to objects. These results point towards the dedicated role of the STS as a ‘social-information processing’ centre.
Collapse
Affiliation(s)
- Rebecca Watson
- Maastricht Brain Imaging Center, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands; Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK.
| | - Marianne Latinus
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK; Institut des Neurosciences de La Timone, UMR 7289, CNRS & Université Aix-Marseille, Marseille, France
| | - Ian Charest
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK; Medical Research Council-Cognition and Brain Sciences Unit (MRC-CBU), Cambridge, UK
| | - Frances Crabbe
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| | - Pascal Belin
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK; Institut des Neurosciences de La Timone, UMR 7289, CNRS & Université Aix-Marseille, Marseille, France; International Laboratories for Brain, Music and Sound (BRAMS), Université de Montréal & McGill University, Montreal, Canada
| |
Collapse
|
41
|
Maurage P, Campanella S. Experimental and clinical usefulness of crossmodal paradigms in psychiatry: an illustration from emotional processing in alcohol-dependence. Front Hum Neurosci 2013; 7:394. [PMID: 23898250 PMCID: PMC3722513 DOI: 10.3389/fnhum.2013.00394] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2013] [Accepted: 07/05/2013] [Indexed: 11/24/2022] Open
Abstract
Crossmodal processing (i.e., the construction of a unified representation stemming from distinct sensorial modalities inputs) constitutes a crucial ability in humans' everyday life. It has been extensively explored at cognitive and cerebral levels during the last decade among healthy controls. Paradoxically however, and while difficulties to perform this integrative process have been suggested in a large range of psychopathological states (e.g., schizophrenia and autism), these crossmodal paradigms have been very rarely used in the exploration of psychiatric populations. The main aim of the present paper is thus to underline the experimental and clinical usefulness of exploring crossmodal processes in psychiatry. We will illustrate this proposal by means of the recent data obtained in the crossmodal exploration of emotional alterations in alcohol-dependence. Indeed, emotional decoding impairments might have a role in the development and maintenance of alcohol-dependence, and have been extensively investigated by means of experiments using separated visual or auditory stimulations. Besides these unimodal explorations, we have recently conducted several studies using audio-visual crossmodal paradigms, which has allowed us to improve the ecological validity of the unimodal experimental designs and to offer new insights on the emotional alterations among alcohol-dependent individuals. We will show how these preliminary results can be extended to develop a coherent and ambitious research program using crossmodal designs in various psychiatric populations and sensory modalities. We will finally end the paper by underlining the various potential clinical applications and the fundamental implications that can be raised by this emerging project.
Collapse
Affiliation(s)
- Pierre Maurage
- Laboratory for Experimental Psychopathology, Faculty of Psychology, Institute of Psychology, Université Catholique de Louvain Louvain-la-Neuve, Belgium
| | | |
Collapse
|
42
|
Killgore WDS, Schwab ZJ, Tkachenko O, Webb CA, DelDonno SR, Kipman M, Rauch SL, Weber M. Emotional intelligence correlates with functional responses to dynamic changes in facial trustworthiness. Soc Neurosci 2013; 8:334-46. [DOI: 10.1080/17470919.2013.807300] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
43
|
Jola C, McAleer P, Grosbras MH, Love SA, Morison G, Pollick FE. Uni- and multisensory brain areas are synchronised across spectators when watching unedited dance recordings. Iperception 2013; 4:265-84. [PMID: 24349687 PMCID: PMC3859570 DOI: 10.1068/i0536] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2012] [Revised: 02/20/2013] [Indexed: 11/17/2022] Open
Abstract
The superior temporal sulcus (STS) and gyrus (STG) are commonly identified to be functionally relevant for multisensory integration of audiovisual (AV) stimuli. However, most neuroimaging studies on AV integration used stimuli of short duration in explicit evaluative tasks. Importantly though, many of our AV experiences are of a long duration and ambiguous. It is unclear if the enhanced activity in audio, visual, and AV brain areas would also be synchronised over time across subjects when they are exposed to such multisensory stimuli. We used intersubject correlation to investigate which brain areas are synchronised across novices for uni- and multisensory versions of a 6-min 26-s recording of an unfamiliar, unedited Indian dance recording (Bharatanatyam). In Bharatanatyam, music and dance are choreographed together in a highly intermodal-dependent manner. Activity in the middle and posterior STG was significantly correlated between subjects and showed also significant enhancement for AV integration when the functional magnetic resonance signals were contrasted against each other using a general linear model conjunction analysis. These results extend previous studies by showing an intermediate step of synchronisation for novices: while there was a consensus across subjects' brain activity in areas relevant for unisensory processing and AV integration of related audio and visual stimuli, we found no evidence for synchronisation of higher level cognitive processes, suggesting these were idiosyncratic.
Collapse
Affiliation(s)
- Corinne Jola
- INSERM-CEA Cognitive Neuroimaging Unit, NeuroSpin Center, F-91191 Gif-sur-Yvette, France, and School of Psychology, University of Glasgow, Glasgow G12 8QB, UK; e-mail:
| | - Phil McAleer
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow G12 8QB, UK; e-mail:
| | - Marie-Hélène Grosbras
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow G12 8QB, UK; e-mail:
| | - Scott A Love
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, USA; e-mail:
| | - Gordon Morison
- Computer, Communication and Interactive Systems, Glasgow Caledonian University, Glasgow G4 0BA, UK; e-mail:
| | - Frank E Pollick
- School of Psychology, University of Glasgow, Glasgow G12 8QB, UK; e-mail:
| |
Collapse
|
44
|
Cerebral integration of verbal and nonverbal emotional cues: Impact of individual nonverbal dominance. Neuroimage 2012; 61:738-47. [DOI: 10.1016/j.neuroimage.2012.03.085] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2011] [Revised: 03/26/2012] [Accepted: 03/29/2012] [Indexed: 11/20/2022] Open
|
45
|
Müller VI, Cieslik EC, Turetsky BI, Eickhoff SB. Crossmodal interactions in audiovisual emotion processing. Neuroimage 2011; 60:553-61. [PMID: 22182770 DOI: 10.1016/j.neuroimage.2011.12.007] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2011] [Revised: 11/16/2011] [Accepted: 12/03/2011] [Indexed: 10/14/2022] Open
Abstract
Emotion in daily life is often expressed in a multimodal fashion. Consequently emotional information from one modality can influence processing in another. In a previous fMRI study we assessed the neural correlates of audio-visual integration and found that activity in the left amygdala is significantly attenuated when a neutral stimulus is paired with an emotional one compared to conditions where emotional stimuli were present in both channels. Here we used dynamic causal modelling to investigate the effective connectivity in the neuronal network underlying this emotion presence congruence effect. Our results provided strong evidence in favor of a model family, differing only in the interhemispheric interactions. All winning models share a connection from the bilateral fusiform gyrus (FFG) into the left amygdala and a non-linear modulatory influence of bilateral posterior superior temporal sulcus (pSTS) on these connections. This result indicates that the pSTS not only integrates multi-modal information from visual and auditory regions (as reflected in our model by significant feed-forward connections) but also gates the influence of the sensory information on the left amygdala, leading to attenuation of amygdala activity when a neutral stimulus is integrated. Moreover, we found a significant lateralization of the FFG due to stronger driving input by the stimuli (faces) into the right hemisphere, whereas such lateralization was not present for sound-driven input into the superior temporal gyrus. In summary, our data provides further evidence for a rightward lateralization of the FFG and in particular for a key role of the pSTS in the integration and gating of audio-visual emotional information.
Collapse
Affiliation(s)
- Veronika I Müller
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Germany.
| | | | | | | |
Collapse
|
46
|
Abstract
Supramodal representation of emotion and its neural substrates have recently attracted attention as a marker of social cognition. However, the question whether perceptual integration of facial and vocal emotions takes place in primary sensory areas, multimodal cortices, or in affective structures remains unanswered yet. Using novel computer-generated stimuli, we combined emotional faces and voices in congruent and incongruent ways and assessed functional brain data (fMRI) during an emotional classification task. Both congruent and incongruent audiovisual stimuli evoked larger responses in thalamus and superior temporal regions compared with unimodal conditions. Congruent emotions were characterized by activation in amygdala, insula, ventral posterior cingulate (vPCC), temporo-occipital, and auditory cortices; incongruent emotions activated a frontoparietal network and bilateral caudate nucleus, indicating a greater processing load in working memory and emotion-encoding areas. The vPCC alone exhibited differential reactions to congruency and incongruency for all emotion categories and can thus be considered a central structure for supramodal representation of complex emotional information. Moreover, the left amygdala reflected supramodal representation of happy stimuli. These findings document that emotional information does not merge at the perceptual audiovisual integration level in unimodal or multimodal areas, but in vPCC and amygdala.
Collapse
|
47
|
Russo PM, Mancini G, Trombini E, Baldaro B, Mavroveli S, Petrides KV. Trait Emotional Intelligence and the Big Five. JOURNAL OF PSYCHOEDUCATIONAL ASSESSMENT 2011. [DOI: 10.1177/0734282911426412] [Citation(s) in RCA: 74] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Trait emotional intelligence (EI) is a constellation of emotion-related self-perceptions located at the lower levels of personality hierarchies. This article examines the validity of the Trait Emotional Intelligence Questionnaire–Child Form and investigates its relationships with Big Five factors and cognitive ability. A total of 690 children (317 Males; M Age = 10.25 years; SD = 1.58 years) completed the TEIQue-CF, the Raven Progressive, Matrices and the Big Five Questionnaire; in addition, a subsample of 136 participants answered to Depression and Anxiety scales. Results evidenced that TEIQue-CF is a reliable measure of Trait EI that is partially determined by all of the Big Five factors but independent of cognitive ability. Trait EI predicts depression and anxiety scores over and above the five higher order personality dimensions.
Collapse
|
48
|
Brück C, Kreifelts B, Wildgruber D. Emotional voices in context: A neurobiological model of multimodal affective information processing. Phys Life Rev 2011; 8:383-403. [DOI: 10.1016/j.plrev.2011.10.002] [Citation(s) in RCA: 108] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2011] [Accepted: 10/11/2011] [Indexed: 11/27/2022]
|
49
|
Föcker J, Gondan M, Röder B. Preattentive processing of audio-visual emotional signals. Acta Psychol (Amst) 2011; 137:36-47. [PMID: 21397889 DOI: 10.1016/j.actpsy.2011.02.004] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2010] [Revised: 02/16/2011] [Accepted: 02/17/2011] [Indexed: 11/27/2022] Open
Abstract
Previous research has shown that redundant information in faces and voices leads to faster emotional categorization compared to incongruent emotional information even when attending to only one modality. The aim of the present study was to test whether these crossmodal effects are predominantly due to a response conflict rather than interference at earlier, e.g. perceptual processing stages. In Experiment 1, participants had to categorize the valence and rate the intensity of happy, sad, angry and neutral unimodal or bimodal face-voice stimuli. They were asked to rate either the facial or vocal expression and ignore the emotion expressed in the other modality. Participants responded faster and more precisely to emotionally congruent compared to incongruent face-voice pairs in both the Attend Face and in the Attend Voice condition. Moreover, when attending to faces, emotionally congruent bimodal stimuli were more efficiently processed than unimodal visual stimuli. To study the role of a possible response conflict, Experiment 2 used a modified paradigm in which emotional and response conflicts were disentangled. Incongruency effects were significant even in the absence of response conflicts. The results suggest that emotional signals available through different sensory channels are automatically combined prior to response selection.
Collapse
|
50
|
|