1
|
Putkinen V, Nazari-Farsani S, Karjalainen T, Santavirta S, Hudson M, Seppälä K, Sun L, Karlsson HK, Hirvonen J, Nummenmaa L. Pattern recognition reveals sex-dependent neural substrates of sexual perception. Hum Brain Mapp 2023; 44:2543-2556. [PMID: 36773282 PMCID: PMC10028630 DOI: 10.1002/hbm.26229] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Revised: 12/13/2022] [Accepted: 01/16/2023] [Indexed: 02/12/2023] Open
Abstract
Sex differences in brain activity evoked by sexual stimuli remain elusive despite robust evidence for stronger enjoyment of and interest toward sexual stimuli in men than in women. To test whether visual sexual stimuli evoke different brain activity patterns in men and women, we measured hemodynamic brain activity induced by visual sexual stimuli in two experiments with 91 subjects (46 males). In one experiment, the subjects viewed sexual and nonsexual film clips, and dynamic annotations for nudity in the clips were used to predict hemodynamic activity. In the second experiment, the subjects viewed sexual and nonsexual pictures in an event-related design. Men showed stronger activation than women in the visual and prefrontal cortices and dorsal attention network in both experiments. Furthermore, using multivariate pattern classification we could accurately predict the sex of the subject on the basis of the brain activity elicited by the sexual stimuli. The classification generalized across the experiments indicating that the sex differences were task-independent. Eye tracking data obtained from an independent sample of subjects (N = 110) showed that men looked longer than women at the chest area of the nude female actors in the film clips. These results indicate that visual sexual stimuli evoke discernible brain activity patterns in men and women which may reflect stronger attentional engagement with sexual stimuli in men.
Collapse
Affiliation(s)
- Vesa Putkinen
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Sanaz Nazari-Farsani
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Tomi Karjalainen
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Severi Santavirta
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Matthew Hudson
- Turku PET Centre, University of Turku, Turku, Finland
- School of Psychology, University of Plymouth, Plymouth, UK
| | - Kerttu Seppälä
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
- Department of Medical Physics, Turku University Hospital, Turku, Finland
| | - Lihua Sun
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Henry K Karlsson
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| | - Jussi Hirvonen
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
- Department of Radiology, Turku University Hospital, Turku, Finland
| | - Lauri Nummenmaa
- Turku PET Centre, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
- Department of Psychology, University of Turku, Turku, Finland
| |
Collapse
|
2
|
Hoffmann J, Travers-Podmaniczky G, Pelzl MA, Brück C, Jacob H, Hölz L, Martinelli A, Wildgruber D. Impairments in recognition of emotional facial expressions, affective prosody, and multisensory facilitation of response time in high-functioning autism. Front Psychiatry 2023; 14:1151665. [PMID: 37168084 PMCID: PMC10165112 DOI: 10.3389/fpsyt.2023.1151665] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/03/2023] [Indexed: 05/13/2023] Open
Abstract
Introduction Deficits in emotional perception are common in autistic people, but it remains unclear to which extent these perceptual impairments are linked to specific sensory modalities, specific emotions or multisensory facilitation. Methods This study aimed to investigate uni- and bimodal perception of emotional cues as well as multisensory facilitation in autistic (n = 18, mean age: 36.72 years, SD: 11.36) compared to non-autistic (n = 18, mean age: 36.41 years, SD: 12.18) people using auditory, visual and audiovisual stimuli. Results Lower identification accuracy and longer response time were revealed in high-functioning autistic people. These differences were independent of modality and emotion and showed large effect sizes (Cohen's d 0.8-1.2). Furthermore, multisensory facilitation of response time was observed in non-autistic people that was absent in autistic people, whereas no differences were found in multisensory facilitation of accuracy between the two groups. Discussion These findings suggest that processing of auditory and visual components of audiovisual stimuli is carried out more separately in autistic individuals (with equivalent temporal demands required for processing of the respective unimodal cues), but still with similar relative improvement in accuracy, whereas earlier integrative multimodal merging of stimulus properties seems to occur in non-autistic individuals.
Collapse
Affiliation(s)
- Jonatan Hoffmann
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- *Correspondence: Jonatan Hoffmann,
| | | | - Michael Alexander Pelzl
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Carolin Brück
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Heike Jacob
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Lea Hölz
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Anne Martinelli
- School of Psychology, Fresenius University of Applied Sciences, Frankfurt am Main, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| |
Collapse
|
3
|
Lenschow C, Mendes ARP, Lima SQ. Hearing, touching, and multisensory integration during mate choice. Front Neural Circuits 2022; 16:943888. [PMID: 36247731 PMCID: PMC9559228 DOI: 10.3389/fncir.2022.943888] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2022] [Accepted: 06/28/2022] [Indexed: 12/27/2022] Open
Abstract
Mate choice is a potent generator of diversity and a fundamental pillar for sexual selection and evolution. Mate choice is a multistage affair, where complex sensory information and elaborate actions are used to identify, scrutinize, and evaluate potential mating partners. While widely accepted that communication during mate assessment relies on multimodal cues, most studies investigating the mechanisms controlling this fundamental behavior have restricted their focus to the dominant sensory modality used by the species under examination, such as vision in humans and smell in rodents. However, despite their undeniable importance for the initial recognition, attraction, and approach towards a potential mate, other modalities gain relevance as the interaction progresses, amongst which are touch and audition. In this review, we will: (1) focus on recent findings of how touch and audition can contribute to the evaluation and choice of mating partners, and (2) outline our current knowledge regarding the neuronal circuits processing touch and audition (amongst others) in the context of mate choice and ask (3) how these neural circuits are connected to areas that have been studied in the light of multisensory integration.
Collapse
Affiliation(s)
- Constanze Lenschow
- Champalimaud Foundation, Champalimaud Research, Neuroscience Program, Lisbon, Portugal
| | - Ana Rita P Mendes
- Champalimaud Foundation, Champalimaud Research, Neuroscience Program, Lisbon, Portugal
| | - Susana Q Lima
- Champalimaud Foundation, Champalimaud Research, Neuroscience Program, Lisbon, Portugal
| |
Collapse
|
4
|
van 't Hof SR, Cera N. Specific factors and methodological decisions influencing brain responses to sexual stimuli in women. Neurosci Biobehav Rev 2021; 131:164-178. [PMID: 34560132 DOI: 10.1016/j.neubiorev.2021.09.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2021] [Revised: 08/29/2021] [Accepted: 09/06/2021] [Indexed: 11/19/2022]
Abstract
Most of the neuroimaging studies on sexual behavior have been conducted with male participants, leading to men-based models of sexual arousal. Here, possible factors and methodological decisions that might influence brain responses to sexual stimuli, specifically for the inclusion of women, will be reviewed. Based on this review, we suggest that future studies consider the following factors: menstrual phase, hormonal contraception use, history of sexual or psychiatric disorders or diseases, and medication use. Moreover, when researching sexual arousal, we suggest future studies assess sexual orientation and preferences, that women should select visual sexual stimuli, and a longer duration than commonly used. This review is thought to represent a useful guideline for future research in sexual arousal, which hopefully will lead to a higher inclusion of women and therefore more accurate neurobiological models of sexual arousal.
Collapse
Affiliation(s)
| | - Nicoletta Cera
- Centre for Psychology at University of Porto -CPUP, Faculty of Psychology and Education Sciences, University of Porto, Portugal
| |
Collapse
|
5
|
Lin Y, Ding H, Zhang Y. Gender Differences in Identifying Facial, Prosodic, and Semantic Emotions Show Category- and Channel-Specific Effects Mediated by Encoder's Gender. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2021; 64:2941-2955. [PMID: 34310173 DOI: 10.1044/2021_jslhr-20-00553] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Purpose The nature of gender differences in emotion processing has remained unclear due to the discrepancies in existing literature. This study examined the modulatory effects of emotion categories and communication channels on gender differences in verbal and nonverbal emotion perception. Method Eighty-eight participants (43 females and 45 males) were asked to identify three basic emotions (i.e., happiness, sadness, and anger) and neutrality encoded by female or male actors from verbal (i.e., semantic) or nonverbal (i.e., facial and prosodic) channels. Results While women showed an overall advantage in performance, their superiority was dependent on specific types of emotion and channel. Specifically, women outperformed men in regard to two basic emotions (happiness and sadness) in the nonverbal channels and only the anger category with verbal content. Conversely, men did better for the anger category in the nonverbal channels and for the other two emotions (happiness and sadness) in verbal content. There was an emotion- and channel-specific interaction effect between the two types of gender differences, with male subjects showing higher sensitivity to sad faces and prosody portrayed by the female encoders. Conclusion These findings reveal explicit emotion processing as a highly dynamic complex process with significant gender differences tied to specific emotion categories and communication channels. Supplemental Material https://doi.org/10.23641/asha.15032583.
Collapse
Affiliation(s)
- Yi Lin
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai, China
| | - Hongwei Ding
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai, China
| | - Yang Zhang
- Department of Speech-Language-Hearing Sciences & Center for Neurobehavioral Development, University of Minnesota Twin Cities, Minneapolis
| |
Collapse
|
6
|
Frühholz S, Dietziker J, Staib M, Trost W. Neurocognitive processing efficiency for discriminating human non-alarm rather than alarm scream calls. PLoS Biol 2021; 19:e3000751. [PMID: 33848299 PMCID: PMC8043411 DOI: 10.1371/journal.pbio.3000751] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Accepted: 02/15/2021] [Indexed: 11/19/2022] Open
Abstract
Across many species, scream calls signal the affective significance of events to other agents. Scream calls were often thought to be of generic alarming and fearful nature, to signal potential threats, with instantaneous, involuntary, and accurate recognition by perceivers. However, scream calls are more diverse in their affective signaling nature than being limited to fearfully alarming a threat, and thus the broader sociobiological relevance of various scream types is unclear. Here we used 4 different psychoacoustic, perceptual decision-making, and neuroimaging experiments in humans to demonstrate the existence of at least 6 psychoacoustically distinctive types of scream calls of both alarming and non-alarming nature, rather than there being only screams caused by fear or aggression. Second, based on perceptual and processing sensitivity measures for decision-making during scream recognition, we found that alarm screams (with some exceptions) were overall discriminated the worst, were responded to the slowest, and were associated with a lower perceptual sensitivity for their recognition compared with non-alarm screams. Third, the neural processing of alarm compared with non-alarm screams during an implicit processing task elicited only minimal neural signal and connectivity in perceivers, contrary to the frequent assumption of a threat processing bias of the primate neural system. These findings show that scream calls are more diverse in their signaling and communicative nature in humans than previously assumed, and, in contrast to a commonly observed threat processing bias in perceptual discriminations and neural processes, we found that especially non-alarm screams, and positive screams in particular, seem to have higher efficiency in speeded discriminations and the implicit neural processing of various scream types in humans.
Collapse
Affiliation(s)
- Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, University of Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, University of Zurich and ETH Zurich, Zurich, Switzerland
- Department of Psychology, University of Oslo, Oslo, Norway
- Center for the Interdisciplinary Study of Language Evolution, University of Zurich, Zurich, Switzerland
- * E-mail:
| | - Joris Dietziker
- Cognitive and Affective Neuroscience Unit, University of Zurich, Zurich, Switzerland
| | - Matthias Staib
- Cognitive and Affective Neuroscience Unit, University of Zurich, Zurich, Switzerland
| | - Wiebke Trost
- Cognitive and Affective Neuroscience Unit, University of Zurich, Zurich, Switzerland
| |
Collapse
|
7
|
Johnson JF, Belyk M, Schwartze M, Pinheiro AP, Kotz SA. Expectancy changes the self-monitoring of voice identity. Eur J Neurosci 2021; 53:2681-2695. [PMID: 33638190 PMCID: PMC8252045 DOI: 10.1111/ejn.15162] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2020] [Revised: 01/18/2021] [Accepted: 02/20/2021] [Indexed: 12/02/2022]
Abstract
Self‐voice attribution can become difficult when voice characteristics are ambiguous, but functional magnetic resonance imaging (fMRI) investigations of such ambiguity are sparse. We utilized voice‐morphing (self‐other) to manipulate (un‐)certainty in self‐voice attribution in a button‐press paradigm. This allowed investigating how levels of self‐voice certainty alter brain activation in brain regions monitoring voice identity and unexpected changes in voice playback quality. FMRI results confirmed a self‐voice suppression effect in the right anterior superior temporal gyrus (aSTG) when self‐voice attribution was unambiguous. Although the right inferior frontal gyrus (IFG) was more active during a self‐generated compared to a passively heard voice, the putative role of this region in detecting unexpected self‐voice changes during the action was demonstrated only when hearing the voice of another speaker and not when attribution was uncertain. Further research on the link between right aSTG and IFG is required and may establish a threshold monitoring voice identity in action. The current results have implications for a better understanding of the altered experience of self‐voice feedback in auditory verbal hallucinations.
Collapse
Affiliation(s)
- Joseph F Johnson
- Department of Neuropsychology and Psychopharmacology, Maastricht University, Maastricht, the Netherlands
| | - Michel Belyk
- Division of Psychology and Language Sciences, University College London, London, UK
| | - Michael Schwartze
- Department of Neuropsychology and Psychopharmacology, Maastricht University, Maastricht, the Netherlands
| | - Ana P Pinheiro
- Faculdade de Psicologia, Universidade de Lisboa, Lisbon, Portugal
| | - Sonja A Kotz
- Department of Neuropsychology and Psychopharmacology, Maastricht University, Maastricht, the Netherlands.,Department of Neuropsychology, Max Planck Institute for Human and Cognitive Sciences, Leipzig, Germany
| |
Collapse
|
8
|
Rigoulot S, Jiang X, Vergis N, Pell MD. Neurophysiological correlates of sexually evocative speech. Biol Psychol 2020; 154:107909. [DOI: 10.1016/j.biopsycho.2020.107909] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2019] [Revised: 05/14/2020] [Accepted: 05/20/2020] [Indexed: 12/11/2022]
|
9
|
Ethofer T, Stegmaier S, Koch K, Reinl M, Kreifelts B, Schwarz L, Erb M, Scheffler K, Wildgruber D. Are you laughing at me? Neural correlates of social intent attribution to auditory and visual laughter. Hum Brain Mapp 2019; 41:353-361. [PMID: 31642167 PMCID: PMC7268062 DOI: 10.1002/hbm.24806] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 08/30/2019] [Accepted: 09/15/2019] [Indexed: 01/17/2023] Open
Abstract
Laughter is a multifaceted signal, which can convey social acceptance facilitating social bonding as well as social rejection inflicting social pain. In the current study, we addressed the neural correlates of social intent attribution to auditory or visual laughter within an fMRI study to identify brain areas showing linear increases of activation with social intent ratings. Negative social intent attributions were associated with activation increases within the medial prefrontal cortex/anterior cingulate cortex (mPFC/ACC). Interestingly, negative social intent attributions of auditory laughter were represented more rostral than visual laughter within this area. Our findings corroborate the role of the mPFC/ACC as key node for processing “social pain” with distinct modality‐specific subregions. Other brain areas that showed an increase of activation included bilateral inferior frontal gyrus and right superior/middle temporal gyrus (STG/MTG) for visually presented laughter and bilateral STG for auditory presented laughter with no overlap across modalities. Similarly, positive social intent attributions were linked to hemodynamic responses within the right inferior parietal lobe and right middle frontal gyrus, but there was no overlap of activity for visual and auditory laughter. Our findings demonstrate that social intent attribution to auditory and visual laughter is located in neighboring, but spatially distinct neural structures.
Collapse
Affiliation(s)
- Thomas Ethofer
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany.,Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| | - Sophia Stegmaier
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Katharina Koch
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Maren Reinl
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Benjamin Kreifelts
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Lena Schwarz
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Michael Erb
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| | - Klaus Scheffler
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany.,Max-Planck-Institute for Biological Cybernetics, University of Tuebingen, Tuebingen, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| |
Collapse
|
10
|
Emotional prosody Stroop effect in Hindi: An event related potential study. PROGRESS IN BRAIN RESEARCH 2019. [PMID: 31196434 DOI: 10.1016/bs.pbr.2019.04.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register]
Abstract
Prosody processing is an important aspect of language comprehension. Previous research on emotional word-prosody conflict has shown that participants are worse when emotional prosody and word meaning are incongruent. Studies with event-related potentials have shown a congruency effect in N400 component. There has been no study on emotional processing in Hindi language in the context of conflict between emotional word meaning and prosody. We used happy and angry words spoken using happy and angry prosody. Participants had to identify whether the word had a happy or angry word meaning. The results showed a congruency effect with worse performance in incongruent trials indicating an emotional Stroop effect in Hindi. The ERP results showed that prosody information is detected very early, which can be seen in the N1 component. In addition, there was a congruency effect in N400. The results show that prosody is processed very early and emotional meaning-prosody congruency effect is obtained with Hindi. Further studies would be needed to investigate similarities and differences in cognitive control associated with language processing.
Collapse
|
11
|
Koch K, Stegmaier S, Schwarz L, Erb M, Thomas M, Scheffler K, Wildgruber D, Nieratschker V, Ethofer T. CACNA1C risk variant affects microstructural connectivity of the amygdala. Neuroimage Clin 2019; 22:101774. [PMID: 30909026 PMCID: PMC6434179 DOI: 10.1016/j.nicl.2019.101774] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 01/29/2019] [Accepted: 03/10/2019] [Indexed: 11/28/2022]
Abstract
Deficits in perception of emotional prosody have been described in patients with affective disorders at behavioral and neural level. In the current study, we use an imaging genetics approach to examine the impact of CACNA1C, one of the most promising genetic risk factors for psychiatric disorders, on prosody processing on a behavioral, functional and microstructural level. Using functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI) we examined key areas involved in prosody processing, i.e. the amygdala and voice areas, in a healthy population. We found stronger activation to emotional than neutral prosody in the voice areas and the amygdala, but CACNA1C rs1006737 genotype had no influence on fMRI activity. However, significant microstructural differences (i.e. mean diffusivity) between CACNA1C rs1006737 risk allele carriers and non carriers were found in the amygdala, but not the voice areas. These modifications in brain architecture associated with CACNA1C might reflect a neurobiological marker predisposing to affective disorders and concomitant alterations in emotion perception.
Collapse
Affiliation(s)
- Katharina Koch
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany.
| | - Sophia Stegmaier
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Lena Schwarz
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Michael Erb
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| | - Mara Thomas
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Klaus Scheffler
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany; Max-Planck-Institute for Biological Cybernetics, University of Tuebingen, Tuebingen, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Vanessa Nieratschker
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany; Werner Reichardt Center for Integrative Neuroscience, University of Tuebingen, Tuebingen, Germany
| | - Thomas Ethofer
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany; Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| |
Collapse
|
12
|
Koch K, Stegmaier S, Schwarz L, Erb M, Reinl M, Scheffler K, Wildgruber D, Ethofer T. Neural correlates of processing emotional prosody in unipolar depression. Hum Brain Mapp 2018; 39:3419-3427. [PMID: 29682814 DOI: 10.1002/hbm.24185] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2017] [Revised: 03/15/2018] [Accepted: 04/09/2018] [Indexed: 12/11/2022] Open
Abstract
Major depressive disorder (MDD) is characterized by a biased emotion perception. In the auditory domain, MDD patients have been shown to exhibit attenuated processing of positive emotions expressed by speech melody (prosody). So far, no neuroimaging studies examining the neural basis of altered processing of emotional prosody in MDD are available. In this study, we addressed this issue by examining the emotion bias in MDD during evaluation of happy, neutral, and angry prosodic stimuli on a five-point Likert scale during functional magnetic resonance imaging (fMRI). As expected, MDD patients rated happy prosody less intense than healthy controls (HC). At neural level, stronger activation in the middle superior temporal gyrus (STG) and the amygdala was found in all participants when processing emotional as compared to neutral prosody. MDD patients exhibited an increased activation of the amygdala during processing prosody irrespective of valence while no significant differences between groups were found for the STG, indicating that altered processing of prosodic emotions in MDD occurs rather within the amygdala than in auditory areas. Concurring with the valence-specific behavioral effect of attenuated evaluation of positive prosodic stimuli, activation within the left amygdala of MDD patients correlated with ratings of happy, but not neutral or angry prosody. Our study provides first insights in the neural basis of reduced experience of positive information and an abnormally increased amygdala activity during prosody processing.
Collapse
Affiliation(s)
- Katharina Koch
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Sophia Stegmaier
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Lena Schwarz
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Michael Erb
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| | - Maren Reinl
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Klaus Scheffler
- Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany.,Max-Planck-Institute for Biological Cybernetics, University of Tuebingen, Tuebingen, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany
| | - Thomas Ethofer
- Department of General Psychiatry, University of Tuebingen, Tuebingen, Germany.,Department of Biomedical Resonance, University of Tuebingen, Tuebingen, Germany
| |
Collapse
|
13
|
Schirmer A, Gunter TC. Temporal signatures of processing voiceness and emotion in sound. Soc Cogn Affect Neurosci 2018; 12:902-909. [PMID: 28338796 PMCID: PMC5472162 DOI: 10.1093/scan/nsx020] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 02/07/2017] [Indexed: 12/22/2022] Open
Abstract
This study explored the temporal course of vocal and emotional sound processing. Participants detected rare repetitions in a stimulus stream comprising neutral and surprised non-verbal exclamations and spectrally rotated control sounds. Spectral rotation preserved some acoustic and emotional properties of the vocal originals. Event-related potentials elicited to unrepeated sounds revealed effects of voiceness and emotion. Relative to non-vocal sounds, vocal sounds elicited a larger centro-parietally distributed N1. This effect was followed by greater positivity to vocal relative to non-vocal sounds beginning with the P2 and extending throughout the recording epoch (N4, late positive potential) with larger amplitudes in female than in male listeners. Emotion effects overlapped with the voiceness effects but were smaller and differed topographically. Voiceness and emotion interacted only for the late positive potential, which was greater for vocal-emotional as compared with all other sounds. Taken together, these results point to a multi-stage process in which voiceness and emotionality are represented independently before being integrated in a manner that biases responses to stimuli with socio-emotional relevance.
Collapse
Affiliation(s)
- Annett Schirmer
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Department of Psychology, Chinese University of Hong Kong, Hong Kong
| | - Thomas C Gunter
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
14
|
Speech Prosodies of Different Emotional Categories Activate Different Brain Regions in Adult Cortex: an fNIRS Study. Sci Rep 2018; 8:218. [PMID: 29317758 PMCID: PMC5760650 DOI: 10.1038/s41598-017-18683-2] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Accepted: 12/14/2017] [Indexed: 11/12/2022] Open
Abstract
Emotional expressions of others embedded in speech prosodies are important for social interactions. This study used functional near-infrared spectroscopy to investigate how speech prosodies of different emotional categories are processed in the cortex. The results demonstrated several cerebral areas critical for emotional prosody processing. We confirmed that the superior temporal cortex, especially the right middle and posterior parts of superior temporal gyrus (BA 22/42), primarily works to discriminate between emotional and neutral prosodies. Furthermore, the results suggested that categorization of emotions occurs within a high-level brain region–the frontal cortex, since the brain activation patterns were distinct when positive (happy) were contrasted to negative (fearful and angry) prosody in the left middle part of inferior frontal gyrus (BA 45) and the frontal eye field (BA8), and when angry were contrasted to neutral prosody in bilateral orbital frontal regions (BA 10/11). These findings verified and extended previous fMRI findings in adult brain and also provided a “developed version” of brain activation for our following neonatal study.
Collapse
|
15
|
Neural correlates of the affective properties of spontaneous and volitional laughter types. Neuropsychologia 2016; 95:30-39. [PMID: 27940151 DOI: 10.1016/j.neuropsychologia.2016.12.012] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2016] [Revised: 12/06/2016] [Accepted: 12/07/2016] [Indexed: 11/23/2022]
Abstract
Previous investigations of vocal expressions of emotion have identified acoustic and perceptual distinctions between expressions of different emotion categories, and between spontaneous and volitional (or acted) variants of a given category. Recent work on laughter has identified relationships between acoustic properties of laughs and their perceived affective properties (arousal and valence) that are similar across spontaneous and volitional types (Bryant & Aktipis, 2014; Lavan et al., 2016). In the current study, we explored the neural correlates of such relationships by measuring modulations of the BOLD response in the presence of itemwise variability in the subjective affective properties of spontaneous and volitional laughter. Across all laughs, and within spontaneous and volitional sets, we consistently observed linear increases in the response of bilateral auditory cortices (including Heschl's gyrus and superior temporal gyrus [STG]) associated with higher ratings of perceived arousal, valence and authenticity. Areas in the anterior medial prefrontal cortex (amPFC) showed negative linear correlations with valence and authenticity ratings across the full set of spontaneous and volitional laughs; in line with previous research (McGettigan et al., 2015; Szameitat et al., 2010), we suggest that this reflects increased engagement of these regions in response to laughter of greater social ambiguity. Strikingly, an investigation of higher-order relationships between the entire laughter set and the neural response revealed a positive quadratic profile of the BOLD response in right-dominant STG (extending onto the dorsal bank of the STS), where this region responded most strongly to laughs rated at the extremes of the authenticity scale. While previous studies claimed a role for right STG in bipolar representation of emotional valence, we instead argue that this may in fact exhibit a relatively categorical response to emotional signals, whether positive or negative.
Collapse
|
16
|
Vogel BD, Brück C, Jacob H, Eberle M, Wildgruber D. Effects of cue modality and emotional category on recognition of nonverbal emotional signals in schizophrenia. BMC Psychiatry 2016; 16:218. [PMID: 27388011 PMCID: PMC4936116 DOI: 10.1186/s12888-016-0913-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 06/07/2016] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. METHODS Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. RESULTS Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. CONCLUSION Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore, future studies evaluating perception of nonverbal cues should consider a broader range of social and emotional signals beyond basic emotions including attitudes and interpersonal intentions. Identifying specific domains of social perception particularly prone for misunderstandings in patients with schizophrenia might allow for a refinement of interventions aiming at improving social functioning.
Collapse
Affiliation(s)
- Bastian D. Vogel
- Department of Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, Tübingen, 72076 Germany
| | - Carolin Brück
- Department of Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, Tübingen, 72076 Germany
| | - Heike Jacob
- Department of Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, Tübingen, 72076 Germany
| | - Mark Eberle
- Department of Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, Tübingen, 72076 Germany
| | - Dirk Wildgruber
- Department of Psychiatry and Psychotherapy, University of Tübingen, Calwerstraße 14, Tübingen, 72076 Germany
| |
Collapse
|
17
|
The sound of emotions-Towards a unifying neural network perspective of affective sound processing. Neurosci Biobehav Rev 2016; 68:96-110. [PMID: 27189782 DOI: 10.1016/j.neubiorev.2016.05.002] [Citation(s) in RCA: 109] [Impact Index Per Article: 13.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Revised: 05/01/2016] [Accepted: 05/04/2016] [Indexed: 12/15/2022]
Abstract
Affective sounds are an integral part of the natural and social environment that shape and influence behavior across a multitude of species. In human primates, these affective sounds span a repertoire of environmental and human sounds when we vocalize or produce music. In terms of neural processing, cortical and subcortical brain areas constitute a distributed network that supports our listening experience to these affective sounds. Taking an exhaustive cross-domain view, we accordingly suggest a common neural network that facilitates the decoding of the emotional meaning from a wide source of sounds rather than a traditional view that postulates distinct neural systems for specific affective sound types. This new integrative neural network view unifies the decoding of affective valence in sounds, and ascribes differential as well as complementary functional roles to specific nodes within a common neural network. It also highlights the importance of an extended brain network beyond the central limbic and auditory brain systems engaged in the processing of affective sounds.
Collapse
|
18
|
Pannese A, Grandjean D, Frühholz S. Subcortical processing in auditory communication. Hear Res 2015; 328:67-77. [DOI: 10.1016/j.heares.2015.07.003] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2015] [Revised: 06/23/2015] [Accepted: 07/01/2015] [Indexed: 12/21/2022]
|
19
|
Wierzba M, Riegel M, Pucz A, Leśniewska Z, Dragan WŁ, Gola M, Jednoróg K, Marchewka A. Erotic subset for the Nencki Affective Picture System (NAPS ERO): cross-sexual comparison study. Front Psychol 2015; 6:1336. [PMID: 26441715 PMCID: PMC4564755 DOI: 10.3389/fpsyg.2015.01336] [Citation(s) in RCA: 49] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2015] [Accepted: 08/19/2015] [Indexed: 11/13/2022] Open
Abstract
Research on the processing of sexual stimuli has proved that such material has high priority in human cognition. Yet, although sex differences in response to sexual stimuli were extensively discussed in the literature, sexual orientation was given relatively little consideration, and material suitable for relevant research is difficult to come by. With this in mind, we present a collection of 200 erotic images, accompanied by their self-report ratings of emotional valence and arousal by homo- and heterosexual males and females (n = 80, divided into four equal-sized subsamples). The collection complements the Nencki Affective Picture System (NAPS) and is intended to be used as stimulus material in experimental research. The erotic images are divided into five categories, depending on their content: opposite-sex couple (50), male couple (50), female couple (50), male (25) and female (25). Additional 100 control images from the NAPS depicting people in a non-erotic context were also used in the study. We showed that recipient sex and sexual orientation strongly influenced the evaluation of erotic content. Thus, comparisons of valence and arousal ratings in different subject groups will help researchers select stimuli set for the purpose of various experimental designs. To facilitate the use of the dataset, we provide an on-line tool, which allows the user to browse the images interactively and select proper stimuli on the basis of several parameters. The NAPS ERO image collection together with the data are available to the scientific community for non-commercial use at http://naps.nencki.gov.pl.
Collapse
Affiliation(s)
- Małgorzata Wierzba
- Laboratory of Brain Imaging, Neurobiology Centre, Nencki Institute of Experimental BiologyWarsaw, Poland
| | - Monika Riegel
- Laboratory of Brain Imaging, Neurobiology Centre, Nencki Institute of Experimental BiologyWarsaw, Poland
| | - Anna Pucz
- Faculty of Psychology, University of WarsawWarsaw, Poland
| | | | | | - Mateusz Gola
- Institute of Psychology, Polish Academy of SciencesWarsaw, Poland
- Swartz Center for Computational Neuroscience, Institute for Neural Computations, University of California, San DiegoSan Diego, CA, USA
| | - Katarzyna Jednoróg
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental BiologyWarsaw, Poland
| | - Artur Marchewka
- Laboratory of Brain Imaging, Neurobiology Centre, Nencki Institute of Experimental BiologyWarsaw, Poland
| |
Collapse
|
20
|
Zhao L, Guan M, Zhang X, Karama S, Khundrakpam B, Wang M, Dong M, Qin W, Tian J, Evans AC, Shi D. Structural insights into aberrant cortical morphometry and network organization in psychogenic erectile dysfunction. Hum Brain Mapp 2015; 36:4469-82. [PMID: 26264575 DOI: 10.1002/hbm.22925] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2014] [Revised: 07/20/2015] [Accepted: 07/28/2015] [Indexed: 12/31/2022] Open
Abstract
Functional neuroimaging studies have revealed abnormal brain dynamics of male sexual arousal (SA) in psychogenic erectile dysfunction (pED). However, the neuroanatomical correlates of pED are still unclear. In this work, we obtained cortical thickness (CTh) measurements from structural magnetic resonance images of 40 pED patients and 39 healthy control subjects. Abnormalities in CTh related to pED were explored using a scale space search based brain morphometric analysis. Organizations of brain structural covariance networks were analyzed as well. Compared with healthy men, pED patients showed significantly decreased CTh in widespread cortical regions, most of which were previously reported to show abnormal dynamics of male SA in pED, such as the medial prefrontal, orbitofrontal, cingulate, inferotemporal, and insular cortices. CTh reductions in these areas were found to be significantly correlated with male sexual functioning degradation. Moreover, pED patients showed decreased interregional CTh correlations from the right lateral orbitofrontal cortex to the right supramarginal gyrus and the left angular cortex, implying disassociations between the cognitive, motivational, and inhibitory networks of male SA in pED. This work provides structural insights on the complex phenomenon of psychogenic sexual dysfunction in men, and suggests a specific vulnerability factor, possibly as an extra "organic" factor, that may play an important role in pED.
Collapse
Affiliation(s)
- Lu Zhao
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec, Canada
| | - Min Guan
- Department of Radiology, Henan Provincial People's Hospital, Henan, China
| | - Xiangsheng Zhang
- Department of Urology, Henan Provincial People's Hospital, Henan, China
| | - Sherif Karama
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec, Canada.,Douglas Mental Health University Institute, McGill University, Montreal, Quebec, Canada
| | - Budhachandra Khundrakpam
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec, Canada
| | - Meiyun Wang
- Department of Radiology, Henan Provincial People's Hospital, Henan, China
| | - Minghao Dong
- School of Life Science and Technology, Xi'dian University, Shaanxi, China
| | - Wei Qin
- School of Life Science and Technology, Xi'dian University, Shaanxi, China
| | - Jie Tian
- School of Life Science and Technology, Xi'dian University, Shaanxi, China.,Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Alan C Evans
- McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec, Canada
| | - Dapeng Shi
- Department of Radiology, Henan Provincial People's Hospital, Henan, China
| |
Collapse
|
21
|
Abstract
The neuroanatomical correlates of human sexual desire, arousal, and behavior have been characterized in recent years with functional brain imaging techniques such as magnetic resonance imaging (MRI) and positron emission tomography (PET). Here, we briefly review the results of functional neuroimaging studies in humans, whether healthy or suffering from sexual disorders, and the current models of regional and network activation in sexual arousal. Attention is paid, in particular, to findings from both regional and network studies in the past 3 years. We also identify yet unanswered and pressing questions of interest to areas of ongoing investigations for psychiatric, scientific, and forensic disciplines.
Collapse
|
22
|
Pernet CR, McAleer P, Latinus M, Gorgolewski KJ, Charest I, Bestelmeyer PEG, Watson RH, Fleming D, Crabbe F, Valdes-Sosa M, Belin P. The human voice areas: Spatial organization and inter-individual variability in temporal and extra-temporal cortices. Neuroimage 2015; 119:164-74. [PMID: 26116964 PMCID: PMC4768083 DOI: 10.1016/j.neuroimage.2015.06.050] [Citation(s) in RCA: 133] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2014] [Revised: 06/15/2015] [Accepted: 06/18/2015] [Indexed: 12/02/2022] Open
Abstract
fMRI studies increasingly examine functions and properties of non-primary areas of human auditory cortex. However there is currently no standardized localization procedure to reliably identify specific areas across individuals such as the standard ‘localizers’ available in the visual domain. Here we present an fMRI ‘voice localizer’ scan allowing rapid and reliable localization of the voice-sensitive ‘temporal voice areas’ (TVA) of human auditory cortex. We describe results obtained using this standardized localizer scan in a large cohort of normal adult subjects. Most participants (94%) showed bilateral patches of significantly greater response to vocal than non-vocal sounds along the superior temporal sulcus/gyrus (STS/STG). Individual activation patterns, although reproducible, showed high inter-individual variability in precise anatomical location. Cluster analysis of individual peaks from the large cohort highlighted three bilateral clusters of voice-sensitivity, or “voice patches” along posterior (TVAp), mid (TVAm) and anterior (TVAa) STS/STG, respectively. A series of extra-temporal areas including bilateral inferior prefrontal cortex and amygdalae showed small, but reliable voice-sensitivity as part of a large-scale cerebral voice network. Stimuli for the voice localizer scan and probabilistic maps in MNI space are available for download. Three “voice patches” along human superior temporal gyrus/sulcus. Anatomical location reproducible within- but variable between-individuals. Extended voice processing network includes amygdala and prefrontal cortex. Stimulus material for “voice localizer” scan available for download.
Collapse
Affiliation(s)
- Cyril R Pernet
- Cente for Clinical Brain Sciences, Neuroimaging Sciences, The University of Edinburgh, United Kingdom.
| | - Phil McAleer
- Institute of Neuroscience and Psychology, University of Glasgow, United Kingdom
| | - Marianne Latinus
- Institut des Neurosciences de La Timone, UMR 7289, CNRS & Université Aix-Marseille, France
| | | | - Ian Charest
- Cognition and Brain Sciences Unit, Medical Research Council, Cambridge, United Kingdom
| | | | - Rebecca H Watson
- Faculty of Psychology and Neuroscience, Maastricht University, The Netherlands
| | - David Fleming
- Institute of Neuroscience and Psychology, University of Glasgow, United Kingdom
| | - Frances Crabbe
- Institute of Neuroscience and Psychology, University of Glasgow, United Kingdom
| | | | - Pascal Belin
- Institute of Neuroscience and Psychology, University of Glasgow, United Kingdom; Institut des Neurosciences de La Timone, UMR 7289, CNRS & Université Aix-Marseille, France; Département de Psychologie, Université de Montréal, Canada.
| |
Collapse
|
23
|
Abstract
Accents provide information about the speaker's geographical, socio-economic, and ethnic background. Research in applied psychology and sociolinguistics suggests that we generally prefer our own accent to other varieties of our native language and attribute more positive traits to it. Despite the widespread influence of accents on social interactions, educational and work settings the neural underpinnings of this social bias toward our own accent and, what may drive this bias, are unexplored. We measured brain activity while participants from two different geographical backgrounds listened passively to 3 English accent types embedded in an adaptation design. Cerebral activity in several regions, including bilateral amygdalae, revealed a significant interaction between the participants' own accent and the accent they listened to: while repetition of own accents elicited an enhanced neural response, repetition of the other group's accent resulted in reduced responses classically associated with adaptation. Our findings suggest that increased social relevance of, or greater emotional sensitivity to in-group accents, may underlie the own-accent bias. Our results provide a neural marker for the bias associated with accents, and show, for the first time, that the neural response to speech is partly shaped by the geographical background of the listener.
Collapse
Affiliation(s)
| | - Pascal Belin
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK International Laboratories for Brain, Music and Sound Research, Université de Montréal & McGill University, Montréal, Canada Institut des Neurosciences de La Timone, UMR 7289, CNRS & Aix-Marseille Université, Marseille, France
| | - D Robert Ladd
- School of Philosophy, Psychology and Language Sciences, University of Edinburgh, UK
| |
Collapse
|
24
|
Li Y, Gu F, Zhang X, Yang L, Chen L, Wei Z, Zha R, Wang Y, Li X, Zhou Y, Zhang X. Cerebral activity to opposite-sex voices reflected by event-related potentials. PLoS One 2014; 9:e94976. [PMID: 24727971 PMCID: PMC3984274 DOI: 10.1371/journal.pone.0094976] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2014] [Accepted: 03/20/2014] [Indexed: 11/18/2022] Open
Abstract
Human voice is a gender discriminating cue and is important to mate selection. This study employed electrophysiological recordings to examine whether there is specific cerebral activity when presented with opposite-sex voices as compared to same-sex voices. Male voices and female voices were pseudo-randomly presented to male and female participants. In Experiment 1, participants were instructed to determine the gender of each voice. A late positivity (LP) response around 750 ms after voice onset was elicited by opposite-sex voices, as reflected by a positive deflection of the ERP to opposite-sex voices than that to same-sex voices. This LP response was prominent around parieto-occipital recording sites, and it suggests an opposite-sex specific process, which may reflect emotion- and/or reward-related cerebral activity. In Experiment 2, participants were instructed to press a key when hearing a non-voice pure tone and not give any response when they heard voice stimuli. In this task, no difference were found between the ERP to same-sex voices and that to opposite-sex voices, suggesting that the cerebral activity to opposite-sex voices may disappear without gender-related attention. These results provide significant implications on cognitive mechanisms with regard to opposite-sex specific voice processing.
Collapse
Affiliation(s)
- Ya Li
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Feng Gu
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
- * E-mail: (FG) (FG); (XCZ) (XZ)
| | - Xiliang Zhang
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Lizhuang Yang
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Lijun Chen
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Zhengde Wei
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Rujing Zha
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Ying Wang
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Xiaoming Li
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
- Department of Medical Psychology, Anhui Medical University, Hefei, Anhui Province, China
| | - Yifeng Zhou
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
| | - Xiaochu Zhang
- CAS Key Laboratory of Brain Function and Disease, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui, China
- School of Humanities & Social Science, University of Science & Technology of China, Hefei, Anhui, China
- * E-mail: (FG) (FG); (XCZ) (XZ)
| |
Collapse
|
25
|
Abstract
Through this study, we aimed to validate a new tool for inducing moods in experimental contexts. Five audio stories with sad, joyful, frightening, erotic, or neutral content were presented to 60 participants (33 women, 27 men) in a within-subjects design, each for about 10 min. Participants were asked (1) to report their moods before and after listening to each story, (2) to assess the emotional content of the excerpts on various emotional scales, and (3) to rate their level of projection into the stories. The results confirmed our a priori emotional classification. The emotional stories were effective in inducing the desired mood, with no difference found between male and female participants. These stories therefore constitute a valuable corpus for inducing moods in French-speaking participants, and they are made freely available for use in scientific research.
Collapse
|
26
|
Vocal emotion of humanoid robots: a study from brain mechanism. ScientificWorldJournal 2014; 2014:216341. [PMID: 24587712 PMCID: PMC3920811 DOI: 10.1155/2014/216341] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2013] [Accepted: 11/14/2013] [Indexed: 11/17/2022] Open
Abstract
Driven by rapid ongoing advances in humanoid robot, increasing attention has been shifted into the issue of emotion intelligence of AI robots to facilitate the communication between man-machines and human beings, especially for the vocal emotion in interactive system of future humanoid robots. This paper explored the brain mechanism of vocal emotion by studying previous researches and developed an experiment to observe the brain response by fMRI, to analyze vocal emotion of human beings. Findings in this paper provided a new approach to design and evaluate the vocal emotion of humanoid robots based on brain mechanism of human beings.
Collapse
|
27
|
Lambrecht L, Kreifelts B, Wildgruber D. Gender differences in emotion recognition: Impact of sensory modality and emotional category. Cogn Emot 2013; 28:452-69. [PMID: 24151963 DOI: 10.1080/02699931.2013.837378] [Citation(s) in RCA: 46] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
Results from studies on gender differences in emotion recognition vary, depending on the types of emotion and the sensory modalities used for stimulus presentation. This makes comparability between different studies problematic. This study investigated emotion recognition of healthy participants (N = 84; 40 males; ages 20 to 70 years), using dynamic stimuli, displayed by two genders in three different sensory modalities (auditory, visual, audio-visual) and five emotional categories. The participants were asked to categorise the stimuli on the basis of their nonverbal emotional content (happy, alluring, neutral, angry, and disgusted). Hit rates and category selection biases were analysed. Women were found to be more accurate in recognition of emotional prosody. This effect was partially mediated by hearing loss for the frequency of 8,000 Hz. Moreover, there was a gender-specific selection bias for alluring stimuli: Men, as compared to women, chose "alluring" more often when a stimulus was presented by a woman as compared to a man.
Collapse
Affiliation(s)
- Lena Lambrecht
- a Department of Psychiatry and Psychotherapy , Eberhard-Karls-University of Tübingen , Tübingen , Germany
| | | | | |
Collapse
|
28
|
Affective auditory stimuli: Adaptation of the International Affective Digitized Sounds (IADS-2) for European Portuguese. Behav Res Methods 2013; 45:1168-81. [DOI: 10.3758/s13428-012-0310-1] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
29
|
Gädeke JC, Föcker J, Röder B. Is the processing of affective prosody influenced by spatial attention? An ERP study. BMC Neurosci 2013; 14:14. [PMID: 23360491 PMCID: PMC3616832 DOI: 10.1186/1471-2202-14-14] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2012] [Accepted: 01/24/2013] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The present study asked whether the processing of affective prosody is modulated by spatial attention. Pseudo-words with a neutral, happy, threatening, and fearful prosody were presented at two spatial positions. Participants attended to one position in order to detect infrequent targets. Emotional prosody was task irrelevant. The electro-encephalogram (EEG) was recorded to assess processing differences as a function of spatial attention and emotional valence. RESULTS Event-related potentials (ERPs) differed as a function of emotional prosody both when attended and when unattended. While emotional prosody effects interacted with effects of spatial attention at early processing levels (< 200 ms), these effects were additive at later processing stages (> 200 ms). CONCLUSIONS Emotional prosody, therefore, seems to be partially processed outside the focus of spatial attention. Whereas at early sensory processing stages spatial attention modulates the degree of emotional voice processing as a function of emotional valence, emotional prosody is processed outside of the focus of spatial attention at later processing stages.
Collapse
Affiliation(s)
- Julia C Gädeke
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, Hamburg 20146, Germany
| | | | | |
Collapse
|
30
|
Frühholz S, Grandjean D. Multiple subregions in superior temporal cortex are differentially sensitive to vocal expressions: A quantitative meta-analysis. Neurosci Biobehav Rev 2013; 37:24-35. [DOI: 10.1016/j.neubiorev.2012.11.002] [Citation(s) in RCA: 61] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2012] [Revised: 10/08/2012] [Accepted: 11/04/2012] [Indexed: 11/16/2022]
|
31
|
Witteman J, Van Heuven VJP, Schiller NO. Hearing feelings: a quantitative meta-analysis on the neuroimaging literature of emotional prosody perception. Neuropsychologia 2012; 50:2752-2763. [PMID: 22841991 DOI: 10.1016/j.neuropsychologia.2012.07.026] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2012] [Revised: 06/29/2012] [Accepted: 07/13/2012] [Indexed: 10/28/2022]
Abstract
With the advent of neuroimaging considerable progress has been made in uncovering the neural network involved in the perception of emotional prosody. However, the exact neuroanatomical underpinnings of the emotional prosody perception process remain unclear. Furthermore, it is unclear what the intrahemispheric basis might be of the relative right-hemispheric specialization for emotional prosody perception that has been found previously in the lesion literature. In an attempt to shed light on these issues, quantitative meta-analyses of the neuroimaging literature were performed to investigate which brain areas are robustly associated with stimulus-driven and task-dependent perception of emotional prosody. Also, lateralization analyses were performed to investigate whether statistically reliable hemispheric specialization across studies can be found in these networks. A bilateral temporofrontal network was found to be implicated in emotional prosody perception, generally supporting previously proposed models of emotional prosody perception. Right-lateralized convergence across studies was found in (early) auditory processing areas, suggesting that the right hemispheric specialization for emotional prosody perception reported previously in the lesion literature might be driven by hemispheric specialization for non-prosody-specific fundamental acoustic dimensions of the speech signal.
Collapse
Affiliation(s)
- Jurriaan Witteman
- Leiden Institute for Brain and Cognition, Leiden University, The Netherlands; Leiden University Centre for Linguistics, Leiden University, The Netherlands.
| | - Vincent J P Van Heuven
- Leiden Institute for Brain and Cognition, Leiden University, The Netherlands; Leiden University Centre for Linguistics, Leiden University, The Netherlands
| | - Niels O Schiller
- Leiden Institute for Brain and Cognition, Leiden University, The Netherlands; Leiden University Centre for Linguistics, Leiden University, The Netherlands
| |
Collapse
|
32
|
Chun JW, Park HJ, Park IH, Kim JJ. Common and differential brain responses in men and women to nonverbal emotional vocalizations by the same and opposite sex. Neurosci Lett 2012; 515:157-61. [DOI: 10.1016/j.neulet.2012.03.038] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2011] [Revised: 03/06/2012] [Accepted: 03/12/2012] [Indexed: 11/27/2022]
|
33
|
Functional neuroimaging studies of sexual arousal and orgasm in healthy men and women: a review and meta-analysis. Neurosci Biobehav Rev 2012; 36:1481-509. [PMID: 22465619 DOI: 10.1016/j.neubiorev.2012.03.006] [Citation(s) in RCA: 231] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2011] [Revised: 03/03/2012] [Accepted: 03/14/2012] [Indexed: 11/23/2022]
Abstract
In the last fifteen years, functional neuroimaging techniques have been used to investigate the neuroanatomical correlates of sexual arousal in healthy human subjects. In most studies, subjects have been requested to watch visual sexual stimuli and control stimuli. Our review and meta-analysis found that in heterosexual men, sites of cortical activation consistently reported across studies are the lateral occipitotemporal, inferotemporal, parietal, orbitofrontal, medial prefrontal, insular, anterior cingulate, and frontal premotor cortices as well as, for subcortical regions, the amygdalas, claustrum, hypothalamus, caudate nucleus, thalami, cerebellum, and substantia nigra. Heterosexual and gay men show a similar pattern of activation. Visual sexual stimuli activate the amygdalas and thalami more in men than in women. Ejaculation is associated with decreased activation throughout the prefrontal cortex. We present a neurophenomenological model to understand how these multiple regional brain responses could account for the varied facets of the subjective experience of sexual arousal. Further research should shift from passive to active paradigms, focus on functional connectivity and use subliminal presentation of stimuli.
Collapse
|
34
|
Brück C, Kreifelts B, Wildgruber D. Emotional voices in context: A neurobiological model of multimodal affective information processing. Phys Life Rev 2011; 8:383-403. [DOI: 10.1016/j.plrev.2011.10.002] [Citation(s) in RCA: 108] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2011] [Accepted: 10/11/2011] [Indexed: 11/27/2022]
|
35
|
Viinikainen M, Kätsyri J, Sams M. Representation of perceived sound valence in the human brain. Hum Brain Mapp 2011; 33:2295-305. [PMID: 21826759 DOI: 10.1002/hbm.21362] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2010] [Revised: 03/23/2011] [Accepted: 04/26/2011] [Indexed: 11/09/2022] Open
Abstract
Perceived emotional valence of sensory stimuli influences their processing in various cortical and subcortical structures. Recent evidence suggests that negative and positive valences are processed separately, not along a single linear continuum. Here, we examined how brain is activated when subjects are listening to auditory stimuli varying parametrically in perceived valence (very unpleasant-neutral-very pleasant). Seventeen healthy volunteers were scanned in 3 Tesla while listening to International Affective Digital Sounds (IADS-2) in a block design paradigm. We found a strong quadratic U-shaped relationship between valence and blood oxygen level dependent (BOLD) signal strength in the medial prefrontal cortex, auditory cortex, and amygdala. Signals were the weakest for neutral stimuli and increased progressively for more unpleasant or pleasant stimuli. The results strengthen the view that valence is a crucial factor in neural processing of emotions. An alternative explanation is salience, which increases with both negative and positive valences.
Collapse
Affiliation(s)
- Mikko Viinikainen
- Mind and Brain Laboratory, Department of Biomedical Engineering and Computational Science, Aalto University School of Science, Finland.
| | | | | |
Collapse
|
36
|
Fonteille V, Stoléru S. Les corrélats cérébraux du désir sexuel : approche en neuro-imagerie fonctionnelle. SEXOLOGIES 2011. [DOI: 10.1016/j.sexol.2010.03.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
37
|
|
38
|
Whittle S, Yücel M, Yap MBH, Allen NB. Sex differences in the neural correlates of emotion: Evidence from neuroimaging. Biol Psychol 2011; 87:319-33. [PMID: 21600956 DOI: 10.1016/j.biopsycho.2011.05.003] [Citation(s) in RCA: 188] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2010] [Revised: 05/05/2011] [Accepted: 05/05/2011] [Indexed: 01/17/2023]
Affiliation(s)
- Sarah Whittle
- Centre for Youth Mental Health, Orygen Youth Health Research Centre, The University of Melbourne, 35 Polar Road, Parkville, Victoria 3052, Australia.
| | | | | | | |
Collapse
|
39
|
Ethofer T, Bretscher J, Gschwind M, Kreifelts B, Wildgruber D, Vuilleumier P. Emotional Voice Areas: Anatomic Location, Functional Properties, and Structural Connections Revealed by Combined fMRI/DTI. Cereb Cortex 2011; 22:191-200. [PMID: 21625012 DOI: 10.1093/cercor/bhr113] [Citation(s) in RCA: 130] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022] Open
Affiliation(s)
- Thomas Ethofer
- Department of General Psychiatry, University of Tübingen, 72076 Tübingen, Germany.
| | | | | | | | | | | |
Collapse
|
40
|
Bidirectional connectivity between hemispheres occurs at multiple levels in language processing but depends on sex. J Neurosci 2010; 30:11576-85. [PMID: 20810879 DOI: 10.1523/jneurosci.1245-10.2010] [Citation(s) in RCA: 49] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Our aim was to determine the direction of interhemispheric communication in a phonological task in regions involved in different levels of processing. Effective connectivity analysis was conducted on functional magnetic resonance imaging data from 39 children (ages 9-15 years) performing rhyming judgment on spoken words. The results show interaction between hemispheres at multiple levels. First, there is unidirectional transfer of information from right to left at the sensory level of primary auditory cortex. Second, bidirectional connections between superior temporal gyri (STGs) suggest a reciprocal cooperation between hemispheres at the level of phonological and prosodic processing. Third, a direct connection from right STG to left inferior frontal gyrus suggest that information processed in the right STG is integrated into the final stages of phonological segmentation required for the rhyming decision. Intrahemispheric connectivity from primary auditory cortex to STG was stronger in the left compared to the right hemisphere. These results support a model of cooperation between hemispheres, with asymmetric interhemispheric and intrahemispheric connectivity consistent with the left hemisphere specialization for phonological processing. Finally, we found greater interhemispheric connectivity in girls compared to boys, consistent with the hypothesis of a more bilateral representation of language in females than males. However, interhemispheric communication was associated with slow performance and low verbal intelligent quotient within girls. We suggest that females may have the potential for greater interhemispheric cooperation, which may be an advantage in certain tasks. However, in other tasks too much communication between hemispheres may interfere with task performance.
Collapse
|
41
|
Hughes SM, Farley SD, Rhodes BC. Vocal and Physiological Changes in Response to the Physical Attractiveness of Conversational Partners. JOURNAL OF NONVERBAL BEHAVIOR 2010. [DOI: 10.1007/s10919-010-0087-9] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
42
|
Ethofer T, Kreifelts B, Wiethoff S, Wolf J, Grodd W, Vuilleumier P, Wildgruber D. Differential Influences of Emotion, Task, and Novelty on Brain Regions Underlying the Processing of Speech Melody. J Cogn Neurosci 2009; 21:1255-68. [DOI: 10.1162/jocn.2009.21099] [Citation(s) in RCA: 112] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
We investigated the functional characteristics of brain regions implicated in processing of speech melody by presenting words spoken in either neutral or angry prosody during a functional magnetic resonance imaging experiment using a factorial habituation design. Subjects judged either affective prosody or word class for these vocal stimuli, which could be heard for either the first, second, or third time. Voice-sensitive temporal cortices, as well as the amygdala, insula, and mediodorsal thalami, reacted stronger to angry than to neutral prosody. These stimulus-driven effects were not influenced by the task, suggesting that these brain structures are automatically engaged during processing of emotional information in the voice and operate relatively independent of cognitive demands. By contrast, the right middle temporal gyrus and the bilateral orbito-frontal cortices (OFC) responded stronger during emotion than word classification, but were also sensitive to anger expressed by the voices, suggesting that some perceptual aspects of prosody are also encoded within these regions subserving explicit processing of vocal emotion. The bilateral OFC showed a selective modulation by emotion and repetition, with particularly pronounced responses to angry prosody during the first presentation only, indicating a critical role of the OFC in detection of vocal information that is both novel and behaviorally relevant. These results converge with previous findings obtained for angry faces and suggest a general involvement of the OFC for recognition of anger irrespective of the sensory modality. Taken together, our study reveals that different aspects of voice stimuli and perceptual demands modulate distinct areas involved in the processing of emotional prosody.
Collapse
Affiliation(s)
- Thomas Ethofer
- 1University of Tübingen, Tübingen, Germany
- 2University Medical Center of Geneva, Geneva, Switzerland
| | | | | | | | | | | | | |
Collapse
|
43
|
Decoding of Emotional Information in Voice-Sensitive Cortices. Curr Biol 2009; 19:1028-33. [DOI: 10.1016/j.cub.2009.04.054] [Citation(s) in RCA: 186] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2008] [Revised: 04/13/2009] [Accepted: 04/14/2009] [Indexed: 11/18/2022]
|
44
|
Aeschlimann M, Knebel JF, Murray MM, Clarke S. Emotional pre-eminence of human vocalizations. Brain Topogr 2008; 20:239-48. [PMID: 18347967 DOI: 10.1007/s10548-008-0051-8] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2007] [Accepted: 02/11/2008] [Indexed: 11/28/2022]
Abstract
Human vocalizations (HV), as well as environmental sounds, convey a wide range of information, including emotional expressions. The latter have been relatively rarely investigated, and, in particular, it is unclear if duration-controlled non-linguistic HV sequences can reliably convey both positive and negative emotional information. The aims of the present psychophysical study were: (i) to generate a battery of duration-controlled and acoustically controlled extreme valence stimuli, and (ii) to compare the emotional impact of HV with that of other environmental sounds. A set of 144 HV and other environmental sounds was selected to cover emotionally positive, negative, and neutral values. Sequences of 2 s duration were rated on Likert scales by 16 listeners along three emotional dimensions (arousal, intensity, and valence) and two non-emotional dimensions (confidence in identifying the sound source and perceived loudness). The 2 s stimuli were reliably perceived as emotionally positive, negative or neutral. We observed a linear relationship between intensity and arousal ratings and a "boomerang-shaped" intensity-valence distribution, as previously reported for longer, duration-variable stimuli. In addition, the emotional intensity ratings for HV were higher than for other environmental sounds, suggesting that HV constitute a characteristic class of emotional auditory stimuli. In addition, emotionally positive HV were more readily identified than other sounds, and emotionally negative stimuli, irrespective of their source, were perceived as louder than their positive and neutral counterparts. In conclusion, HV are a distinct emotional category of environmental sounds and they retain this emotional pre-eminence even when presented for brief periods.
Collapse
Affiliation(s)
- Mélanie Aeschlimann
- Service de Neuropsychologie et de Neuroréhabilitation, Centre Hospitalier Universitaire Vaudois (CHUV) and Université de Lausanne (UNIL), Av. Pierre Decker 5, 1011 Lausanne, Switzerland.
| | | | | | | |
Collapse
|