201
|
A Cognitive Neuroscience View of Schizophrenic Symptoms: Abnormal Activation of a System for Social Perception and Communication. Brain Imaging Behav 2008; 3:85-110. [PMID: 19809534 PMCID: PMC2757313 DOI: 10.1007/s11682-008-9052-1] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
We will review converging evidence that language related symptoms of the schizophrenic syndrome such as auditory verbal hallucinations arise at least in part from processing abnormalities in posterior language regions. These language regions are either adjacent to or overlapping with regions in the (posterior) temporal cortex and temporo-parietal occipital junction that are part of a system for processing social cognition, emotion, and self representation or agency. The inferior parietal and posterior superior temporal regions contain multi-modal representational systems that may also provide rapid feedback and feed-forward activation to unimodal regions such as auditory cortex. We propose that the over-activation of these regions could not only result in erroneous activation of semantic and speech (auditory word) representations, resulting in thought disorder and voice hallucinations, but could also result in many of the other symptoms of schizophrenia. These regions are also part of the so-called “default network”, a network of regions that are normally active; and their activity is also correlated with activity within the hippocampal system.
Collapse
|
202
|
fMRI evidence for the effect of verbal complexity on lateralisation of the neural response associated with decoding prosodic emotion. Neuropsychologia 2008; 46:2880-7. [DOI: 10.1016/j.neuropsychologia.2008.05.024] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2008] [Revised: 04/29/2008] [Accepted: 05/14/2008] [Indexed: 11/17/2022]
|
203
|
Robins DL, Hunyadi E, Schultz RT. Superior temporal activation in response to dynamic audio-visual emotional cues. Brain Cogn 2008; 69:269-78. [PMID: 18809234 DOI: 10.1016/j.bandc.2008.08.007] [Citation(s) in RCA: 101] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2008] [Revised: 07/29/2008] [Accepted: 08/01/2008] [Indexed: 11/19/2022]
Abstract
Perception of emotion is critical for successful social interaction, yet the neural mechanisms underlying the perception of dynamic, audio-visual emotional cues are poorly understood. Evidence from language and sensory paradigms suggests that the superior temporal sulcus and gyrus (STS/STG) play a key role in the integration of auditory and visual cues. Emotion perception research has focused on static facial cues; however, dynamic audio-visual (AV) cues mimic real-world social cues more accurately than static and/or unimodal stimuli. Novel dynamic AV stimuli were presented using a block design in two fMRI studies, comparing bimodal stimuli to unimodal conditions, and emotional to neutral stimuli. Results suggest that the bilateral superior temporal region plays distinct roles in the perception of emotion and in the integration of auditory and visual cues. Given the greater ecological validity of the stimuli developed for this study, this paradigm may be helpful in elucidating the deficits in emotion perception experienced by clinical populations.
Collapse
Affiliation(s)
- Diana L Robins
- Department of Psychology, Georgia State University, P.O. Box 5010, Atlanta, GA 30302-5010, USA.
| | | | | |
Collapse
|
204
|
Pichon S, de Gelder B, Grèzes J. Emotional modulation of visual and motor areas by dynamic body expressions of anger. Soc Neurosci 2008; 3:199-212. [DOI: 10.1080/17470910701394368] [Citation(s) in RCA: 106] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
205
|
Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, Wager TD. Functional grouping and cortical-subcortical interactions in emotion: a meta-analysis of neuroimaging studies. Neuroimage 2008; 42:998-1031. [PMID: 18579414 PMCID: PMC2752702 DOI: 10.1016/j.neuroimage.2008.03.059] [Citation(s) in RCA: 772] [Impact Index Per Article: 45.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2007] [Revised: 03/05/2008] [Accepted: 03/26/2008] [Indexed: 01/11/2023] Open
Abstract
We performed an updated quantitative meta-analysis of 162 neuroimaging studies of emotion using a novel multi-level kernel-based approach, focusing on locating brain regions consistently activated in emotional tasks and their functional organization into distributed functional groups, independent of semantically defined emotion category labels (e.g., "anger," "fear"). Such brain-based analyses are critical if our ways of labeling emotions are to be evaluated and revised based on consistency with brain data. Consistent activations were limited to specific cortical sub-regions, including multiple functional areas within medial, orbital, and inferior lateral frontal cortices. Consistent with a wealth of animal literature, multiple subcortical activations were identified, including amygdala, ventral striatum, thalamus, hypothalamus, and periaqueductal gray. We used multivariate parcellation and clustering techniques to identify groups of co-activated brain regions across studies. These analyses identified six distributed functional groups, including medial and lateral frontal groups, two posterior cortical groups, and paralimbic and core limbic/brainstem groups. These functional groups provide information on potential organization of brain regions into large-scale networks. Specific follow-up analyses focused on amygdala, periaqueductal gray (PAG), and hypothalamic (Hy) activations, and identified frontal cortical areas co-activated with these core limbic structures. While multiple areas of frontal cortex co-activated with amygdala sub-regions, a specific region of dorsomedial prefrontal cortex (dmPFC, Brodmann's Area 9/32) was the only area co-activated with both PAG and Hy. Subsequent mediation analyses were consistent with a pathway from dmPFC through PAG to Hy. These results suggest that medial frontal areas are more closely associated with core limbic activation than their lateral counterparts, and that dmPFC may play a particularly important role in the cognitive generation of emotional states.
Collapse
Affiliation(s)
- Hedy Kober
- Department of Psychology, Columbia University, USA
| | - Lisa Feldman Barrett
- Department of Psychology, Boston College, USA
- Psychiatric Neuroimaging Research Program, Massachusetts General Hospital, Harvard Medical School, USA
| | - Josh Joseph
- Department of Psychology, Columbia University, USA
| | | | | | - Tor D. Wager
- Department of Psychology, Columbia University, USA
| |
Collapse
|
206
|
Bach DR, Grandjean D, Sander D, Herdener M, Strik WK, Seifritz E. The effect of appraisal level on processing of emotional prosody in meaningless speech. Neuroimage 2008; 42:919-27. [DOI: 10.1016/j.neuroimage.2008.05.034] [Citation(s) in RCA: 96] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2008] [Revised: 05/14/2008] [Accepted: 05/19/2008] [Indexed: 10/22/2022] Open
|
207
|
Auditory processing abnormalities in schizotypal personality disorder: an fMRI experiment using tones of deviant pitch and duration. Schizophr Res 2008; 103:26-39. [PMID: 18555666 PMCID: PMC3188851 DOI: 10.1016/j.schres.2008.04.041] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2008] [Revised: 04/18/2008] [Accepted: 04/24/2008] [Indexed: 11/22/2022]
Abstract
BACKGROUND One of the cardinal features of schizotypal personality disorder (SPD) is language abnormalities. The focus of this study was to determine whether or not there are also processing abnormalities of pure tones differing in pitch and duration in SPD. METHODS Thirteen neuroleptic-naïve male subjects met full criteria for SPD and were group-matched on age and parental socio-economic status to 13 comparison subjects. Verbal learning was measured with the California Verbal Learning Test. Heschl's gyrus volumes were measured using structural MRI. Whole-brain fMRI activation patterns in an auditory task of listening to tones including pitch and duration deviants were compared between SPD and control subjects. In a second and separate ROI analysis we found that peak activation in superior temporal gyrus (STG), Brodmann Areas 41 and 42, was correlated with verbal learning and clinical measures derived from the SCID-II interview. RESULTS In the region of the STG, SPD subjects demonstrated more activation to pitch deviants bilaterally (p<0.001); and to duration deviants in the left hemisphere (p=0.005) (two-sample t). SPD subjects also showed more bilateral parietal cortex activation to duration deviants. In no region did comparison subjects activate more than SPD subjects in either experiment. Exploratory correlations for SPD subjects suggest a relationship between peak activation on the right for deviant tones in the pitch experiment with odd speech and impaired verbal learning. There was no difference between groups on Heschl's gyrus volume. CONCLUSIONS These data suggest that SPD subjects have inefficient or hyper-responsive processing of pure tones both in terms of pitch and duration deviance that is not attributable to smaller Heschl's gyrus volumes. Finally, these auditory processing abnormalities may have significance for the odd speech heard in some SPD subjects and downstream language and verbal learning deficits.
Collapse
|
208
|
Hoekert M, Bais L, Kahn RS, Aleman A. Time course of the involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum in emotional prosody perception. PLoS One 2008; 3:e2244. [PMID: 18493307 PMCID: PMC2373925 DOI: 10.1371/journal.pone.0002244] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2008] [Accepted: 04/10/2008] [Indexed: 11/19/2022] Open
Abstract
In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.
Collapse
Affiliation(s)
- Marjolijn Hoekert
- BCN Neuroimaging Center, University of Groningen, Groningen, The Netherlands.
| | | | | | | |
Collapse
|
209
|
Quadflieg S, Mohr A, Mentzel HJ, Miltner WH, Straube T. Modulation of the neural network involved in the processing of anger prosody: The role of task-relevance and social phobia. Biol Psychol 2008; 78:129-37. [DOI: 10.1016/j.biopsycho.2008.01.014] [Citation(s) in RCA: 75] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2007] [Revised: 01/28/2008] [Accepted: 01/29/2008] [Indexed: 11/25/2022]
|
210
|
|
211
|
Glasser MF, Rilling JK. DTI tractography of the human brain's language pathways. ACTA ACUST UNITED AC 2008; 18:2471-82. [PMID: 18281301 DOI: 10.1093/cercor/bhn011] [Citation(s) in RCA: 439] [Impact Index Per Article: 25.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Diffusion Tensor Imaging (DTI) tractography has been used to detect leftward asymmetries in the arcuate fasciculus, a pathway that links temporal and inferior frontal language cortices. In this study, we more specifically define this asymmetry with respect to both anatomy and function. Twenty right-handed male subjects were scanned with DTI, and the arcuate fasciculus was reconstructed using deterministic tractography. The arcuate was divided into 2 segments with different hypothesized functions, one terminating in the posterior superior temporal gyrus (STG) and another terminating in the middle temporal gyrus (MTG). Tractography results were compared with peak activation coordinates from prior functional neuroimaging studies of phonology, lexical-semantic processing, and prosodic processing to assign putative functions to these pathways. STG terminations were strongly left lateralized and overlapped with phonological activations in the left but not the right hemisphere, suggesting that only the left hemisphere phonological cortex is directly connected with the frontal lobe via the arcuate fasciculus. MTG terminations were also strongly left lateralized, overlapping with left lateralized lexical-semantic activations. Smaller right hemisphere MTG terminations overlapped with right lateralized prosodic activations. We combine our findings with a recent model of brain language processing to explain 6 aphasia syndromes.
Collapse
Affiliation(s)
- Matthew F Glasser
- Department of Anthropology, Emory University, 1557 Dickey Drive, Atlanta, GA 30322, USA
| | | |
Collapse
|
212
|
Schirmer A, Escoffier N, Zysset S, Koester D, Striano T, Friederici AD. When vocal processing gets emotional: on the role of social orientation in relevance detection by the human amygdala. Neuroimage 2008; 40:1402-10. [PMID: 18299209 DOI: 10.1016/j.neuroimage.2008.01.018] [Citation(s) in RCA: 50] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2007] [Revised: 11/14/2007] [Accepted: 01/13/2008] [Indexed: 10/22/2022] Open
Abstract
Previous work on vocal emotional processing provided little evidence for involvement of emotional processing areas such as the amygdala or the orbitofrontal cortex (OFC). Here, we sought to specify whether involvement of these areas depends on how relevant vocal expressions are for the individual. To this end, we assessed participants' social orientation--a measure of the interest and concern for other individuals and hence the relevance of social signals. We then presented task-irrelevant syllable sequences that contained rare changes in tone of voice that could be emotional or neutral. Processing differences between emotional and neutral vocal change in the right amygdala and the bilateral OFC were significantly correlated with the social orientation measure. Specifically, higher social orientation scores were associated with enhanced amygdala and OFC activity to emotional as compared to neutral change. Given the presumed role of the amygdala in the detection of emotionally relevant information, our results suggest that social orientation enhances this detection process and the activation of emotional representations mediated by the OFC. Moreover, social orientation may predict listener responses to vocal emotional cues and explain interindividual variability in vocal emotional processing.
Collapse
Affiliation(s)
- Annett Schirmer
- Department of Psychology, Faculty of Arts and Social Sciences, National University of Singapore, Singapore.
| | | | | | | | | | | |
Collapse
|
213
|
Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: implications for neuropsychological models of aging. Neurosci Biobehav Rev 2008; 32:863-81. [PMID: 18276008 DOI: 10.1016/j.neubiorev.2008.01.001] [Citation(s) in RCA: 507] [Impact Index Per Article: 29.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2007] [Revised: 12/23/2007] [Accepted: 01/07/2008] [Indexed: 12/24/2022]
Abstract
This meta-analysis of 28 data sets (N=705 older adults, N=962 younger adults) examined age differences in emotion recognition across four modalities: faces, voices, bodies/contexts, and matching of faces to voices. The results indicate that older adults have increased difficulty recognising at least some of the basic emotions (anger, sadness, fear, disgust, surprise, happiness) in each modality, with some emotions (anger and sadness) and some modalities (face-voice matching) creating particular difficulties. The predominant pattern across all emotions and modalities was of age-related decline with the exception that there was a trend for older adults to be better than young adults at recognising disgusted facial expressions. These age-related changes are examined in the context of three theoretical perspectives-positivity effects, general cognitive decline, and more specific neuropsychological change in the social brain. We argue that the pattern of age-related change observed is most consistent with a neuropsychological model of adult aging stemming from changes in frontal and temporal volume, and/or changes in neurotransmitters.
Collapse
Affiliation(s)
- Ted Ruffman
- Department of Psychology, University of Otago, Box 56, Dunedin 9054, New Zealand.
| | | | | | | |
Collapse
|
214
|
Vocal emotion processing in Parkinson's disease: Reduced sensitivity to negative emotions. Brain Res 2008; 1188:100-11. [DOI: 10.1016/j.brainres.2007.10.034] [Citation(s) in RCA: 101] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2007] [Revised: 09/04/2007] [Accepted: 10/16/2007] [Indexed: 11/23/2022]
|
215
|
Ross ED, Monnot M. Neurology of affective prosody and its functional-anatomic organization in right hemisphere. BRAIN AND LANGUAGE 2008; 104:51-74. [PMID: 17537499 DOI: 10.1016/j.bandl.2007.04.007] [Citation(s) in RCA: 170] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2006] [Revised: 04/17/2007] [Accepted: 04/26/2007] [Indexed: 05/15/2023]
Abstract
Unlike the aphasic syndromes, the organization of affective prosody in brain has remained controversial because affective-prosodic deficits may occur after left or right brain damage. However, different patterns of deficits are observed following left and right brain damage that suggest affective prosody is a dominant and lateralized function of the right hemisphere. Using the Aprosodia Battery, which was developed to differentiate left and right hemisphere patterns of affective-prosodic deficits, functional-anatomic evidence is presented in patients with focal ischemic strokes to support the concepts that (1) affective prosody is a dominant and lateralized function of the right hemisphere, (2) the intrahemispheric organization of affective prosody in the right hemisphere, with the partial exception of Repetition, is analogous to the organization of propositional language in the left hemisphere and (3) the aprosodic syndromes are cortically based as part of evolutionary adaptations underlying human language and communication.
Collapse
Affiliation(s)
- Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center and the VA Medical Center (11AZ), Oklahoma City, OK 73104, USA.
| | | |
Collapse
|
216
|
Processing of inconsistent emotional information: an fMRI study. Exp Brain Res 2007; 186:401-7. [PMID: 18094962 PMCID: PMC2755755 DOI: 10.1007/s00221-007-1242-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2007] [Accepted: 12/01/2007] [Indexed: 12/30/2022]
Abstract
Previous studies investigating the anterior cingulate cortex (ACC) have relied on a number of tasks which involved cognitive control and attentional demands. In this fMRI study, we tested the model that ACC functions as an attentional network in the processing of language. We employed a paradigm that requires the processing of concurrent linguistic information predicting that the cognitive costs imposed by competing trials would engender the activation of ACC. Subjects were confronted with sentences where the semantic content conflicted with the prosodic intonation (CONF condition) randomly interspaced with sentences which conveyed coherent discourse components (NOCONF condition). We observed the activation of the rostral ACC and the middle frontal gyrus when the NOCONF condition was subtracted from the CONF condition. Our findings provide evidence for the involvement of the rostral ACC in the processing of complex competing linguistic stimuli, supporting theories that claim its relevance as a part of the cortical attentional circuit. The processing of emotional prosody involved a bilateral network encompassing the superior and medial temporal cortices. This evidence confirms previous research investigating the neuronal network that supports the processing of emotional information.
Collapse
|
217
|
Golan O, Baron-Cohen S, Hill JJ, Rutherford MD. The 'Reading the Mind in the Voice' test-revised: a study of complex emotion recognition in adults with and without autism spectrum conditions. J Autism Dev Disord 2007; 37:1096-106. [PMID: 17072749 DOI: 10.1007/s10803-006-0252-5] [Citation(s) in RCA: 151] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
This study reports a revised version of the 'Reading the Mind in the Voice' (RMV) task. The original task (Rutherford et al., (2002), Journal of Autism and Developmental Disorders, 32, 189-194) suffered from ceiling effects and limited sensitivity. To improve that, the task was shortened and two more foils were added to each of the remaining items. About 50 adults with Asperger Syndrome (AS) or High Functioning Autism (HFA) and 22 matched controls took the revised task. Results show the revised task has good reliability and validity, is harder, and more sensitive in distinguishing the AS/HFA group from controls. Verbal IQ was positively correlated with performance, and females performed worse than males in the AS/HFA group. Results are discussed with regard to multi modal empathizing deficits in autism spectrum conditions (ASC).
Collapse
Affiliation(s)
- Ofer Golan
- Department of Psychiatry, Autism Research Centre, Cambridge University, Douglas House, 18b Trumpington Road, CB2 2AH, Cambridge, UK.
| | | | | | | |
Collapse
|
218
|
Meyer M, Baumann S, Wildgruber D, Alter K. How the brain laughs. Behav Brain Res 2007; 182:245-60. [PMID: 17568693 DOI: 10.1016/j.bbr.2007.04.023] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2007] [Revised: 04/26/2007] [Accepted: 04/30/2007] [Indexed: 10/23/2022]
Abstract
Laughter is an affective nonspeech vocalization that is not reserved to humans, but can also be observed in other mammalians, in particular monkeys and great apes. This observation makes laughter an interesting subject for brain research as it allows us to learn more about parallels and differences of human and animal communication by studying the neural underpinnings of expressive and perceptive laughter. In the first part of this review we will briefly sketch the acoustic structure of a bout of laughter and relate this to the differential anatomy of the larynx and the vocal tract in human and monkey. The subsequent part of the article introduces the present knowledge on behavioral and brain mechanisms of "laughter-like responses" and other affective vocalizations in monkeys and apes, before we describe the scant evidence on the cerebral organization of laughter provided by neuroimaging studies. Our review indicates that a densely intertwined network of auditory and (pre-) motor functions subserves perceptive and expressive aspects of human laughter. Even though there is a tendency in the present literature to suggest a rightward asymmetry of the cortical representation of laughter, there is no doubt that left cortical areas are also involved. In addition, subcortical areas, namely the amygdala, have also been identified as part of this network. Furthermore, we can conclude from our overview that research on the brain mechanisms of affective vocalizations in monkeys and great apes report the recruitment of similar cortical and subcortical areas similar to those attributed to laughter in humans. Therefore, we propose the existence of equivalent brain representations of emotional tone in human and great apes. This reasoning receives support from neuroethological models that describe laughter as a primal behavioral tool used by individuals - be they human or ape - to prompt other individuals of a peer group and to create a mirthful context for social interaction and communication.
Collapse
Affiliation(s)
- Martin Meyer
- Institute of Neuroradiology, Department of Medical Radiology, University Hospital of Zurich, Frauenklinikstrasse 10, CH-8091 Zurich, Switzerland.
| | | | | | | |
Collapse
|
219
|
Redcay E. The superior temporal sulcus performs a common function for social and speech perception: implications for the emergence of autism. Neurosci Biobehav Rev 2007; 32:123-42. [PMID: 17706781 DOI: 10.1016/j.neubiorev.2007.06.004] [Citation(s) in RCA: 230] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2006] [Revised: 03/08/2007] [Accepted: 06/12/2007] [Indexed: 10/23/2022]
Abstract
Within the cognitive neuroscience literature, discussion of the functional role of the superior temporal sulcus (STS) has traditionally been divided into two domains; one focuses on its activity during language processing while the other emphasizes its role in biological motion and social attention, such as eye gaze processing. I will argue that a common process underlying both of these functional domains is performed by the STS, namely analyzing changing sequences of input, either in the auditory or visual domain, and interpreting the communicative significance of those inputs. From a developmental perspective, the fact that these two domains share an anatomical substrate suggests the acquisition of social and speech perception may be linked. In addition, I will argue that because of the STS' role in interpreting social and speech input, impairments in STS function may underlie many of the social and language abnormalities seen in autism.
Collapse
Affiliation(s)
- Elizabeth Redcay
- Department of Psychology, University of California, San Diego, 8110 La Jolla Shores Dr., Suite 201, La Jolla, CA 92037, USA.
| |
Collapse
|
220
|
Kreifelts B, Ethofer T, Grodd W, Erb M, Wildgruber D. Audiovisual integration of emotional signals in voice and face: an event-related fMRI study. Neuroimage 2007; 37:1445-56. [PMID: 17659885 DOI: 10.1016/j.neuroimage.2007.06.020] [Citation(s) in RCA: 195] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2007] [Revised: 06/08/2007] [Accepted: 06/25/2007] [Indexed: 11/30/2022] Open
Abstract
In a natural environment, non-verbal emotional communication is multimodal (i.e. speech melody, facial expression) and multifaceted concerning the variety of expressed emotions. Understanding these communicative signals and integrating them into a common percept is paramount to successful social behaviour. While many previous studies have focused on the neurobiology of emotional communication in the auditory or visual modality alone, far less is known about multimodal integration of auditory and visual non-verbal emotional information. The present study investigated this process using event-related fMRI. Behavioural data revealed that audiovisual presentation of non-verbal emotional information resulted in a significant increase in correctly classified stimuli when compared with visual and auditory stimulation. This behavioural gain was paralleled by enhanced activation in bilateral posterior superior temporal gyrus (pSTG) and right thalamus, when contrasting audiovisual to auditory and visual conditions. Further, a characteristic of these brain regions, substantiating their role in the emotional integration process, is a linear relationship between the gain in classification accuracy and the strength of the BOLD response during the bimodal condition. Additionally, enhanced effective connectivity between audiovisual integration areas and associative auditory and visual cortices was observed during audiovisual stimulation, offering further insight into the neural process accomplishing multimodal integration. Finally, we were able to document an enhanced sensitivity of the putative integration sites to stimuli with emotional non-verbal content as compared to neutral stimuli.
Collapse
Affiliation(s)
- Benjamin Kreifelts
- Department of Psychiatry and Psychotherapy, University of Tuebingen, Osianderstrasse 24, 72076 Tuebingen, Germany.
| | | | | | | | | |
Collapse
|
221
|
Wang AT, Lee SS, Sigman M, Dapretto M. Reading affect in the face and voice: neural correlates of interpreting communicative intent in children and adolescents with autism spectrum disorders. ACTA ACUST UNITED AC 2007; 64:698-708. [PMID: 17548751 PMCID: PMC3713233 DOI: 10.1001/archpsyc.64.6.698] [Citation(s) in RCA: 144] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
CONTEXT Understanding a speaker's communicative intent in everyday interactions is likely to draw on cues such as facial expression and tone of voice. Prior research has shown that individuals with autism spectrum disorders (ASD) show reduced activity in brain regions that respond selectively to the face and voice. However, there is also evidence that activity in key regions can be increased if task demands allow for explicit processing of emotion. OBJECTIVES To examine the neural circuitry underlying impairments in interpreting communicative intentions in ASD using irony comprehension as a test case, and to determine whether explicit instructions to attend to facial expression and tone of voice will elicit more normative patterns of brain activity. DESIGN, SETTING, AND PARTICIPANTS Eighteen boys with ASD (aged 7-17 years, full-scale IQ >70) and 18 typically developing (TD) boys underwent functional magnetic resonance imaging at the Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles. MAIN OUTCOME MEASURES Blood oxygenation level-dependent brain activity during the presentation of short scenarios involving irony. Behavioral performance (accuracy and response time) was also recorded. RESULTS Reduced activity in the medial prefrontal cortex and right superior temporal gyrus was observed in children with ASD relative to TD children during the perception of potentially ironic vs control scenarios. Importantly, a significant group x condition interaction in the medial prefrontal cortex showed that activity was modulated by explicit instructions to attend to facial expression and tone of voice only in the ASD group. Finally, medial prefrontal cortex activity was inversely related to symptom severity in children with ASD such that children with greater social impairment showed less activity in this region. CONCLUSIONS Explicit instructions to attend to facial expression and tone of voice can elicit increased activity in the medial prefrontal cortex, part of a network important for understanding the intentions of others, in children with ASD. These findings suggest a strategy for future intervention research.
Collapse
Affiliation(s)
- A Ting Wang
- Department of Psychiatry, University of California, Los Angeles, USA.
| | | | | | | |
Collapse
|
222
|
Wilson SM, Molnar-Szakacs I, Iacoboni M. Beyond superior temporal cortex: intersubject correlations in narrative speech comprehension. ACTA ACUST UNITED AC 2007; 18:230-42. [PMID: 17504783 DOI: 10.1093/cercor/bhm049] [Citation(s) in RCA: 191] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
The role of superior temporal cortex in speech comprehension is well established, but the complete network of regions involved in understanding language in ecologically valid contexts is less clearly understood. In a functional magnetic resonance imaging (fMRI) study, we presented 24 subjects with auditory or audiovisual narratives, and used model-free intersubject correlational analyses to reveal brain areas that were modulated in a consistent way across subjects during the narratives. Conventional comparisons to a resting state were also performed. Both analyses showed the expected recruitment of superior temporal areas, however, the intersubject correlational analyses also revealed an extended network of areas involved in narrative speech comprehension. Two findings stand out in particular. Firstly, many areas in the "default mode" network (typically deactivated relative to rest) were systematically modulated by the time-varying properties of the auditory or audiovisual input. These areas included the anterior cingulate and adjacent medial frontal cortex, and the posterior cingulate and adjacent precuneus. Secondly, extensive bilateral inferior frontal and premotor regions were implicated in auditory as well as audiovisual language comprehension. This extended network of regions may be important for higher-level linguistic processes, and interfaces with extralinguistic cognitive, affective, and interpersonal systems.
Collapse
Affiliation(s)
- Stephen M Wilson
- Ahmanson-Lovelace Brain Mapping Center, Brain Research Institute, David Geffen School of Medicine, University of California, Los Angeles, CA 90095, USA.
| | | | | |
Collapse
|
223
|
Saito Y, Kondo T, Aoyama S, Fukumoto R, Konishi N, Nakamura K, Kobayashi M, Toshima T. The function of the frontal lobe in neonates for response to a prosodic voice. Early Hum Dev 2007; 83:225-30. [PMID: 16839715 DOI: 10.1016/j.earlhumdev.2006.05.017] [Citation(s) in RCA: 50] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/14/2006] [Revised: 05/18/2006] [Accepted: 05/25/2006] [Indexed: 11/17/2022]
Abstract
We examined how neonates responded at the brain level to an element of acoustic stimulation using near-infrared spectroscopy (NIRS). Twenty full-term, healthy neonates were included in the study. The neonates were tested in their cribs while they slept in a silent room. First, two probe holders were placed on the left and right sides of the forehead over the eyebrows using double-sided adhesive tape. Then the neonates were exposed to the auditory stimuli from an external auditory speaker. The stimuli, readings of the first scene of "Little Red Riding Hood," were made with a digital voice. The stimuli consisted of two conditions: variably pitched speech (variable speech: VS) and monotonous flat-pitched speech (monotonous speech: MS). The analyses focused on changes in O(2)Hb because O(2)Hb is the most sensitive indicator of changes in cerebral blood flow in NIRS measurement. The O(2)Hb level promptly increased at the beginning of the VS condition, and then returned to baseline again, while O(2)Hb did not show any changes during the MS condition. Differences between baseline-stimulation relative values were used to perform a 2 (condition)x2 (recording site)x2 (gender) analysis of variance. The results show that VS (M=0.45, S.D.=1.33) produced a greater increase of oxygenated blood to the frontal area of the brain area than MS (M=-0.19, S.D.=1.28). Neonates' brain activation patterns suggest that they can discriminate differences in the prosodic patterns of utterances.
Collapse
Affiliation(s)
- Yuri Saito
- Department of Psychology, Graduate School of Education, Hiroshima University, Kagamiyama 1-1-1, Higashi-Hiroshima, Hiroshima 739-8524, Japan.
| | | | | | | | | | | | | | | |
Collapse
|
224
|
Mitchell RLC. fMRI delineation of working memory for emotional prosody in the brain: commonalities with the lexico-semantic emotion network. Neuroimage 2007; 36:1015-25. [PMID: 17481919 DOI: 10.1016/j.neuroimage.2007.03.016] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2006] [Revised: 03/15/2007] [Accepted: 03/19/2007] [Indexed: 10/23/2022] Open
Abstract
Decoding emotional prosody is crucial for successful social interactions, and continuous monitoring of emotional intent via prosody requires working memory. It has been proposed by Ross and others that emotional prosody cognitions in the right hemisphere are organized in an analogous fashion to propositional language functions in the left hemisphere. This study aimed to test the applicability of this model in the context of prefrontal cortex working memory functions. BOLD response data were therefore collected during performance of two emotional working memory tasks by participants undergoing fMRI. In the prosody task, participants identified the emotion conveyed in pre-recorded sentences, and working memory load was manipulated in the style of an N-back task. In the matched lexico-semantic task, participants identified the emotion conveyed by sentence content. Block-design neuroimaging data were analyzed parametrically with SPM5. At first, working memory for emotional prosody appeared to be right-lateralized in the PFC, however, further analyses revealed that it shared much bilateral prefrontal functional neuroanatomy with working memory for lexico-semantic emotion. Supplementary separate analyses of males and females suggested that these language functions were less bilateral in females, but their inclusion did not alter the direction of laterality. It is concluded that Ross et al.'s model is not applicable to prefrontal cortex working memory functions, that evidence that working memory cannot be subdivided in prefrontal cortex according to material type is increased, and that incidental working memory demands may explain the frontal lobe involvement in emotional prosody comprehension as revealed by neuroimaging studies.
Collapse
Affiliation(s)
- Rachel L C Mitchell
- School of Psychology and Clinical Language Sciences, University of Reading, Reading, Berkshire, RG6 6AL, UK.
| |
Collapse
|
225
|
Kleber B, Birbaumer N, Veit R, Trevorrow T, Lotze M. Overt and imagined singing of an Italian aria. Neuroimage 2007; 36:889-900. [PMID: 17478107 DOI: 10.1016/j.neuroimage.2007.02.053] [Citation(s) in RCA: 125] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2006] [Revised: 02/20/2007] [Accepted: 02/23/2007] [Indexed: 11/17/2022] Open
Abstract
Activation maps of 16 professional classical singers were evaluated during overt singing and imagined singing of an Italian aria utilizing a sparse sampling functional magnetic imaging (fMRI) technique. Overt singing involved bilateral primary and secondary sensorimotor and auditory cortices but also areas associated with speech and language production. Activation magnitude within the gyri of Heschl (A1) was comparable in both hemispheres. Subcortical motor areas (cerebellum, thalamus, medulla and basal ganglia) were active too. Areas associated with emotional processing showed slight (anterior cingulate cortex, anterior insula) activation. Cerebral activation sites during imagined singing were centered on fronto-parietal areas and involved primary and secondary sensorimotor areas in both hemispheres. Areas processing emotions showed intense activation (ACC and bilateral insula, hippocampus and anterior temporal poles, bilateral amygdala). Imagery showed no significant activation in A1. Overt minus imagined singing revealed increased activation in cortical (bilateral primary motor; M1) and subcortical (right cerebellar hemisphere, medulla) motor as well as in sensory areas (primary somatosensory cortex, bilateral A1). Imagined minus overt singing showed enhanced activity in the medial Brodmann's area 6, the ventrolateral and medial prefrontal cortex (PFC), the anterior cingulate cortex and the inferior parietal lobe. Additionally, Wernicke's area and Brocca's area and their homologues were increasingly active during imagery. We conclude that imagined and overt singing involves partly different brain systems in professional singers with more prefrontal and limbic activation and a larger network of higher order associative functions during imagery.
Collapse
Affiliation(s)
- B Kleber
- Institute of Medical Psychology and Behavioral Neurobiology, University of Tübingen, Germany.
| | | | | | | | | |
Collapse
|
226
|
Abstract
Developmental psychology and psychopathology has in the past been more concerned with the quality of self-representation than with the development of the subjective agency which underpins our experience of feeling, thought and action, a key function of mentalisation. This review begins by contrasting a Cartesian view of pre-wired introspective subjectivity with a constructionist model based on the assumption of an innate contingency detector which orients the infant towards aspects of the social world that react congruently and in a specifically cued informative manner that expresses and facilitates the assimilation of cultural knowledge. Research on the neural mechanisms associated with mentalisation and social influences on its development are reviewed. It is suggested that the infant focuses on the attachment figure as a source of reliable information about the world. The construction of the sense of a subjective self is then an aspect of acquiring knowledge about the world through the caregiver's pedagogical communicative displays which in this context focuses on the child's thoughts and feelings. We argue that a number of possible mechanisms, including complementary activation of attachment and mentalisation, the disruptive effect of maltreatment on parent-child communication, the biobehavioural overlap of cues for learning and cues for attachment, may have a role in ensuring that the quality of relationship with the caregiver influences the development of the child's experience of thoughts and feelings.
Collapse
Affiliation(s)
- Peter Fonagy
- Sub-department of Clinical Health Psychology, University College London, UK.
| | | | | |
Collapse
|
227
|
Neuroanatomical correlates of personality in the elderly. Neuroimage 2007; 35:263-72. [PMID: 17229578 DOI: 10.1016/j.neuroimage.2006.11.039] [Citation(s) in RCA: 86] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2006] [Revised: 11/08/2006] [Accepted: 11/09/2006] [Indexed: 10/23/2022] Open
Abstract
Extraversion and neuroticism are two important and frequently studied dimensions of human personality. They describe individual differences in emotional responding that are quite stable across the adult lifespan. Neuroimaging research has begun to provide evidence that neuroticism and extraversion have specific neuroanatomical correlates within the cerebral cortex and amygdala of young adults. However, these brain areas undergo alterations in size with aging, which may influence the nature of these personality factor-brain structure associations in the elderly. One study in the elderly demonstrated associations between perisylvian cortex structure and measures of self transcendence [Kaasinen, V., Maguire, R.P., Kurki, T., Bruck, A., Rinne, J.O., 2005. Mapping brain structure and personality in late adulthood. NeuroImage 24, 315-322], but the neuroanatomical correlates of extraversion and neuroticism, or other measures of the Five Factor Model of personality have not been explored. The purpose of the present study was to investigate the structural correlates of neuroticism and extraversion in healthy elderly subjects (n=29) using neuroanatomic measures of the cerebral cortex and amygdala. We observed that the thickness of specific lateral prefrontal cortex (PFC) regions, but not amygdala volume, correlates with measures of extraversion and neuroticism. The results suggest differences in the regional neuroanatomic correlates of specific personality traits with aging. We speculate that this relates to the influences of age-related structural changes in the PFC.
Collapse
|
228
|
Lotze M, Veit R, Anders S, Birbaumer N. Evidence for a different role of the ventral and dorsal medial prefrontal cortex for social reactive aggression: An interactive fMRI study. Neuroimage 2007; 34:470-8. [PMID: 17071110 DOI: 10.1016/j.neuroimage.2006.09.028] [Citation(s) in RCA: 143] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2006] [Revised: 09/15/2006] [Accepted: 09/21/2006] [Indexed: 11/28/2022] Open
Abstract
Interactive paradigms inducing reactive aggression are absent in the brain mapping literature. We used a competitive reaction time task to investigate brain regions involved in social interaction and reactive aggression in sixteen healthy male subjects with fMRI. Subjects were provoked by increasingly aversive stimuli and were given the opportunity to respond aggressively against their opponent by administering a stimulus as retaliation. fMRI revealed an increase of medial prefrontal cortex (mPFC) activity during retaliation. The dorsal mPFC was active when subjects had to select the intensity of the retaliation stimulus, and its activity correlated with the selected stimulus strength. In contrast, ventral mPFC was active during observing the opponent suffering but also during retaliation independent of the stimulus strength. Ventral mPFC activation, stronger in low callous subjects, correlated positively with skin conductance response during observation of the suffering opponent. In conclusion, dorsal mPFC activation seems to represent cognitive operations related to more intense social interaction processes whereas the ventral mPFC might be involved in affective processes associated with compassion to the suffering opponent.
Collapse
Affiliation(s)
- M Lotze
- Institute of Medical Psychology and Behavioral Neurobiology, University of Tübingen, Germany.
| | | | | | | |
Collapse
|
229
|
Mitchell RLC. How does the brain mediate interpretation of incongruent auditory emotions? The neural response to prosody in the presence of conflicting lexico-semantic cues. Eur J Neurosci 2006; 24:3611-8. [PMID: 17229109 DOI: 10.1111/j.1460-9568.2006.05231.x] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
We frequently encounter conflicting emotion cues. This study examined how the neural response to emotional prosody differed in the presence of congruent and incongruent lexico-semantic cues. Two hypotheses were assessed: (i) decoding emotional prosody with conflicting lexico-semantic cues would activate brain regions associated with cognitive conflict (anterior cingulate and dorsolateral prefrontal cortex) or (ii) the increased attentional load of incongruent cues would modulate the activity of regions that decode emotional prosody (right lateral temporal cortex). While the participants indicated the emotion conveyed by prosody, functional magnetic resonance imaging data were acquired on a 3T scanner using blood oxygenation level-dependent contrast. Using SPM5, the response to congruent cues was contrasted with that to emotional prosody alone, as was the response to incongruent lexico-semantic cues (for the 'cognitive conflict' hypothesis). The right lateral temporal lobe region of interest analyses examined modulation of activity in this brain region between these two contrasts (for the 'prosody cortex' hypothesis). Dorsolateral prefrontal and anterior cingulate cortex activity was not observed, and neither was attentional modulation of activity in right lateral temporal cortex activity. However, decoding emotional prosody with incongruent lexico-semantic cues was strongly associated with left inferior frontal gyrus activity. This specialist form of conflict is therefore not processed by the brain using the same neural resources as non-affective cognitive conflict and neither can it be handled by associated sensory cortex alone. The recruitment of inferior frontal cortex may indicate increased semantic processing demands but other contributory functions of this region should be explored.
Collapse
Affiliation(s)
- Rachel L C Mitchell
- School of Psychology and Clinical Language Sciences, University of Reading, Whiteknights Road, Reading, Berkshire RG6 6AL, UK.
| |
Collapse
|
230
|
Harciarek M, Heilman KM, Jodzio K. Defective comprehension of emotional faces and prosody as a result of right hemisphere stroke: modality versus emotion-type specificity. J Int Neuropsychol Soc 2006; 12:774-81. [PMID: 17064441 DOI: 10.1017/s1355617706061121] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/27/2005] [Revised: 06/16/2006] [Accepted: 06/16/2006] [Indexed: 11/06/2022]
Abstract
Studies of patients with brain damage, as well as studies with normal subjects have revealed that the right hemisphere is important for recognizing emotions expressed by faces and prosody. It is unclear, however, if the knowledge needed to perform recognition of emotional stimuli is organized by modality or by the type of emotion. Thus, the purpose of this study is to assess these alternative a priori hypotheses. The participants of this study were 30 stroke patients with right hemisphere damage (RHD) and 31 normal controls (NC). Subjects were assessed with the Polish adaptation of the Right Hemisphere Language Battery of Bryan and the Facial Affect Recognition Test based on work of Ekman and Friesen. RHD participants were significantly impaired on both emotional tasks. Whereas on the visual-faces task the RHD subjects recognized happiness better than anger or sadness, the reverse dissociation was found in the auditory-prosody test. These results confirm prior studies demonstrating the role of the right hemisphere in understanding facial and prosodic emotional expressions. These results also suggest that the representations needed to recognize these emotional stimuli are organized by modality (prosodic-echoic and facial-eidetic) and that some modality specific features are more impaired than others.
Collapse
|
231
|
Ethofer T, Anders S, Erb M, Droll C, Royen L, Saur R, Reiterer S, Grodd W, Wildgruber D. Impact of voice on emotional judgment of faces: an event-related fMRI study. Hum Brain Mapp 2006; 27:707-14. [PMID: 16411179 PMCID: PMC6871326 DOI: 10.1002/hbm.20212] [Citation(s) in RCA: 114] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
Emotional information can be conveyed by various means of communication, such as propositional content, speech intonation, facial expression, and gestures. Prior studies have demonstrated that inputs from one modality can alter perception in another modality. To evaluate the impact of emotional intonation on ratings of emotional faces, a behavioral study first was carried out. Second, functional magnetic resonance (fMRI) was used to identify brain regions that mediate crossmodal effects of emotional prosody on judgments of facial expressions. In the behavioral study, subjects rated fearful and neutral facial expressions as being more fearful when accompanied by a fearful voice as compared to the same facial expressions without concomitant auditory stimulus, whereas no such influence on rating of faces was found for happy voices. In the fMRI experiment, this shift in rating of facial expressions in presence of a fearfully spoken sentence was correlated with the hemodynamic response in the left amygdala extending into the periamygdaloid cortex, which suggests that crossmodal effects on cognitive judgments of emotional information are mediated via these neuronal structures. Furthermore, significantly stronger activations were found in the mid-portion of the right fusiform gyrus during judgment of facial expressions in presence of fearful as compared to happy intonations, indicating that enhanced processing of faces within this region can be induced by the presence of threat-related information perceived via the auditory modality. Presumably, these increased extrastriate activations correspond to enhanced alertness, whereas responses within the left amygdala modulate cognitive evaluation of emotional facial expressions.
Collapse
Affiliation(s)
- Thomas Ethofer
- Section of Experimental MR of the CNS, Department of Neuroradiology, University of Tuebingen, Tuebingen, Germany.
| | | | | | | | | | | | | | | | | |
Collapse
|
232
|
Rymarczyk K, Grabowska A. Sex differences in brain control of prosody. Neuropsychologia 2006; 45:921-30. [PMID: 17005213 DOI: 10.1016/j.neuropsychologia.2006.08.021] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2005] [Revised: 08/22/2006] [Accepted: 08/25/2006] [Indexed: 11/29/2022]
Abstract
Affective (emotional) prosody is a neuropsychological function that encompasses non-verbal aspects of language that are necessary for recognizing and conveying emotions in communication, whereas non-affective (linguistic) prosody indicates whether the sentence is a question, an order or a statement. Considerable evidence points to a dominant role for the right hemisphere in both aspects of prosodic function. However, it has yet to be established whether separate parts of the right hemisphere are involved in processing different kinds of emotional intonation. The aim of this study was to answer this question. In addition, the issue of sex differences in the ability to understand prosody was considered. Fifty-two patients with damage to frontal, temporo-parietal or subcortical (basal) parts of the right hemisphere and 26 controls were tested for their ability to assess prosody information in normal (well-formed) sentences and in pseudo-sentences. General impairment of prosody processing was seen in all patient groups but the effect of damage was more apparent for emotional rather than linguistic prosody. Interestingly, appreciation of emotional prosody appeared to depend on the type of emotional expression and the location of the brain lesion. The patients with frontal damage were mostly impaired in comprehension of happy intonations; those with temporo-parietal damage in assessment of sad intonations, while subcortical lesions mostly affected comprehension of angry intonations. Differential effects of lesion location on the performance of men and women were also observed. Frontal lesions were more detrimental to women, whereas subcortical lesions led to stronger impairment in men. This suggests sex differences in brain organization of prosodic functions.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Neurophysiology, Nencki Institute of Experimental Biology, Pasteur 3, 02-093 Warsaw, Poland.
| | | |
Collapse
|
233
|
Lewis JW, Phinney RE, Brefczynski-Lewis JA, DeYoe EA. Lefties get it "right" when hearing tool sounds. J Cogn Neurosci 2006; 18:1314-30. [PMID: 16859417 DOI: 10.1162/jocn.2006.18.8.1314] [Citation(s) in RCA: 62] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Our ability to manipulate and understand the use of a wide range of tools is a feature that sets humans apart from other animals. In right-handers, we previously reported that hearing hand-manipulated tool sounds preferentially activates a left hemisphere network of motor-related brain regions hypothesized to be related to handedness. Using functional magnetic resonance imaging, we compared cortical activation in strongly right-handed versus left-handed listeners categorizing tool sounds relative to animal vocalizations. Here we show that tool sounds preferentially evoke activity predominantly in the hemisphere "opposite" the dominant hand, in specific high-level motor-related and multisensory cortical regions, as determined by a separate task involving pantomiming tool-use gestures. This organization presumably reflects the idea that we typically learn the "meaning" of tool sounds in the context of using them with our dominant hand, such that the networks underlying motor imagery or action schemas may be recruited to facilitate recognition.
Collapse
Affiliation(s)
- James W Lewis
- Department of Physiology and Phamacology, West Virginia University, WV 26506-9229, USA.
| | | | | | | |
Collapse
|
234
|
Golan O, Baron-Cohen S, Hill JJ, Golan Y. The “Reading the Mind in Films” Task: Complex emotion recognition in adults with and without autism spectrum conditions. Soc Neurosci 2006; 1:111-23. [PMID: 18633780 DOI: 10.1080/17470910600980986] [Citation(s) in RCA: 109] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
|
235
|
Hashimoto T, Usui N, Taira M, Nose I, Haji T, Kojima S. The neural mechanism associated with the processing of onomatopoeic sounds. Neuroimage 2006; 31:1762-70. [PMID: 16616863 DOI: 10.1016/j.neuroimage.2006.02.019] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2005] [Revised: 02/09/2006] [Accepted: 02/14/2006] [Indexed: 11/17/2022] Open
Abstract
This event-related fMRI study was conducted to examine the blood-oxygen-level-dependent responses to the processing of auditory onomatopoeic sounds. We used a sound categorization task in which the participants heard four types of stimuli: onomatopoeic sounds, nouns (verbal), animal (nonverbal) sounds, and pure tone/noise (control). By discriminating between the categories of target sounds (birds/nonbirds), the nouns resulted in activations in the left anterior superior temporal gyrus (STG), whereas the animal sounds resulted in activations in the bilateral superior temporal sulcus (STS) and the left inferior frontal gyrus (IFG). In contrast, the onomatopoeias activated extensive brain regions, including the left anterior STG, the region from the bilateral STS to the middle temporal gyrus, and the bilateral IFG. The onomatopoeic sounds showed greater activation in the right middle STS than did the nouns and environmental sounds. These results indicate that onomatopoeic sounds are processed by extensive brain regions involved in the processing of both verbal and nonverbal sounds. Thus, we can posit that onomatopoeic sounds can serve as a bridge between nouns and animal sounds. This is the first evidence to demonstrate the way in which onomatopoeic sounds are processed in the human brain.
Collapse
Affiliation(s)
- Teruo Hashimoto
- Department of Psychology, Faculty of Letters, Keio University, 2-15-45 Mita, Tokyo 108-8345, Japan
| | | | | | | | | | | |
Collapse
|
236
|
Ethofer T, Anders S, Erb M, Herbert C, Wiethoff S, Kissler J, Grodd W, Wildgruber D. Cerebral pathways in processing of affective prosody: A dynamic causal modeling study. Neuroimage 2006; 30:580-7. [PMID: 16275138 DOI: 10.1016/j.neuroimage.2005.09.059] [Citation(s) in RCA: 173] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2005] [Revised: 08/05/2005] [Accepted: 09/19/2005] [Indexed: 11/26/2022] Open
Abstract
This study was conducted to investigate the connectivity architecture of neural structures involved in processing of emotional speech melody (prosody). 24 subjects underwent event-related functional magnetic resonance imaging (fMRI) while rating the emotional valence of either prosody or semantics of binaurally presented adjectives. Conventional analysis of fMRI data revealed activation within the right posterior middle temporal gyrus and bilateral inferior frontal cortex during evaluation of affective prosody and left temporal pole, orbitofrontal, and medial superior frontal cortex during judgment of affective semantics. Dynamic causal modeling (DCM) in combination with Bayes factors was used to compare competing neurophysiological models with different intrinsic connectivity structures and input regions within the network of brain regions underlying comprehension of affective prosody. Comparison on group level revealed superiority of a model in which the right temporal cortex serves as input region as compared to models in which one of the frontal areas is assumed to receive external inputs. Moreover, models with parallel information conductance from the right temporal cortex were superior to models in which the two frontal lobes accomplish serial processing steps. In conclusion, connectivity analysis supports the view that evaluation of affective prosody requires prior analysis of acoustic features within the temporal and that transfer of information from the temporal cortex to the frontal lobes occurs via parallel pathways.
Collapse
Affiliation(s)
- Thomas Ethofer
- Section of Experimental MR of the CNS, Department of Neuroradiology, Otfried-Mueller-Strasse 51, University of Tuebingen, 72076 Tuebingen, Germany.
| | | | | | | | | | | | | | | |
Collapse
|
237
|
Ethofer T, Anders S, Wiethoff S, Erb M, Herbert C, Saur R, Grodd W, Wildgruber D. Effects of prosodic emotional intensity on activation of associative auditory cortex. Neuroreport 2006; 17:249-53. [PMID: 16462592 DOI: 10.1097/01.wnr.0000199466.32036.5d] [Citation(s) in RCA: 98] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Functional magnetic resonance imaging was used to investigate hemodynamic responses to adjectives pronounced in happy and angry intonations of varying emotional intensity. In separate sessions, participants judged the emotional valence of either intonation or semantics. To disentangle effects of emotional prosodic intensity from confounding acoustic parameters, mean and variability of volume and fundamental frequency of each stimulus were included as nuisance variables in the statistical models. A linear dependency between hemodynamic responses and emotional intensity of happy and angry intonations was found in the bilateral superior temporal sulcus during both tasks, indicating that increases of hemodynamic responses in this region are elicited by both positive and negative prosody independent of low-level acoustic properties and task instructions.
Collapse
Affiliation(s)
- Thomas Ethofer
- Section of Experimental MR of the CNS, Department of Neuroradiology, University of Tuebingen, Tuebingen, Germany.
| | | | | | | | | | | | | | | |
Collapse
|
238
|
Wang AT, Lee SS, Sigman M, Dapretto M. Neural basis of irony comprehension in children with autism: the role of prosody and context. ACTA ACUST UNITED AC 2006; 129:932-43. [PMID: 16481375 PMCID: PMC3713234 DOI: 10.1093/brain/awl032] [Citation(s) in RCA: 217] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
While individuals with autism spectrum disorders (ASD) are typically impaired in interpreting the communicative intent of others, little is known about the neural bases of higher-level pragmatic impairments. Here, we used functional MRI (fMRI) to examine the neural circuitry underlying deficits in understanding irony in high-functioning children with ASD. Participants listened to short scenarios and decided whether the speaker was sincere or ironic. Three types of scenarios were used in which we varied the information available to guide this decision. Scenarios included (i) both knowledge of the event outcome and strong prosodic cues (sincere or sarcastic intonation), (ii) prosodic cues only or (iii) knowledge of the event outcome only. Although children with ASD performed well above chance, they were less accurate than typically developing (TD) children at interpreting the communicative intent behind a potentially ironic remark, particularly with regard to taking advantage of available contextual information. In contrast to prior research showing hypoactivation of regions involved in understanding the mental states of others, children with ASD showed significantly greater activity than TD children in the right inferior frontal gyrus (IFG) as well as in bilateral temporal regions. Increased activity in the ASD group fell within the network recruited in the TD group and may reflect more effortful processing needed to interpret the intended meaning of an utterance. These results confirm that children with ASD have difficulty interpreting the communicative intent of others and suggest that these individuals can recruit regions activated as part of the normative neural circuitry when task demands require explicit attention to socially relevant cues.
Collapse
Affiliation(s)
- A. Ting Wang
- Department of Psychology, University of California, Los Angeles, CA
- Department of Psychiatry, Mount Sinai School of Medicine, New York, NY, USA
| | - Susan S. Lee
- Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, CA
| | - Marian Sigman
- Department of Psychology, University of California, Los Angeles, CA
- Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, CA
| | - Mirella Dapretto
- Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, CA
- Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, CA
| |
Collapse
|
239
|
Lotze M, Heymans U, Birbaumer N, Veit R, Erb M, Flor H, Halsband U. Differential cerebral activation during observation of expressive gestures and motor acts. Neuropsychologia 2006; 44:1787-95. [PMID: 16730755 DOI: 10.1016/j.neuropsychologia.2006.03.016] [Citation(s) in RCA: 106] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2005] [Revised: 02/24/2006] [Accepted: 03/08/2006] [Indexed: 11/18/2022]
Abstract
We compared brain activation involved in the observation of isolated right hand movements (e.g. twisting a lid), body-referred movements (e.g. brushing teeth) and expressive gestures (e.g. threatening) in 20 healthy subjects by using functional magnetic resonance imaging (fMRI). Perception-related areas in the occipital and inferior temporal lobe but also the mirror neuron system in the lateral frontal (ventral premotor cortex and BA 44) and superior parietal lobe were active during all three conditions. Observation of body-referred compared to common hand actions induced increased activity in the bilateral posterior superior temporal sulcus (STS), the left temporo-parietal lobe and left BA 45. Expressive gestures involved additional areas related to social perception (bilateral STS, temporal poles, medial prefrontal lobe), emotional processing (bilateral amygdala, bilateral ventrolateral prefrontal cortex (VLPFC), speech and language processing (Broca's and Wernicke's areas) and the pre-supplementary motor area (pre-SMA). In comparison to body-referred actions, expressive gestures evoked additional activity only in the left VLPFC (BA 47). The valence-ratings for expressive gestures correlated significantly with activation intensity in the VLPFC during expressive gesture observation. Valence-ratings for negative expressive gestures correlated with right STS-activity. Our data suggest that both, the VLPFC and the STS are coding for differential emotional valence during the observation of expressive gestures.
Collapse
Affiliation(s)
- M Lotze
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Germany.
| | | | | | | | | | | | | |
Collapse
|
240
|
Kotz SA, Meyer M, Paulmann S. Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design. PROGRESS IN BRAIN RESEARCH 2006; 156:285-94. [PMID: 17015086 DOI: 10.1016/s0079-6123(06)56015-7] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Recently, research on the lateralization of linguistic and nonlinguistic (emotional) prosody has experienced a revival. However, both neuroimaging and patient evidence do not draw a coherent picture substantiating right-hemispheric lateralization of prosody and emotional prosody in particular. The current overview summarizes positions and data on the lateralization of emotion and emotional prosodic processing in the brain and proposes that: (1) the realization of emotional prosodic processing in the brain is based on differentially lateralized subprocesses and (2) methodological factors can influence the lateralization of emotional prosody in neuroimaging investigations. Latter evidence reveals that emotional valence effects are strongly right lateralized in studies using compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional prosodic valence. These findings suggest a strong interaction between language and emotional prosodic processing.
Collapse
Affiliation(s)
- Sonja A Kotz
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany.
| | | | | |
Collapse
|
241
|
Schirmer A, Kotz SA. Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing. Trends Cogn Sci 2006; 10:24-30. [PMID: 16321562 DOI: 10.1016/j.tics.2005.11.009] [Citation(s) in RCA: 420] [Impact Index Per Article: 22.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2005] [Revised: 09/29/2005] [Accepted: 11/16/2005] [Indexed: 11/17/2022]
Abstract
Vocal perception is particularly important for understanding a speaker's emotional state and intentions because, unlike facial perception, it is relatively independent of speaker distance and viewing conditions. The idea, derived from brain lesion studies, that vocal emotional comprehension is a special domain of the right hemisphere has failed to receive consistent support from neuroimaging. This conflict can be reconciled if vocal emotional comprehension is viewed as a multi-step process with individual neural representations. This view reveals a processing chain that proceeds from the ventral auditory pathway to brain structures implicated in cognition and emotion. Thus, vocal emotional comprehension appears to be mediated by bilateral mechanisms anchored within sensory, cognitive and emotional processing systems.
Collapse
Affiliation(s)
- Annett Schirmer
- Department of Psychology, University of Georgia, Athens, Georgia, USA.
| | | |
Collapse
|
242
|
Grandjean D, Bänziger T, Scherer KR. Intonation as an interface between language and affect. PROGRESS IN BRAIN RESEARCH 2006; 156:235-47. [PMID: 17015083 DOI: 10.1016/s0079-6123(06)56012-1] [Citation(s) in RCA: 72] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/05/2022]
Abstract
The vocal expression of human emotions is embedded within language and the study of intonation has to take into account two interacting levels of information--emotional and semantic meaning. In addition to the discussion of this dual coding system, an extension of Brunswik's lens model is proposed. This model includes the influences of conventions, norms, and display rules (pull effects) and psychobiological mechanisms (push effects) on emotional vocalizations produced by the speaker (encoding) and the reciprocal influences of these two aspects on attributions made by the listener (decoding), allowing the dissociation and systematic study of the production and perception of intonation. Three empirical studies are described as examples of possibilities of dissociating these different phenomena at the behavioral and neurological levels in the study of intonation.
Collapse
Affiliation(s)
- Didier Grandjean
- Swiss Center for Affective Sciences, University of Geneva, 7 rue des Battoirs, 1205 Geneva, Switzerland.
| | | | | |
Collapse
|
243
|
Fonseca RP, Ferreira GD, Liedtke FV, Müller JDL, Sarmento TF, Parente MADMP. Alterações cognitivas, comunicativas e emocionais após lesão hemisférica direita: em busca de uma caracterização da Síndrome do Hemisfério Direito. PSICOLOGIA USP 2006. [DOI: 10.1590/s0103-65642006000400013] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
O conjunto de sinais e sintomas observados após um acometimento neurológico no hemisfério direito pode ser denominado de Síndrome do Hemisfério Direito &– SHD. Esse ensaio teórico tem por objetivo apresentar uma caracterização desse quadro neuropsicológico. A SHD é, então, caracterizada por déficits nas funções cognitivas atenção, percepção, memória, praxias e funções executivas, com a presença de anosognosia, heminegligência sensorial, prosopagnosia, alterações de memória visuo-espacial e de trabalho, dispraxia construtiva e disfunção executiva. Quanto às habilidades comunicativas, a SHD engloba alterações nos componentes discursivo, pragmático-inferencial, léxico-semântico e prosódico. Os déficits de processamento emocional incluem dificuldades de compreensão e produção de emoções a partir de expressões faciais ou emissões vocais e alterações neuropsiquiátricas. Há, no entanto, uma heterogeneidade na sua manifestação. Tendo em vista essa variabilidade de sinais e sintomas, mais estudos de caso e de grupo com indivíduos lesados de hemisfério direito devem ser conduzidos para um melhor entendimento da SHD.
Collapse
|
244
|
Wildgruber D, Ackermann H, Kreifelts B, Ethofer T. Cerebral processing of linguistic and emotional prosody: fMRI studies. PROGRESS IN BRAIN RESEARCH 2006; 156:249-68. [PMID: 17015084 DOI: 10.1016/s0079-6123(06)56013-3] [Citation(s) in RCA: 196] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
During acoustic communication in humans, information about a speaker's emotional state is predominantly conveyed by modulation of the tone of voice (emotional or affective prosody). Based on lesion data, a right hemisphere superiority for cerebral processing of emotional prosody has been assumed. However, the available clinical studies do not yet provide a coherent picture with respect to interhemispheric lateralization effects of prosody recognition and intrahemispheric localization of the respective brain regions. To further delineate the cerebral network engaged in the perception of emotional tone, a series of experiments was carried out based upon functional magnetic resonance imaging (fMRI). The findings obtained from these investigations allow for the separation of three successive processing stages during recognition of emotional prosody: (1) extraction of suprasegmental acoustic information predominantly subserved by right-sided primary and higher order acoustic regions; (2) representation of meaningful suprasegmental acoustic sequences within posterior aspects of the right superior temporal sulcus; (3) explicit evaluation of emotional prosody at the level of the bilateral inferior frontal cortex. Moreover, implicit processing of affective intonation seems to be bound to subcortical regions mediating automatic induction of specific emotional reactions such as activation of the amygdala in response to fearful stimuli. As concerns lower level processing of the underlying suprasegmental acoustic cues, linguistic and emotional prosody seem to share the same right hemisphere neural resources. Explicit judgment of linguistic aspects of speech prosody, however, appears to be linked to left-sided language areas whereas bilateral orbitofrontal cortex has been found involved in explicit evaluation of emotional prosody. These differences in hemispheric lateralization effects might explain that specific impairments in nonverbal emotional communication subsequent to focal brain lesions are relatively rare clinical observations as compared to the more frequent aphasic disorders.
Collapse
Affiliation(s)
- D Wildgruber
- Department of Psychiatry, University of Tübingen, Osianderstr. 24, 72076 Tübingen, Germany.
| | | | | | | |
Collapse
|
245
|
Sander D, Grandjean D, Pourtois G, Schwartz S, Seghier ML, Scherer KR, Vuilleumier P. Emotion and attention interactions in social cognition: Brain regions involved in processing anger prosody. Neuroimage 2005; 28:848-58. [PMID: 16055351 DOI: 10.1016/j.neuroimage.2005.06.023] [Citation(s) in RCA: 253] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2004] [Revised: 06/03/2005] [Accepted: 06/10/2005] [Indexed: 11/21/2022] Open
Abstract
Multiple levels of processing are thought to be involved in the appraisal of emotionally relevant events, with some processes being engaged relatively independently of attention, whereas other processes may depend on attention and current task goals or context. We conducted an event-related fMRI experiment to examine how processing angry voice prosody, an affectively and socially salient signal, is modulated by voluntary attention. To manipulate attention orthogonally to emotional prosody, we used a dichotic listening paradigm in which meaningless utterances, pronounced with either angry or neutral prosody, were presented simultaneously to both ears on each trial. In two successive blocks, participants selectively attended to either the left or right ear and performed a gender-decision on the voice heard on the target side. Our results revealed a functional dissociation between different brain areas. Whereas the right amygdala and bilateral superior temporal sulcus responded to anger prosody irrespective of whether it was heard from a to-be-attended or to-be-ignored voice, the orbitofrontal cortex and the cuneus in medial occipital cortex showed greater activation to the same emotional stimuli when the angry voice was to-be-attended rather than to-be-ignored. Furthermore, regression analyses revealed a strong correlation between orbitofrontal regions and sensitivity on a behavioral inhibition scale measuring proneness to anxiety reactions. Our results underscore the importance of emotion and attention interactions in social cognition by demonstrating that multiple levels of processing are involved in the appraisal of emotionally relevant cues in voices, and by showing a modulation of some emotional responses by both the current task-demands and individual differences.
Collapse
Affiliation(s)
- David Sander
- Geneva Emotion Research Group, Department of Psychology, University of Geneva, Switzerland.
| | | | | | | | | | | | | |
Collapse
|
246
|
Fecteau S, Armony JL, Joanette Y, Belin P. Sensitivity to Voice in Human Prefrontal Cortex. J Neurophysiol 2005; 94:2251-4. [PMID: 15928057 DOI: 10.1152/jn.00329.2005] [Citation(s) in RCA: 69] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We report two functional MRI (fMRI) experiments showing sensitivity to human voice in a region of human left inferior prefrontal cortex, pars orbitalis. The voice-enhanced response was observed for speech as well as nonlinguistic vocalizations and was stronger for emotional than neutral vocalizations. This region could constitute a human prefrontal auditory domain similar to the one recently identified in the macaque brain.
Collapse
Affiliation(s)
- Shirley Fecteau
- Départment de Psychologie, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montreal, Quebec H3C 3J7, Canada.
| | | | | | | |
Collapse
|
247
|
Dias AE, Chien HF, Barbosa ER. O método Lee Silverman para reabilitação da fala na doença de Parkinson. ACTA ACUST UNITED AC 2001. [DOI: 10.34024/rnc.2011.v19.8356] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
As alterações da fala (disfonia e disartria) frequentemente acompanham a evolução da doença de Parkinson (DP). Objetivo. Este estudo revisa o Método Lee Silverman, considerado o mais eficiente para a reabilitação das alterações da fala na DP e atualiza os avanços na sua aplicação. Método. Foi realizada uma pesquisa nas bases de dados MEDLINE, Pubmed e Bireme dos artigos indexados publicados de 1990 a 2010, com as seguintes palavras-chave: Parkinson’s disease, PD, Lee Silverman Voice Treatment, LSVT, LSVT LOUD, LSVT parkinson, voice treatment and PD, voice therapy and PD, communication and PD, dysarthria and PD, dysphonia and PD, speech disorders and PD, voice disorders and PD, hypophonia and PD, speech motor system and PD. Resultados. Na literatura, existe ampla descrição dos resultados de estudos do método Lee Silverman na DP. Os artigos encontrados evidenciam melhora da prosódia, articulação, ressonância, respiração, inteligibilidade, intensidade e qualidade da voz, assim como da deglutição e da expressividade facial. Conclusões. Para a reabilitação da fala dispõe-se de eficientes técnicas fonoaudiológicas. Contudo, conforme a literatura científica, o emprego do Método Lee Silverman é opção vantajosa, pois foi desenvolvido especificamente para a DP. Há inúmeras evidências de sua eficácia e vem sendo continuamente avaliado, ampliando sua aplicabilidade.
Collapse
|