1
|
De Oliveira ACS, Alcântara YB, De Góes VB, Menezes PDL, Chagas EFB, Machado MS, Frizzo ACF. Study of aged central auditory function using the auditory middle latency response. Clinics (Sao Paulo) 2023; 78:100245. [PMID: 37478629 PMCID: PMC10387568 DOI: 10.1016/j.clinsp.2023.100245] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/06/2022] [Revised: 06/05/2023] [Accepted: 06/19/2023] [Indexed: 07/23/2023] Open
Abstract
OBJECTIVE Investigate the auditory function of the elderly using the middle latency potentials. METHODOLOGY Group 1 (G1): 20 healthy individuals of both genders, older than 60 years, without hearing loss. Group 2 (G2): 20 healthy individuals of both sexes, older than 60 years, with hearing loss in frequencies from 4 to 8 kHz. Potential recording was performed with unilateral and bilateral stimulation and the Binaural Interaction Component was calculated. RESULTS Na latency in C3A1 was greater in the stimulation of the right ear in G2 and the amplitude of Na-Pa was greater in the stimulation of the right ear and recording in C3A1 in G1. The latency of the Pa component was higher in the stimulation of the right ear recorded in C4A2. The Pb component in G2 by bilateral stimulation and recorded in C4A2 had higher latency. The first and second negative and positive peaks presented greater amplitude in G1. In C3A1, the 1st negative peak was more negative in G1 and the 2nd positive peak showed greater amplitude in C4A2 in both groups. CONCLUSION The transmission of auditory information to the primary auditory cortex is impaired with aging, especially in unilateral stimulation, reinforced by losses in elderly people with peripheral hearing loss, such as in the binaural interaction at the cortical and subcortical levels. Thus, the AMLR has shown to be a sensitive examination to investigate neuroauditory disorders in the elderly, especially related to high-frequency hearing loss and primary auditory cortex dysfunctions caused by the aging process.
Collapse
Affiliation(s)
| | - Yara Bagali Alcântara
- Postgraduate Program, Faculty of Philosophy and Sciences (FFC), Universidade Estadual Paulista (UNESP), Marília, SP, Brazil
| | - Viviane Borim De Góes
- Postgraduate Program, Faculty of Philosophy and Sciences (FFC), Universidade Estadual Paulista (UNESP), Marília, SP, Brazil
| | - Pedro de Lemos Menezes
- Postgraduate at Program of the Northeast Network of Biotechnology (RENORBIO), Universidade Federal Rural de Pernambuco (UFRPE), Recife, PE, Brazil; Program Research in Health, Centro Universitário CESMAC, Macéio, AL, Brazil; Speech Language Pathology Department, Universidade Estadual de Ciências da Saúde de Alagoas (UNCISAL), Macéio, AL, Brazil
| | - Eduardo Federighi Baisi Chagas
- Postgraduate Program in Structural and Functional Interactions in Rehabilitation, Universidade de Marília (UNIMAR), Marília, SP, Brazil; Postgraduate Program, Faculdade de Medicina da Marília (FAMEMA), Marília, SP, Brazil
| | - Milena Sonsini Machado
- Postgraduate Program, Faculty of Philosophy and Sciences (FFC), Universidade Estadual Paulista (UNESP), Marília, SP, Brazil
| | - Ana Claudia Figueiredo Frizzo
- Speech Language Pathology Department and Graduate Program in Speech Language Pathology, Faculdade de Filosofia e Ciências (FFC), Universidade Estadual Paulista (UNESP), Marília, SP, Brazil.
| |
Collapse
|
2
|
Canonical finger-numeral configurations facilitate the processing of Arabic numerals in adults: An Event-Related Potential study. Neuropsychologia 2022; 170:108214. [DOI: 10.1016/j.neuropsychologia.2022.108214] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Revised: 02/21/2022] [Accepted: 03/16/2022] [Indexed: 11/16/2022]
|
3
|
Electrophysiological evidence for internalized representations of canonical finger-number gestures and their facilitating effects on adults' math verification performance. Sci Rep 2021; 11:11776. [PMID: 34083708 PMCID: PMC8175394 DOI: 10.1038/s41598-021-91303-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2020] [Accepted: 05/21/2021] [Indexed: 11/09/2022] Open
Abstract
Fingers facilitate number learning and arithmetic processing in early childhood. The current study investigated whether images of early-learned, culturally-typical (canonical), finger montring patterns presenting smaller (2,3,4) or larger (7,8,9) quantities still facilitate adults' performance and neural processing in a math verification task. Twenty-eight adults verified solutions to simple addition problems that were shown in the form of canonical or non-canonical finger-number montring patterns while measuring Event Related Potentials (ERPs). Results showed more accurate and faster sum verification when sum solutions were shown by canonical (versus non-canonical) finger patterns. Canonical finger montring patterns 2-4 led to faster responses independent of whether they presented correct or incorrect sum solutions and elicited an enhanced early right-parietal P2p response, whereas canonical configurations 7-9 only facilitated performance in correct sum solution trials without evoking P2p effects. The later central-parietal P3 was enhanced to all canonical finger patterns irrespective of numerical range. These combined results provide behavioral and brain evidence for canonical cardinal finger patterns still having facilitating effects on adults' number processing. They further suggest that finger montring configurations of numbers 2-4 have stronger internalized associations with other magnitude representations, possibly established through their mediating role in the developmental phase in which children acquire the numerical meaning of the first four number symbols.
Collapse
|
4
|
Fronda G, Balconi M. The effect of interbrain synchronization in gesture observation: A fNIRS study. Brain Behav 2020; 10:e01663. [PMID: 32469153 PMCID: PMC7375069 DOI: 10.1002/brb3.1663] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 02/23/2020] [Accepted: 04/20/2020] [Indexed: 12/24/2022] Open
Abstract
INTRODUCTION Gestures characterize individuals' nonverbal communicative exchanges, taking on different functions. Several types of research in the neuroscientific field have been interested in the investigation of the neural correlates underlying the observation and implementation of different gestures categories. In particular, different studies have focused on the neural correlates underlying gestures observation, emphasizing the presence of mirroring mechanisms in specific brain areas, which appear to be involved in gesture observation and planning mechanisms. MATERIALS AND METHODS Specifically, the present study aimed to investigate the neural mechanisms, through the use of functional Near-Infrared Spectroscopy (fNIRS), underlying the observation of affective, social, and informative gestures with positive and negative valence in individuals' dyads composed by encoder and decoder. The variations of oxygenated (O2Hb) and deoxygenated (HHb) hemoglobin concentrations of both individuals were collected simultaneously through the use of hyperscanning paradigm, allowing the recording of brain responsiveness and interbrain connectivity. RESULTS The results showed a different brain activation and an increase of interbrain connectivity according to the type of gestures observed, with a significant increase of O2Hb brain responsiveness and interbrain connectivity and a decrease of HHb brain responsiveness for affective gestures in the dorsolateral prefrontal cortex (DLPFC) and for social gestures in the superior frontal gyrus (SFG). Furthermore, concerning the valence of the observed gestures, an increase of O2Hb brain activity and interbrain connectivity was observed in the left DLPFC for positive affective gestures compared to negative ones. CONCLUSION In conclusion, the present study showed different brain responses underlying the observation of different types of positive and negative gestures. Moreover, interbrain connectivity calculation allowed us to underline the presence of mirroring mechanisms involved in gesture-specific frontal regions during gestures observation and action planning.
Collapse
Affiliation(s)
- Giulia Fronda
- Department of Psychology, Catholic University of Milan, Milan, Italy.,Research Unit in Affective and Social Neuroscience, Catholic University of Milan, Milan, Italy
| | - Michela Balconi
- Department of Psychology, Catholic University of Milan, Milan, Italy.,Research Unit in Affective and Social Neuroscience, Catholic University of Milan, Milan, Italy
| |
Collapse
|
5
|
Balconi M, Fronda G, Bartolo A. Affective, Social, and Informative Gestures Reproduction in Human Interaction: Hyperscanning and Brain Connectivity. J Mot Behav 2020; 53:296-315. [PMID: 32525458 DOI: 10.1080/00222895.2020.1774490] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
Gestural communication characterizes daily individuals' interactions in order to share information and to modify others' behavior. Social neuroscience has investigated the neural bases which support recognizing of different gestures. The present research, through the use of the hyperscanning approach, that allows the simultaneously recording of the activity of two or more individuals involved in a joint action, aims to investigate the neural bases of gestural communication. Moreover, by using hyperscanning paradigm we explore the inter-brain connectivity between two inter-agents, the one who performed the gesture (encoder) and the one who received it (decoder), with functional Near-infrared Spectroscopy (fNIRS) during the reproduction of affective, social and informative gestures with positive and negative valence. Result showed an increase in oxygenated hemoglobin concentration (O2Hb) and inter-brain connectivity in the dorsolateral prefrontal cortex (DLPFC) for affective gestures, in the superior frontal gyrus (SFG) for social gestures and the frontal eye fields (FEF) for informative gestures, for both encoder and decoder. Furthermore, it emerged that positive gestures activate more the left DLPFC, with an increase in inter-brain connectivity in DLPFC and SFG. The present study revealed the relevant function of the type and valence of gestures in affecting intra- and inter-brain connectivity.
Collapse
Affiliation(s)
- Michela Balconi
- Research Unit in Affective and Social Neuroscience, Catholic University of the Sacred Heart, Milan, Italy.,Department of Psychology, Catholic University of the Sacred Heart, Milan, Italy
| | - Giulia Fronda
- Research Unit in Affective and Social Neuroscience, Catholic University of the Sacred Heart, Milan, Italy.,Department of Psychology, Catholic University of the Sacred Heart, Milan, Italy
| | - Angela Bartolo
- Univ. Lille, CNRS, CHU Lille, UMR 9193 - SCALab - Sciences Cognitives et Sciences Affectives, F-59000 Lille, France.,Institut Universitaire de France (IUF), France
| |
Collapse
|
6
|
Zhang X, Ran G, Xu W, Ma Y, Chen X. Adult Attachment Affects Neural Response to Preference-Inferring in Ambiguous Scenarios: Evidence From an fMRI Study. Front Psychol 2018; 9:139. [PMID: 29559932 PMCID: PMC5845741 DOI: 10.3389/fpsyg.2018.00139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2017] [Accepted: 01/29/2018] [Indexed: 11/13/2022] Open
Abstract
Humans are highly social animals, and the ability to cater to the preferences of other individuals is encouraged by society. Preference-inferring is an important aspect of the theory of mind (TOM). Many previous studies have shown that attachment style is closely related to TOM ability. However, little is known about the effects of adult attachment style on preferences inferring under different levels of certainty. Here, we investigated how adult attachment style affects neural activity underlying preferences inferred under different levels of certainty by using functional magnetic resonance imaging (fMRI). The fMRI results demonstrated that adult attachment influenced the activation of anterior insula (AI) and inferior parietal lobule (IPL) in response to ambiguous preference-inferring. More specifically, in the ambiguous preference condition, the avoidant attached groups exhibited a significantly enhanced activation than secure and anxious attached groups in left IPL; the anxious attached groups exhibited a significantly reduced activation secure attached group in left IPL. In addition, the anxious attached groups exhibited a significantly reduced activation than secure and avoidant attached groups in left AI. These results were also further confirmed by the subsequent PPI analysis. The results from current study suggest that, under ambiguous situations, the avoidant attached individuals show lower sensitivity to the preference of other individuals and need to invest more cognitive resources for preference-reasoning; while compared with avoidant attached group, the anxious attached individuals express high tolerance for uncertainty and a higher ToM proficiency. Results from the current study imply that differences in preference-inferring under ambiguous conditions associated with different levels of individual attachment may explain the differences in interpersonal interaction.
Collapse
Affiliation(s)
- Xing Zhang
- Faculty of Psychology, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Southwest University, Chongqing, China
| | - Guangming Ran
- Institute of Education, China West Normal University, Nanchong, China
| | - Wenjian Xu
- Faculty of Psychology, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Southwest University, Chongqing, China
| | - Yuanxiao Ma
- Faculty of Psychology, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Southwest University, Chongqing, China
| | - Xu Chen
- Faculty of Psychology, Southwest University, Chongqing, China
- Key Laboratory of Cognition and Personality, Southwest University, Chongqing, China
- *Correspondence: Xu Chen,
| |
Collapse
|
7
|
Wolf D, Rekittke LM, Mittelberg I, Klasen M, Mathiak K. Perceived Conventionality in Co-speech Gestures Involves the Fronto-Temporal Language Network. Front Hum Neurosci 2017; 11:573. [PMID: 29249945 PMCID: PMC5714878 DOI: 10.3389/fnhum.2017.00573] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2017] [Accepted: 11/13/2017] [Indexed: 11/16/2022] Open
Abstract
Face-to-face communication is multimodal; it encompasses spoken words, facial expressions, gaze, and co-speech gestures. In contrast to linguistic symbols (e.g., spoken words or signs in sign language) relying on mostly explicit conventions, gestures vary in their degree of conventionality. Bodily signs may have a general accepted or conventionalized meaning (e.g., a head shake) or less so (e.g., self-grooming). We hypothesized that subjective perception of conventionality in co-speech gestures relies on the classical language network, i.e., the left hemispheric inferior frontal gyrus (IFG, Broca's area) and the posterior superior temporal gyrus (pSTG, Wernicke's area) and studied 36 subjects watching video-recorded story retellings during a behavioral and an functional magnetic resonance imaging (fMRI) experiment. It is well documented that neural correlates of such naturalistic videos emerge as intersubject covariance (ISC) in fMRI even without involving a stimulus (model-free analysis). The subjects attended either to perceived conventionality or to a control condition (any hand movements or gesture-speech relations). Such tasks modulate ISC in contributing neural structures and thus we studied ISC changes to task demands in language networks. Indeed, the conventionality task significantly increased covariance of the button press time series and neuronal synchronization in the left IFG over the comparison with other tasks. In the left IFG, synchronous activity was observed during the conventionality task only. In contrast, the left pSTG exhibited correlated activation patterns during all conditions with an increase in the conventionality task at the trend level only. Conceivably, the left IFG can be considered a core region for the processing of perceived conventionality in co-speech gestures similar to spoken language. In general, the interpretation of conventionalized signs may rely on neural mechanisms that engage during language comprehension.
Collapse
Affiliation(s)
- Dhana Wolf
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Aachen, Germany.,Natural Media Lab, Human Technology Centre, RWTH Aachen, Aachen, Germany.,Center for Sign Language and Gesture (SignGes), RWTH Aachen, Aachen, Germany
| | - Linn-Marlen Rekittke
- Natural Media Lab, Human Technology Centre, RWTH Aachen, Aachen, Germany.,Center for Sign Language and Gesture (SignGes), RWTH Aachen, Aachen, Germany
| | - Irene Mittelberg
- Natural Media Lab, Human Technology Centre, RWTH Aachen, Aachen, Germany.,Center for Sign Language and Gesture (SignGes), RWTH Aachen, Aachen, Germany
| | - Martin Klasen
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen, Aachen, Germany
| | - Klaus Mathiak
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Aachen, Germany.,Center for Sign Language and Gesture (SignGes), RWTH Aachen, Aachen, Germany.,JARA-Translational Brain Medicine, RWTH Aachen, Aachen, Germany
| |
Collapse
|
8
|
Gunter TC, Weinbrenner JED. When to Take a Gesture Seriously: On How We Use and Prioritize Communicative Cues. J Cogn Neurosci 2017; 29:1355-1367. [PMID: 28358659 DOI: 10.1162/jocn_a_01125] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
When people talk, their speech is often accompanied by gestures. Although it is known that co-speech gestures can influence face-to-face communication, it is currently unclear to what extent they are actively used and under which premises they are prioritized to facilitate communication. We investigated these open questions in two experiments that varied how pointing gestures disambiguate the utterances of an interlocutor. Participants, whose event-related brain responses were measured, watched a video, where an actress was interviewed about, for instance, classical literature (e.g., Goethe and Shakespeare). While responding, the actress pointed systematically to the left side to refer to, for example, Goethe, or to the right to refer to Shakespeare. Her final statement was ambiguous and combined with a pointing gesture. The P600 pattern found in Experiment 1 revealed that, when pointing was unreliable, gestures were only monitored for their cue validity and not used for reference tracking related to the ambiguity. However, when pointing was a valid cue (Experiment 2), it was used for reference tracking, as indicated by a reduced N400 for pointing. In summary, these findings suggest that a general prioritization mechanism is in use that constantly monitors and evaluates the use of communicative cues against communicative priors on the basis of accumulated error information.
Collapse
Affiliation(s)
- Thomas C Gunter
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | | |
Collapse
|
9
|
Schlaffke L, Rüther NN, Heba S, Haag LM, Schultz T, Rosengarth K, Tegenthoff M, Bellebaum C, Schmidt-Wilcke T. From perceptual to lexico-semantic analysis--cortical plasticity enabling new levels of processing. Hum Brain Mapp 2015; 36:4512-28. [PMID: 26304153 PMCID: PMC5049624 DOI: 10.1002/hbm.22939] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2015] [Revised: 07/23/2015] [Accepted: 07/30/2015] [Indexed: 11/09/2022] Open
Abstract
Certain kinds of stimuli can be processed on multiple levels. While the neural correlates of different levels of processing (LOPs) have been investigated to some extent, most of the studies involve skills and/or knowledge already present when performing the task. In this study we specifically sought to identify neural correlates of an evolving skill that allows the transition from perceptual to a lexico‐semantic stimulus analysis. Eighteen participants were trained to decode 12 letters of Morse code that were presented acoustically inside and outside of the scanner environment. Morse code was presented in trains of three letters while brain activity was assessed with fMRI. Participants either attended to the stimulus length (perceptual analysis), or evaluated its meaning distinguishing words from nonwords (lexico‐semantic analysis). Perceptual and lexico‐semantic analyses shared a mutual network comprising the left premotor cortex, the supplementary motor area (SMA) and the inferior parietal lobule (IPL). Perceptual analysis was associated with a strong brain activation in the SMA and the superior temporal gyrus bilaterally (STG), which remained unaltered from pre and post training. In the lexico‐semantic analysis post learning, study participants showed additional activation in the left inferior frontal cortex (IFC) and in the left occipitotemporal cortex (OTC), regions known to be critically involved in lexical processing. Our data provide evidence for cortical plasticity evolving with a learning process enabling the transition from perceptual to lexico‐semantic stimulus analysis. Importantly, the activation pattern remains task‐related LOP and is thus the result of a decision process as to which LOP to engage in. Hum Brain Mapp 36:4512–4528, 2015. © 2015 The Authors. Human Brain Mapping Published byWiley Periodicals, Inc.
Collapse
Affiliation(s)
- Lara Schlaffke
- Department of Neurology, BG-University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
| | - Naima N Rüther
- Department of Neuropsychology, Ruhr-University Bochum, Bochum, Germany
| | - Stefanie Heba
- Department of Neurology, BG-University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
| | - Lauren M Haag
- Department of Neurology, BG-University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
| | - Thomas Schultz
- Department of Computer Science, University of Bonn, Germany
| | | | - Martin Tegenthoff
- Department of Neurology, BG-University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
| | - Christian Bellebaum
- Department of Neuropsychology, Ruhr-University Bochum, Bochum, Germany.,Department of Psychology, Heinrich-Heine University Düsseldorf, Germany
| | - Tobias Schmidt-Wilcke
- Department of Neurology, BG-University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
10
|
Modulation of Gestural-verbal Semantic Integration by tDCS. Brain Stimul 2015; 8:493-8. [DOI: 10.1016/j.brs.2014.12.001] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2014] [Revised: 09/28/2014] [Accepted: 12/04/2014] [Indexed: 11/20/2022] Open
|
11
|
Reis ACMB, Frizzo ACF, Isaac MDL, Garcia CFD, Funayama CAR, Iório MCM. P300 in individuals with sensorineural hearing loss. Braz J Otorhinolaryngol 2015; 81:126-32. [PMID: 25458253 PMCID: PMC9448995 DOI: 10.1016/j.bjorl.2014.10.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2013] [Accepted: 03/22/2014] [Indexed: 11/06/2022] Open
Abstract
Introduction Behavioral and electrophysiological auditory evaluations contribute to the understanding of the auditory system and of the process of intervention. Objective To study P300 in subjects with severe or profound sensorineural hearing loss. Methods This was a descriptive cross-sectional prospective study. It included 29 individuals of both genders with severe or profound sensorineural hearing loss without other type of disorders, aged 11 to 42 years; all were assessed by behavioral audiological evaluation and auditory evoked potentials. Results A recording of the P3 wave was obtained in 17 individuals, with a mean latency of 326.97 ms and mean amplitude of 3.76 V. There were significant differences in latency in relation to age and in amplitude according to degree of hearing loss. There was a statistically significant association of the P300 results with the degrees of hearing loss (p = 0.04), with the predominant auditory communication channels (p < 0.0001), and with time of hearing loss. Conclusions P300 can be recorded in individuals with severe and profound congenital sensorineural hearing loss; it may contribute to the understanding of cortical development and is a good predictor of the early intervention outcome.
Collapse
|
12
|
Fabbri-Destro M, Avanzini P, De Stefani E, Innocenti A, Campi C, Gentilucci M. Interaction Between Words and Symbolic Gestures as Revealed By N400. Brain Topogr 2014; 28:591-605. [DOI: 10.1007/s10548-014-0392-4] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 08/08/2014] [Indexed: 11/25/2022]
|
13
|
Malaia E, Talavage TM, Wilbur RB. Functional connectivity in task-negative network of the Deaf: effects of sign language experience. PeerJ 2014; 2:e446. [PMID: 25024915 PMCID: PMC4081178 DOI: 10.7717/peerj.446] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 06/02/2014] [Indexed: 01/23/2023] Open
Abstract
Prior studies investigating cortical processing in Deaf signers suggest that life-long experience with sign language and/or auditory deprivation may alter the brain’s anatomical structure and the function of brain regions typically recruited for auditory processing (Emmorey et al., 2010; Pénicaud et al., 2013 inter alia). We report the first investigation of the task-negative network in Deaf signers and its functional connectivity—the temporal correlations among spatially remote neurophysiological events. We show that Deaf signers manifest increased functional connectivity between posterior cingulate/precuneus and left medial temporal gyrus (MTG), but also inferior parietal lobe and medial temporal gyrus in the right hemisphere- areas that have been found to show functional recruitment specifically during sign language processing. These findings suggest that the organization of the brain at the level of inter-network connectivity is likely affected by experience with processing visual language, although sensory deprivation could be another source of the difference. We hypothesize that connectivity alterations in the task negative network reflect predictive/automatized processing of the visual signal.
Collapse
Affiliation(s)
- Evie Malaia
- Center for Mind, Brain, and Education, University of Texas at Arlington , TX , USA
| | - Thomas M Talavage
- Weldon School of Biomedical Engineering, Purdue University , IN , USA ; School of Electrical and Computer Engineering, Purdue University , IN , USA
| | - Ronnie B Wilbur
- Speech, Language, and Hearing Sciences, and Linguistics Program, Purdue University , IN , USA
| |
Collapse
|
14
|
Spatiotemporal dynamics of early cortical gesture processing. Neuroimage 2014; 99:42-9. [PMID: 24875144 DOI: 10.1016/j.neuroimage.2014.05.061] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2014] [Revised: 05/18/2014] [Accepted: 05/19/2014] [Indexed: 11/20/2022] Open
Abstract
Gesture processing has been consistently shown to be associated with activation of the inferior parietal lobe (IPL); however, little is known about the integration of IPL activation into the temporal dynamics of early sensory areas. Using a temporally graded repetition suppression paradigm, we examined the activation and time course of brain areas involved in hand gesture processing. We recorded event-related potentials in response to stimulus pairs of static hand images forming gestures of the popular rock-paper-scissors game and estimated their neuronal generators. We identified two main components associated with adaptive patterns related to stimulus repetition. The N190 component elicited at temporo-parietal sites adapted to repetitions of the same gesture and was associated with right-hemispheric extrastriate body area activation. A later component at parieto-occipital sites demonstrated temporally graded adaptation effects for all gestures with a left-hemispheric dominance. Source localization revealed concurrent activations of the right extrastriate body area, fusiform gyri bilaterally, and the left IPL at about 250 ms. The adaptation pattern derived from the graded repetition suppression paradigm demonstrates the functional sensitivity of these sources to gesture processing. Given the literature on IPL contribution to imitation, action recognition, and action execution, IPL activation at about 250 ms may represent the access into specific cognitive routes for gesture processing and may thus be involved in integrating sensory information from cortical body areas into subsequent visuo-motor transformation processes.
Collapse
|
15
|
Nakamura A, Maess B, Knösche TR, Friederici AD. Different hemispheric roles in recognition of happy expressions. PLoS One 2014; 9:e88628. [PMID: 24520407 PMCID: PMC3919788 DOI: 10.1371/journal.pone.0088628] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2013] [Accepted: 01/13/2014] [Indexed: 11/19/2022] Open
Abstract
The emotional expression of the face provides an important social signal that allows humans to make inferences about other people's state of mind. However, the underlying brain mechanisms are complex and still not completely understood. Using magnetoencephalography (MEG), we analyzed the spatiotemporal structure of regional electrical brain activity in human adults during a categorization task (faces or hands) and an emotion discrimination task (happy faces or neutral faces). Brain regions that are specifically important for different aspects of processing emotional facial expressions showed interesting hemispheric dominance patterns. The dorsal brain regions showed a right predominance when participants paid attention to facial expressions: The right parietofrontal regions, including the somatosensory, motor/premotor, and inferior frontal cortices showed significantly increased activation in the emotion discrimination task, compared to in the categorization task, in latencies of 350 to 550 ms, while no activation was found in their left hemispheric counterparts. Furthermore, a left predominance of the ventral brain regions was shown for happy faces, compared to neutral faces, in latencies of 350 to 550 ms within the emotion discrimination task. Thus, the present data suggest that the right and left hemispheres play different roles in the recognition of facial expressions depending on cognitive context.
Collapse
Affiliation(s)
- Akinori Nakamura
- Department of Clinical and Experimental Neuroimaging, National Center for Geriatrics and Gerontology, Obu, Japan
- Method and Developmental Group “MEG and EEG: Signal Analysis and Modelling”, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Burkhard Maess
- Method and Developmental Group “MEG and EEG: Signal Analysis and Modelling”, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Thomas R. Knösche
- Method and Developmental Group “Cortical Networks and Cognitive Functions”, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Angela D. Friederici
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
16
|
Straube B, He Y, Steines M, Gebhardt H, Kircher T, Sammer G, Nagels A. Supramodal neural processing of abstract information conveyed by speech and gesture. Front Behav Neurosci 2013; 7:120. [PMID: 24062652 PMCID: PMC3772311 DOI: 10.3389/fnbeh.2013.00120] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2013] [Accepted: 08/24/2013] [Indexed: 11/13/2022] Open
Abstract
Abstractness and modality of interpersonal communication have a considerable impact on comprehension. They are relevant for determining thoughts and constituting internal models of the environment. Whereas concrete object-related information can be represented in mind irrespective of language, abstract concepts require a representation in speech. Consequently, modality-independent processing of abstract information can be expected. Here we investigated the neural correlates of abstractness (abstract vs. concrete) and modality (speech vs. gestures), to identify an abstractness-specific supramodal neural network. During fMRI data acquisition 20 participants were presented with videos of an actor either speaking sentences with an abstract-social [AS] or concrete-object-related content [CS], or performing meaningful abstract-social emblematic [AG] or concrete-object-related tool-use gestures [CG]. Gestures were accompanied by a foreign language to increase the comparability between conditions and to frame the communication context of the gesture videos. Participants performed a content judgment task referring to the person vs. object-relatedness of the utterances. The behavioral data suggest a comparable comprehension of contents communicated by speech or gesture. Furthermore, we found common neural processing for abstract information independent of modality (AS > CS ∩ AG > CG) in a left hemispheric network including the left inferior frontal gyrus (IFG), temporal pole, and medial frontal cortex. Modality specific activations were found in bilateral occipital, parietal, and temporal as well as right inferior frontal brain regions for gesture (G > S) and in left anterior temporal regions and the left angular gyrus for the processing of speech semantics (S > G). These data support the idea that abstract concepts are represented in a supramodal manner. Consequently, gestures referring to abstract concepts are processed in a predominantly left hemispheric language related neural network.
Collapse
Affiliation(s)
- Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg Marburg, Germany
| | | | | | | | | | | | | |
Collapse
|
17
|
Lexical and gestural symbols in left-damaged patients. Cortex 2013; 49:1668-78. [DOI: 10.1016/j.cortex.2012.09.003] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2012] [Revised: 08/07/2012] [Accepted: 09/05/2012] [Indexed: 11/20/2022]
|
18
|
Straube B, Green A, Weis S, Kircher T. A supramodal neural network for speech and gesture semantics: an fMRI study. PLoS One 2012; 7:e51207. [PMID: 23226488 PMCID: PMC3511386 DOI: 10.1371/journal.pone.0051207] [Citation(s) in RCA: 51] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2012] [Accepted: 10/30/2012] [Indexed: 12/03/2022] Open
Abstract
In a natural setting, speech is often accompanied by gestures. As language, speech-accompanying iconic gestures to some extent convey semantic information. However, if comprehension of the information contained in both the auditory and visual modality depends on same or different brain-networks is quite unknown. In this fMRI study, we aimed at identifying the cortical areas engaged in supramodal processing of semantic information. BOLD changes were recorded in 18 healthy right-handed male subjects watching video clips showing an actor who either performed speech (S, acoustic) or gestures (G, visual) in more (+) or less (−) meaningful varieties. In the experimental conditions familiar speech or isolated iconic gestures were presented; during the visual control condition the volunteers watched meaningless gestures (G−), while during the acoustic control condition a foreign language was presented (S−). The conjunction of the visual and acoustic semantic processing revealed activations extending from the left inferior frontal gyrus to the precentral gyrus, and included bilateral posterior temporal regions. We conclude that proclaiming this frontotemporal network the brain's core language system is to take too narrow a view. Our results rather indicate that these regions constitute a supramodal semantic processing network.
Collapse
Affiliation(s)
- Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Marburg, Germany.
| | | | | | | |
Collapse
|
19
|
Abstract
When people talk to each other, they often make arm and hand movements that accompany what they say. These manual movements, called “co-speech gestures,” can convey meaning by way of their interaction with the oral message. Another class of manual gestures, called “emblematic gestures” or “emblems,” also conveys meaning, but in contrast to co-speech gestures, they can do so directly and independent of speech. There is currently significant interest in the behavioral and biological relationships between action and language. Since co-speech gestures are actions that rely on spoken language, and emblems convey meaning to the effect that they can sometimes substitute for speech, these actions may be important, and potentially informative, examples of language–motor interactions. Researchers have recently been examining how the brain processes these actions. The current results of this work do not yet give a clear understanding of gesture processing at the neural level. For the most part, however, it seems that two complimentary sets of brain areas respond when people see gestures, reflecting their role in disambiguating meaning. These include areas thought to be important for understanding actions and areas ordinarily related to processing language. The shared and distinct responses across these two sets of areas during communication are just beginning to emerge. In this review, we talk about the ways that the brain responds when people see gestures, how these responses relate to brain activity when people process language, and how these might relate in normal, everyday communication.
Collapse
Affiliation(s)
- Michael Andric
- Department of Psychology, The University of Chicago Chicago, IL, USA
| | | |
Collapse
|
20
|
Whitmarsh S, Nieuwenhuis ILC, Barendregt HP, Jensen O. Sensorimotor Alpha Activity is Modulated in Response to the Observation of Pain in Others. Front Hum Neurosci 2011; 5:91. [PMID: 22007165 PMCID: PMC3188815 DOI: 10.3389/fnhum.2011.00091] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2011] [Accepted: 08/11/2011] [Indexed: 11/13/2022] Open
Abstract
The perception-action account of empathy states that observation of another person's state automatically activates a similar state in the observer. It is still unclear in what way ongoing sensorimotor alpha oscillations are involved in this process. Although they have been repeatedly implicated in (biological) action observation and understanding communicative gestures, less is known about their role in vicarious pain observation. Their role is understood as providing a graded inhibition through functional inhibition, thereby streamlining information flow through the cortex. Although alpha oscillations have been shown to have at least visual and sensorimotor origins, only the latter are expected to be involved in the empathetic response. Here, we used magnetoencephalography, allowing us to spatially distinguish and localize oscillatory components using beamformer source reconstruction. Subjects observed realistic pictures of limbs in painful and no-pain (control) conditions. As predicted, time-frequency analysis indeed showed increased alpha suppression in the pain condition compared to the no-pain condition. Although both pain and no-pain conditions suppressed alpha- and beta-band activity at both posterior and central sensors, the pain condition suppressed alpha more only at central sensors. Source reconstruction localized these differences along the central sulcus. Our results could not be accounted for by differences in the evoked fields, suggesting a unique role of oscillatory activity in empathetic responses. We argue that alpha oscillations provide a unique measure of the underlying functional architecture of the brain, suggesting an automatic disinhibition of the sensorimotor cortices in response to the observation of pain in others.
Collapse
Affiliation(s)
- Stephen Whitmarsh
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Nijmegen, Netherlands
| | | | | | | |
Collapse
|
21
|
Enrici I, Adenzato M, Cappa S, Bara BG, Tettamanti M. Intention Processing in Communication: A Common Brain Network for Language and Gestures. J Cogn Neurosci 2011; 23:2415-31. [DOI: 10.1162/jocn.2010.21594] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Human communicative competence is based on the ability to process a specific class of mental states, namely, communicative intention. The present fMRI study aims to analyze whether intention processing in communication is affected by the expressive means through which a communicative intention is conveyed, that is, the linguistic or extralinguistic gestural means. Combined factorial and conjunction analyses were used to test two sets of predictions: first, that a common brain network is recruited for the comprehension of communicative intentions independently of the modality through which they are conveyed; second, that additional brain areas are specifically recruited depending on the communicative modality used, reflecting distinct sensorimotor gateways. Our results clearly showed that a common neural network is engaged in communicative intention processing independently of the modality used. This network includes the precuneus, the left and right posterior STS and TPJ, and the medial pFC. Additional brain areas outside those involved in intention processing are specifically engaged by the particular communicative modality, that is, a peri-sylvian language network for the linguistic modality and a sensorimotor network for the extralinguistic modality. Thus, common representation of communicative intention may be accessed by modality-specific gateways, which are distinct for linguistic versus extralinguistic expressive means. Taken together, our results indicate that the information acquired by different communicative modalities is equivalent from a mental processing standpoint, in particular, at the point at which the actor's communicative intention has to be reconstructed.
Collapse
Affiliation(s)
| | - Mauro Adenzato
- 1University of Torino, Italy
- 2Neuroscience Institute of Turin, Italy
| | - Stefano Cappa
- 3Vita-Salute San Raffaele University, Milan, Italy
- 4Scientific Institute HSR, Milan, Italy
| | - Bruno G. Bara
- 1University of Torino, Italy
- 2Neuroscience Institute of Turin, Italy
| | | |
Collapse
|
22
|
The effect of musical experience on hemispheric lateralization in musical feature processing. Neurosci Lett 2011; 496:141-5. [DOI: 10.1016/j.neulet.2011.04.002] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2010] [Revised: 04/01/2011] [Accepted: 04/05/2011] [Indexed: 11/20/2022]
|
23
|
Streltsova A, Berchio C, Gallese V, Umilta' MA. Time course and specificity of sensory-motor alpha modulation during the observation of hand motor acts and gestures: a high density EEG study. Exp Brain Res 2010; 205:363-73. [PMID: 20680250 PMCID: PMC2923333 DOI: 10.1007/s00221-010-2371-7] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2010] [Accepted: 07/14/2010] [Indexed: 11/12/2022]
Abstract
The main aim of the present study was to explore, by means of high-density EEG, the intensity and the temporal pattern of event-related sensory-motor alpha desynchronization (ERD) during the observation of different types of hand motor acts and gestures. In particular, we aimed to investigate whether the sensory-motor ERD would show a specific modulation during the observation of hand behaviors differing for goal-relatedness (hand grasping of an object and meaningless hand movements) and social relevance (communicative hand gestures and grasping within a social context). Time course analysis of alpha suppression showed that all types of hand behaviors were effective in triggering sensory-motor alpha ERD, but to a different degree depending on the category of observed hand motor acts and gestures. Meaningless gestures and hand grasping were the most effective stimuli, resulting in the strongest ERD. The observation of social hand behaviors such as social grasping and communicative gestures, triggered a more dynamic time course of ERD compared to that driven by the observation of simple grasping and meaningless gestures. These findings indicate that the observation of hand motor acts and gestures evoke the activation of a motor resonance mechanism that differs on the basis of the goal-relatedness and the social relevance of the observed hand behavior.
Collapse
|
24
|
Flaisch T, Schupp HT, Renner B, Junghöfer M. Neural systems of visual attention responding to emotional gestures. Neuroimage 2009; 45:1339-46. [PMID: 19349245 DOI: 10.1016/j.neuroimage.2008.12.073] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2008] [Revised: 12/16/2008] [Accepted: 12/31/2008] [Indexed: 11/27/2022] Open
Affiliation(s)
- Tobias Flaisch
- Department of Psychology, University of Konstanz, P.O. Box D 36, 78457 Konstanz, Germany
| | | | | | | |
Collapse
|
25
|
Commonalities in the neural mechanisms underlying automatic attentional shifts by gaze, gestures, and symbols. Neuroimage 2009; 45:984-92. [PMID: 19167506 DOI: 10.1016/j.neuroimage.2008.12.052] [Citation(s) in RCA: 58] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2008] [Revised: 11/27/2008] [Accepted: 12/29/2008] [Indexed: 11/23/2022] Open
Abstract
Eye gaze, hand-pointing gestures, and arrows automatically trigger attentional shifts. Although it has been suggested that common neural mechanisms underlie these three types of attentional shifts, this issue remains unsettled. We measured brain activity using fMRI while participants observed directional and non-directional stimuli, including eyes, hands, and arrows, to investigate this issue. Conjunction analyses revealed that the posterior superior temporal sulcus (STS), the inferior parietal lobule, the inferior frontal gyrus, and the occipital cortices in the right hemisphere were more active in common in response to directional versus non-directional stimuli. These results suggest commonalities in the neurocognitive mechanisms underlying the automatic attentional shifts triggered by gaze, gestures, and symbols.
Collapse
|
26
|
Bach P, Gunter TC, Knoblich G, Prinz W, Friederici AD. N400-like negativities in action perception reflect the activation of two components of an action representation. Soc Neurosci 2008; 4:212-32. [PMID: 19023701 DOI: 10.1080/17470910802362546] [Citation(s) in RCA: 60] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
The understanding of actions of tool use depends on the motor act that is performed and on the function of the objects involved in the action. We used event-related potentials (ERPs) to investigate the processes that derive both kinds of information in a task in which inserting actions had to be judged. The actions were presented as two consecutive frames, one showing an effector/instrument and the other showing a potential target object of the action. Two mismatches were possible. An orientation mismatch occurred when the spatial object properties were not consistent with a motor act of insertion being performed (i.e., different orientations of insert and slot). A functional mismatch happened when the instrument (e.g., screwdriver) would usually not be applied to the target object (e.g., keyhole). The order in which instrument and target object were presented was also varied. The two kinds of mismatch gave rise to similar but not identical negativities in the latency range of the N400 followed by a positive modulation. The results indicate that the motor act and the function of the objects are derived by two at least partially different subprocesses and become integrated into a common representation of the observed action.
Collapse
Affiliation(s)
- Patric Bach
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | | | | | | | | |
Collapse
|
27
|
Lui F, Buccino G, Duzzi D, Benuzzi F, Crisi G, Baraldi P, Nichelli P, Porro CA, Rizzolatti G. Neural substrates for observing and imagining non-object-directed actions. Soc Neurosci 2008; 3:261-75. [DOI: 10.1080/17470910701458551] [Citation(s) in RCA: 71] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
28
|
Nan Y, Knösche TR, Zysset S, Friederici AD. Cross-cultural music phrase processing: an fMRI study. Hum Brain Mapp 2008; 29:312-28. [PMID: 17497646 PMCID: PMC6871102 DOI: 10.1002/hbm.20390] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
Abstract
The current study used functional magnetic resonance imaging (fMRI) to investigate the neural basis of musical phrase boundary processing during the perception of music from native and non-native cultures. German musicians performed a cultural categorization task while listening to phrased Western (native) and Chinese (non-native) musical excerpts as well as modified versions of these, where the impression of phrasing has been reduced by removing the phrase boundary marking pause (henceforth called "unphrased"). Bilateral planum temporale was found to be associated with an increased difficulty of identifying phrase boundaries in unphrased Western melodies. A network involving frontal and parietal regions showed increased activation for the phrased condition with the orbital part of left inferior frontal gyrus presumably reflecting working memory aspects of the temporal integration between phrases, and the middle frontal gyrus and intraparietal sulcus probably reflecting attention processes. Areas more active in the culturally familiar, native (Western) condition included, in addition to the left planum temporale and right ventro-medial prefrontal cortex, mainly the bilateral motor regions. These latter results are interpreted in light of sensorimotor integration. Regions with increased signal for the unfamiliar, non-native music style (Chinese) included a right lateralized network of angular gyrus and the middle frontal gyrus, possibly reflecting higher demands on attention systems, and the right posterior insula suggesting higher loads on basic auditory processing.
Collapse
Affiliation(s)
- Yun Nan
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Thomas R. Knösche
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Stefan Zysset
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | | |
Collapse
|
29
|
Molnar-Szakacs I, Wu AD, Robles FJ, Iacoboni M. Do you see what I mean? Corticospinal excitability during observation of culture-specific gestures. PLoS One 2007; 2:e626. [PMID: 17637842 PMCID: PMC1913205 DOI: 10.1371/journal.pone.0000626] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2007] [Accepted: 06/15/2007] [Indexed: 11/18/2022] Open
Abstract
People all over the world use their hands to communicate expressively. Autonomous gestures, also known as emblems, are highly social in nature, and convey conventionalized meaning without accompanying speech. To study the neural bases of cross-cultural social communication, we used single pulse transcranial magnetic stimulation (TMS) to measure corticospinal excitability (CSE) during observation of culture-specific emblems. Foreign Nicaraguan and familiar American emblems as well as meaningless control gestures were performed by both a Euro-American and a Nicaraguan actor. Euro-American participants demonstrated higher CSE during observation of the American compared to the Nicaraguan actor. This motor resonance phenomenon may reflect ethnic and cultural ingroup familiarity effects. However, participants also demonstrated a nearly significant (p = 0.053) actor by emblem interaction whereby both Nicaraguan and American emblems performed by the American actor elicited similar CSE, whereas Nicaraguan emblems performed by the Nicaraguan actor yielded higher CSE than American emblems. The latter result cannot be interpreted simply as an effect of ethnic ingroup familiarity. Thus, a likely explanation of these findings is that motor resonance is modulated by interacting biological and cultural factors.
Collapse
Affiliation(s)
- Istvan Molnar-Szakacs
- Center for the Biology of Creativity, Semel Institute for Neuroscience and Human Behavior, University of California at Los Angeles, Los Angeles, California, United States of America
- Ahmanson-Lovelace Brain Mapping Center, University of California at Los Angeles, Los Angeles, California, United States of America
- FPR-UCLA Center for Culture, Brain and Development, University of California at Los Angeles, Los Angeles, California, United States of America
- Department of Psychiatry and Biobehavioral Sciences, University of California at Los Angeles, Los Angeles, California, United States of America
- * To whom correspondence should be addressed. E-mail:
| | - Allan D. Wu
- Ahmanson-Lovelace Brain Mapping Center, University of California at Los Angeles, Los Angeles, California, United States of America
- Department of Neurology, University of California at Los Angeles, Los Angeles, California, United States of America
| | - Francisco J. Robles
- Ahmanson-Lovelace Brain Mapping Center, University of California at Los Angeles, Los Angeles, California, United States of America
| | - Marco Iacoboni
- Ahmanson-Lovelace Brain Mapping Center, University of California at Los Angeles, Los Angeles, California, United States of America
- FPR-UCLA Center for Culture, Brain and Development, University of California at Los Angeles, Los Angeles, California, United States of America
- Department of Psychiatry and Biobehavioral Sciences, University of California at Los Angeles, Los Angeles, California, United States of America
- Brain Research Institute, David Geffen School of Medicine, University of California at Los Angeles, Los Angeles, California, United States of America
| |
Collapse
|
30
|
Redcay E. The superior temporal sulcus performs a common function for social and speech perception: implications for the emergence of autism. Neurosci Biobehav Rev 2007; 32:123-42. [PMID: 17706781 DOI: 10.1016/j.neubiorev.2007.06.004] [Citation(s) in RCA: 221] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2006] [Revised: 03/08/2007] [Accepted: 06/12/2007] [Indexed: 10/23/2022]
Abstract
Within the cognitive neuroscience literature, discussion of the functional role of the superior temporal sulcus (STS) has traditionally been divided into two domains; one focuses on its activity during language processing while the other emphasizes its role in biological motion and social attention, such as eye gaze processing. I will argue that a common process underlying both of these functional domains is performed by the STS, namely analyzing changing sequences of input, either in the auditory or visual domain, and interpreting the communicative significance of those inputs. From a developmental perspective, the fact that these two domains share an anatomical substrate suggests the acquisition of social and speech perception may be linked. In addition, I will argue that because of the STS' role in interpreting social and speech input, impairments in STS function may underlie many of the social and language abnormalities seen in autism.
Collapse
Affiliation(s)
- Elizabeth Redcay
- Department of Psychology, University of California, San Diego, 8110 La Jolla Shores Dr., Suite 201, La Jolla, CA 92037, USA.
| |
Collapse
|
31
|
Willems RM, Hagoort P. Neural evidence for the interplay between language, gesture, and action: a review. BRAIN AND LANGUAGE 2007; 101:278-89. [PMID: 17416411 DOI: 10.1016/j.bandl.2007.03.004] [Citation(s) in RCA: 189] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/16/2006] [Revised: 02/20/2007] [Accepted: 03/04/2007] [Indexed: 05/14/2023]
Abstract
Co-speech gestures embody a form of manual action that is tightly coupled to the language system. As such, the co-occurrence of speech and co-speech gestures is an excellent example of the interplay between language and action. There are, however, other ways in which language and action can be thought of as closely related. In this paper we will give an overview of studies in cognitive neuroscience that examine the neural underpinnings of links between language and action. Topics include neurocognitive studies of motor representations of speech sounds, action-related language, sign language and co-speech gestures. It will be concluded that there is strong evidence on the interaction between speech and gestures in the brain. This interaction however shares general properties with other domains in which there is interplay between language and action.
Collapse
Affiliation(s)
- Roel M Willems
- F. C. Donders Centre for Cognitive Neuroimaging, Radboud University Nijmegen, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands.
| | | |
Collapse
|
32
|
Kelly SD, Ward S, Creigh P, Bartolotti J. An intentional stance modulates the integration of gesture and speech during comprehension. BRAIN AND LANGUAGE 2007; 101:222-33. [PMID: 16997367 DOI: 10.1016/j.bandl.2006.07.008] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2006] [Revised: 07/03/2006] [Accepted: 07/25/2006] [Indexed: 05/11/2023]
Abstract
The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.
Collapse
Affiliation(s)
- Spencer D Kelly
- Neuroscience Program, Department of Psychology, 13 Oak Dr, Colgate University, Hamitlon, NY 13346, USA.
| | | | | | | |
Collapse
|
33
|
Kobayashi C, Glover GH, Temple E. Children's and adults' neural bases of verbal and nonverbal 'theory of mind'. Neuropsychologia 2007; 45:1522-32. [PMID: 17208260 PMCID: PMC1868677 DOI: 10.1016/j.neuropsychologia.2006.11.017] [Citation(s) in RCA: 106] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2006] [Revised: 11/24/2006] [Accepted: 11/25/2006] [Indexed: 10/23/2022]
Abstract
Theory of mind (ToM) - our ability to predict behaviors of others in terms of their underlying intentions - has been examined through verbal and nonverbal false-belief (FB) tasks. Previous brain imaging studies of ToM in adults have implicated medial prefrontal cortex (mPFC) and temporo-parietal junction (TPJ) for adults' ToM ability. To examine age and modality related differences and similarities in neural correlates of ToM, we tested 16 adults (18-40 years old) and 12 children (8-12 years old) with verbal (story) and nonverbal (cartoon) FB tasks, using functional magnetic resonance imaging (fMRI). Both age groups showed significant activity in the TPJ bilaterally and right inferior parietal lobule (IPL) in a modality-independent manner, indicating that these areas are important for ToM during both adulthood and childhood, regardless of modality. We also found significant age-related differences in the ToM condition-specific activity for the story and cartoon tasks in the left inferior frontal gyrus (IFG) and left TPJ. These results suggest that depending on the modality adults may utilize different brain regions from children in understanding ToM.
Collapse
|
34
|
Willems RM, Ozyürek A, Hagoort P. When Language Meets Action: The Neural Integration of Gesture and Speech. Cereb Cortex 2006; 17:2322-33. [PMID: 17159232 DOI: 10.1093/cercor/bhl141] [Citation(s) in RCA: 205] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we identified the neural networks involved in the integration of semantic information from speech and gestures. Verbal and/or gestural content could be integrated easily or less easily with the content of the preceding part of speech. Premotor areas involved in action observation (Brodmann area [BA] 6) were found to be specifically modulated by action information "mismatching" to a language context. Importantly, an increase in integration load of both verbal and gestural information into prior speech context activated Broca's area and adjacent cortex (BA 45/47). A classical language area, Broca's area, is not only recruited for language-internal processing but also when action observation is integrated with speech. These findings provide direct evidence that action and language processing share a high-level neural integration system.
Collapse
Affiliation(s)
- Roel M Willems
- F. C. Donders Centre for Cognitive Neuroimaging, Radboud University Nijmegen, 6500 HB Nijmegen, The Netherlands.
| | | | | |
Collapse
|
35
|
Maess B, Herrmann CS, Hahne A, Nakamura A, Friederici AD. Localizing the distributed language network responsible for the N400 measured by MEG during auditory sentence processing. Brain Res 2006; 1096:163-72. [PMID: 16769041 DOI: 10.1016/j.brainres.2006.04.037] [Citation(s) in RCA: 98] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2005] [Revised: 04/06/2006] [Accepted: 04/10/2006] [Indexed: 11/30/2022]
Abstract
We studied auditory sentence comprehension using magnetoencephalography while subjects listened to sentences whose correctness they had to judge subsequently. The localization and the time course of brain electrical activity during processing of correct and semantically incorrect sentences were estimated by computing a brain surface current density within a cortical layer for both conditions. Finally, a region of interest (ROI) analysis was conducted to determine the time course of specific locations. A magnetic N400 was present in six spatially different ROIs. Semantic anomalies caused an exclusive involvement of the ventral portion of the left inferior frontal gyrus (BA 47) and left pars triangularis (BA 45). The anterior parts of the superior (BA 22) and inferior (BA 20/21) temporal gyri bilaterally were activated by both conditions. The activation for the correct condition, however, peaked earlier in both left temporal regions (approximately 32 ms). In general, activation due to semantic violations was more pronounced, started later, and lasted longer as compared to correct sentences. The findings reveal a clear left-hemispheric dominance during language processing indicated firstly by the mere number of activated regions (four in the left vs. two in the right hemisphere) and secondly by the observed specificity of the left inferior frontal ROIs to semantic violations. The temporal advantage observed for the correct condition in the left temporal regions is supporting the notion that the established context eases the processing of the final word. Semantically incorrect words that do not fit into the context result in longer integration times.
Collapse
Affiliation(s)
- Burkhard Maess
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | | | | | | | | |
Collapse
|
36
|
Abstract
Most neuropsychological research on the perception of emotion concerns the perception of faces. Yet in everyday life, hand actions are also modulated by our affective state, revealing it, in turn, to the observer. We used functional magnetic resonance imaging (fMRI) to identify brain regions engaged during the observation of hand actions performed either in a neutral or an angry way. We also asked whether these are the same regions as those involved in perceiving expressive faces. During the passive observation of emotionally neutral hand movements, the fMRI signal increased significantly in dorsal and ventral premotor cortices, with the exact location of the 'peaks' distinct from those induced by face observation. Various areas in the extrastriate visual cortex were also engaged, overlapping with the face-related activity. When the observed hand action was performed with emotion, additional regions were recruited including the right dorsal premotor, the right medial prefrontal cortex, the left anterior insula and a region in the rostral part of the supramarginal gyrus bilaterally. These regions, except for the supramarginal gyrus, were also activated during the perception of angry faces. These results complement the wealth of studies on the perception of affect from faces and provide further insights into the processes involved in the perception of others underlying, perhaps, social constructs such as empathy.
Collapse
Affiliation(s)
- Marie-Hélène Grosbras
- Cognitive Neuroscience Unit, Montreal Neurological Institute, McGill University, Montreal, Canada
| | | |
Collapse
|
37
|
Knösche TR, Maess B, Nakamura A, Friederici AD. Human communication investigated with magnetoencephalography: speech, music, and gestures. INTERNATIONAL REVIEW OF NEUROBIOLOGY 2005; 68:79-120. [PMID: 16443011 DOI: 10.1016/s0074-7742(05)68004-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Affiliation(s)
- Thomas R Knösche
- Max-Planck-Institute for Human Cognitive and Brain Sciences, 04103 Leipzig, Germany
| | | | | | | |
Collapse
|