1
|
Corsini A, Tomassini A, Pastore A, Delis I, Fadiga L, D'Ausilio A. Speech perception difficulty modulates theta-band encoding of articulatory synergies. J Neurophysiol 2024; 131:480-491. [PMID: 38323331 DOI: 10.1152/jn.00388.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Revised: 01/04/2024] [Accepted: 01/25/2024] [Indexed: 02/08/2024] Open
Abstract
The human brain tracks available speech acoustics and extrapolates missing information such as the speaker's articulatory patterns. However, the extent to which articulatory reconstruction supports speech perception remains unclear. This study explores the relationship between articulatory reconstruction and task difficulty. Participants listened to sentences and performed a speech-rhyming task. Real kinematic data of the speaker's vocal tract were recorded via electromagnetic articulography (EMA) and aligned to corresponding acoustic outputs. We extracted articulatory synergies from the EMA data with principal component analysis (PCA) and employed partial information decomposition (PID) to separate the electroencephalographic (EEG) encoding of acoustic and articulatory features into unique, redundant, and synergistic atoms of information. We median-split sentences into easy (ES) and hard (HS) based on participants' performance and found that greater task difficulty involved greater encoding of unique articulatory information in the theta band. We conclude that fine-grained articulatory reconstruction plays a complementary role in the encoding of speech acoustics, lending further support to the claim that motor processes support speech perception.NEW & NOTEWORTHY Top-down processes originating from the motor system contribute to speech perception through the reconstruction of the speaker's articulatory movement. This study investigates the role of such articulatory simulation under variable task difficulty. We show that more challenging listening tasks lead to increased encoding of articulatory kinematics in the theta band and suggest that, in such situations, fine-grained articulatory reconstruction complements acoustic encoding.
Collapse
Affiliation(s)
- Alessandro Corsini
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
- Department of Neuroscience and Rehabilitation, Università di Ferrara, Ferrara, Italy
| | - Alice Tomassini
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
- Department of Neuroscience and Rehabilitation, Università di Ferrara, Ferrara, Italy
| | - Aldo Pastore
- Laboratorio NEST, Scuola Normale Superiore, Pisa, Italy
| | - Ioannis Delis
- School of Biomedical Sciences, University of Leeds, Leeds, United Kingdom
| | - Luciano Fadiga
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
- Department of Neuroscience and Rehabilitation, Università di Ferrara, Ferrara, Italy
| | - Alessandro D'Ausilio
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
- Department of Neuroscience and Rehabilitation, Università di Ferrara, Ferrara, Italy
| |
Collapse
|
2
|
Liang B, Li Y, Zhao W, Du Y. Bilateral human laryngeal motor cortex in perceptual decision of lexical tone and voicing of consonant. Nat Commun 2023; 14:4710. [PMID: 37543659 PMCID: PMC10404239 DOI: 10.1038/s41467-023-40445-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 07/27/2023] [Indexed: 08/07/2023] Open
Abstract
Speech perception is believed to recruit the left motor cortex. However, the exact role of the laryngeal subregion and its right counterpart in speech perception, as well as their temporal patterns of involvement remain unclear. To address these questions, we conducted a hypothesis-driven study, utilizing transcranial magnetic stimulation on the left or right dorsal laryngeal motor cortex (dLMC) when participants performed perceptual decision on Mandarin lexical tone or consonant (voicing contrast) presented with or without noise. We used psychometric function and hierarchical drift-diffusion model to disentangle perceptual sensitivity and dynamic decision-making parameters. Results showed that bilateral dLMCs were engaged with effector specificity, and this engagement was left-lateralized with right upregulation in noise. Furthermore, the dLMC contributed to various decision stages depending on the hemisphere and task difficulty. These findings substantially advance our understanding of the hemispherical lateralization and temporal dynamics of bilateral dLMC in sensorimotor integration during speech perceptual decision-making.
Collapse
Affiliation(s)
- Baishen Liang
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Yanchang Li
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, Chinese Academy of Sciences, Beijing, 100101, China
| | - Wanying Zhao
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, Chinese Academy of Sciences, Beijing, 100101, China
| | - Yi Du
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, Chinese Academy of Sciences, Beijing, 100101, China.
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
- CAS Center for Excellence in Brain Science and Intelligence Technology, Shanghai, 200031, China.
- Chinese Institute for Brain Research, Beijing, 102206, China.
| |
Collapse
|
3
|
Bono D, Belyk M, Longo MR, Dick F. Beyond language: The unspoken sensory-motor representation of the tongue in non-primates, non-human and human primates. Neurosci Biobehav Rev 2022; 139:104730. [PMID: 35691470 DOI: 10.1016/j.neubiorev.2022.104730] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Revised: 04/06/2022] [Accepted: 06/06/2022] [Indexed: 11/28/2022]
Abstract
The English idiom "on the tip of my tongue" commonly acknowledges that something is known, but it cannot be immediately brought to mind. This phrase accurately describes sensorimotor functions of the tongue, which are fundamental for many tongue-related behaviors (e.g., speech), but often neglected by scientific research. Here, we review a wide range of studies conducted on non-primates, non-human and human primates with the aim of providing a comprehensive description of the cortical representation of the tongue's somatosensory inputs and motor outputs across different phylogenetic domains. First, we summarize how the properties of passive non-noxious mechanical stimuli are encoded in the putative somatosensory tongue area, which has a conserved location in the ventral portion of the somatosensory cortex across mammals. Second, we review how complex self-generated actions involving the tongue are represented in more anterior regions of the putative somato-motor tongue area. Finally, we describe multisensory response properties of the primate and non-primate tongue area by also defining how the cytoarchitecture of this area is affected by experience and deafferentation.
Collapse
Affiliation(s)
- Davide Bono
- Birkbeck/UCL Centre for Neuroimaging, 26 Bedford Way, London WC1H0AP, UK; Department of Experimental Psychology, UCL Division of Psychology and Language Sciences, 26 Bedford Way, London WC1H0AP, UK.
| | - Michel Belyk
- Department of Speech, Hearing, and Phonetic Sciences, UCL Division of Psychology and Language Sciences, 2 Wakefield Street, London WC1N 1PJ, UK
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck College, University of London, Malet St, London WC1E7HX, UK
| | - Frederic Dick
- Birkbeck/UCL Centre for Neuroimaging, 26 Bedford Way, London WC1H0AP, UK; Department of Experimental Psychology, UCL Division of Psychology and Language Sciences, 26 Bedford Way, London WC1H0AP, UK; Department of Psychological Sciences, Birkbeck College, University of London, Malet St, London WC1E7HX, UK.
| |
Collapse
|
4
|
Abstract
Ten years ago, Perspectives in Psychological Science published the Mirror Neuron Forum, in which authors debated the role of mirror neurons in action understanding, speech, imitation, and autism and asked whether mirror neurons are acquired through visual-motor learning. Subsequent research on these themes has made significant advances, which should encourage further, more systematic research. For action understanding, multivoxel pattern analysis, patient studies, and brain stimulation suggest that mirror-neuron brain areas contribute to low-level processing of observed actions (e.g., distinguishing types of grip) but not to high-level action interpretation (e.g., inferring actors’ intentions). In the area of speech perception, although it remains unclear whether mirror neurons play a specific, causal role in speech perception, there is compelling evidence for the involvement of the motor system in the discrimination of speech in perceptually noisy conditions. For imitation, there is strong evidence from patient, brain-stimulation, and brain-imaging studies that mirror-neuron brain areas play a causal role in copying of body movement topography. In the area of autism, studies using behavioral and neurological measures have tried and failed to find evidence supporting the “broken-mirror theory” of autism. Furthermore, research on the origin of mirror neurons has confirmed the importance of domain-general visual-motor associative learning rather than canalized visual-motor learning, or motor learning alone.
Collapse
Affiliation(s)
- Cecilia Heyes
- All Souls College, University of Oxford.,Department of Experimental Psychology, University of Oxford
| | - Caroline Catmur
- Department of Psychology, Institute of Psychiatry, Psychology, and Neuroscience, King's College London
| |
Collapse
|
5
|
Kaczmarczyk I, Rawji V, Rothwell JC, Hodson-Tole E, Sharma N. Comparison between surface electrodes and ultrasound monitoring to measure TMS evoked muscle contraction. Muscle Nerve 2021; 63:724-729. [PMID: 33533504 DOI: 10.1002/mus.27192] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2020] [Revised: 01/29/2021] [Accepted: 01/31/2021] [Indexed: 12/13/2022]
Abstract
INTRODUCTION Transcranial magnetic stimulation (TMS) is widely used to explore cortical physiology in health and disease. Surface electromyography (sEMG) is appropriate for superficial muscles, but cannot be applied easily to less accessible muscles. Muscle ultrasound (mUS) may provide an elegant solution to this problem, but fundamental questions remain. We explore the relationship between TMS evoked muscle potentials and TMS evoked muscle contractions measured with mUS. METHODS In 10 participants, we performed a TMS recruitment curve, simultaneously measuring motor evoked potentials (MEPs) and mUS in biceps (BI), first dorsal interosseous (FDI), tibialis anterior (TA), and the tongue (TO). RESULTS Resting motor threshold (RMT) measurements and recruitment curves were found to be consistent across sEMG and mUS. DISCUSSION This work supports the use of TMS-US to study less accessible muscles. The implications are broad but could include the study of a new range of muscles in disorders such as amyotrophic lateral sclerosis.
Collapse
Affiliation(s)
- Isabella Kaczmarczyk
- Department of Clinical and Movement Neuroscience, UCL Queen Square Institute of Neurology, London, UK
| | - Vishal Rawji
- Department of Clinical and Movement Neuroscience, UCL Queen Square Institute of Neurology, London, UK
| | - John C Rothwell
- Department of Clinical and Movement Neuroscience, UCL Queen Square Institute of Neurology, London, UK
| | - Emma Hodson-Tole
- Musculoskeletal Sciences and Sports Medicine Research Centre, Manchester Metropolitan University, Manchester, UK
| | - Nikhil Sharma
- Department of Clinical and Movement Neuroscience, UCL Queen Square Institute of Neurology, London, UK
| |
Collapse
|
6
|
Hilt PM, Cardellicchio P, Dolfini E, Pozzo T, Fadiga L, D'Ausilio A. Motor Recruitment during Action Observation: Effect of Interindividual Differences in Action Strategy. Cereb Cortex 2020; 30:3910-3920. [PMID: 32043124 PMCID: PMC7264692 DOI: 10.1093/cercor/bhaa006] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2019] [Revised: 11/04/2019] [Accepted: 01/07/2020] [Indexed: 12/02/2022] Open
Abstract
Visual processing of other’s actions is supported by sensorimotor brain activations. Access to sensorimotor representations may, in principle, provide the top-down signal required to bias search and selection of critical visual features. For this to happen, it is necessary that a stable one-to-one mapping exists between observed kinematics and underlying motor commands. However, due to the inherent redundancy of the human musculoskeletal system, this is hardly the case for multijoint actions where everyone has his own moving style (individual motor signature—IMS). Here, we investigated the influence of subject’s IMS on subjects’ motor excitability during the observation of an actor achieving the same goal by adopting two different IMSs. Despite a clear dissociation in kinematic and electromyographic patterns between the two actions, we found no group-level modulation of corticospinal excitability (CSE) in observers. Rather, we found a negative relationship between CSE and actor-observer IMS distance, already at the single-subject level. Thus, sensorimotor activity during action observation does not slavishly replicate the motor plan implemented by the actor, but rather reflects the distance between what is canonical according to one’s own motor template and the observed movements performed by other individuals.
Collapse
Affiliation(s)
- P M Hilt
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy
| | - P Cardellicchio
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy
| | - E Dolfini
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy
| | - T Pozzo
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy.,INSERMU1093, Universite de Bourgogne Franche-Comte, 21000, Dijon, France
| | - L Fadiga
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy.,Section of Human Physiology, Università di Ferrara, 44121, Ferrara, Italy
| | - A D'Ausilio
- IIT@UniFe Center for Translational Neurophysiology, Istituto Italiano di Tecnologia, 44121, Ferrara, Italy.,Section of Human Physiology, Università di Ferrara, 44121, Ferrara, Italy
| |
Collapse
|
7
|
Abstract
Recent evidence suggests that the motor system may have a facilitatory role in speech perception during noisy listening conditions. Studies clearly show an association between activity in auditory and motor speech systems, but also hint at a causal role for the motor system in noisy speech perception. However, in the most compelling "causal" studies performance was only measured at a single signal-to-noise ratio (SNR). If listening conditions must be noisy to invoke causal motor involvement, then effects will be contingent on the SNR at which they are tested. We used articulatory suppression to disrupt motor-speech areas while measuring phonemic identification across a range of SNRs. As controls, we also measured phoneme identification during passive listening, mandible gesturing, and foot-tapping conditions. Two-parameter (threshold, slope) psychometric functions were fit to the data in each condition. Our findings indicate: (1) no effect of experimental task on psychometric function slopes; (2) a small effect of articulatory suppression, in particular, on psychometric function thresholds. The size of the latter effect was 1 dB (~5% correct) on average, suggesting, at best, a minor modulatory role of the speech motor system in perception.
Collapse
Affiliation(s)
- Ryan C Stokes
- Department of Cognitive Sciences Social and Behavioral Sciences Gateway, University of California - Irvine, Irvine, CA, 92697-5100, USA.
| | - Jonathan H Venezia
- Department of Cognitive Sciences Social and Behavioral Sciences Gateway, University of California - Irvine, Irvine, CA, 92697-5100, USA
| | - Gregory Hickok
- Department of Cognitive Sciences Social and Behavioral Sciences Gateway, University of California - Irvine, Irvine, CA, 92697-5100, USA
| |
Collapse
|
8
|
Burgess JD, Major BP, McNeel C, Clark GM, Lum JAG, Enticott PG. Learning to Expect: Predicting Sounds During Movement Is Related to Sensorimotor Association During Listening. Front Hum Neurosci 2019; 13:215. [PMID: 31333431 PMCID: PMC6624421 DOI: 10.3389/fnhum.2019.00215] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2018] [Accepted: 06/11/2019] [Indexed: 11/13/2022] Open
Abstract
Sensory experiences, such as sound, often result from our motor actions. Over time, repeated sound-producing performance can generate sensorimotor associations. However, it is not clear how sensory and motor information are associated. Here, we explore if sensory prediction is associated with the formation of sensorimotor associations during a learning task. We recorded event-related potentials (ERPs) while participants produced index and little finger-swipes on a bespoke device, generating novel sounds. ERPs were also obtained as participants heard those sounds played back. Peak suppression was compared to assess sensory prediction. Additionally, transcranial magnetic stimulation (TMS) was used during listening to generate finger-motor evoked potentials (MEPs). MEPs were recorded before and after training upon hearing these sounds, and then compared to reveal sensorimotor associations. Finally, we explored the relationship between these components. Results demonstrated that an increased positive-going peak (e.g., P2) and a suppressed negative-going peak (e.g., N2) were recorded during action, revealing some sensory prediction outcomes (P2: p = 0.050, ηp2 = 0.208; N2: p = 0.001, ηp2 = 0.474). Increased MEPs were also observed upon hearing congruent sounds compared with incongruent sounds (i.e., associated to a finger), demonstrating precise sensorimotor associations that were not present before learning (Index finger: p < 0.001, ηp2 = 0.614; Little finger: p < 0.001, ηp2 = 0.529). Consistent with our broad hypotheses, a negative association between the MEPs in one finger during listening and ERPs during performance of the other was observed (Index finger MEPs and Fz N1 action ERPs; r = −0.655, p = 0.003). Overall, data suggest that predictive mechanisms are associated with the fine-tuning of sensorimotor associations.
Collapse
Affiliation(s)
- Jed D Burgess
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| | - Brendan P Major
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| | - Claire McNeel
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| | - Gillian M Clark
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| | - Jarrad A G Lum
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| | - Peter G Enticott
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, VIC, Australia
| |
Collapse
|
9
|
Schmitz J, Bartoli E, Maffongelli L, Fadiga L, Sebastian-Galles N, D’Ausilio A. Motor cortex compensates for lack of sensory and motor experience during auditory speech perception. Neuropsychologia 2019; 128:290-296. [DOI: 10.1016/j.neuropsychologia.2018.01.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2017] [Revised: 12/18/2017] [Accepted: 01/05/2018] [Indexed: 10/18/2022]
|
10
|
Vainio L. Connection between movements of mouth and hand: Perspectives on development and evolution of speech. Neurosci Biobehav Rev 2019; 100:211-223. [PMID: 30871957 DOI: 10.1016/j.neubiorev.2019.03.005] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2018] [Revised: 01/25/2019] [Accepted: 03/08/2019] [Indexed: 10/27/2022]
Abstract
Mounting evidence shows interaction between manipulative hand movements and movements of tongue, lips and mouth in a vocal and non-vocal context. The current article reviews this evidence and discusses its contribution to perspectives of development and evolution of speech. In particular, the article aims to present novel insight on how processes controlling the two primary grasp components of manipulative hand movements, the precision and power grip, might be systematically connected to motor processes involved in producing certain articulatory gestures. This view assumes that due to these motor overlaps between grasping and articulation, development of these grip types in infancy can facilitate development of specific articulatory gestures. In addition, the hand-mouth connections might have even boosted the evolution of some articulatory gestures. This account also proposes that some semantic sound-symbolic pairings between a speech sound and a referent concept might be partially based on these hand-mouth interactions.
Collapse
Affiliation(s)
- Lari Vainio
- University of Helsinki, Helsinki Collegium for Advanced Studies, P.O. Box 4 (Fabianinkatu 24), FIN 00014, Finland; Perception, Action & Cognition Research Group, Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki, Haartmaninkatu 8, 00014, Finland; Phonetics and Speech Synthesis Research Group, Department of Digital Humanities, University of Helsinki, Unioninkatu 40, 00014, Finland.
| |
Collapse
|
11
|
Mukherjee S, Badino L, Hilt PM, Tomassini A, Inuggi A, Fadiga L, Nguyen N, D'Ausilio A. The neural oscillatory markers of phonetic convergence during verbal interaction. Hum Brain Mapp 2018; 40:187-201. [PMID: 30240542 DOI: 10.1002/hbm.24364] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2018] [Revised: 07/19/2018] [Accepted: 08/05/2018] [Indexed: 12/12/2022] Open
Abstract
During a conversation, the neural processes supporting speech production and perception overlap in time and, based on context, expectations and the dynamics of interaction, they are also continuously modulated in real time. Recently, the growing interest in the neural dynamics underlying interactive tasks, in particular in the language domain, has mainly tackled the temporal aspects of turn-taking in dialogs. Besides temporal coordination, an under-investigated phenomenon is the implicit convergence of the speakers toward a shared phonetic space. Here, we used dual electroencephalography (dual-EEG) to record brain signals from subjects involved in a relatively constrained interactive task where they were asked to take turns in chaining words according to a phonetic rhyming rule. We quantified participants' initial phonetic fingerprints and tracked their phonetic convergence during the interaction via a robust and automatic speaker verification technique. Results show that phonetic convergence is associated to left frontal alpha/low-beta desynchronization during speech preparation and by high-beta suppression before and during listening to speech in right centro-parietal and left frontal sectors, respectively. By this work, we provide evidence that mutual adaptation of speech phonetic targets, correlates with specific alpha and beta oscillatory dynamics. Alpha and beta oscillatory dynamics may index the coordination of the "when" as well as the "how" speech interaction takes place, reinforcing the suggestion that perception and production processes are highly interdependent and co-constructed during a conversation.
Collapse
Affiliation(s)
- Sankar Mukherjee
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
| | - Leonardo Badino
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
| | - Pauline M Hilt
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
| | - Alice Tomassini
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy
| | - Alberto Inuggi
- Center for Human Technologies, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Luciano Fadiga
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy.,Section of Human Physiology, University of Ferrara, Ferrara, Italy
| | - Noël Nguyen
- CNRS, LPL, Aix Marseille University, Aix-en-Provence, France
| | - Alessandro D'Ausilio
- Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Ferrara, Italy.,Section of Human Physiology, University of Ferrara, Ferrara, Italy
| |
Collapse
|
12
|
Wermelinger S, Gampe A, Behr J, Daum MM. Interference of action perception on action production increases across the adult life span. Exp Brain Res 2017; 236:577-586. [PMID: 29249051 DOI: 10.1007/s00221-017-5157-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Accepted: 12/14/2017] [Indexed: 10/18/2022]
Abstract
Action perception and action production are assumed to be based on an internal simulation process that involves the sensorimotor system. This system undergoes changes across the life span and is assumed to become less precise with age. In the current study, we investigated how increasing age affects the magnitude of interference in action production during simultaneous action perception. In a task adapted from Brass et al. (Brain Cogn 44(2):124-143, 2000), we asked participants (aged 20-80 years) to respond to a visually presented finger movement and/or symbolic cue by executing a previously defined finger movement. Action production was assessed via participants' reaction times. Results show that participants were slower in trials in which they were asked to ignore an incongruent finger movement compared to trials in which they had to ignore an incongruent symbolic cue. Moreover, advancing age was shown to accentuate this effect. We suggest that the internal simulation of the action becomes less precise with age making the sensorimotor system more susceptible to perturbations such as the interference of a concurrent action perception.
Collapse
Affiliation(s)
- Stephanie Wermelinger
- Department of Psychology, University of Zurich, Binzmuehlestrasse 14, 8050, Zurich, Switzerland.
| | - Anja Gampe
- Department of Psychology, University of Zurich, Binzmuehlestrasse 14, 8050, Zurich, Switzerland
| | - Jannis Behr
- Department of Psychology, University of Zurich, Binzmuehlestrasse 14, 8050, Zurich, Switzerland
| | - Moritz M Daum
- Department of Psychology, University of Zurich, Binzmuehlestrasse 14, 8050, Zurich, Switzerland.,Neuroscience Center Zurich, University of Zurich and ETH Zurich, Zurich, Switzerland
| |
Collapse
|
13
|
Ruch H, Zürcher Y, Burkart JM. The function and mechanism of vocal accommodation in humans and other primates. Biol Rev Camb Philos Soc 2017; 93:996-1013. [PMID: 29111610 DOI: 10.1111/brv.12382] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2017] [Revised: 09/26/2017] [Accepted: 10/02/2017] [Indexed: 11/30/2022]
Abstract
The study of non-human animals, in particular primates, can provide essential insights into language evolution. A critical element of language is vocal production learning, i.e. learning how to produce calls. In contrast to other lineages such as songbirds, vocal production learning of completely new signals is strikingly rare in non-human primates. An increasing body of research, however, suggests that various species of non-human primates engage in vocal accommodation and adjust the structure of their calls in response to environmental noise or conspecific vocalizations. To date it is unclear what role vocal accommodation may have played in language evolution, in particular because it summarizes a variety of heterogeneous phenomena which are potentially achieved by different mechanisms. In contrast to non-human primates, accommodation research in humans has a long tradition in psychology and linguistics. Based on theoretical models from these research traditions, we provide a new framework which allows comparing instances of accommodation across species, and studying them according to their underlying mechanism and ultimate biological function. We found that at the mechanistic level, many cases of accommodation can be explained with an automatic perception-production link, but some instances arguably require higher levels of vocal control. Functionally, both human and non-human primates use social accommodation to signal social closeness or social distance to a partner or social group. Together, this indicates that not only some vocal control, but also the communicative function of vocal accommodation to signal social closeness and distance must have evolved prior to the emergence of language, rather than being the result of it. Vocal accommodation as found in other primates has thus endowed our ancestors with pre-adaptations that may have paved the way for language evolution.
Collapse
Affiliation(s)
- Hanna Ruch
- University Research Priority Program Language and Space, University of Zurich, 8032, Zürich, Switzerland
| | - Yvonne Zürcher
- Department of Anthropology, University of Zurich, 8057, Zürich, Switzerland
| | - Judith M Burkart
- Department of Anthropology, University of Zurich, 8057, Zürich, Switzerland
| |
Collapse
|
14
|
Echoes on the motor network: how internal motor control structures afford sensory experience. Brain Struct Funct 2017; 222:3865-3888. [DOI: 10.1007/s00429-017-1484-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2017] [Accepted: 07/25/2017] [Indexed: 01/10/2023]
|
15
|
Beta rhythm modulation by speech sounds: somatotopic mapping in somatosensory cortex. Sci Rep 2016; 6:31182. [PMID: 27499204 PMCID: PMC4976318 DOI: 10.1038/srep31182] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2016] [Accepted: 07/13/2016] [Indexed: 11/20/2022] Open
Abstract
During speech listening motor regions are somatotopically activated, resembling the activity that subtends actual speech production, suggesting that motor commands can be retrieved from sensory inputs. Crucially, the efficient motor control of the articulators relies on the accurate anticipation of the somatosensory reafference. Nevertheless, evidence about somatosensory activities elicited by auditory speech processing is sparse. The present work looked for specific interactions between auditory speech presentation and somatosensory cortical information processing. We used an auditory speech identification task with sounds having different place of articulation (bilabials and dentals). We tested whether coupling the auditory task with a peripheral electrical stimulation of the lips would affect the pattern of sensorimotor electroencephalographic rhythms. Peripheral electrical stimulation elicits a series of spectral perturbations of which the beta rebound reflects the return-to-baseline stage of somatosensory processing. We show a left-lateralized and selective reduction in the beta rebound following lip somatosensory stimulation when listening to speech sounds produced with the lips (i.e. bilabials). Thus, the somatosensory processing could not return to baseline due to the recruitment of the same neural resources by speech stimuli. Our results are a clear demonstration that heard speech sounds are somatotopically mapped onto somatosensory cortices, according to place of articulation.
Collapse
|
16
|
Nuttall HE, Kennedy-Higgins D, Hogan J, Devlin JT, Adank P. The effect of speech distortion on the excitability of articulatory motor cortex. Neuroimage 2016; 128:218-226. [DOI: 10.1016/j.neuroimage.2015.12.038] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2015] [Revised: 10/30/2015] [Accepted: 12/21/2015] [Indexed: 11/30/2022] Open
|
17
|
Finisguerra A, Maffongelli L, Bassolino M, Jacono M, Pozzo T, D'Ausilio A. Generalization of motor resonance during the observation of hand, mouth, and eye movements. J Neurophysiol 2015; 114:2295-304. [PMID: 26289463 DOI: 10.1152/jn.00433.2015] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2015] [Accepted: 08/18/2015] [Indexed: 11/22/2022] Open
Abstract
Transcranial magnetic stimulation (TMS) of the motor cortex shows that hand action observation (AO) modulates corticospinal excitability (CSE). CSE modulation alternatively maps low-level kinematic characteristics or higher-level features, like object-directed action goals. However, action execution is achieved through the control of muscle synergies, consisting of coordinated patterns of muscular activity during natural movements, rather than single muscles or object-directed goals. This synergistic organization of action execution also underlies the ability to produce the same functional output (i.e., grasping an object) using different effectors. We hypothesize that motor system activation during AO may rely on similar principles. To investigate this issue, we recorded both hand CSE and TMS-evoked finger movements which provide a much more complete description of coordinated patterns of muscular activity. Subjects passively watched hand, mouth and eyelid opening or closing, which are performing non-object-directed (intransitive) actions. Hand and mouth share the same potential to grasp objects, whereas eyelid does not allow object-directed (transitive) actions. Hand CSE modulation generalized to all effectors, while TMS evoked finger movements only to mouth AO. Such dissociation suggests that the two techniques may have different sensitivities to fine motor modulations induced by AO. Differently from evoked movements, which are sensitive to the possibility to achieve object-directed action, CSE is generically modulated by "opening" vs. "closing" movements, independently of which effector was observed. We propose that motor activities during AO might exploit the same synergistic mechanisms shown for the neural control of movement and organized around a limited set of motor primitives.
Collapse
Affiliation(s)
- Alessandra Finisguerra
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy; Dipartimento di Scienze Umane, Università Degli Studi di Udine, Udine, Italy
| | - Laura Maffongelli
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy
| | - Michela Bassolino
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy; Center for Neuroprosthetics, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland; and
| | - Marco Jacono
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy
| | - Thierry Pozzo
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy; IUF, INSERM U1093 Cognition, Action et Plasticité Sensorimotrice, Université de Bourgogne, Dijon, France
| | - Alessandro D'Ausilio
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Genova, Italy;
| |
Collapse
|
18
|
D'Ausilio A, Bartoli E, Maffongelli L. Motor control may support mirror neuron research with new hypotheses and methods: reply to comments on "Grasping synergies: a motor-control approach to the mirror neuron mechanism". Phys Life Rev 2015; 12:133-7. [PMID: 25792432 DOI: 10.1016/j.plrev.2015.02.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2015] [Accepted: 02/12/2015] [Indexed: 11/26/2022]
|
19
|
Grasping synergies: A motor-control approach to the mirror neuron mechanism. Phys Life Rev 2015; 12:91-103. [DOI: 10.1016/j.plrev.2014.11.002] [Citation(s) in RCA: 46] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2014] [Accepted: 11/10/2014] [Indexed: 11/21/2022]
|
20
|
Abstract
All spoken languages express words by sound patterns, and certain patterns (e.g., blog) are systematically preferred to others (e.g., lbog). What principles account for such preferences: does the language system encode abstract rules banning syllables like lbog, or does their dislike reflect the increased motor demands associated with speech production? More generally, we ask whether linguistic knowledge is fully embodied or whether some linguistic principles could potentially be abstract. To address this question, here we gauge the sensitivity of English speakers to the putative universal syllable hierarchy (e.g., blif ≻ bnif ≻ bdif ≻ lbif) while undergoing transcranial magnetic stimulation (TMS) over the cortical motor representation of the left orbicularis oris muscle. If syllable preferences reflect motor simulation, then worse-formed syllables (e.g., lbif) should (i) elicit more errors; (ii) engage more strongly motor brain areas; and (iii) elicit stronger effects of TMS on these motor regions. In line with the motor account, we found that repetitive TMS pulses impaired participants' global sensitivity to the number of syllables, and functional MRI confirmed that the cortical stimulation site was sensitive to the syllable hierarchy. Contrary to the motor account, however, ill-formed syllables were least likely to engage the lip sensorimotor area and they were least impaired by TMS. Results suggest that speech perception automatically triggers motor action, but this effect is not causally linked to the computation of linguistic structure. We conclude that the language and motor systems are intimately linked, yet distinct. Language is designed to optimize motor action, but its knowledge includes principles that are disembodied and potentially abstract.
Collapse
|
21
|
Cattaneo L. Granularity within the mirror system is not informative on action perception: comment on "Grasping synergies: a motor-control approach to the mirror neuron mechanism" by D'Ausilio et al. Phys Life Rev 2015; 12:123-5. [PMID: 25637139 DOI: 10.1016/j.plrev.2015.01.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2015] [Accepted: 01/09/2015] [Indexed: 11/15/2022]
Affiliation(s)
- Luigi Cattaneo
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Via delle Regole 101, 38123, Trento, Italy.
| |
Collapse
|
22
|
Naming a Lego world. The role of language in the acquisition of abstract concepts. PLoS One 2015; 10:e0114615. [PMID: 25629816 PMCID: PMC4309617 DOI: 10.1371/journal.pone.0114615] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2014] [Accepted: 11/11/2014] [Indexed: 11/19/2022] Open
Abstract
While embodied approaches of cognition have proved to be successful in explaining concrete concepts and words, they have more difficulties in accounting for abstract concepts and words, and several proposals have been put forward. This work aims to test the Words As Tools proposal, according to which both abstract and concrete concepts are grounded in perception, action and emotional systems, but linguistic information is more important for abstract than for concrete concept representation, due to the different ways they are acquired: while for the acquisition of the latter linguistic information might play a role, for the acquisition of the former it is instead crucial. We investigated the acquisition of concrete and abstract concepts and words, and verified its impact on conceptual representation. In Experiment 1, participants explored and categorized novel concrete and abstract entities, and were taught a novel label for each category. Later they performed a categorical recognition task and an image-word matching task to verify a) whether and how the introduction of language changed the previously formed categories, b) whether language had a major weight for abstract than for concrete words representation, and c) whether this difference had consequences on bodily responses. The results confirm that, even though both concrete and abstract concepts are grounded, language facilitates the acquisition of the latter and plays a major role in their representation, resulting in faster responses with the mouth, typically associated with language production. Experiment 2 was a rating test aiming to verify whether the findings of Experiment 1 were simply due to heterogeneity, i.e. to the fact that the members of abstract categories were more heterogeneous than those of concrete categories. The results confirmed the effectiveness of our operationalization, showing that abstract concepts are more associated with the mouth and concrete ones with the hand, independently from heterogeneity.
Collapse
|
23
|
Schomers MR, Kirilina E, Weigand A, Bajbouj M, Pulvermüller F. Causal Influence of Articulatory Motor Cortex on Comprehending Single Spoken Words: TMS Evidence. Cereb Cortex 2014; 25:3894-902. [PMID: 25452575 PMCID: PMC4585521 DOI: 10.1093/cercor/bhu274] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
Classic wisdom had been that motor and premotor cortex contribute to motor execution but not to higher cognition and language comprehension. In contrast, mounting evidence from neuroimaging, patient research, and transcranial magnetic stimulation (TMS) suggest sensorimotor interaction and, specifically, that the articulatory motor cortex is important for classifying meaningless speech sounds into phonemic categories. However, whether these findings speak to the comprehension issue is unclear, because language comprehension does not require explicit phonemic classification and previous results may therefore relate to factors alien to semantic understanding. We here used the standard psycholinguistic test of spoken word comprehension, the word-to-picture-matching task, and concordant TMS to articulatory motor cortex. TMS pulses were applied to primary motor cortex controlling either the lips or the tongue as subjects heard critical word stimuli starting with bilabial lip-related or alveolar tongue-related stop consonants (e.g., “pool” or “tool”). A significant cross-over interaction showed that articulatory motor cortex stimulation delayed comprehension responses for phonologically incongruent words relative to congruous ones (i.e., lip area TMS delayed “tool” relative to “pool” responses). As local TMS to articulatory motor areas differentially delays the comprehension of phonologically incongruous spoken words, we conclude that motor systems can take a causal role in semantic comprehension and, hence, higher cognition.
Collapse
Affiliation(s)
- Malte R Schomers
- Brain Language Laboratory, Department of Philosophy and Humanities, Freie Universität Berlin, 14195 Berlin, Germany Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, 10099 Berlin, Germany
| | - Evgeniya Kirilina
- Dahlem Institute for Neuroimaging of Emotion, Freie Universität Berlin, 14195 Berlin, Germany
| | - Anne Weigand
- Dahlem Institute for Neuroimaging of Emotion, Freie Universität Berlin, 14195 Berlin, Germany Department of Psychiatry, Charité Universitätsmedizin Berlin, 14050 Berlin, Germany Berenson-Allen Center for Noninvasive Brain Stimulation, Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, MA 02215, USA
| | - Malek Bajbouj
- Dahlem Institute for Neuroimaging of Emotion, Freie Universität Berlin, 14195 Berlin, Germany Department of Psychiatry, Charité Universitätsmedizin Berlin, 14050 Berlin, Germany
| | - Friedemann Pulvermüller
- Brain Language Laboratory, Department of Philosophy and Humanities, Freie Universität Berlin, 14195 Berlin, Germany Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, 10099 Berlin, Germany
| |
Collapse
|
24
|
Komeilipoor N, Vicario CM, Daffertshofer A, Cesari P. Talking hands: tongue motor excitability during observation of hand gestures associated with words. Front Hum Neurosci 2014; 8:767. [PMID: 25324761 PMCID: PMC4179693 DOI: 10.3389/fnhum.2014.00767] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2014] [Accepted: 09/10/2014] [Indexed: 11/15/2022] Open
Abstract
Perception of speech and gestures engage common brain areas. Neural regions involved in speech perception overlap with those involved in speech production in an articulator-specific manner. Yet, it is unclear whether motor cortex also has a role in processing communicative actions like gesture and sign language. We asked whether the mere observation of hand gestures, paired and not paired with words, may result in changes in the excitability of the hand and tongue areas of motor cortex. Using single-pulse transcranial magnetic stimulation (TMS), we measured the motor excitability in tongue and hand areas of left primary motor cortex, while participants viewed video sequences of bimanual hand movements associated or not-associated with nouns. We found higher motor excitability in the tongue area during the presentation of meaningful gestures (noun-associated) as opposed to meaningless ones, while the excitability of hand motor area was not differentially affected by gesture observation. Our results let us argue that the observation of gestures associated with a word results in activation of articulatory motor network accompanying speech production.
Collapse
Affiliation(s)
- Naeem Komeilipoor
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy ; MOVE Research Institute Amsterdam, VU University Amsterdam Amsterdam, Netherlands
| | | | | | - Paola Cesari
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy
| |
Collapse
|
25
|
Ferrari PF, Rizzolatti G. Mirror neuron research: the past and the future. Philos Trans R Soc Lond B Biol Sci 2014; 369:20130169. [PMID: 24778369 DOI: 10.1098/rstb.2013.0169] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|