1
|
Barnaveli I, Viganò S, Reznik D, Haggard P, Doeller CF. Hippocampal-entorhinal cognitive maps and cortical motor system represent action plans and their outcomes. Nat Commun 2025; 16:4139. [PMID: 40319012 PMCID: PMC12049502 DOI: 10.1038/s41467-025-59153-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2024] [Accepted: 04/14/2025] [Indexed: 05/07/2025] Open
Abstract
Efficiently interacting with the environment requires weighing and selecting among multiple alternative actions based on their associated outcomes. However, the neural mechanisms underlying these processes are still debated. We show that forming relations between arbitrary action-outcome associations involve building a cognitive map. Using an immersive virtual reality paradigm, participants learned 2D abstract motor action-outcome associations and later compared action combinations while their brain activity was monitored with fMRI. We observe a hexadirectional modulation of the activity in entorhinal cortex while participants compared different action plans. Furthermore, hippocampal activity scales with the 2D similarity between outcomes of these action plans. Conversely, the supplementary motor area represents individual actions, showing a stronger response to overlapping action plans. Crucially, the connectivity between hippocampus and supplementary motor area is modulated by the similarity between the action plans, suggesting their complementary roles in action evaluation. These findings provide evidence for the role of cognitive maps in action selection, challenging classical models of memory taxonomy and its neural bases.
Collapse
Affiliation(s)
- Irina Barnaveli
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | - Simone Viganò
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Center for Mind/Brain Sciences, University of Trento, Rovereto, Italy
| | - Daniel Reznik
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Christian F Doeller
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
- Kavli Institute for Systems Neuroscience, NTNU, Trondheim, Norway.
| |
Collapse
|
2
|
Endo N, Vilain C, Nakazawa K, Ito T. Somatosensory influence on auditory cortical response of self-generated sound. Neuropsychologia 2025; 211:109103. [PMID: 40021117 DOI: 10.1016/j.neuropsychologia.2025.109103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2024] [Revised: 02/24/2025] [Accepted: 02/24/2025] [Indexed: 03/03/2025]
Abstract
Motor execution which results in the generation of sounds attenuates the cortical response to these self-generated sounds. This attenuation has been explained as a result of motor relevant processing. The current study shows that corresponding somatosensory inputs can also change the auditory processing of a self-generated sound. We recorded auditory event-related potentials (ERP) in response to self-generated sounds and assessed how the amount of auditory attenuation changed according to the somatosensory inputs. The sound stimuli were generated by a finger movement that pressed on a virtual object, which was produced by a haptic robotic device. Somatosensory inputs were modulated by changing the stiffness of this virtual object (low and high) in an unpredictable manner. For comparison purposes, we carried out the same test with a computer keyboard, which is conventionally used to induce the auditory attenuation of self-generated sound. While N1 and P2 attenuations were clearly induced in the control condition with the keyboard as has been observed in previous studies, when using the robotic device the amplitude of N1 was found to vary according to the stiffness of the virtual object. The amplitude of N1 in the low stiffness condition was similar to that found using the keyboard for the same condition but not in the high stiffness condition. In addition, P2 attenuation did not differ between stiffness conditions. The waveforms of auditory ERP after 200 ms also differed according to the stiffness conditions. The estimated source of N1 attenuation was located in the right parietal area. These results suggest that somatosensory inputs during movement can modify the auditory processing of self-generated sound. The auditory processing of self-generated sound may represent self-referenced processing like an embodied process or an action-perception mechanism.
Collapse
Affiliation(s)
- Nozomi Endo
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, Tokyo, 153-8902, Japan
| | - Coriandre Vilain
- Univ. Grenoble Alpes, CNRS, Grenoble-INP, GIPSA-lab, 11 rue des Mathématiques, Grenoble Campus BP46, F-38402, Saint Martin d'Heres, CEDEX, France
| | - Kimitaka Nakazawa
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, Tokyo, 153-8902, Japan
| | - Takayuki Ito
- Univ. Grenoble Alpes, CNRS, Grenoble-INP, GIPSA-lab, 11 rue des Mathématiques, Grenoble Campus BP46, F-38402, Saint Martin d'Heres, CEDEX, France.
| |
Collapse
|
3
|
Ghio M, Haegert K, Seidel A, Suchan B, Thoma P, Bellebaum C. The prediction of auditory consequences of own and observed actions: a brain decoding multivariate pattern study. Cereb Cortex 2025; 35:bhaf091. [PMID: 40298443 PMCID: PMC12038811 DOI: 10.1093/cercor/bhaf091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2024] [Revised: 02/26/2025] [Accepted: 03/25/2025] [Indexed: 04/30/2025] Open
Abstract
Evidence from the auditory domain suggests that sounds generated by self-performed as well as observed actions are processed differently compared to external sounds. This study aimed to investigate which brain regions are involved in the processing of auditory stimuli generated by actions, addressing the question of whether cerebellar forward models, which are supposed to predict the sensory consequences of self-performed actions, similarly underlie predictions for action observation. We measured brain activity with functional magnetic resonance imaging (fMRI) while participants elicited a sound via button press, observed another person performing this action, or listened to external sounds. By applying multivariate pattern analysis (MVPA), we found evidence for altered processing in the right auditory cortex for sounds following both self-performed and observed actions relative to external sounds. Evidence for the prediction of auditory action consequences was found in the bilateral cerebellum and the right supplementary motor area, but only for self-performed actions. Our results suggest that cerebellar forward models contribute to predictions of sensory consequences only for action performance. While predictions are also generated for action observation, the underlying mechanisms remain to be elucidated.
Collapse
Affiliation(s)
- Marta Ghio
- Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Universitätsstrasse 1, 40225, Düsseldorf, Germany
| | - Karolin Haegert
- Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Universitätsstrasse 1, 40225, Düsseldorf, Germany
| | - Alexander Seidel
- Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Universitätsstrasse 1, 40225, Düsseldorf, Germany
| | - Boris Suchan
- Neuropsychological Therapy Centre, Faculty of Psychology, Ruhr University Bochum, Universitätsstr. 150, 44801, Bochum, Germany
| | - Patrizia Thoma
- Neuropsychological Therapy Centre, Faculty of Psychology, Ruhr University Bochum, Universitätsstr. 150, 44801, Bochum, Germany
| | - Christian Bellebaum
- Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Universitätsstrasse 1, 40225, Düsseldorf, Germany
| |
Collapse
|
4
|
Kiepe F, Hesselmann G. Sensory attenuation of self-initiated tactile feedback is modulated by stimulus strength and temporal delay in a virtual reality environment. Q J Exp Psychol (Hove) 2025:17470218251330237. [PMID: 40087903 DOI: 10.1177/17470218251330237] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/17/2025]
Abstract
Despite extensive research across various modalities, the precise mechanisms of sensory attenuation (SA) remain debated. Specifically, it remains unclear to what extent SA is influenced by stimulus predictability alone, as opposed to the distinct impact of self-generated actions. Forward models suggest that efference copies of motor commands enable the brain to predict and distinguish anticipated changes in self-initiated sensory input. Predictive processing proposes that predictions about upcoming changes in sensory input are not solely based on efference copies, but rather generated in the form of a generative model integrating external, contextual factors, as well. This study investigated the underlying mechanisms of SA in the tactile domain, specifically examining self-initiation and temporal predictions within a virtual reality (VR) framework. This setup allowed for precise control over sensory feedback in response to movement. Participants (N = 33) engaged in an active condition, moving their hands to elicit a virtual touch. Importantly, visual perception was modified in VR, so that participants touched their rendered-but not physical-hands. The virtual touch triggered the test vibrations on a touch controller (intensities: 0.2, 0.35, 0.5, 0.65, 0.8; in arbitrary units.), the intensity of which was then compared to that of a standard stimulus (intensity: 0.5). In the passive condition, vibrations were presented without movement and were preceded by a visual cue. Further, test vibrations appeared either immediately or after a variable onset delay (700-800ms). Our results revealed a significant effect of the factor "onset delay" on perceived vibration intensity. In addition, we observed interactions between the factors "agency" and "test vibration intensity" and between the factors "agency" and "onset delay," with attenuation effects for immediate vibrations at high intensities and enhancement effects for delayed vibrations at low intensities. These findings emphasize the impact of external, contextual factors and support the notion of a broader, attention-oriented predictive mechanism for the perception of self-initiated stimuli.
Collapse
Affiliation(s)
- Fabian Kiepe
- Department of General and Biological Psychology, Psychologische Hochschule Berlin (PHB), Berlin, Germany
| | - Guido Hesselmann
- Department of General and Biological Psychology, Psychologische Hochschule Berlin (PHB), Berlin, Germany
| |
Collapse
|
5
|
Tast V, Schröger E, Widmann A. Suppression and omission effects in auditory predictive processing-Two of the same? Eur J Neurosci 2024; 60:4049-4062. [PMID: 38764129 DOI: 10.1111/ejn.16393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2023] [Revised: 04/24/2024] [Accepted: 04/30/2024] [Indexed: 05/21/2024]
Abstract
Recent theories describe perception as an inferential process based on internal predictive models that are adjusted by prediction violations (prediction error). Two different modulations of the auditory N1 event-related brain potential component are often discussed as an expression of auditory predictive processing. The sound-related N1 component is attenuated for self-generated sounds compared to the N1 elicited by externally generated sounds (N1 suppression). An omission-related component in the N1 time-range is elicited when the self-generated sounds are occasionally omitted (omission N1). Both phenomena were explained by action-related forward modelling, which takes place when the sensory input is predictable: prediction error signals are reduced when predicted sensory input is presented (N1 suppression) and elicited when predicted sensory input is omitted (omission N1). This common theoretical account is appealing but has not yet been directly tested. We manipulated the predictability of a sound in a self-generation paradigm in which, in two conditions, either 80% or 50% of the button presses did generate a sound, inducing a strong or a weak expectation for the occurrence of the sound. Consistent with the forward modelling account, an omission N1 was observed in the 80% but not in the 50% condition. However, N1 suppression was highly similar in both conditions. Thus, our results demonstrate a clear effect of predictability for the omission N1 but not for the N1 suppression. These results imply that the two phenomena rely (at least in part) on different mechanisms and challenge prediction related accounts of N1 suppression.
Collapse
Affiliation(s)
- Valentina Tast
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Erich Schröger
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Andreas Widmann
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| |
Collapse
|
6
|
Dercksen TT, Widmann A, Noesselt T, Wetzel N. Somatosensory omissions reveal action-related predictive processing. Hum Brain Mapp 2024; 45:e26550. [PMID: 38050773 PMCID: PMC10915725 DOI: 10.1002/hbm.26550] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Revised: 11/07/2023] [Accepted: 11/16/2023] [Indexed: 12/06/2023] Open
Abstract
The intricate relation between action and somatosensory perception has been studied extensively in the past decades. Generally, a forward model is thought to predict the somatosensory consequences of an action. These models propose that when an action is reliably coupled to a tactile stimulus, unexpected absence of the stimulus should elicit prediction error. Although such omission responses have been demonstrated in the auditory modality, it remains unknown whether this mechanism generalizes across modalities. This study therefore aimed to record action-induced somatosensory omission responses using EEG in humans. Self-paced button presses were coupled to somatosensory stimuli in 88% of trials, allowing a prediction, or in 50% of trials, not allowing a prediction. In the 88% condition, stimulus omission resulted in a neural response consisting of multiple components, as revealed by temporal principal component analysis. The oN1 response suggests similar sensory sources as stimulus-evoked activity, but an origin outside primary cortex. Subsequent oN2 and oP3 responses, as previously observed in the auditory domain, likely reflect modality-unspecific higher order processes. Together, findings straightforwardly demonstrate somatosensory predictions during action and provide evidence for a partially amodal mechanism of prediction error generation.
Collapse
Affiliation(s)
- Tjerk T. Dercksen
- Research Group Neurocognitive DevelopmentLeibniz Institute for NeurobiologyMagdeburgGermany
- Center for Behavioral Brain SciencesMagdeburgGermany
| | - Andreas Widmann
- Research Group Neurocognitive DevelopmentLeibniz Institute for NeurobiologyMagdeburgGermany
- Wilhelm Wundt Institute for PsychologyLeipzig UniversityLeipzigGermany
| | - Tömme Noesselt
- Center for Behavioral Brain SciencesMagdeburgGermany
- Department of Biological PsychologyOtto‐von‐Guericke‐University MagdeburgMagdeburgGermany
| | - Nicole Wetzel
- Research Group Neurocognitive DevelopmentLeibniz Institute for NeurobiologyMagdeburgGermany
- Center for Behavioral Brain SciencesMagdeburgGermany
- University of Applied Sciences Magdeburg‐StendalStendalGermany
| |
Collapse
|
7
|
Landelle C, Caron-Guyon J, Nazarian B, Anton J, Sein J, Pruvost L, Amberg M, Giraud F, Félician O, Danna J, Kavounoudias A. Beyond sense-specific processing: decoding texture in the brain from touch and sonified movement. iScience 2023; 26:107965. [PMID: 37810223 PMCID: PMC10551894 DOI: 10.1016/j.isci.2023.107965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/08/2023] [Accepted: 09/15/2023] [Indexed: 10/10/2023] Open
Abstract
Texture, a fundamental object attribute, is perceived through multisensory information including touch and auditory cues. Coherent perceptions may rely on shared texture representations across different senses in the brain. To test this hypothesis, we delivered haptic textures coupled with a sound synthesizer to generate real-time textural sounds. Participants completed roughness estimation tasks with haptic, auditory, or bimodal cues in an MRI scanner. Somatosensory, auditory, and visual cortices were all activated during haptic and auditory exploration, challenging the traditional view that primary sensory cortices are sense-specific. Furthermore, audio-tactile integration was found in secondary somatosensory (S2) and primary auditory cortices. Multivariate analyses revealed shared spatial activity patterns in primary motor and somatosensory cortices, for discriminating texture across both modalities. This study indicates that primary areas and S2 have a versatile representation of multisensory textures, which has significant implications for how the brain processes multisensory cues to interact more efficiently with our environment.
Collapse
Affiliation(s)
- C. Landelle
- McGill University, McConnell Brain Imaging Centre, Department of Neurology and Neurosurgery, Montreal Neurological Institute, Montreal, QC, Canada
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| | - J. Caron-Guyon
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- University of Louvain, Institute for Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, Crossmodal Perception and Plasticity Laboratory, Louvain-la-Neuve, Belgium
| | - B. Nazarian
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J.L. Anton
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J. Sein
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - L. Pruvost
- Aix-Marseille Université, CNRS, Perception, Représentations, Image, Son, Musique, PRISM UMR 7061, Marseille, France
| | - M. Amberg
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - F. Giraud
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - O. Félician
- Aix Marseille Université, INSERM, Institut des Neurosciences des Systèmes, INS UMR 1106, Marseille, France
| | - J. Danna
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- Université de Toulouse, CNRS, Laboratoire Cognition, Langues, Langage, Ergonomie, CLLE UMR5263, Toulouse, France
| | - A. Kavounoudias
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| |
Collapse
|
8
|
Vivaldo CA, Lee J, Shorkey M, Keerthy A, Rothschild G. Auditory cortex ensembles jointly encode sound and locomotion speed to support sound perception during movement. PLoS Biol 2023; 21:e3002277. [PMID: 37651461 PMCID: PMC10499203 DOI: 10.1371/journal.pbio.3002277] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 09/13/2023] [Accepted: 07/26/2023] [Indexed: 09/02/2023] Open
Abstract
The ability to process and act upon incoming sounds during locomotion is critical for survival and adaptive behavior. Despite the established role that the auditory cortex (AC) plays in behavior- and context-dependent sound processing, previous studies have found that auditory cortical activity is on average suppressed during locomotion as compared to immobility. While suppression of auditory cortical responses to self-generated sounds results from corollary discharge, which weakens responses to predictable sounds, the functional role of weaker responses to unpredictable external sounds during locomotion remains unclear. In particular, whether suppression of external sound-evoked responses during locomotion reflects reduced involvement of the AC in sound processing or whether it results from masking by an alternative neural computation in this state remains unresolved. Here, we tested the hypothesis that rather than simple inhibition, reduced sound-evoked responses during locomotion reflect a tradeoff with the emergence of explicit and reliable coding of locomotion velocity. To test this hypothesis, we first used neural inactivation in behaving mice and found that the AC plays a critical role in sound-guided behavior during locomotion. To investigate the nature of this processing, we used two-photon calcium imaging of local excitatory auditory cortical neural populations in awake mice. We found that locomotion had diverse influences on activity of different neurons, with a net suppression of baseline-subtracted sound-evoked responses and neural stimulus detection, consistent with previous studies. Importantly, we found that the net inhibitory effect of locomotion on baseline-subtracted sound-evoked responses was strongly shaped by elevated ongoing activity that compressed the response dynamic range, and that rather than reflecting enhanced "noise," this ongoing activity reliably encoded the animal's locomotion speed. Decoding analyses revealed that locomotion speed and sound are robustly co-encoded by auditory cortical ensemble activity. Finally, we found consistent patterns of joint coding of sound and locomotion speed in electrophysiologically recorded activity in freely moving rats. Together, our data suggest that rather than being suppressed by locomotion, auditory cortical ensembles explicitly encode it alongside sound information to support sound perception during locomotion.
Collapse
Affiliation(s)
- Carlos Arturo Vivaldo
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Joonyeup Lee
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - MaryClaire Shorkey
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Ajay Keerthy
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Gideon Rothschild
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
- Kresge Hearing Research Institute and Department of Otolaryngology—Head and Neck Surgery, University of Michigan, Ann Arbor, Michigan, United States of America
| |
Collapse
|
9
|
Rineau AL, Bringoux L, Sarrazin JC, Berberian B. Being active over one's own motion: Considering predictive mechanisms in self-motion perception. Neurosci Biobehav Rev 2023; 146:105051. [PMID: 36669748 DOI: 10.1016/j.neubiorev.2023.105051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 01/16/2023] [Accepted: 01/16/2023] [Indexed: 01/19/2023]
Abstract
Self-motion perception is a key element guiding pilots' behavior. Its importance is mostly revealed when impaired, leading in most cases to spatial disorientation which is still today a major factor of accidents occurrence. Self-motion perception is known as mainly based on visuo-vestibular integration and can be modulated by the physical properties of the environment with which humans interact. For instance, several studies have shown that the respective weight of visual and vestibular information depends on their reliability. More recently, it has been suggested that the internal state of an operator can also modulate multisensory integration. Interestingly, the systems' automation can interfere with this internal state through the loss of the intentional nature of movements (i.e., loss of agency) and the modulation of associated predictive mechanisms. In this context, one of the new challenges is to better understand the relationship between automation and self-motion perception. The present review explains how linking the concepts of agency and self-motion is a first approach to address this issue.
Collapse
Affiliation(s)
- Anne-Laure Rineau
- Information Processing and Systems, ONERA, Salon de Provence, Base Aérienne 701, France.
| | | | | | - Bruno Berberian
- Information Processing and Systems, ONERA, Salon de Provence, Base Aérienne 701, France.
| |
Collapse
|
10
|
Musso M, Altenmüller E, Reisert M, Hosp J, Schwarzwald R, Blank B, Horn J, Glauche V, Kaller C, Weiller C, Schumacher M. Speaking in gestures: Left dorsal and ventral frontotemporal brain systems underlie communication in conducting. Eur J Neurosci 2023; 57:324-350. [PMID: 36509461 DOI: 10.1111/ejn.15883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Revised: 09/27/2022] [Accepted: 11/21/2022] [Indexed: 12/15/2022]
Abstract
Conducting constitutes a well-structured system of signs anticipating information concerning the rhythm and dynamic of a musical piece. Conductors communicate the musical tempo to the orchestra, unifying the individual instrumental voices to form an expressive musical Gestalt. In a functional magnetic resonance imaging (fMRI) experiment, 12 professional conductors and 16 instrumentalists conducted real-time novel pieces with diverse complexity in orchestration and rhythm. For control, participants either listened to the stimuli or performed beat patterns, setting the time of a metronome or complex rhythms played by a drum. Activation of the left superior temporal gyrus (STG), supplementary and premotor cortex and Broca's pars opercularis (F3op) was shared in both musician groups and separated conducting from the other conditions. Compared to instrumentalists, conductors activated Broca's pars triangularis (F3tri) and the STG, which differentiated conducting from time beating and reflected the increase in complexity during conducting. In comparison to conductors, instrumentalists activated F3op and F3tri when distinguishing complex rhythm processing from simple rhythm processing. Fibre selection from a normative human connectome database, constructed using a global tractography approach, showed that the F3op and STG are connected via the arcuate fasciculus, whereas the F3tri and STG are connected via the extreme capsule. Like language, the anatomical framework characterising conducting gestures is located in the left dorsal system centred on F3op. This system reflected the sensorimotor mapping for structuring gestures to musical tempo. The ventral system centred on F3Tri may reflect the art of conductors to set this musical tempo to the individual orchestra's voices in a global, holistic way.
Collapse
Affiliation(s)
- Mariacristina Musso
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Eckart Altenmüller
- Institute of Music Physiology and Musician's Medicine, Hannover University of Music Drama and Media, Hannover, Germany
| | - Marco Reisert
- Department of Medical Physics, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Jonas Hosp
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Ralf Schwarzwald
- Department of Neuroradiology, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Bettina Blank
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Julian Horn
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Volkmar Glauche
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Christoph Kaller
- Department of Medical Physics, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Cornelius Weiller
- Department of Neurology and Clinical Neuroscience, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Martin Schumacher
- Department of Neuroradiology, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| |
Collapse
|
11
|
Rineau AL, Berberian B, Sarrazin JC, Bringoux L. Active self-motion control and the role of agency under ambiguity. Front Psychol 2023; 14:1148793. [PMID: 37151332 PMCID: PMC10158821 DOI: 10.3389/fpsyg.2023.1148793] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 03/31/2023] [Indexed: 05/09/2023] Open
Abstract
Purpose Self-motion perception is a key factor in daily behaviours such as driving a car or piloting an aircraft. It is mainly based on visuo-vestibular integration, whose weighting mechanisms are modulated by the reliability properties of sensory inputs. Recently, it has been shown that the internal state of the operator can also modulate multisensory integration and may sharpen the representation of relevant inputs. In line with the concept of agency, it thus appears relevant to evaluate the impact of being in control of our own action on self-motion perception. Methodology Here, we tested two conditions of motion control (active/manual trigger versus passive/ observer condition), asking participants to discriminate between two consecutive longitudinal movements by identifying the larger displacement (displacement of higher intensity). We also tested motion discrimination under two levels of ambiguity by applying acceleration ratios that differed from our two "standard" displacements (i.e., 3 s; 0.012 m.s-2 and 0.030 m.s-2). Results We found an effect of control condition, but not of the level of ambiguity on the way participants perceived the standard displacement, i.e., perceptual bias (Point of Subjective Equality; PSE). Also, we found a significant effect of interaction between the active condition and the level of ambiguity on the ability to discriminate between displacements, i.e., sensitivity (Just Noticeable Difference; JND). Originality Being in control of our own motion through a manual intentional trigger of self-displacement maintains overall motion sensitivity when ambiguity increases.
Collapse
Affiliation(s)
- Anne-Laure Rineau
- ONERA, Information Processing and Systems Department (DTIS), Salon-de-Provence, France
- *Correspondence: Anne-Laure Rineau,
| | - Bruno Berberian
- ONERA, Information Processing and Systems Department (DTIS), Salon-de-Provence, France
| | | | | |
Collapse
|
12
|
Preisig BC, Riecke L, Hervais-Adelman A. Speech sound categorization: The contribution of non-auditory and auditory cortical regions. Neuroimage 2022; 258:119375. [PMID: 35700949 DOI: 10.1016/j.neuroimage.2022.119375] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 05/13/2022] [Accepted: 06/10/2022] [Indexed: 11/26/2022] Open
Abstract
Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.
Collapse
Affiliation(s)
- Basil C Preisig
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, 6500 HB Nijmegen, The Netherlands; Max Planck Institute for Psycholinguistics, 6525 XD Nijmegen, The Netherlands; Department of Psychology, Neurolinguistics, University of Zurich, 8050 Zurich, Switzerland; Department of Comparative Language Science, Evolutionary Neuroscience of Language, University of Zurich, 8050 Zurich, Switzerland; Neuroscience Center Zurich, University of Zurich and Eidgenössische Technische Hochschule Zurich, 8057 Zurich, Switzerland.
| | - Lars Riecke
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 ER Maastricht, The Netherlands
| | - Alexis Hervais-Adelman
- Department of Psychology, Neurolinguistics, University of Zurich, 8050 Zurich, Switzerland; Neuroscience Center Zurich, University of Zurich and Eidgenössische Technische Hochschule Zurich, 8057 Zurich, Switzerland
| |
Collapse
|
13
|
Abstract
The last decade has seen the emergence of new theoretical frameworks to explain pathological fatigue, a much neglected, yet highly significant symptom across a wide range of diseases. While the new models of fatigue provide new hypotheses to test, they also raise a number of questions. The primary purpose of this essay is to examine the predictions of three recently proposed models of fatigue, the overlap and differences between them, and the evidence from diseases that may lend support to the models of fatigue. I also present expansions for the sensory attenuation model of fatigue. Further questions examined here are the following: What are the neural substrates of fatigue? How can sensory attenuation, which underpins agency also explain fatigue? Are fatigue and agency related?
Collapse
Affiliation(s)
- Annapoorna Kuppuswamy
- Department of Clinical and Movement Neuroscience, Institute of Neurology, University College London, London, UK
| |
Collapse
|
14
|
Dercksen TT, Widmann A, Scharf F, Wetzel N. Sound omission related brain responses in children. Dev Cogn Neurosci 2022; 53:101045. [PMID: 34923314 PMCID: PMC8688889 DOI: 10.1016/j.dcn.2021.101045] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Revised: 10/21/2021] [Accepted: 12/09/2021] [Indexed: 11/03/2022] Open
Abstract
Action is an important way for children to learn about the world. Recent theories suggest that action is inherently accompanied by the sensory prediction of its effects. Such predictions can be revealed by rarely omitting the expected sensory consequence of the action, resulting in an omission response that is observable in the EEG. Although prediction errors play an important role in models of learning and development, little is known about omission-related brain responses in children. This study used a motor-auditory omission paradigm, testing a group of 6-8-year-old children and an adult group (N = 31 each). In an identity-specific condition, the sound coupled to the motor action was predictable, while in an identity unspecific condition the sound was unpredictable. Results of a temporal principal component analysis revealed that sound-related brain responses underlying the N1-complex differed considerably between age groups. Despite these developmental differences, omission responses (oN1) were similar between age groups. Two subcomponents of the oN1 were differently affected by specific and unspecific predictions. Results demonstrate that children, independent from the maturation of sound processing mechanisms, can implement specific and unspecific predictions as flexibly as adults. This supports theories that regard action and prediction error as important drivers of cognitive development.
Collapse
Affiliation(s)
- Tjerk T Dercksen
- Leibniz Institute for Neurobiology, Brenneckestraße 6, 39118 Magdeburg, Germany; Center for Behavioral Brain Sciences, Universitätsplatz 2, D-39106 Magdeburg, Germany.
| | - Andreas Widmann
- Leibniz Institute for Neurobiology, Brenneckestraße 6, 39118 Magdeburg, Germany; Leipzig University, Neumarkt 9-19, D-04109 Leipzig, Germany
| | - Florian Scharf
- University of Münster, Fliednerstraße 21, 48149 Münster, Germany
| | - Nicole Wetzel
- Leibniz Institute for Neurobiology, Brenneckestraße 6, 39118 Magdeburg, Germany; Center for Behavioral Brain Sciences, Universitätsplatz 2, D-39106 Magdeburg, Germany; University of Applied Sciences Magdeburg-Stendal, Osterburgerstraße 25, 39576 Stendal, Germany
| |
Collapse
|
15
|
Reznik D, Guttman N, Buaron B, Zion-Golumbic E, Mukamel R. Action-locked Neural Responses in Auditory Cortex to Self-generated Sounds. Cereb Cortex 2021; 31:5560-5569. [PMID: 34185837 DOI: 10.1093/cercor/bhab179] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Revised: 05/24/2021] [Accepted: 05/25/2021] [Indexed: 11/14/2022] Open
Abstract
Sensory perception is a product of interactions between the internal state of an organism and the physical attributes of a stimulus. It has been shown across the animal kingdom that perception and sensory-evoked physiological responses are modulated depending on whether or not the stimulus is the consequence of voluntary actions. These phenomena are often attributed to motor signals sent to relevant sensory regions that convey information about upcoming sensory consequences. However, the neurophysiological signature of action-locked modulations in sensory cortex, and their relationship with perception, is still unclear. In the current study, we recorded neurophysiological (using Magnetoencephalography) and behavioral responses from 16 healthy subjects performing an auditory detection task of faint tones. Tones were either generated by subjects' voluntary button presses or occurred predictably following a visual cue. By introducing a constant temporal delay between button press/cue and tone delivery, and applying source-level analysis, we decoupled action-locked and auditory-locked activity in auditory cortex. We show action-locked evoked-responses in auditory cortex following sound-triggering actions and preceding sound onset. Such evoked-responses were not found for button-presses that were not coupled with sounds, or sounds delivered following a predictive visual cue. Our results provide evidence for efferent signals in human auditory cortex that are locked to voluntary actions coupled with future auditory consequences.
Collapse
Affiliation(s)
- Daniel Reznik
- Max Planck Institute for Human Cognitive and Brain Sciences, Psychology Department, Leipzig, 04103, Germany
| | - Noa Guttman
- The Gonda Center for Multidisciplinary Brain Research, Bar-Ilan University, Ramat Gan, 5290002, Israel
| | - Batel Buaron
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, 69978, Israel
| | - Elana Zion-Golumbic
- The Gonda Center for Multidisciplinary Brain Research, Bar-Ilan University, Ramat Gan, 5290002, Israel
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, 69978, Israel
| |
Collapse
|
16
|
Gale DJ, Areshenkoff CN, Honda C, Johnsrude IS, Flanagan JR, Gallivan JP. Motor Planning Modulates Neural Activity Patterns in Early Human Auditory Cortex. Cereb Cortex 2021; 31:2952-2967. [PMID: 33511976 PMCID: PMC8107793 DOI: 10.1093/cercor/bhaa403] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2020] [Revised: 12/14/2020] [Accepted: 12/14/2020] [Indexed: 11/13/2022] Open
Abstract
It is well established that movement planning recruits motor-related cortical brain areas in preparation for the forthcoming action. Given that an integral component to the control of action is the processing of sensory information throughout movement, we predicted that movement planning might also modulate early sensory cortical areas, readying them for sensory processing during the unfolding action. To test this hypothesis, we performed 2 human functional magnetic resonance imaging studies involving separate delayed movement tasks and focused on premovement neural activity in early auditory cortex, given the area's direct connections to the motor system and evidence that it is modulated by motor cortex during movement in rodents. We show that effector-specific information (i.e., movements of the left vs. right hand in Experiment 1 and movements of the hand vs. eye in Experiment 2) can be decoded, well before movement, from neural activity in early auditory cortex. We find that this motor-related information is encoded in a separate subregion of auditory cortex than sensory-related information and is present even when movements are cued visually instead of auditorily. These findings suggest that action planning, in addition to preparing the motor system for movement, involves selectively modulating primary sensory areas based on the intended action.
Collapse
Affiliation(s)
- Daniel J Gale
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Corson N Areshenkoff
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Claire Honda
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Ingrid S Johnsrude
- Department of Psychology, University of Western Ontario, London, Ontario, N6A 3K7, Canada
- School of Communication Sciences and Disorders, University of Western Ontario, London, Ontario, N6A 3K7, Canada
- Brain and Mind Institute, University of Western Ontario, London, Ontario, N6A 3K7, Canada
| | - J Randall Flanagan
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Jason P Gallivan
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Biomedical and Molecular Sciences, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| |
Collapse
|
17
|
Morán I, Perez-Orive J, Melchor J, Figueroa T, Lemus L. Auditory decisions in the supplementary motor area. Prog Neurobiol 2021; 202:102053. [PMID: 33957182 DOI: 10.1016/j.pneurobio.2021.102053] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 04/06/2021] [Accepted: 04/20/2021] [Indexed: 10/21/2022]
Abstract
In human speech and communication across various species, recognizing and categorizing sounds is fundamental for the selection of appropriate behaviors. However, how does the brain decide which action to perform based on sounds? We explored whether the supplementary motor area (SMA), responsible for linking sensory information to motor programs, also accounts for auditory-driven decision making. To this end, we trained two rhesus monkeys to discriminate between numerous naturalistic sounds and words learned as target (T) or non-target (nT) categories. We found that the SMA at single and population neuronal levels perform decision-related computations that transition from auditory to movement representations in this task. Moreover, we demonstrated that the neural population is organized orthogonally during the auditory and the movement periods, implying that the SMA performs different computations. In conclusion, our results suggest that the SMA integrates acoustic information in order to form categorical signals that drive behavior.
Collapse
Affiliation(s)
- Isaac Morán
- Department of Cognitive Neuroscience, Institute of Cell Physiology, Universidad Nacional Autónoma de México (UNAM), 04510, Mexico City, Mexico
| | - Javier Perez-Orive
- Instituto Nacional de Rehabilitacion "Luis Guillermo Ibarra Ibarra", Mexico City, Mexico
| | - Jonathan Melchor
- Department of Cognitive Neuroscience, Institute of Cell Physiology, Universidad Nacional Autónoma de México (UNAM), 04510, Mexico City, Mexico
| | - Tonatiuh Figueroa
- Department of Cognitive Neuroscience, Institute of Cell Physiology, Universidad Nacional Autónoma de México (UNAM), 04510, Mexico City, Mexico
| | - Luis Lemus
- Department of Cognitive Neuroscience, Institute of Cell Physiology, Universidad Nacional Autónoma de México (UNAM), 04510, Mexico City, Mexico.
| |
Collapse
|
18
|
Seidel A, Ghio M, Studer B, Bellebaum C. Illusion of control affects ERP amplitude reductions for auditory outcomes of self-generated actions. Psychophysiology 2021; 58:e13792. [PMID: 33604896 DOI: 10.1111/psyp.13792] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2020] [Revised: 01/28/2021] [Accepted: 01/29/2021] [Indexed: 11/27/2022]
Abstract
The reduction of neural responses to self-generated stimuli compared to external stimuli is thought to result from the matching of motor-based sensory predictions and sensory reafferences and to serve the identification of changes in the environment as caused by oneself. The amplitude of the auditory event-related potential (ERP) component N1 seems to closely reflect this matching process, while the later positive component (P2/ P3a) has been associated with judgments of agency, which are also sensitive to contextual top-down information. In this study, we examined the effect of perceived control over sound production on the processing of self-generated and external stimuli, as reflected in these components. We used a new version of a classic two-button choice task to induce different degrees of the illusion of control (IoC) and recorded ERPs for the processing of self-generated and external sounds in a subsequent task. N1 amplitudes were reduced for self-generated compared to external sounds, but not significantly affected by IoC. P2/3a amplitudes were affected by IoC: We found reduced P2/3a amplitudes after a high compared to a low IoC induction training, but only for self-generated, not for external sounds. These findings suggest that prior contextual belief information induced by an IoC affects later processing as reflected in the P2/P3a, possibly for the formation of agency judgments, while early processing reflecting motor-based predictions is not affected.
Collapse
Affiliation(s)
- Alexander Seidel
- Institute of Experimental Psychology, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
| | - Marta Ghio
- CIMeC - Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Bettina Studer
- Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, Germany.,Department of Neurology, Mauritius Hospital Meerbusch, Meerbusch, Germany
| | - Christian Bellebaum
- Institute of Experimental Psychology, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
| |
Collapse
|
19
|
Cannon JJ, Patel AD. How Beat Perception Co-opts Motor Neurophysiology. Trends Cogn Sci 2020; 25:137-150. [PMID: 33353800 DOI: 10.1016/j.tics.2020.11.002] [Citation(s) in RCA: 98] [Impact Index Per Article: 19.6] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Revised: 11/06/2020] [Accepted: 11/12/2020] [Indexed: 02/08/2023]
Abstract
Beat perception offers cognitive scientists an exciting opportunity to explore how cognition and action are intertwined in the brain even in the absence of movement. Many believe the motor system predicts the timing of beats, yet current models of beat perception do not specify how this is neurally implemented. Drawing on recent insights into the neurocomputational properties of the motor system, we propose that beat anticipation relies on action-like processes consisting of precisely patterned neural time-keeping activity in the supplementary motor area (SMA), orchestrated and sequenced by activity in the dorsal striatum. In addition to synthesizing recent advances in cognitive science and motor neuroscience, our framework provides testable predictions to guide future work.
Collapse
Affiliation(s)
- Jonathan J Cannon
- Department of Brain and Cognitive Science, Massachusetts Institute of Technology, Cambridge, MA, USA.
| | - Aniruddh D Patel
- Department of Psychology, Tufts University, Medford, MA, USA; Program in Brain, Mind, and Consciousness, Canadian Institute for Advanced Research (CIFAR), Toronto, CA.
| |
Collapse
|
20
|
Pinheiro AP, Schwartze M, Gutiérrez-Domínguez F, Kotz SA. Real and imagined sensory feedback have comparable effects on action anticipation. Cortex 2020; 130:290-301. [PMID: 32698087 DOI: 10.1016/j.cortex.2020.04.030] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2019] [Revised: 03/23/2020] [Accepted: 04/13/2020] [Indexed: 01/08/2023]
Abstract
The forward model monitors the success of sensory feedback to an action and links it to an efference copy originating in the motor system. The Readiness Potential (RP) of the electroencephalogram has been denoted as a neural signature of the efference copy. An open question is whether imagined sensory feedback works similarly to real sensory feedback. We investigated the RP to audible and imagined sounds in a button-press paradigm and assessed the role of sound complexity (vocal vs. non-vocal sound). Sensory feedback (both audible and imagined) in response to a voluntary action modulated the RP amplitude time-locked to the button press. The RP amplitude increase was larger for actions with expected sensory feedback (audible and imagined) than those without sensory feedback, and associated with N1 suppression for audible sounds. Further, the early RP phase was increased when actions elicited an imagined vocal (self-voice) compared to non-vocal sound. Our results support the notion that sensory feedback is anticipated before voluntary actions. This is the case for both audible and imagined sensory feedback and confirms a role of overt and covert feedback in the forward model.
Collapse
Affiliation(s)
- Ana P Pinheiro
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisbon, Portugal; Faculty of Psychology and Neuroscience, University of Maastricht, Maastricht, The Netherlands.
| | - Michael Schwartze
- Faculty of Psychology and Neuroscience, University of Maastricht, Maastricht, The Netherlands
| | | | - Sonja A Kotz
- Faculty of Psychology and Neuroscience, University of Maastricht, Maastricht, The Netherlands
| |
Collapse
|
21
|
Buaron B, Reznik D, Gilron R, Mukamel R. Voluntary Actions Modulate Perception and Neural Representation of Action-Consequences in a Hand-Dependent Manner. Cereb Cortex 2020; 30:6097-6107. [PMID: 32607565 DOI: 10.1093/cercor/bhaa156] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Revised: 05/17/2020] [Accepted: 05/18/2020] [Indexed: 12/13/2022] Open
Abstract
Evoked neural activity in sensory regions and perception of sensory stimuli are modulated when the stimuli are the consequence of voluntary movement, as opposed to an external source. It has been suggested that such modulations are due to motor commands that are sent to relevant sensory regions during voluntary movement. However, given the anatomical-functional laterality bias of the motor system, it is plausible that the pattern of such behavioral and neural modulations will also exhibit a similar bias, depending on the effector triggering the stimulus (e.g., right/left hand). Here, we examined this issue in the visual domain using behavioral and neural measures (fMRI). Healthy participants judged the relative brightness of identical visual stimuli that were either self-triggered (using right/left hand button presses), or triggered by the computer. Stimuli were presented either in the right or left visual field. Despite identical physical properties of the visual consequences, we found stronger perceptual modulations when the triggering hand was ipsi- (rather than contra-) lateral to the stimulated visual field. Additionally, fMRI responses in visual cortices differentiated between stimuli triggered by right/left hand. Our findings support a model in which voluntary actions induce sensory modulations that follow the anatomical-functional bias of the motor system.
Collapse
Affiliation(s)
- Batel Buaron
- Sagol School of Neuroscience, School of Psychological Sciences, Tel-Aviv University, Tel Aviv 69978, Israel
| | - Daniel Reznik
- Department of Psychology, Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Ro'ee Gilron
- Department of Neurological Surgery, UCSF School of Medicine, UCSF, San Francisco, CA 94115, USA
| | - Roy Mukamel
- Sagol School of Neuroscience, School of Psychological Sciences, Tel-Aviv University, Tel Aviv 69978, Israel
| |
Collapse
|
22
|
van Kemenade BM, Arikan BE, Podranski K, Steinsträter O, Kircher T, Straube B. Distinct Roles for the Cerebellum, Angular Gyrus, and Middle Temporal Gyrus in Action-Feedback Monitoring. Cereb Cortex 2020; 29:1520-1531. [PMID: 29912297 DOI: 10.1093/cercor/bhy048] [Citation(s) in RCA: 51] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2017] [Revised: 01/29/2018] [Accepted: 02/14/2018] [Indexed: 12/27/2022] Open
Abstract
Action-feedback monitoring is essential to ensure meaningful interactions with the external world. This process involves generating efference copy-based sensory predictions and comparing these with the actual action-feedback. As neural correlates of comparator processes, previous fMRI studies have provided heterogeneous results, including the cerebellum, angular and middle temporal gyrus. However, these studies usually comprised only self-generated actions. Therefore, they might have induced not only action-based prediction errors, but also general sensory mismatch errors. Here, we aimed to disentangle these processes using a custom-made fMRI-compatible movement device, generating active and passive hand movements with identical sensory feedback. Online visual feedback of the hand was presented with a variable delay. Participants had to judge whether the feedback was delayed. Activity in the right cerebellum correlated more positively with delay in active than in passive trials. Interestingly, we also observed activation in the angular and middle temporal gyri, but across both active and passive conditions. This suggests that the cerebellum is a comparator area specific to voluntary action, whereas angular and middle temporal gyri seem to detect more general intersensory conflict. Correlations with behavior and cerebellar activity nevertheless suggest involvement of these temporoparietal areas in processing and awareness of temporal discrepancies in action-feedback monitoring.
Collapse
Affiliation(s)
- Bianca M van Kemenade
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
| | - B Ezgi Arikan
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
| | - Kornelius Podranski
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
- Technische Hochschule Mittelhessen, Wiesenstraße 14, Gießen, Germany
| | - Olaf Steinsträter
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
| | - Tilo Kircher
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Straße 8, Marburg, Germany
| |
Collapse
|
23
|
Heins N, Pomp J, Kluger DS, Trempler I, Zentgraf K, Raab M, Schubotz RI. Incidental or Intentional? Different Brain Responses to One's Own Action Sounds in Hurdling vs. Tap Dancing. Front Neurosci 2020; 14:483. [PMID: 32477059 PMCID: PMC7237737 DOI: 10.3389/fnins.2020.00483] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Accepted: 04/20/2020] [Indexed: 12/20/2022] Open
Abstract
Most human actions produce concomitant sounds. Action sounds can be either part of the action goal (GAS, goal-related action sounds), as for instance in tap dancing, or a mere by-product of the action (BAS, by-product action sounds), as for instance in hurdling. It is currently unclear whether these two types of action sounds-incidental or intentional-differ in their neural representation and whether the impact on the performance evaluation of an action diverges between the two. We here examined whether during the observation of tap dancing compared to hurdling, auditory information is a more important factor for positive action quality ratings. Moreover, we tested whether observation of tap dancing vs. hurdling led to stronger attenuation in primary auditory cortex, and a stronger mismatch signal when sounds do not match our expectations. We recorded individual point-light videos of newly trained participants performing tap dancing and hurdling. In the subsequent functional magnetic resonance imaging (fMRI) session, participants were presented with the videos that displayed their own actions, including corresponding action sounds, and were asked to rate the quality of their performance. Videos were either in their original form or scrambled regarding the visual modality, the auditory modality, or both. As hypothesized, behavioral results showed significantly lower rating scores in the GAS condition compared to the BAS condition when the auditory modality was scrambled. Functional MRI contrasts between BAS and GAS actions revealed higher activation of primary auditory cortex in the BAS condition, speaking in favor of stronger attenuation in GAS, as well as stronger activation of posterior superior temporal gyri and the supplementary motor area in GAS. Results suggest that the processing of self-generated action sounds depends on whether we have the intention to produce a sound with our action or not, and action sounds may be more prone to be used as sensory feedback when they are part of the explicit action goal. Our findings contribute to a better understanding of the function of action sounds for learning and controlling sound-producing actions.
Collapse
Affiliation(s)
- Nina Heins
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Jennifer Pomp
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Daniel S. Kluger
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
- Institute for Biomagnetism and Biosignalanalysis, University of Muenster, Muenster, Germany
| | - Ima Trempler
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Karen Zentgraf
- Department of Movement Science and Training in Sports, Institute of Sport Sciences, Goethe University Frankfurt, Frankfurt, Germany
| | - Markus Raab
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany
- School of Applied Sciences, London South Bank University, London, United Kingdom
| | - Ricarda I. Schubotz
- Department of Psychology, University of Muenster, Münster, Germany
- Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| |
Collapse
|
24
|
Krala M, van Kemenade B, Straube B, Kircher T, Bremmer F. Predictive coding in a multisensory path integration task: An fMRI study. J Vis 2020; 19:13. [PMID: 31561251 DOI: 10.1167/19.11.13] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
During self-motion through an environment, our sensory systems are confronted with a constant flow of information from different modalities. To successfully navigate, self-induced sensory signals have to be dissociated from externally induced sensory signals. Previous studies have suggested that the processing of self-induced sensory information is modulated by means of predictive coding mechanisms. However, the neural correlates of processing self-induced sensory information from different modalities during self-motion are largely unknown. Here, we asked if and how the processing of visually simulated self-motion and/or associated auditory stimuli is modulated by self-controlled action. Participants were asked to actively reproduce a previously observed simulated self-displacement (path integration). Blood oxygen level-dependent (BOLD) activation during this path integration was compared with BOLD activation during a condition in which we passively replayed the exact sensory stimulus that had been produced by the participants in previous trials. We found supramodal BOLD suppression in parietal and frontal regions. Remarkably, BOLD contrast in sensory areas was enhanced in a modality-specific manner. We conclude that the effect of action on sensory processing is strictly dependent on the respective behavioral task and its relevance.
Collapse
Affiliation(s)
- Milosz Krala
- Department of Neurophysics, University of Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany
| | - Bianca van Kemenade
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Benjamin Straube
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Tilo Kircher
- Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany.,Translational Neuroimaging Marburg, Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, University of Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior-CMBB, University of Marburg and Justus-Liebig-University Giessen, Germany
| |
Collapse
|
25
|
Role of the supplementary motor area in auditory sensory attenuation. Brain Struct Funct 2019; 224:2577-2586. [DOI: 10.1007/s00429-019-01920-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2019] [Accepted: 07/06/2019] [Indexed: 11/25/2022]
|
26
|
Kral A, Dorman MF, Wilson BS. Neuronal Development of Hearing and Language: Cochlear Implants and Critical Periods. Annu Rev Neurosci 2019; 42:47-65. [DOI: 10.1146/annurev-neuro-080317-061513] [Citation(s) in RCA: 71] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The modern cochlear implant (CI) is the most successful neural prosthesis developed to date. CIs provide hearing to the profoundly hearing impaired and allow the acquisition of spoken language in children born deaf. Results from studies enabled by the CI have provided new insights into ( a) minimal representations at the periphery for speech reception, ( b) brain mechanisms for decoding speech presented in quiet and in acoustically adverse conditions, ( c) the developmental neuroscience of language and hearing, and ( d) the mechanisms and time courses of intramodal and cross-modal plasticity. Additionally, the results have underscored the interconnectedness of brain functions and the importance of top-down processes in perception and learning. The findings are described in this review with emphasis on the developing brain and the acquisition of hearing and spoken language.
Collapse
Affiliation(s)
- Andrej Kral
- Institute of AudioNeuroTechnology and Department of Experimental Otology, ENT Clinics, Hannover Medical University, 30625 Hannover, Germany
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Dallas, Texas 75080, USA
- School of Medicine and Health Sciences, Macquarie University, Sydney, New South Wales 2109, Australia
| | - Michael F. Dorman
- Department of Speech and Hearing Science, Arizona State University, Tempe, Arizona 85287, USA
| | - Blake S. Wilson
- School of Behavioral and Brain Sciences, The University of Texas at Dallas, Dallas, Texas 75080, USA
- School of Medicine and Pratt School of Engineering, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
27
|
Jiang M, Wen Z, Long L, Wong CW, Ye N, Zee C, Chen BT. Assessing Cerebral White Matter Microstructure in Children With Congenital Sensorineural Hearing Loss: A Tract-Based Spatial Statistics Study. Front Neurosci 2019; 13:597. [PMID: 31293368 PMCID: PMC6598398 DOI: 10.3389/fnins.2019.00597] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2019] [Accepted: 05/27/2019] [Indexed: 12/16/2022] Open
Abstract
Objectives To assess the microstructural properties of cerebral white matter in children with congenital sensorineural hearing loss (CSNHL). Methods Children (>4 years of age) with profound CSNHL and healthy controls with normal hearing (the control group) were enrolled and underwent brain magnetic resonance imaging (MRI) scans with diffusion tensor imaging (DTI). DTI parameters including fractional anisotropy, mean diffusivity, axial diffusivity, and radial diffusivity were obtained from a whole-brain tract-based spatial statistics analysis and were compared between the two groups. In addition, a region of interest (ROI) approach focusing on auditory cortex, i.e., Heschl’s gyrus, using visual cortex, i.e., forceps major as an internal control, was performed. Correlations between mean DTI values and age were obtained with the ROI method. Results The study cohort consisted of 23 children with CSHNL (11 boys and 12 girls; mean age ± SD: 7.21 ± 2.67 years; range: 4.1–13.5 years) and 18 children in the control group (11 boys and 7 girls; mean age ± SD: 10.86 ± 3.56 years; range: 4.5–15.3 years). We found the axial diffusivity values being significantly greater in the left anterior thalamic radiation, right corticospinal tract, and corpus callosum in the CSHNL group than in the control group (p < 0.05). Significantly higher radial diffusivity values in the white matter tracts were noted in the CSHNL group as compared to the control group (p < 0.05). The fractional anisotropy values in the Heschl’s gyrus in the CSNHL group were lower compared to the control group (p = 0.0015). There was significant negative correlation between the mean fractional anisotropy values in Heschl’s gyrus and age in the CSNHL group < 7 years of age (r = −0.59, p = 0.004). Conclusion Our study showed higher axial and radial diffusivities in the children affected by CNHNL as compared to the hearing children. We also found lower fractional anisotropy values in the Heschl’s gyrus in the CSNHL group. Furthermore, we identified negative correlation between the fractional anisotropy values and age up to 7 years in the children born deaf. Our study findings suggest that myelination and axonal structure may be affected due to acoustic deprivation. This information may help to monitor hearing rehabilitation in the deaf children.
Collapse
Affiliation(s)
- Muliang Jiang
- Department of Diagnostic Radiology, City of Hope National Medical Center, Duarte, CA, United States
| | - Zuguang Wen
- Department of Radiology, First Affiliated Hospital of Guangxi Medical University, Nanning, China
| | - Liling Long
- Department of Radiology, First Affiliated Hospital of Guangxi Medical University, Nanning, China
| | - Chi Wah Wong
- Center for Informatics, City of Hope National Medical Center, Duarte, CA, United States
| | - Ningrong Ye
- Department of Diagnostic Radiology, City of Hope National Medical Center, Duarte, CA, United States
| | - Chishing Zee
- Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, United States
| | - Bihong T Chen
- Department of Diagnostic Radiology, City of Hope National Medical Center, Duarte, CA, United States
| |
Collapse
|
28
|
Winkowski DE, Nagode DA, Donaldson KJ, Yin P, Shamma SA, Fritz JB, Kanold PO. Orbitofrontal Cortex Neurons Respond to Sound and Activate Primary Auditory Cortex Neurons. Cereb Cortex 2019; 28:868-879. [PMID: 28069762 DOI: 10.1093/cercor/bhw409] [Citation(s) in RCA: 63] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2016] [Accepted: 12/21/2016] [Indexed: 12/13/2022] Open
Abstract
Sensory environments change over a wide dynamic range and sensory processing can change rapidly to facilitate stable perception. While rapid changes may occur throughout the sensory processing pathway, cortical changes are believed to profoundly influence perception. Prior stimulation studies showed that orbitofrontal cortex (OFC) can modify receptive fields and sensory coding in A1, but the engagement of OFC during listening and the pathways mediating OFC influences on A1 are unknown. We show in mice that OFC neurons respond to sounds consistent with a role of OFC in audition. We then show in vitro that OFC axons are present in A1 and excite pyramidal and GABAergic cells in all layers of A1 via glutamatergic synapses. Optogenetic stimulation of OFC terminals in A1 in vivo evokes short-latency neural activity in A1 and pairing activation of OFC projections in A1 with sounds alters sound-evoked A1 responses. Together, our results identify a direct connection from OFC to A1 that can excite A1 neurons at the earliest stage of cortical processing, and thereby sculpt A1 receptive fields. These results are consistent with a role for OFC in adjusting to changing behavioral relevance of sensory inputs and modulating A1 receptive fields to enhance sound processing.
Collapse
Affiliation(s)
- Daniel E Winkowski
- Institute for Systems Research, University of Maryland, College Park, MD 20742, USA.,Department of Biology, University of Maryland, College Park, MD 20742, USA
| | - Daniel A Nagode
- Department of Biology, University of Maryland, College Park, MD 20742,USA
| | - Kevin J Donaldson
- Institute for Systems Research, University of Maryland, College Park, MD 20742,USA
| | - Pingbo Yin
- Institute for Systems Research, University of Maryland, College Park, MD 20742,USA
| | - Shihab A Shamma
- Institute for Systems Research, University of Maryland, College Park, MD 20742, USA.,Laboratoire des Systèmes Perceptifs, École Normale Supérieure, 75005 Paris, France
| | - Jonathan B Fritz
- Institute for Systems Research, University of Maryland, College Park, MD 20742,USA
| | - Patrick O Kanold
- Institute for Systems Research, University of Maryland, College Park, MD 20742, USA.,Department of Biology, University of Maryland, College Park, MD 20742, USA
| |
Collapse
|
29
|
Abstract
Coordinated movement depends on constant interaction between neural circuits that produce motor output and those that report sensory consequences. Fundamental to this process are mechanisms for controlling the influence that sensory signals have on motor pathways - for example, reducing feedback gains when they are disruptive and increasing gains when advantageous. Sensory gain control comes in many forms and serves diverse purposes - in some cases sensory input is attenuated to maintain movement stability and filter out irrelevant or self-generated signals, or enhanced to facilitate salient signals for improved movement execution and adaptation. The ubiquitous presence of sensory gain control across species at multiple levels of the nervous system reflects the importance of tuning the impact that feedback information has on behavioral output.
Collapse
|
30
|
Motor output, neural states and auditory perception. Neurosci Biobehav Rev 2019; 96:116-126. [DOI: 10.1016/j.neubiorev.2018.10.021] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Revised: 10/26/2018] [Accepted: 10/29/2018] [Indexed: 12/12/2022]
|
31
|
Reznik D, Simon S, Mukamel R. Predicted sensory consequences of voluntary actions modulate amplitude of preceding readiness potentials. Neuropsychologia 2018; 119:302-307. [PMID: 30172828 DOI: 10.1016/j.neuropsychologia.2018.08.028] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2018] [Revised: 08/20/2018] [Accepted: 08/29/2018] [Indexed: 10/28/2022]
Abstract
Self-generated, voluntary actions, are preceded by a slow negativity in the scalp electroencephalography (EEG) signal recorded from frontal regions (termed 'readiness potential'; RP). This signal, and its lateralized subcomponent (LRP), is mainly regarded as preparatory motor activity associated with the forthcoming voluntary motor act. However, it is not clear whether this neural signature is associated with preparatory motor activity, expectation of its associated sensory consequences, or both. Here we recorded EEG data from 14 healthy subjects while they performed self-paced button presses with their right index and middle fingers. Button-presses with one finger triggered a sound (motor+sound condition), while button-presses with the other finger did not (motor-only condition). Additionally, subjects listened to externally-generated sounds delivered in expected timings (sound-only condition). We found that the RP amplitude (locked to time of button press) was significantly more negative in the motor+sound compared with motor-only conditions. Importantly, no signal negativity was observed prior to expected sound delivery in the sound-only condition. Thus, the differences in RP amplitude between motor+sound and motor-only conditions are beyond differences in mere expectation of a forthcoming auditory sound. Our results suggest that information regarding expected auditory consequences is represented in the RP preceding voluntary action execution.
Collapse
Affiliation(s)
- Daniel Reznik
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel
| | - Shiri Simon
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel.
| |
Collapse
|
32
|
Abstract
Hearing is often viewed as a passive process: Sound enters the ear, triggers a cascade of activity through the auditory system, and culminates in an auditory percept. In contrast to a passive process, motor-related signals strongly modulate the auditory system from the eardrum to the cortex. The motor modulation of auditory activity is most well documented during speech and other vocalizations but also can be detected during a wide variety of other sound-generating behaviors. An influential idea is that these motor-related signals suppress neural responses to predictable movement-generated sounds, thereby enhancing sensitivity to environmental sounds during movement while helping to detect errors in learned acoustic behaviors, including speech and musicianship. Findings in humans, monkeys, songbirds, and mice provide new insights into the circuits that convey motor-related signals to the auditory system, while lending support to the idea that these signals function predictively to facilitate hearing and vocal learning.
Collapse
Affiliation(s)
- David M Schneider
- Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA;
- Current affiliation: Center for Neural Science, New York University, New York, New York 10003, USA
| | - Richard Mooney
- Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA;
| |
Collapse
|
33
|
Stark‐Inbar A, Dayan E. Preferential encoding of movement amplitude and speed in the primary motor cortex and cerebellum. Hum Brain Mapp 2017; 38:5970-5986. [PMID: 28885740 PMCID: PMC6867018 DOI: 10.1002/hbm.23802] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Revised: 07/04/2017] [Accepted: 08/27/2017] [Indexed: 11/06/2022] Open
Abstract
Voluntary movements require control of multiple kinematic parameters, a task carried out by a distributed brain architecture. However, it remains unclear whether regions along the motor system encode single, or rather a mixture of, kinematic parameters during action execution. Here, rapid event-related functional magnetic resonance imaging was used to differentiate brain activity along the motor system during the encoding of movement amplitude, duration, and speed. We present cumulative evidence supporting preferential encoding of kinematic parameters along the motor system, based on blood-oxygenation-level dependent signal recorded in a well-controlled single-joint wrist-flexion task. Whereas activity in the left primary motor cortex (M1) showed preferential encoding of movement amplitude, the anterior lobe of the right cerebellum (primarily lobule V) showed preferential encoding of movement speed. Conversely, activity in the left supplementary motor area (SMA), basal ganglia (putamen), and anterior intraparietal sulcus was not preferentially modulated by any specific parameter. We found no preference in peak activation for duration encoding in any of the tested regions. Electromyographic data was mainly modulated by movement amplitude, restricting the distinction between amplitude and muscle force encoding. Together, these results suggest that during single-joint movements, distinct kinematic parameters are controlled by largely distinct brain-regions that work together to produce and control precise movements. Hum Brain Mapp 38:5970-5986, 2017. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Alit Stark‐Inbar
- Department of PsychologyUniversity of CaliforniaBerkeleyCalifornia
| | - Eran Dayan
- Department of RadiologyBiomedical Research Imaging Center and Neuroscience Curriculum, University of North Carolina at Chapel HillNorth Carolina
| |
Collapse
|
34
|
Li Q, Liu G, Wei D, Guo J, Yuan G, Wu S. The spatiotemporal pattern of pure tone processing: A single-trial EEG-fMRI study. Neuroimage 2017; 187:184-191. [PMID: 29191479 DOI: 10.1016/j.neuroimage.2017.11.059] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2017] [Revised: 11/23/2017] [Accepted: 11/26/2017] [Indexed: 12/12/2022] Open
Abstract
Although considerable research has been published on pure tone processing, its spatiotemporal pattern is not well understood. Specifically, the link between neural activity in the auditory pathway measured by functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) markers of pure tone processing in the P1, N1, P2, and N4 components is not well established. In this study, we used single-trial EEG-fMRI as a multi-modal fusion approach to integrate concurrently acquired EEG and fMRI data, in order to understand the spatial and temporal aspects of the pure tone processing pathway. Data were recorded from 33 subjects who were presented with stochastically alternating pure tone sequences with two different frequencies: 200 and 6400 Hz. Brain network correlated with trial-to-trial variability of the task-discriminating EEG amplitude was identified. We found that neural responses responding to pure tone perception are spatially along the auditory pathway and temporally divided into three stages: (1) the early stage (P1), wherein activation occurs in the midbrain, which constitutes a part of the low level auditory pathway; (2) the middle stage (N1, P2), wherein correlates were found in areas associated with the posterodorsal auditory pathway, including the primary auditory cortex and the motor cortex; (3) the late stage (N4), wherein correlation was found in the motor cortex. This indicates that trial-by-trial variation in neural activity in the P1, N1, P2, and N4 components reflects the sequential engagement of low- and high-level parts of the auditory pathway for pure tone processing. Our results demonstrate that during simple pure tone listening tasks, regions associated with the auditory pathway transiently correlate with trial-to-trial variability of the EEG amplitude, and they do so on a millisecond timescale with a distinct temporal ordering.
Collapse
Affiliation(s)
- Qiang Li
- College of Electronic and Information Engineering, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China
| | - Guangyuan Liu
- College of Electronic and Information Engineering, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China.
| | - Dongtao Wei
- Department of Psychology, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China
| | - Jing Guo
- College of Electronic and Information Engineering, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China
| | - Guangjie Yuan
- College of Electronic and Information Engineering, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China
| | - Shifu Wu
- College of Electronic and Information Engineering, Southwest University, No. 2, TianSheng Street, Beibei, Chongqing 400715, China
| |
Collapse
|
35
|
Abstract
In behavior, action and perception are inherently interdependent. However, the actual mechanistic contributions of the motor system to sensory processing are unknown. We present neurophysiological evidence that the motor system is involved in predictive timing, a brain function that aligns temporal fluctuations of attention with the timing of events in a task-relevant stream, thus facilitating sensory selection and optimizing behavior. In a magnetoencephalography experiment involving auditory temporal attention, participants had to disentangle two streams of sound on the unique basis of endogenous temporal cues. We show that temporal predictions are encoded by interdependent delta and beta neural oscillations originating from the left sensorimotor cortex, and directed toward auditory regions. We also found that overt rhythmic movements improved the quality of temporal predictions and sharpened the temporal selection of relevant auditory information. This latter behavioral and functional benefit was associated with increased signaling of temporal predictions in right-lateralized frontoparietal associative regions. In sum, this study points at a covert form of auditory active sensing. Our results emphasize the key role of motor brain areas in providing contextual temporal information to sensory regions, driving perceptual and behavioral selection.
Collapse
|
36
|
Andoh J, Ferreira M, Leppert I, Matsushita R, Pike B, Zatorre R. How restful is it with all that noise? Comparison of Interleaved silent steady state (ISSS) and conventional imaging in resting-state fMRI. Neuroimage 2017; 147:726-735. [DOI: 10.1016/j.neuroimage.2016.11.065] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2016] [Revised: 11/03/2016] [Accepted: 11/26/2016] [Indexed: 01/24/2023] Open
|
37
|
Lima CF, Krishnan S, Scott SK. Roles of Supplementary Motor Areas in Auditory Processing and Auditory Imagery. Trends Neurosci 2016; 39:527-542. [PMID: 27381836 PMCID: PMC5441995 DOI: 10.1016/j.tins.2016.06.003] [Citation(s) in RCA: 154] [Impact Index Per Article: 17.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2016] [Revised: 05/26/2016] [Accepted: 06/09/2016] [Indexed: 11/28/2022]
Abstract
Although the supplementary and pre-supplementary motor areas have been intensely investigated in relation to their motor functions, they are also consistently reported in studies of auditory processing and auditory imagery. This involvement is commonly overlooked, in contrast to lateral premotor and inferior prefrontal areas. We argue here for the engagement of supplementary motor areas across a variety of sound categories, including speech, vocalizations, and music, and we discuss how our understanding of auditory processes in these regions relate to findings and hypotheses from the motor literature. We suggest that supplementary and pre-supplementary motor areas play a role in facilitating spontaneous motor responses to sound, and in supporting a flexible engagement of sensorimotor processes to enable imagery and to guide auditory perception. Hearing and imagining sounds–including speech, vocalizations, and music–can recruit SMA and pre-SMA, which are normally discussed in relation to their motor functions. Emerging research indicates that individual differences in the structure and function of SMA and pre-SMA can predict performance in auditory perception and auditory imagery tasks. Responses during auditory processing primarily peak in pre-SMA and in the boundary area between pre-SMA and SMA. This boundary area is crucially involved in the control of speech and vocal production, suggesting that sounds engage this region in an effector-specific manner. Activating sound-related motor representations in SMA and pre-SMA might facilitate behavioral responses to sounds. This might also support a flexible generation of sensory predictions based on previous experience to enable imagery and guide perception.
Collapse
Affiliation(s)
- César F Lima
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Saloni Krishnan
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Sophie K Scott
- Institute of Cognitive Neuroscience, University College London, London, UK.
| |
Collapse
|
38
|
Reznik D, Henkin Y, Levy O, Mukamel R. Perceived loudness of self-generated sounds is differentially modified by expected sound intensity. PLoS One 2015; 10:e0127651. [PMID: 25992603 PMCID: PMC4436370 DOI: 10.1371/journal.pone.0127651] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 04/17/2015] [Indexed: 11/30/2022] Open
Abstract
Performing actions with sensory consequences modifies physiological and behavioral responses relative to otherwise identical sensory input perceived in a passive manner. It is assumed that such modifications occur through an efference copy sent from motor cortex to sensory regions during performance of voluntary actions. In the auditory domain most behavioral studies report attenuated perceived loudness of self-generated auditory action-consequences. However, several recent behavioral and physiological studies report enhanced responses to such consequences. Here we manipulated the intensity of self-generated and externally-generated sounds and examined the type of perceptual modification (enhancement vs. attenuation) reported by healthy human subjects. We found that when the intensity of self-generated sounds was low, perceived loudness is enhanced. Conversely, when the intensity of self-generated sounds was high, perceived loudness is attenuated. These results might reconcile some of the apparent discrepancies in the reported literature and suggest that efference copies can adapt perception according to the differential sensory context of voluntary actions.
Collapse
Affiliation(s)
- Daniel Reznik
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Yael Henkin
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Communication Disorders, Sackler Faculty of Medicine, Tel Aviv University, Tel Aviv, Israel
- Hearing, Speech, and Language Center, Sheba Medical Center, Tel Hashomer, Ramat Gan, Israel
| | - Osnat Levy
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Roy Mukamel
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- * E-mail:
| |
Collapse
|