1
|
Ahveninen J, Lee HJ, Yu HY, Lee CC, Chou CC, Ahlfors SP, Kuo WJ, Jääskeläinen IP, Lin FH. Visual Stimuli Modulate Local Field Potentials But Drive No High-Frequency Activity in Human Auditory Cortex. J Neurosci 2024; 44:e0890232023. [PMID: 38129133 PMCID: PMC10869150 DOI: 10.1523/jneurosci.0890-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 11/06/2023] [Accepted: 11/07/2023] [Indexed: 12/23/2023] Open
Abstract
Neuroimaging studies suggest cross-sensory visual influences in human auditory cortices (ACs). Whether these influences reflect active visual processing in human ACs, which drives neuronal firing and concurrent broadband high-frequency activity (BHFA; >70 Hz), or whether they merely modulate sound processing is still debatable. Here, we presented auditory, visual, and audiovisual stimuli to 16 participants (7 women, 9 men) with stereo-EEG depth electrodes implanted near ACs for presurgical monitoring. Anatomically normalized group analyses were facilitated by inverse modeling of intracranial source currents. Analyses of intracranial event-related potentials (iERPs) suggested cross-sensory responses to visual stimuli in ACs, which lagged the earliest auditory responses by several tens of milliseconds. Visual stimuli also modulated the phase of intrinsic low-frequency oscillations and triggered 15-30 Hz event-related desynchronization in ACs. However, BHFA, a putative correlate of neuronal firing, was not significantly increased in ACs after visual stimuli, not even when they coincided with auditory stimuli. Intracranial recordings demonstrate cross-sensory modulations, but no indication of active visual processing in human ACs.
Collapse
Affiliation(s)
- Jyrki Ahveninen
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Hsin-Ju Lee
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
| | - Hsiang-Yu Yu
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Cheng-Chia Lee
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
- Department of Neurosurgery, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
| | - Chien-Chen Chou
- Department of Epilepsy, Neurological Institute, Taipei Veterans General Hospital, Taipei 11217, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Seppo P Ahlfors
- Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, Massachusetts 02129
- Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Wen-Jui Kuo
- Institute of Neuroscience, National Yang Ming Chiao Tung University, Taipei 112304, Taiwan
| | - Iiro P Jääskeläinen
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
- International Laboratory of Social Neurobiology, Institute of Cognitive Neuroscience, Higher School of Economics, Moscow 101000, Russia
| | - Fa-Hsuan Lin
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada
- Department of Medical Biophysics, University of Toronto, Toronto, Ontario M5G 1L7, Canada
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, Espoo, FI-00076 AALTO, Finland
| |
Collapse
|
2
|
Zuo Y, Wang Z. Neural Oscillations and Multisensory Processing. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:121-137. [PMID: 38270857 DOI: 10.1007/978-981-99-7611-9_8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Neural oscillations play a role in sensory processing by coordinating synchronized neuronal activity. Synchronization of gamma oscillations is engaged in local computation of feedforward signals and synchronization of alpha-beta oscillations is engaged in feedback processing over long-range areas. These spatially and spectrally segregated bi-directional signals may be integrated by a mechanism of cross-frequency coupling. Synchronization of neural oscillations has also been proposed as a mechanism for information integration across multiple sensory modalities. A transient stimulus or rhythmic stimulus from one modality may lead to phase alignment of ongoing neural oscillations in multiple sensory cortices, through a mechanism of cross-modal phase reset or cross-modal neural entrainment. Synchronized activities in multiple sensory cortices are more likely to boost stronger activities in downstream areas. Compared to synchronized oscillations, asynchronized oscillations may impede signal processing, and may contribute to sensory selection by setting the oscillations in the target-related cortex and the oscillations in the distractor-related cortex to opposite phases.
Collapse
Affiliation(s)
- Yanfang Zuo
- Department of Neurology, Guangzhou First People's Hospital, School of Medicine, South China University of Technology, Guangzhou, China
- Center for Medical Research on Innovation and Translation, Institute of Clinical Medicine, Guangzhou First People's Hospital, School of Medicine, South China University of Technology, Guangzhou, China
| | - Zuoren Wang
- Institute of Neuroscience, State Key Laboratory of Neuroscience, CAS Center for Excellence in Brain Science & Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
- University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
3
|
Li Y, Anumanchipalli GK, Mohamed A, Chen P, Carney LH, Lu J, Wu J, Chang EF. Dissecting neural computations in the human auditory pathway using deep neural networks for speech. Nat Neurosci 2023; 26:2213-2225. [PMID: 37904043 PMCID: PMC10689246 DOI: 10.1038/s41593-023-01468-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 09/13/2023] [Indexed: 11/01/2023]
Abstract
The human auditory system extracts rich linguistic abstractions from speech signals. Traditional approaches to understanding this complex process have used linear feature-encoding models, with limited success. Artificial neural networks excel in speech recognition tasks and offer promising computational models of speech processing. We used speech representations in state-of-the-art deep neural network (DNN) models to investigate neural coding from the auditory nerve to the speech cortex. Representations in hierarchical layers of the DNN correlated well with the neural activity throughout the ascending auditory system. Unsupervised speech models performed at least as well as other purely supervised or fine-tuned models. Deeper DNN layers were better correlated with the neural activity in the higher-order auditory cortex, with computations aligned with phonemic and syllabic structures in speech. Accordingly, DNN models trained on either English or Mandarin predicted cortical responses in native speakers of each language. These results reveal convergence between DNN model representations and the biological auditory pathway, offering new approaches for modeling neural coding in the auditory cortex.
Collapse
Affiliation(s)
- Yuanning Li
- Department of Neurological Surgery, University of California, San Francisco, San Francisco, CA, USA
- School of Biomedical Engineering & State Key Laboratory of Advanced Medical Materials and Devices, ShanghaiTech University, Shanghai, China
| | - Gopala K Anumanchipalli
- Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA
- Department of Electrical Engineering and Computer Science, University of California, Berkeley, Berkeley, CA, USA
| | | | - Peili Chen
- School of Biomedical Engineering & State Key Laboratory of Advanced Medical Materialsand Devices, ShanghaiTech University, Shanghai, China
| | - Laurel H Carney
- Department of Biomedical Engineering, University of Rochester, Rochester, NY, USA
| | - Junfeng Lu
- Neurologic Surgery Department, Huashan Hospital, Shanghai Medical College, Fudan University, Shanghai, China
- Brain Function Laboratory, Neurosurgical Institute, Fudan University, Shanghai, China
| | - Jinsong Wu
- Neurologic Surgery Department, Huashan Hospital, Shanghai Medical College, Fudan University, Shanghai, China
- Brain Function Laboratory, Neurosurgical Institute, Fudan University, Shanghai, China
| | - Edward F Chang
- Department of Neurological Surgery, University of California, San Francisco, San Francisco, CA, USA.
- Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA.
| |
Collapse
|
4
|
Delli Pizzi S, Chiacchiaretta P, Sestieri C, Ferretti A, Tullo MG, Della Penna S, Martinotti G, Onofrj M, Roseman L, Timmermann C, Nutt DJ, Carhart-Harris RL, Sensi SL. LSD-induced changes in the functional connectivity of distinct thalamic nuclei. Neuroimage 2023; 283:120414. [PMID: 37858906 DOI: 10.1016/j.neuroimage.2023.120414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Revised: 09/05/2023] [Accepted: 10/16/2023] [Indexed: 10/21/2023] Open
Abstract
The role of the thalamus in mediating the effects of lysergic acid diethylamide (LSD) was recently proposed in a model of communication and corroborated by imaging studies. However, a detailed analysis of LSD effects on nuclei-resolved thalamocortical connectivity is still missing. Here, in a group of healthy volunteers, we evaluated whether LSD intake alters the thalamocortical coupling in a nucleus-specific manner. Structural and resting-state functional Magnetic Resonance Imaging (MRI) data were acquired in a placebo-controlled study on subjects exposed to acute LSD administration. Structural MRI was used to parcel the thalamus into its constituent nuclei based on individual anatomy. Nucleus-specific changes of resting-state functional MRI (rs-fMRI) connectivity were mapped using a seed-based approach. LSD intake selectively increased the thalamocortical functional connectivity (FC) of the ventral complex, pulvinar, and non-specific nuclei. Functional coupling was increased between these nuclei and sensory cortices that include the somatosensory and auditory networks. The ventral and pulvinar nuclei also exhibited increased FC with parts of the associative cortex that are dense in serotonin type 2A receptors. These areas are hyperactive and hyper-connected upon LSD intake. At subcortical levels, LSD increased the functional coupling among the thalamus's ventral, pulvinar, and non-specific nuclei, but decreased the striatal-thalamic connectivity. These findings unravel some LSD effects on the modulation of subcortical-cortical circuits and associated behavioral outputs.
Collapse
Affiliation(s)
- Stefano Delli Pizzi
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy; Molecular Neurology Unit, Center for Advanced Studies and Technology (CAST), University "G. d'Annunzio" of Chieti-Pescara, Italy
| | - Piero Chiacchiaretta
- Department of Innovative Technologies in Medicine and Dentistry, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Carlo Sestieri
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy; Institute for Advanced Biomedical Technologies (ITAB), "G. d'Annunzio" University, Chieti-Pescara, Italy
| | - Antonio Ferretti
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy; Institute for Advanced Biomedical Technologies (ITAB), "G. d'Annunzio" University, Chieti-Pescara, Italy; UdA-TechLab, Research Center, University "G. d'Annunzio" of Chieti-Pescara, 66100 Chieti, Italy
| | - Maria Giulia Tullo
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy
| | - Stefania Della Penna
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy; Institute for Advanced Biomedical Technologies (ITAB), "G. d'Annunzio" University, Chieti-Pescara, Italy
| | - Giovanni Martinotti
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy
| | - Marco Onofrj
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy
| | - Leor Roseman
- Centre for Psychedelic Research, Faculty of Medicine, Imperial College London, London, United Kingdom
| | - Christopher Timmermann
- Centre for Psychedelic Research, Faculty of Medicine, Imperial College London, London, United Kingdom
| | - David J Nutt
- Centre for Psychedelic Research, Faculty of Medicine, Imperial College London, London, United Kingdom
| | - Robin L Carhart-Harris
- Centre for Psychedelic Research, Faculty of Medicine, Imperial College London, London, United Kingdom; Psychedelics Division, Neuroscape, Neurology, University of California San Francisco
| | - Stefano L Sensi
- Department of Neuroscience, Imaging, and Clinical Sciences, University "G. d'Annunzio" of Chieti-Pescara, Italy; Molecular Neurology Unit, Center for Advanced Studies and Technology (CAST), University "G. d'Annunzio" of Chieti-Pescara, Italy; Institute for Advanced Biomedical Technologies (ITAB), "G. d'Annunzio" University, Chieti-Pescara, Italy.
| |
Collapse
|
5
|
Landelle C, Caron-Guyon J, Nazarian B, Anton J, Sein J, Pruvost L, Amberg M, Giraud F, Félician O, Danna J, Kavounoudias A. Beyond sense-specific processing: decoding texture in the brain from touch and sonified movement. iScience 2023; 26:107965. [PMID: 37810223 PMCID: PMC10551894 DOI: 10.1016/j.isci.2023.107965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/08/2023] [Accepted: 09/15/2023] [Indexed: 10/10/2023] Open
Abstract
Texture, a fundamental object attribute, is perceived through multisensory information including touch and auditory cues. Coherent perceptions may rely on shared texture representations across different senses in the brain. To test this hypothesis, we delivered haptic textures coupled with a sound synthesizer to generate real-time textural sounds. Participants completed roughness estimation tasks with haptic, auditory, or bimodal cues in an MRI scanner. Somatosensory, auditory, and visual cortices were all activated during haptic and auditory exploration, challenging the traditional view that primary sensory cortices are sense-specific. Furthermore, audio-tactile integration was found in secondary somatosensory (S2) and primary auditory cortices. Multivariate analyses revealed shared spatial activity patterns in primary motor and somatosensory cortices, for discriminating texture across both modalities. This study indicates that primary areas and S2 have a versatile representation of multisensory textures, which has significant implications for how the brain processes multisensory cues to interact more efficiently with our environment.
Collapse
Affiliation(s)
- C. Landelle
- McGill University, McConnell Brain Imaging Centre, Department of Neurology and Neurosurgery, Montreal Neurological Institute, Montreal, QC, Canada
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| | - J. Caron-Guyon
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- University of Louvain, Institute for Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, Crossmodal Perception and Plasticity Laboratory, Louvain-la-Neuve, Belgium
| | - B. Nazarian
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J.L. Anton
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J. Sein
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - L. Pruvost
- Aix-Marseille Université, CNRS, Perception, Représentations, Image, Son, Musique, PRISM UMR 7061, Marseille, France
| | - M. Amberg
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - F. Giraud
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - O. Félician
- Aix Marseille Université, INSERM, Institut des Neurosciences des Systèmes, INS UMR 1106, Marseille, France
| | - J. Danna
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- Université de Toulouse, CNRS, Laboratoire Cognition, Langues, Langage, Ergonomie, CLLE UMR5263, Toulouse, France
| | - A. Kavounoudias
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| |
Collapse
|
6
|
Scott SK, Jasmin K. Rostro-caudal networks for sound processing in the primate brain. Front Neurosci 2022; 16:1076374. [PMID: 36590301 PMCID: PMC9797816 DOI: 10.3389/fnins.2022.1076374] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Accepted: 11/28/2022] [Indexed: 12/23/2022] Open
Abstract
Sound is processed in primate brains along anatomically and functionally distinct streams: this pattern can be seen in both human and non-human primates. We have previously proposed a general auditory processing framework in which these different perceptual profiles are associated with different computational characteristics. In this paper we consider how recent work supports our framework.
Collapse
Affiliation(s)
- Sophie K. Scott
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom,*Correspondence: Sophie K. Scott,
| | - Kyle Jasmin
- Department of Psychology, Royal Holloway, University of London, Egham, United Kingdom,Kyle Jasmin,
| |
Collapse
|
7
|
Franken MK, Liu BC, Ostry DJ. Towards a somatosensory theory of speech perception. J Neurophysiol 2022; 128:1683-1695. [PMID: 36416451 PMCID: PMC9762980 DOI: 10.1152/jn.00381.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Revised: 11/19/2022] [Accepted: 11/19/2022] [Indexed: 11/24/2022] Open
Abstract
Speech perception is known to be a multimodal process, relying not only on auditory input but also on the visual system and possibly on the motor system as well. To date there has been little work on the potential involvement of the somatosensory system in speech perception. In the present review, we identify the somatosensory system as another contributor to speech perception. First, we argue that evidence in favor of a motor contribution to speech perception can just as easily be interpreted as showing somatosensory involvement. Second, physiological and neuroanatomical evidence for auditory-somatosensory interactions across the auditory hierarchy indicates the availability of a neural infrastructure that supports somatosensory involvement in auditory processing in general. Third, there is accumulating evidence for somatosensory involvement in the context of speech specifically. In particular, tactile stimulation modifies speech perception, and speech auditory input elicits activity in somatosensory cortical areas. Moreover, speech sounds can be decoded from activity in somatosensory cortex; lesions to this region affect perception, and vowels can be identified based on somatic input alone. We suggest that the somatosensory involvement in speech perception derives from the somatosensory-auditory pairing that occurs during speech production and learning. By bringing together findings from a set of studies that have not been previously linked, the present article identifies the somatosensory system as a presently unrecognized contributor to speech perception.
Collapse
Affiliation(s)
| | | | - David J Ostry
- McGill University, Montreal, Quebec, Canada
- Haskins Laboratories, New Haven, Connecticut
| |
Collapse
|
8
|
Gao C, Green JJ, Yang X, Oh S, Kim J, Shinkareva SV. Audiovisual integration in the human brain: a coordinate-based meta-analysis. Cereb Cortex 2022; 33:5574-5584. [PMID: 36336347 PMCID: PMC10152097 DOI: 10.1093/cercor/bhac443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Revised: 10/09/2022] [Accepted: 10/10/2022] [Indexed: 11/09/2022] Open
Abstract
Abstract
People can seamlessly integrate a vast array of information from what they see and hear in the noisy and uncertain world. However, the neural underpinnings of audiovisual integration continue to be a topic of debate. Using strict inclusion criteria, we performed an activation likelihood estimation meta-analysis on 121 neuroimaging experiments with a total of 2,092 participants. We found that audiovisual integration is linked with the coexistence of multiple integration sites, including early cortical, subcortical, and higher association areas. Although activity was consistently found within the superior temporal cortex, different portions of this cortical region were identified depending on the analytical contrast used, complexity of the stimuli, and modality within which attention was directed. The context-dependent neural activity related to audiovisual integration suggests a flexible rather than fixed neural pathway for audiovisual integration. Together, our findings highlight a flexible multiple pathways model for audiovisual integration, with superior temporal cortex as the central node in these neural assemblies.
Collapse
Affiliation(s)
- Chuanji Gao
- Donders Institute for Brain, Cognition and Behaviour, Radboud University , Nijmegen , Netherlands
| | - Jessica J Green
- Department of Psychology, Institute for Mind and Brain, University of South Carolina , Columbia, SC 29201 , USA
| | - Xuan Yang
- Department of Psychology, Institute for Mind and Brain, University of South Carolina , Columbia, SC 29201 , USA
| | - Sewon Oh
- Department of Psychology, Institute for Mind and Brain, University of South Carolina , Columbia, SC 29201 , USA
| | - Jongwan Kim
- Department of Psychology, Jeonbuk National University , Jeonju , South Korea
| | - Svetlana V Shinkareva
- Department of Psychology, Institute for Mind and Brain, University of South Carolina , Columbia, SC 29201 , USA
| |
Collapse
|
9
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
10
|
Enhancement of speech-in-noise comprehension through vibrotactile stimulation at the syllabic rate. Proc Natl Acad Sci U S A 2022; 119:e2117000119. [PMID: 35312362 PMCID: PMC9060510 DOI: 10.1073/pnas.2117000119] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Syllables are important building blocks of speech. They occur at a rate between 4 and 8 Hz, corresponding to the theta frequency range of neural activity in the cerebral cortex. When listening to speech, the theta activity becomes aligned to the syllabic rhythm, presumably aiding in parsing a speech signal into distinct syllables. However, this neural activity cannot only be influenced by sound, but also by somatosensory information. Here, we show that the presentation of vibrotactile signals at the syllabic rate can enhance the comprehension of speech in background noise. We further provide evidence that this multisensory enhancement of speech comprehension reflects the multisensory integration of auditory and tactile information in the auditory cortex. Speech unfolds over distinct temporal scales, in particular, those related to the rhythm of phonemes, syllables, and words. When a person listens to continuous speech, the syllabic rhythm is tracked by neural activity in the theta frequency range. The tracking plays a functional role in speech processing: Influencing the theta activity through transcranial current stimulation, for instance, can impact speech perception. The theta-band activity in the auditory cortex can also be modulated through the somatosensory system, but the effect on speech processing has remained unclear. Here, we show that vibrotactile feedback presented at the rate of syllables can modulate and, in fact, enhance the comprehension of a speech signal in background noise. The enhancement occurs when vibrotactile pulses occur at the perceptual center of the syllables, whereas a temporal delay between the vibrotactile signals and the speech stream can lead to a lower level of speech comprehension. We further investigate the neural mechanisms underlying the audiotactile integration through electroencephalographic (EEG) recordings. We find that the audiotactile stimulation modulates the neural response to the speech rhythm, as well as the neural response to the vibrotactile pulses. The modulations of these neural activities reflect the behavioral effects on speech comprehension. Moreover, we demonstrate that speech comprehension can be predicted by particular aspects of the neural responses. Our results evidence a role of vibrotactile information for speech processing and may have applications in future auditory prosthesis.
Collapse
|
11
|
Hamilton LS, Oganian Y, Hall J, Chang EF. Parallel and distributed encoding of speech across human auditory cortex. Cell 2021; 184:4626-4639.e13. [PMID: 34411517 DOI: 10.1016/j.cell.2021.07.019] [Citation(s) in RCA: 77] [Impact Index Per Article: 25.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2020] [Revised: 02/11/2021] [Accepted: 07/19/2021] [Indexed: 12/27/2022]
Abstract
Speech perception is thought to rely on a cortical feedforward serial transformation of acoustic into linguistic representations. Using intracranial recordings across the entire human auditory cortex, electrocortical stimulation, and surgical ablation, we show that cortical processing across areas is not consistent with a serial hierarchical organization. Instead, response latency and receptive field analyses demonstrate parallel and distinct information processing in the primary and nonprimary auditory cortices. This functional dissociation was also observed where stimulation of the primary auditory cortex evokes auditory hallucination but does not distort or interfere with speech perception. Opposite effects were observed during stimulation of nonprimary cortex in superior temporal gyrus. Ablation of the primary auditory cortex does not affect speech perception. These results establish a distributed functional organization of parallel information processing throughout the human auditory cortex and demonstrate an essential independent role for nonprimary auditory cortex in speech processing.
Collapse
Affiliation(s)
- Liberty S Hamilton
- Department of Neurological Surgery, University of California, San Francisco, 675 Nelson Rising Lane, San Francisco, CA 94158, USA
| | - Yulia Oganian
- Department of Neurological Surgery, University of California, San Francisco, 675 Nelson Rising Lane, San Francisco, CA 94158, USA
| | - Jeffery Hall
- Department of Neurology and Neurosurgery, McGill University Montreal Neurological Institute, Montreal, QC, H3A 2B4, Canada
| | - Edward F Chang
- Department of Neurological Surgery, University of California, San Francisco, 675 Nelson Rising Lane, San Francisco, CA 94158, USA.
| |
Collapse
|
12
|
Bauer AKR, van Ede F, Quinn AJ, Nobre AC. Rhythmic Modulation of Visual Perception by Continuous Rhythmic Auditory Stimulation. J Neurosci 2021; 41:7065-7075. [PMID: 34261698 PMCID: PMC8372019 DOI: 10.1523/jneurosci.2980-20.2021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Revised: 04/16/2021] [Accepted: 05/29/2021] [Indexed: 11/21/2022] Open
Abstract
At any given moment our sensory systems receive multiple, often rhythmic, inputs from the environment. Processing of temporally structured events in one sensory modality can guide both behavioral and neural processing of events in other sensory modalities, but whether this occurs remains unclear. Here, we used human electroencephalography (EEG) to test the cross-modal influences of a continuous auditory frequency-modulated (FM) sound on visual perception and visual cortical activity. We report systematic fluctuations in perceptual discrimination of brief visual stimuli in line with the phase of the FM-sound. We further show that this rhythmic modulation in visual perception is related to an accompanying rhythmic modulation of neural activity recorded over visual areas. Importantly, in our task, perceptual and neural visual modulations occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. As such, the results provide a critical validation for the existence and functional role of cross-modal entrainment and demonstrates its utility for organizing the perception of multisensory stimulation in the natural environment.SIGNIFICANCE STATEMENT Our sensory environment is filled with rhythmic structures that are often multi-sensory in nature. Here, we show that the alignment of neural activity to the phase of an auditory frequency-modulated (FM) sound has cross-modal consequences for vision: yielding systematic fluctuations in perceptual discrimination of brief visual stimuli that are mediated by accompanying rhythmic modulation of neural activity recorded over visual areas. These cross-modal effects on visual neural activity and perception occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. The current work shows that continuous auditory fluctuations in the natural environment can provide a pacing signal for neural activity and perception across the senses.
Collapse
Affiliation(s)
- Anna-Katharina R Bauer
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| | - Freek van Ede
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
- Institute for Brain and Behavior Amsterdam, Department of Experimental and Applied Psychology, Vrije Universiteit Amsterdam, Amsterdam 1081BT, The Netherlands
| | - Andrew J Quinn
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| | - Anna C Nobre
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| |
Collapse
|
13
|
Chai Y, Liu TT, Marrett S, Li L, Khojandi A, Handwerker DA, Alink A, Muckli L, Bandettini PA. Topographical and laminar distribution of audiovisual processing within human planum temporale. Prog Neurobiol 2021; 205:102121. [PMID: 34273456 DOI: 10.1016/j.pneurobio.2021.102121] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 05/20/2021] [Accepted: 07/13/2021] [Indexed: 10/20/2022]
Abstract
The brain is capable of integrating signals from multiple sensory modalities. Such multisensory integration can occur in areas that are commonly considered unisensory, such as planum temporale (PT) representing the auditory association cortex. However, the roles of different afferents (feedforward vs. feedback) to PT in multisensory processing are not well understood. Our study aims to understand that by examining laminar activity patterns in different topographical subfields of human PT under unimodal and multisensory stimuli. To this end, we adopted an advanced mesoscopic (sub-millimeter) fMRI methodology at 7 T by acquiring BOLD (blood-oxygen-level-dependent contrast, which has higher sensitivity) and VAPER (integrated blood volume and perfusion contrast, which has superior laminar specificity) signal concurrently, and performed all analyses in native fMRI space benefiting from an identical acquisition between functional and anatomical images. We found a division of function between visual and auditory processing in PT and distinct feedback mechanisms in different subareas. Specifically, anterior PT was activated more by auditory inputs and received feedback modulation in superficial layers. This feedback depended on task performance and likely arose from top-down influences from higher-order multimodal areas. In contrast, posterior PT was preferentially activated by visual inputs and received visual feedback in both superficial and deep layers, which is likely projected directly from the early visual cortex. Together, these findings provide novel insights into the mechanism of multisensory interaction in human PT at the mesoscopic spatial scale.
Collapse
Affiliation(s)
- Yuhui Chai
- Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA.
| | - Tina T Liu
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Sean Marrett
- Functional MRI Core, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Linqing Li
- Functional MRI Core, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Arman Khojandi
- Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Daniel A Handwerker
- Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Arjen Alink
- University Medical Centre Hamburg-Eppendorf, Department of Systems Neuroscience, Hamburg, Germany
| | - Lars Muckli
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| | - Peter A Bandettini
- Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA; Functional MRI Core, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| |
Collapse
|
14
|
Honnorat N, Saranathan M, Sullivan EV, Pfefferbaum A, Pohl KM, Zahr NM. Performance ramifications of abnormal functional connectivity of ventral posterior lateral thalamus with cerebellum in abstinent individuals with Alcohol Use Disorder. Drug Alcohol Depend 2021; 220:108509. [PMID: 33453503 PMCID: PMC7889734 DOI: 10.1016/j.drugalcdep.2021.108509] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/06/2020] [Revised: 12/03/2020] [Accepted: 12/07/2020] [Indexed: 01/06/2023]
Abstract
The extant literature supports the involvement of the thalamus in the cognitive and motor impairment associated with chronic alcohol consumption, but clear structure/function relationships remain elusive. Alcohol effects on specific nuclei rather than the entire thalamus may provide the basis for differential cognitive and motor decline in Alcohol Use Disorder (AUD). This functional MRI (fMRI) study was conducted in 23 abstinent individuals with AUD and 27 healthy controls to test the hypothesis that functional connectivity between anterior thalamus and hippocampus would be compromised in those with an AUD diagnosis and related to mnemonic deficits. Functional connectivity between 7 thalamic structures [5 thalamic nuclei: anterior ventral (AV), mediodorsal (MD), pulvinar (Pul), ventral lateral posterior (VLP), and ventral posterior lateral (VPL); ventral thalamus; the entire thalamus] and 14 "functional regions" was evaluated. Relative to controls, the AUD group exhibited different VPL-based functional connectivity: an anticorrelation between VPL and a bilateral middle temporal lobe region observed in controls became a positive correlation in the AUD group; an anticorrelation between the VPL and the cerebellum was stronger in the AUD than control group. AUD-associated altered connectivity between anterior thalamus and hippocampus as a substrate of memory compromise was not supported; instead, connectivity differences from controls selective to VPL and cerebellum demonstrated a relationship with impaired balance. These preliminary findings support substructure-level evaluation in future studies focused on discerning the role of the thalamus in AUD-associated cognitive and motor deficits.
Collapse
Affiliation(s)
- Nicolas Honnorat
- Neuroscience Program, SRI International, 333 Ravenswood Ave., Menlo Park, CA, 94025, USA.
| | - Manojkumar Saranathan
- Department of Medical Imaging, University of Arizona College of Medicine, 1501 N. Campbell Ave., Tucson, AZ, 85724, USA.
| | - Edith V Sullivan
- Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305, USA.
| | - Adolf Pfefferbaum
- Neuroscience Program, SRI International, 333 Ravenswood Ave., Menlo Park, CA, 94025, USA; Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305, USA.
| | - Kilian M Pohl
- Neuroscience Program, SRI International, 333 Ravenswood Ave., Menlo Park, CA, 94025, USA; Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305, USA.
| | - Natalie M Zahr
- Neuroscience Program, SRI International, 333 Ravenswood Ave., Menlo Park, CA, 94025, USA; Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305, USA.
| |
Collapse
|
15
|
A multisensory perspective onto primate pulvinar functions. Neurosci Biobehav Rev 2021; 125:231-243. [PMID: 33662442 DOI: 10.1016/j.neubiorev.2021.02.043] [Citation(s) in RCA: 33] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 02/18/2021] [Accepted: 02/25/2021] [Indexed: 02/08/2023]
Abstract
Perception in ambiguous environments relies on the combination of sensory information from various sources. Most associative and primary sensory cortical areas are involved in this multisensory active integration process. As a result, the entire cortex appears as heavily multisensory. In this review, we focus on the contribution of the pulvinar to multisensory integration. This subcortical thalamic nucleus plays a central role in visual detection and selection at a fast time scale, as well as in the regulation of visual processes, at a much slower time scale. However, the pulvinar is also densely connected to cortical areas involved in multisensory integration. In spite of this, little is known about its multisensory properties and its contribution to multisensory perception. Here, we review the anatomical and functional organization of multisensory input to the pulvinar. We describe how visual, auditory, somatosensory, pain, proprioceptive and olfactory projections are differentially organized across the main subdivisions of the pulvinar and we show that topography is central to the organization of this complex nucleus. We propose that the pulvinar combines multiple sources of sensory information to enhance fast responses to the environment, while also playing the role of a general regulation hub for adaptive and flexible cognition.
Collapse
|
16
|
Kaas JH. Comparative Functional Anatomy of Marmoset Brains. ILAR J 2021; 61:260-273. [PMID: 33550381 PMCID: PMC9214571 DOI: 10.1093/ilar/ilaa026] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2020] [Revised: 10/09/2020] [Accepted: 10/23/2020] [Indexed: 12/23/2022] Open
Abstract
Marmosets and closely related tamarins have become popular models for understanding aspects of human brain organization and function because they are small, reproduce and mature rapidly, and have few cortical fissures so that more cortex is visible and accessible on the surface. They are well suited for studies of development and aging. Because marmosets are highly social primates with extensive vocal communication, marmoset studies can inform theories of the evolution of language in humans. Most importantly, marmosets share basic features of major sensory and motor systems with other primates, including those of macaque monkeys and humans with larger and more complex brains. The early stages of sensory processing, including subcortical nuclei and several cortical levels for the visual, auditory, somatosensory, and motor systems, are highly similar across primates, and thus results from marmosets are relevant for making inferences about how these systems are organized and function in humans. Nevertheless, the structures in these systems are not identical across primate species, and homologous structures are much bigger and therefore function somewhat differently in human brains. In particular, the large human brain has more cortical areas that add to the complexity of information processing and storage, as well as decision-making, while making new abilities possible, such as language. Thus, inferences about human brains based on studies on marmoset brains alone should be made with a bit of caution.
Collapse
Affiliation(s)
- Jon H Kaas
- Corresponding Author: Jon H. Kaas, PhD, Department of Psychology, Vanderbilt University, 301 Wilson Hall, 111 21st Ave. S., Nashville, TN 37203, USA. E-mail:
| |
Collapse
|
17
|
|
18
|
Cheng L, Guo ZY, Qu YL. Cross-modality modulation of auditory midbrain processing of intensity information. Hear Res 2020; 395:108042. [PMID: 32810721 DOI: 10.1016/j.heares.2020.108042] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Revised: 06/12/2020] [Accepted: 07/08/2020] [Indexed: 02/03/2023]
Abstract
In nature, animals constantly receive a multitude of sensory stimuli, such as visual, auditory, and somatosensory. The integration across sensory modalities is advantageous for the precise processing of sensory inputs which is essential for animals to survival. Although some principles of cross-modality integration have been revealed by many studies, little insight has been gained into its functional potentials. In this study, the functional influence of cross-modality modulation on auditory processing of intensity information was investigated via recording neuronal activity in the auditory midbrain (i.e., inferior colliculus, IC) under the conditions of visual, auditory, and audiovisual stimuli, respectively. Results demonstrated that combined audiovisual stimuli either enhanced or suppressed the responses of IC neurons compared to auditory stimuli alone, even though the same visual stimuli alone induced no response. Audiovisual modulation appeared to be strongest when the combined audiovisual stimuli were located at the best auditory azimuth of neurons as well as when presented with intensity at near-threshold levels. Additionally, the rate-intensity function of IC neurons to auditory stimuli was expanded or compressed by audiovisual modulation, which was highly dependent on the minimal threshold (MT) of neurons. Lowering of the MT and greater audiovisual modulation for the neuron indicated an intensity-specific enhancement of auditory intensity sensitivity by cross-modality modulation. Overall, evidence suggests a potential functional role of cross-modality modulation in IC that serves to instruct adaptive plasticity to enhance the auditory perception of intensity information.
Collapse
Affiliation(s)
- Liang Cheng
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China; School of Life Sciences & Hubei Key Lab of Genetic Regulation and Integrative Biology, Central China Normal University, Wuhan, 430079, China.
| | - Zhao-Yang Guo
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China
| | - Yi-Li Qu
- School of Psychology & Key Laboratory of Adolescent Cyberpsycology and Behavior (CCNU) of Ministry of Education, Central China Normal University, Wuhan, 430079, China
| |
Collapse
|
19
|
Zumer JM, White TP, Noppeney U. The neural mechanisms of audiotactile binding depend on asynchrony. Eur J Neurosci 2020; 52:4709-4731. [PMID: 32725895 DOI: 10.1111/ejn.14928] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 07/06/2020] [Accepted: 07/24/2020] [Indexed: 11/30/2022]
Abstract
Asynchrony is a critical cue informing the brain whether sensory signals are caused by a common source and should be integrated or segregated. This psychophysics-electroencephalography (EEG) study investigated the influence of asynchrony on how the brain binds audiotactile (AT) signals to enable faster responses in a redundant target paradigm. Human participants actively responded (psychophysics) or passively attended (EEG) to noise bursts, "taps-to-the-face" and their AT combinations at seven AT asynchronies: 0, ±20, ±70 and ±500 ms. Behaviourally, observers were faster at detecting AT than unisensory stimuli within a temporal integration window: the redundant target effect was maximal for synchronous stimuli and declined within a ≤70 ms AT asynchrony. EEG revealed a cascade of AT interactions that relied on different neural mechanisms depending on AT asynchrony. At small (≤20 ms) asynchronies, AT interactions arose for evoked response potentials (ERPs) at 110 ms and ~400 ms post-stimulus. Selectively at ±70 ms asynchronies, AT interactions were observed for the P200 ERP, theta-band inter-trial coherence (ITC) and power at ~200 ms post-stimulus. In conclusion, AT binding was mediated by distinct neural mechanisms depending on the asynchrony of the AT signals. Early AT interactions in ERPs and theta-band ITC and power were critical for the behavioural response facilitation within a ≤±70 ms temporal integration window.
Collapse
Affiliation(s)
- Johanna M Zumer
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,School of Life and Health Sciences, Aston University, Birmingham, UK
| | - Thomas P White
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
20
|
Bauer AKR, Debener S, Nobre AC. Synchronisation of Neural Oscillations and Cross-modal Influences. Trends Cogn Sci 2020; 24:481-495. [PMID: 32317142 PMCID: PMC7653674 DOI: 10.1016/j.tics.2020.03.003] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Revised: 02/20/2020] [Accepted: 03/14/2020] [Indexed: 01/23/2023]
Abstract
At any given moment, we receive multiple signals from our different senses. Prior research has shown that signals in one sensory modality can influence neural activity and behavioural performance associated with another sensory modality. Recent human and nonhuman primate studies suggest that such cross-modal influences in sensory cortices are mediated by the synchronisation of ongoing neural oscillations. In this review, we consider two mechanisms proposed to facilitate cross-modal influences on sensory processing, namely cross-modal phase resetting and neural entrainment. We consider how top-down processes may further influence cross-modal processing in a flexible manner, and we highlight fruitful directions for further research.
Collapse
Affiliation(s)
- Anna-Katharina R Bauer
- Department of Experimental Psychology, Brain and Cognition Lab, Oxford Centre for Human Brain Activity, Department of Psychiatry, Wellcome Centre for Integrative Neuroimaging, University of Oxford, UK.
| | - Stefan Debener
- Department of Psychology, Neuropsychology Lab, Cluster of Excellence Hearing4All, University of Oldenburg, Germany
| | - Anna C Nobre
- Department of Experimental Psychology, Brain and Cognition Lab, Oxford Centre for Human Brain Activity, Department of Psychiatry, Wellcome Centre for Integrative Neuroimaging, University of Oxford, UK
| |
Collapse
|
21
|
Scott SK. From speech and talkers to the social world: The neural processing of human spoken language. Science 2020; 366:58-62. [PMID: 31604302 DOI: 10.1126/science.aax0288] [Citation(s) in RCA: 25] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Human speech perception is a paradigm example of the complexity of human linguistic processing; however, it is also the dominant way of expressing vocal identity and is critically important for social interactions. Here, I review the ways that the speech, the talker, and the social nature of speech interact and how this may be computed in the human brain, using models and approaches from nonhuman primate studies. I explore the extent to which domain-general approaches may be able to account for some of these neural findings. Finally, I address the importance of extending these findings into a better understanding of the social use of speech in conversations.
Collapse
Affiliation(s)
- Sophie K Scott
- Institute of Cognitive Neuroscience, University College London, London, UK
| |
Collapse
|
22
|
Abstract
There are functional and anatomical distinctions between the neural systems involved in the recognition of sounds in the environment and those involved in the sensorimotor guidance of sound production and the spatial processing of sound. Evidence for the separation of these processes has historically come from disparate literatures on the perception and production of speech, music and other sounds. More recent evidence indicates that there are computational distinctions between the rostral and caudal primate auditory cortex that may underlie functional differences in auditory processing. These functional differences may originate from differences in the response times and temporal profiles of neurons in the rostral and caudal auditory cortex, suggesting that computational accounts of primate auditory pathways should focus on the implications of these temporal response differences.
Collapse
|
23
|
Xu X, Hanganu-Opatz IL, Bieler M. Cross-Talk of Low-Level Sensory and High-Level Cognitive Processing: Development, Mechanisms, and Relevance for Cross-Modal Abilities of the Brain. Front Neurorobot 2020; 14:7. [PMID: 32116637 PMCID: PMC7034303 DOI: 10.3389/fnbot.2020.00007] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Accepted: 01/27/2020] [Indexed: 12/18/2022] Open
Abstract
The emergence of cross-modal learning capabilities requires the interaction of neural areas accounting for sensory and cognitive processing. Convergence of multiple sensory inputs is observed in low-level sensory cortices including primary somatosensory (S1), visual (V1), and auditory cortex (A1), as well as in high-level areas such as prefrontal cortex (PFC). Evidence shows that local neural activity and functional connectivity between sensory cortices participate in cross-modal processing. However, little is known about the functional interplay between neural areas underlying sensory and cognitive processing required for cross-modal learning capabilities across life. Here we review our current knowledge on the interdependence of low- and high-level cortices for the emergence of cross-modal processing in rodents. First, we summarize the mechanisms underlying the integration of multiple senses and how cross-modal processing in primary sensory cortices might be modified by top-down modulation of the PFC. Second, we examine the critical factors and developmental mechanisms that account for the interaction between neuronal networks involved in sensory and cognitive processing. Finally, we discuss the applicability and relevance of cross-modal processing for brain-inspired intelligent robotics. An in-depth understanding of the factors and mechanisms controlling cross-modal processing might inspire the refinement of robotic systems by better mimicking neural computations.
Collapse
Affiliation(s)
- Xiaxia Xu
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Malte Bieler
- Laboratory for Neural Computation, Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| |
Collapse
|
24
|
Bravo F, Cross I, Hopkins C, Gonzalez N, Docampo J, Bruno C, Stamatakis EA. Anterior cingulate and medial prefrontal cortex response to systematically controlled tonal dissonance during passive music listening. Hum Brain Mapp 2019; 41:46-66. [PMID: 31512332 PMCID: PMC7268082 DOI: 10.1002/hbm.24786] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Revised: 07/18/2019] [Accepted: 08/26/2019] [Indexed: 12/14/2022] Open
Abstract
Several studies have attempted to investigate how the brain codes emotional value when processing music of contrasting levels of dissonance; however, the lack of control over specific musical structural characteristics (i.e., dynamics, rhythm, melodic contour or instrumental timbre), which are known to affect perceived dissonance, rendered results difficult to interpret. To account for this, we used functional imaging with an optimized control of the musical structure to obtain a finer characterization of brain activity in response to tonal dissonance. Behavioral findings supported previous evidence for an association between increased dissonance and negative emotion. Results further demonstrated that the manipulation of tonal dissonance through systematically controlled changes in interval content elicited contrasting valence ratings but no significant effects on either arousal or potency. Neuroscientific findings showed an engagement of the left medial prefrontal cortex (mPFC) and the left rostral anterior cingulate cortex (ACC) while participants listened to dissonant compared to consonant music, converging with studies that have proposed a core role of these regions during conflict monitoring (detection and resolution), and in the appraisal of negative emotion and fear‐related information. Both the left and right primary auditory cortices showed stronger functional connectivity with the ACC during the dissonant portion of the task, implying a demand for greater information integration when processing negatively valenced musical stimuli. This study demonstrated that the systematic control of musical dissonance could be applied to isolate valence from the arousal dimension, facilitating a novel access to the neural representation of negative emotion.
Collapse
Affiliation(s)
- Fernando Bravo
- Centre for Music and Science, University of Cambridge, Cambridge, UK.,TU Dresden, Institut für Kunst- und Musikwissenschaft, Dresden, Germany.,Cognition and Consciousness Imaging Group, Division of Anaesthesia, Department of Medicine, University of Cambridge, Cambridge, UK
| | - Ian Cross
- Centre for Music and Science, University of Cambridge, Cambridge, UK
| | | | - Nadia Gonzalez
- Department of Neuroimaging, Fundación Científica del Sur Imaging Centre, Buenos Aires, Argentina
| | - Jorge Docampo
- Department of Neuroimaging, Fundación Científica del Sur Imaging Centre, Buenos Aires, Argentina
| | - Claudio Bruno
- Department of Neuroimaging, Fundación Científica del Sur Imaging Centre, Buenos Aires, Argentina
| | - Emmanuel A Stamatakis
- Cognition and Consciousness Imaging Group, Division of Anaesthesia, Department of Medicine, University of Cambridge, Cambridge, UK
| |
Collapse
|
25
|
Retter TL, Webster MA, Jiang F. Directional Visual Motion Is Represented in the Auditory and Association Cortices of Early Deaf Individuals. J Cogn Neurosci 2019; 31:1126-1140. [PMID: 30726181 PMCID: PMC6599583 DOI: 10.1162/jocn_a_01378] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Individuals who are deaf since early life may show enhanced performance at some visual tasks, including discrimination of directional motion. The neural substrates of such behavioral enhancements remain difficult to identify in humans, although neural plasticity has been shown for early deaf people in the auditory and association cortices, including the primary auditory cortex (PAC) and STS region, respectively. Here, we investigated whether neural responses in auditory and association cortices of early deaf individuals are reorganized to be sensitive to directional visual motion. To capture direction-selective responses, we recorded fMRI responses frequency-tagged to the 0.1-Hz presentation of central directional (100% coherent random dot) motion persisting for 2 sec contrasted with nondirectional (0% coherent) motion for 8 sec. We found direction-selective responses in the STS region in both deaf and hearing participants, but the extent of activation in the right STS region was 5.5 times larger for deaf participants. Minimal but significant direction-selective responses were also found in the PAC of deaf participants, both at the group level and in five of six individuals. In response to stimuli presented separately in the right and left visual fields, the relative activation across the right and left hemispheres was similar in both the PAC and STS region of deaf participants. Notably, the enhanced right-hemisphere activation could support the right visual field advantage reported previously in behavioral studies. Taken together, these results show that the reorganized auditory cortices of early deaf individuals are sensitive to directional motion. Speculatively, these results suggest that auditory and association regions can be remapped to support enhanced visual performance.
Collapse
|
26
|
Delong P, Aller M, Giani AS, Rohe T, Conrad V, Watanabe M, Noppeney U. Invisible Flashes Alter Perceived Sound Location. Sci Rep 2018; 8:12376. [PMID: 30120294 PMCID: PMC6098122 DOI: 10.1038/s41598-018-30773-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2018] [Accepted: 07/31/2018] [Indexed: 12/05/2022] Open
Abstract
Information integration across the senses is fundamental for effective interactions with our environment. The extent to which signals from different senses can interact in the absence of awareness is controversial. Combining the spatial ventriloquist illusion and dynamic continuous flash suppression (dCFS), we investigated in a series of two experiments whether visual signals that observers do not consciously perceive can influence spatial perception of sounds. Importantly, dCFS obliterated visual awareness only on a fraction of trials allowing us to compare spatial ventriloquism for physically identical flashes that were judged as visible or invisible. Our results show a stronger ventriloquist effect for visible than invisible flashes. Critically, a robust ventriloquist effect emerged also for invisible flashes even when participants were at chance when locating the flash. Collectively, our findings demonstrate that signals that we are not aware of in one sensory modality can alter spatial perception of signals in another sensory modality.
Collapse
Affiliation(s)
- Patrycja Delong
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK.
| | - Máté Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
| | - Anette S Giani
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Tim Rohe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Verena Conrad
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Masataka Watanabe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| |
Collapse
|
27
|
Koelsch S, Skouras S, Lohmann G. The auditory cortex hosts network nodes influential for emotion processing: An fMRI study on music-evoked fear and joy. PLoS One 2018; 13:e0190057. [PMID: 29385142 PMCID: PMC5791961 DOI: 10.1371/journal.pone.0190057] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2016] [Accepted: 12/07/2017] [Indexed: 01/12/2023] Open
Abstract
Sound is a potent elicitor of emotions. Auditory core, belt and parabelt regions have anatomical connections to a large array of limbic and paralimbic structures which are involved in the generation of affective activity. However, little is known about the functional role of auditory cortical regions in emotion processing. Using functional magnetic resonance imaging and music stimuli that evoke joy or fear, our study reveals that anterior and posterior regions of auditory association cortex have emotion-characteristic functional connectivity with limbic/paralimbic (insula, cingulate cortex, and striatum), somatosensory, visual, motor-related, and attentional structures. We found that these regions have remarkably high emotion-characteristic eigenvector centrality, revealing that they have influential positions within emotion-processing brain networks with “small-world” properties. By contrast, primary auditory fields showed surprisingly strong emotion-characteristic functional connectivity with intra-auditory regions. Our findings demonstrate that the auditory cortex hosts regions that are influential within networks underlying the affective processing of auditory information. We anticipate our results to incite research specifying the role of the auditory cortex—and sensory systems in general—in emotion processing, beyond the traditional view that sensory cortices have merely perceptual functions.
Collapse
Affiliation(s)
- Stefan Koelsch
- Department of Biological and Medical Psychology, University of Bergen, Bergen, Norway
- * E-mail:
| | - Stavros Skouras
- Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
| | - Gabriele Lohmann
- Department of Biomedical Magnetic Resonance, University Clinic Tübingen, Tübingen, Germany
- Magnetic Resonance Center, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| |
Collapse
|
28
|
Scott BH, Leccese PA, Saleem KS, Kikuchi Y, Mullarkey MP, Fukushima M, Mishkin M, Saunders RC. Intrinsic Connections of the Core Auditory Cortical Regions and Rostral Supratemporal Plane in the Macaque Monkey. Cereb Cortex 2018; 27:809-840. [PMID: 26620266 DOI: 10.1093/cercor/bhv277] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Abstract
In the ventral stream of the primate auditory cortex, cortico-cortical projections emanate from the primary auditory cortex (AI) along 2 principal axes: one mediolateral, the other caudorostral. Connections in the mediolateral direction from core, to belt, to parabelt, have been well described, but less is known about the flow of information along the supratemporal plane (STP) in the caudorostral dimension. Neuroanatomical tracers were injected throughout the caudorostral extent of the auditory core and rostral STP by direct visualization of the cortical surface. Auditory cortical areas were distinguished by SMI-32 immunostaining for neurofilament, in addition to established cytoarchitectonic criteria. The results describe a pathway comprising step-wise projections from AI through the rostral and rostrotemporal fields of the core (R and RT), continuing to the recently identified rostrotemporal polar field (RTp) and the dorsal temporal pole. Each area was strongly and reciprocally connected with the areas immediately caudal and rostral to it, though deviations from strictly serial connectivity were observed. In RTp, inputs converged from core, belt, parabelt, and the auditory thalamus, as well as higher order cortical regions. The results support a rostrally directed flow of auditory information with complex and recurrent connections, similar to the ventral stream of macaque visual cortex.
Collapse
Affiliation(s)
- Brian H Scott
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Paul A Leccese
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Kadharbatcha S Saleem
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Yukiko Kikuchi
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA.,Present address: Institute of Neuroscience, Newcastle University Medical School, Newcastle Upon Tyne NE2 4HH, UK
| | - Matthew P Mullarkey
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Makoto Fukushima
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Mortimer Mishkin
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| | - Richard C Saunders
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, MD 20892, USA
| |
Collapse
|
29
|
Kimura A, Imbe H. Robust Subthreshold Cross-modal Modulation of Auditory Response by Cutaneous Electrical Stimulation in First- and Higher-order Auditory Thalamic Nuclei. Neuroscience 2018; 372:161-180. [PMID: 29309880 DOI: 10.1016/j.neuroscience.2017.12.051] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2017] [Revised: 12/14/2017] [Accepted: 12/27/2017] [Indexed: 12/14/2022]
Abstract
Conventional extracellular recording has revealed cross-modal alterations of auditory cell activities by cutaneous electrical stimulation of the hindpaw in first- and higher-order auditory thalamic nuclei (Donishi et al., 2011). Juxta-cellular recording and labeling techniques were used in the present study to examine the cross-modal alterations in detail, focusing on possible nucleus and/or cell type-related distinctions in modulation. Recordings were obtained from 80 cells of anesthetized rats. Cutaneous electrical stimulation, which did not elicit unit discharges, i.e., subthreshold effects, modulated early (onset) and/or late auditory responses of first- (64%) and higher-order nucleus cells (77%) with regard to response magnitude, latency and/or burst spiking. Attenuation predominated in the modulation of response magnitude and burst spiking, and delay predominated in the modulation of response time. Striking alterations of burst spiking took place in higher-order nucleus cells, which had the potential to exhibit higher propensities for burst spiking as compared to first-order nucleus cells. A subpopulation of first-order nucleus cells showing modulation in early response magnitude in the caudal domain of the nucleus had larger cell bodies and higher propensities for burst spiking as compared to cells showing no modulation. These findings suggest that somatosensory influence is incorporated into parallel channels in auditory thalamic nuclei to impose distinct impacts on cortical and subcortical sensory processing. Further, cutaneous electrical stimulation given after early auditory responses modulated late responses. Somatosensory influence is likely to affect ongoing auditory processing at any time without being coincident with sound onset in a narrow temporal window.
Collapse
Affiliation(s)
- Akihisa Kimura
- Department of Physiology, Wakayama Medical University, Wakayama Kimiidera 811-1, 641-8509, Japan.
| | - Hiroki Imbe
- Department of Physiology, Wakayama Medical University, Wakayama Kimiidera 811-1, 641-8509, Japan
| |
Collapse
|
30
|
Brefczynski-Lewis JA, Lewis JW. Auditory object perception: A neurobiological model and prospective review. Neuropsychologia 2017; 105:223-242. [PMID: 28467888 PMCID: PMC5662485 DOI: 10.1016/j.neuropsychologia.2017.04.034] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2016] [Revised: 04/27/2017] [Accepted: 04/27/2017] [Indexed: 12/15/2022]
Abstract
Interaction with the world is a multisensory experience, but most of what is known about the neural correlates of perception comes from studying vision. Auditory inputs enter cortex with its own set of unique qualities, and leads to use in oral communication, speech, music, and the understanding of emotional and intentional states of others, all of which are central to the human experience. To better understand how the auditory system develops, recovers after injury, and how it may have transitioned in its functions over the course of hominin evolution, advances are needed in models of how the human brain is organized to process real-world natural sounds and "auditory objects". This review presents a simple fundamental neurobiological model of hearing perception at a category level that incorporates principles of bottom-up signal processing together with top-down constraints of grounded cognition theories of knowledge representation. Though mostly derived from human neuroimaging literature, this theoretical framework highlights rudimentary principles of real-world sound processing that may apply to most if not all mammalian species with hearing and acoustic communication abilities. The model encompasses three basic categories of sound-source: (1) action sounds (non-vocalizations) produced by 'living things', with human (conspecific) and non-human animal sources representing two subcategories; (2) action sounds produced by 'non-living things', including environmental sources and human-made machinery; and (3) vocalizations ('living things'), with human versus non-human animals as two subcategories therein. The model is presented in the context of cognitive architectures relating to multisensory, sensory-motor, and spoken language organizations. The models' predictive values are further discussed in the context of anthropological theories of oral communication evolution and the neurodevelopment of spoken language proto-networks in infants/toddlers. These phylogenetic and ontogenetic frameworks both entail cortical network maturations that are proposed to at least in part be organized around a number of universal acoustic-semantic signal attributes of natural sounds, which are addressed herein.
Collapse
Affiliation(s)
- Julie A Brefczynski-Lewis
- Blanchette Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA; Department of Physiology, Pharmacology, & Neuroscience, West Virginia University, PO Box 9229, Morgantown, WV 26506, USA
| | - James W Lewis
- Blanchette Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA; Department of Physiology, Pharmacology, & Neuroscience, West Virginia University, PO Box 9229, Morgantown, WV 26506, USA.
| |
Collapse
|
31
|
Scott BH, Saleem KS, Kikuchi Y, Fukushima M, Mishkin M, Saunders RC. Thalamic connections of the core auditory cortex and rostral supratemporal plane in the macaque monkey. J Comp Neurol 2017; 525:3488-3513. [PMID: 28685822 DOI: 10.1002/cne.24283] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 06/29/2017] [Accepted: 06/30/2017] [Indexed: 01/06/2023]
Abstract
In the primate auditory cortex, information flows serially in the mediolateral dimension from core, to belt, to parabelt. In the caudorostral dimension, stepwise serial projections convey information through the primary, rostral, and rostrotemporal (AI, R, and RT) core areas on the supratemporal plane, continuing to the rostrotemporal polar area (RTp) and adjacent auditory-related areas of the rostral superior temporal gyrus (STGr) and temporal pole. In addition to this cascade of corticocortical connections, the auditory cortex receives parallel thalamocortical projections from the medial geniculate nucleus (MGN). Previous studies have examined the projections from MGN to auditory cortex, but most have focused on the caudal core areas AI and R. In this study, we investigated the full extent of connections between MGN and AI, R, RT, RTp, and STGr using retrograde and anterograde anatomical tracers. Both AI and R received nearly 90% of their thalamic inputs from the ventral subdivision of the MGN (MGv; the primary/lemniscal auditory pathway). By contrast, RT received only ∼45% from MGv, and an equal share from the dorsal subdivision (MGd). Area RTp received ∼25% of its inputs from MGv, but received additional inputs from multisensory areas outside the MGN (30% in RTp vs. 1-5% in core areas). The MGN input to RTp distinguished this rostral extension of auditory cortex from the adjacent auditory-related cortex of the STGr, which received 80% of its thalamic input from multisensory nuclei (primarily medial pulvinar). Anterograde tracers identified complementary descending connections by which highly processed auditory information may modulate thalamocortical inputs.
Collapse
Affiliation(s)
- Brian H Scott
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| | - Kadharbatcha S Saleem
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| | - Yukiko Kikuchi
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| | - Makoto Fukushima
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| | - Mortimer Mishkin
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| | - Richard C Saunders
- Laboratory of Neuropsychology, National Institute of Mental Health, National Institutes of Health (NIMH/NIH), Bethesda, Maryland
| |
Collapse
|
32
|
Nourski KV, Banks MI, Steinschneider M, Rhone AE, Kawasaki H, Mueller RN, Todd MM, Howard MA. Electrocorticographic delineation of human auditory cortical fields based on effects of propofol anesthesia. Neuroimage 2017; 152:78-93. [PMID: 28254512 PMCID: PMC5432407 DOI: 10.1016/j.neuroimage.2017.02.061] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2016] [Revised: 02/13/2017] [Accepted: 02/21/2017] [Indexed: 12/20/2022] Open
Abstract
The functional organization of human auditory cortex remains incompletely characterized. While the posteromedial two thirds of Heschl's gyrus (HG) is generally considered to be part of core auditory cortex, additional subdivisions of HG remain speculative. To further delineate the hierarchical organization of human auditory cortex, we investigated regional heterogeneity in the modulation of auditory cortical responses under varying depths of anesthesia induced by propofol. Non-invasive studies have shown that propofol differentially affects auditory cortical activity, with a greater impact on non-core areas. Subjects were neurosurgical patients undergoing removal of intracranial electrodes placed to identify epileptic foci. Stimuli were 50Hz click trains, presented continuously during an awake baseline period, and subsequently, while propofol infusion was incrementally titrated to induce general anesthesia. Electrocorticographic recordings were made with depth electrodes implanted in HG and subdural grid electrodes implanted over superior temporal gyrus (STG). Depth of anesthesia was monitored using spectral entropy. Averaged evoked potentials (AEPs), frequency-following responses (FFRs) and high gamma (70-150Hz) event-related band power were used to characterize auditory cortical activity. Based on the changes in AEPs and FFRs during the induction of anesthesia, posteromedial HG could be divided into two subdivisions. In the most posteromedial aspect of the gyrus, the earliest AEP deflections were preserved and FFRs increased during induction. In contrast, the remainder of the posteromedial HG exhibited attenuation of both the AEP and the FFR. The anterolateral HG exhibited weaker activation characterized by broad, low-voltage AEPs and the absence of FFRs. Lateral STG exhibited limited activation by click trains, and FFRs there diminished during induction. Sustained high gamma activity was attenuated in the most posteromedial portion of HG, and was absent in all other regions. These differential patterns of auditory cortical activity during the induction of anesthesia may serve as useful physiological markers for field delineation. In this study, the posteromedial HG could be parcellated into at least two subdivisions. Preservation of the earliest AEP deflections and FFRs in the posteromedial HG likely reflects the persistence of feedforward synaptic activity generated by inputs from subcortical auditory pathways, including the medial geniculate nucleus.
Collapse
Affiliation(s)
- Kirill V Nourski
- Department of Neurosurgery, The University of Iowa, Iowa City, IA, USA.
| | - Matthew I Banks
- Department of Anesthesiology, University of Wisconsin - Madison, Madison, WI, USA
| | - Mitchell Steinschneider
- Departments of Neurology and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, USA
| | - Ariane E Rhone
- Department of Neurosurgery, The University of Iowa, Iowa City, IA, USA
| | - Hiroto Kawasaki
- Department of Neurosurgery, The University of Iowa, Iowa City, IA, USA
| | - Rashmi N Mueller
- Department of Anesthesia, The University of Iowa, Iowa City, IA, USA
| | - Michael M Todd
- Department of Anesthesia, The University of Iowa, Iowa City, IA, USA; Department of Anesthesiology, University of Minnesota, Minneapolis, MN, USA
| | - Matthew A Howard
- Department of Neurosurgery, The University of Iowa, Iowa City, IA, USA; Pappajohn Biomedical Institute, The University of Iowa, Iowa City, IA, USA; Iowa Neuroscience Institute, The University of Iowa, Iowa City, IA, USA
| |
Collapse
|
33
|
Webster PJ, Skipper-Kallal LM, Frum CA, Still HN, Ward BD, Lewis JW. Divergent Human Cortical Regions for Processing Distinct Acoustic-Semantic Categories of Natural Sounds: Animal Action Sounds vs. Vocalizations. Front Neurosci 2017; 10:579. [PMID: 28111538 PMCID: PMC5216875 DOI: 10.3389/fnins.2016.00579] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Accepted: 12/05/2016] [Indexed: 11/13/2022] Open
Abstract
A major gap in our understanding of natural sound processing is knowledge of where or how in a cortical hierarchy differential processing leads to categorical perception at a semantic level. Here, using functional magnetic resonance imaging (fMRI) we sought to determine if and where cortical pathways in humans might diverge for processing action sounds vs. vocalizations as distinct acoustic-semantic categories of real-world sound when matched for duration and intensity. This was tested by using relatively less semantically complex natural sounds produced by non-conspecific animals rather than humans. Our results revealed a striking double-dissociation of activated networks bilaterally. This included a previously well described pathway preferential for processing vocalization signals directed laterally from functionally defined primary auditory cortices to the anterior superior temporal gyri, and a less well-described pathway preferential for processing animal action sounds directed medially to the posterior insulae. We additionally found that some of these regions and associated cortical networks showed parametric sensitivity to high-order quantifiable acoustic signal attributes and/or to perceptual features of the natural stimuli, such as the degree of perceived recognition or intentional understanding. Overall, these results supported a neurobiological theoretical framework for how the mammalian brain may be fundamentally organized to process acoustically and acoustic-semantically distinct categories of ethologically valid, real-world sounds.
Collapse
Affiliation(s)
- Paula J. Webster
- Blanchette Rockefellar Neurosciences Institute, Department of Neurobiology & Anatomy, West Virginia UniversityMorgantown, WV, USA
| | - Laura M. Skipper-Kallal
- Blanchette Rockefellar Neurosciences Institute, Department of Neurobiology & Anatomy, West Virginia UniversityMorgantown, WV, USA
- Department of Neurology, Georgetown University Medical CampusWashington, DC, USA
| | - Chris A. Frum
- Department of Physiology and Pharmacology, West Virginia UniversityMorgantown, WV, USA
| | - Hayley N. Still
- Blanchette Rockefellar Neurosciences Institute, Department of Neurobiology & Anatomy, West Virginia UniversityMorgantown, WV, USA
| | - B. Douglas Ward
- Department of Biophysics, Medical College of WisconsinMilwaukee, WI, USA
| | - James W. Lewis
- Blanchette Rockefellar Neurosciences Institute, Department of Neurobiology & Anatomy, West Virginia UniversityMorgantown, WV, USA
| |
Collapse
|
34
|
Araneda R, Renier L, Ebner-Karestinos D, Dricot L, De Volder AG. Hearing, feeling or seeing a beat recruits a supramodal network in the auditory dorsal stream. Eur J Neurosci 2016; 45:1439-1450. [PMID: 27471102 DOI: 10.1111/ejn.13349] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2015] [Revised: 06/13/2016] [Accepted: 07/23/2016] [Indexed: 10/21/2022]
Abstract
Hearing a beat recruits a wide neural network that involves the auditory cortex and motor planning regions. Perceiving a beat can potentially be achieved via vision or even touch, but it is currently not clear whether a common neural network underlies beat processing. Here, we used functional magnetic resonance imaging (fMRI) to test to what extent the neural network involved in beat processing is supramodal, that is, is the same in the different sensory modalities. Brain activity changes in 27 healthy volunteers were monitored while they were attending to the same rhythmic sequences (with and without a beat) in audition, vision and the vibrotactile modality. We found a common neural network for beat detection in the three modalities that involved parts of the auditory dorsal pathway. Within this network, only the putamen and the supplementary motor area (SMA) showed specificity to the beat, while the brain activity in the putamen covariated with the beat detection speed. These results highlighted the implication of the auditory dorsal stream in beat detection, confirmed the important role played by the putamen in beat detection and indicated that the neural network for beat detection is mostly supramodal. This constitutes a new example of convergence of the same functional attributes into one centralized representation in the brain.
Collapse
Affiliation(s)
- Rodrigo Araneda
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | - Laurent Renier
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | | | - Laurence Dricot
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | - Anne G De Volder
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| |
Collapse
|
35
|
Rosenblum LD, Dias JW, Dorsi J. The supramodal brain: implications for auditory perception. JOURNAL OF COGNITIVE PSYCHOLOGY 2016. [DOI: 10.1080/20445911.2016.1181691] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
36
|
Bizley JK, Maddox RK, Lee AKC. Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends Neurosci 2016; 39:74-85. [PMID: 26775728 PMCID: PMC4738154 DOI: 10.1016/j.tins.2015.12.007] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2015] [Revised: 12/03/2015] [Accepted: 12/11/2015] [Indexed: 11/30/2022]
Abstract
Crossmodal integration is a term applicable to many phenomena in which one sensory modality influences task performance or perception in another sensory modality. We distinguish the term binding as one that should be reserved specifically for the process that underpins perceptual object formation. To unambiguously differentiate binding form other types of integration, behavioral and neural studies must investigate perception of a feature orthogonal to the features that link the auditory and visual stimuli. We argue that supporting true perceptual binding (as opposed to other processes such as decision-making) is one role for cross-sensory influences in early sensory cortex. These early multisensory interactions may therefore form a physiological substrate for the bottom-up grouping of auditory and visual stimuli into auditory-visual (AV) objects. Crossmodal integration and binding have been treated as synonymous in the literature, with no clear delineation between perceptual changes and other interactions such as decision-making. Crossmodal binding is proposed as a distinct form of integration leading to multisensory object formation. Multisensory stimuli are most beneficial in noisy situations, but few studies use stimulus competition to investigate the processes underpinning multisensory integration. Evidence suggests that both visual and auditory attention is object-based – all features within an object are enhanced and there is a cost to attending features across versus within objects. Multisensory interactions can be observed throughout the brain, including early sensory cortex. The role of early sensory cortex in multisensory integration is unknown, but may underlie crossmodal binding.
Collapse
Affiliation(s)
- Jennifer K Bizley
- University College London (UCL) Ear Institute, 332 Gray's Inn Road, London, WC1X 8EE, UK.
| | - Ross K Maddox
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42nd Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA.
| |
Collapse
|
37
|
Pantev C, Paraskevopoulos E, Kuchenbuch A, Lu Y, Herholz SC. Musical expertise is related to neuroplastic changes of multisensory nature within the auditory cortex. Eur J Neurosci 2015; 41:709-17. [PMID: 25728187 DOI: 10.1111/ejn.12788] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2014] [Revised: 10/15/2014] [Accepted: 10/17/2014] [Indexed: 01/08/2023]
Abstract
Recent neuroscientific evidence indicates that multisensory integration does not only occur in higher level association areas of the cortex as the hierarchical models of sensory perception assumed, but also in regions traditionally thought of as unisensory, such as the auditory cortex. Nevertheless, it is not known whether expertise-induced neuroplasticity can alter the multisensory processing that occurs in these low-level regions. The present study used magnetoencephalography to investigate whether musical training may induce neuroplastic changes of multisensory processing within the human auditory cortex. Magnetoencephalography data of four different experiments were used to demonstrate the effect of long-term and short-term musical training on the integration of auditory, somatosensory and visual stimuli in the auditory cortex. The cross-sectional design of three of the experiments allowed us to infer that long-term musical training is related to a significantly different way of processing multisensory information within the auditory cortex, whereas the short-term training design of the fourth experiment allowed us to causally infer that multisensory music reading training affects the multimodal processing within the auditory cortex.
Collapse
Affiliation(s)
- Christo Pantev
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | - Evangelos Paraskevopoulos
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
- Faculty of Health Sciences; School of Medicine; Aristotle University of Thessaloniki; Thessaloniki Greece
| | - Anja Kuchenbuch
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | - Yao Lu
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | | |
Collapse
|
38
|
Zelic G, Mottet D, Lagarde J. Perceptuo-motor compatibility governs multisensory integration in bimanual coordination dynamics. Exp Brain Res 2015; 234:463-74. [DOI: 10.1007/s00221-015-4476-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2014] [Accepted: 10/15/2015] [Indexed: 11/30/2022]
|
39
|
Talsma D. Predictive coding and multisensory integration: an attentional account of the multisensory mind. Front Integr Neurosci 2015; 9:19. [PMID: 25859192 PMCID: PMC4374459 DOI: 10.3389/fnint.2015.00019] [Citation(s) in RCA: 111] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2014] [Accepted: 03/03/2015] [Indexed: 11/13/2022] Open
Abstract
Multisensory integration involves a host of different cognitive processes, occurring at different stages of sensory processing. Here I argue that, despite recent insights suggesting that multisensory interactions can occur at very early latencies, the actual integration of individual sensory traces into an internally consistent mental representation is dependent on both top–down and bottom–up processes. Moreover, I argue that this integration is not limited to just sensory inputs, but that internal cognitive processes also shape the resulting mental representation. Studies showing that memory recall is affected by the initial multisensory context in which the stimuli were presented will be discussed, as well as several studies showing that mental imagery can affect multisensory illusions. This empirical evidence will be discussed from a predictive coding perspective, in which a central top–down attentional process is proposed to play a central role in coordinating the integration of all these inputs into a coherent mental representation.
Collapse
Affiliation(s)
- Durk Talsma
- Department of Experimental Psychology, Ghent University Ghent, Belgium
| |
Collapse
|
40
|
Dewey RS, Hartley DEH. Cortical cross-modal plasticity following deafness measured using functional near-infrared spectroscopy. Hear Res 2015; 325:55-63. [PMID: 25819496 DOI: 10.1016/j.heares.2015.03.007] [Citation(s) in RCA: 58] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/05/2015] [Revised: 03/17/2015] [Accepted: 03/17/2015] [Indexed: 10/23/2022]
Abstract
Evidence from functional neuroimaging studies suggests that the auditory cortex can become more responsive to visual and somatosensory stimulation following deafness, and that this occurs predominately in the right hemisphere. Extensive cross-modal plasticity in prospective cochlear implant recipients is correlated with poor speech outcomes following implantation, highlighting the potential impact of central auditory plasticity on subsequent aural rehabilitation. Conversely, the effects of hearing restoration with a cochlear implant on cortical plasticity are less well understood, since the use of most neuroimaging techniques in CI recipients is either unsafe or problematic due to the electromagnetic artefacts generated by CI stimulation. Additionally, techniques such as functional magnetic resonance imaging (fMRI) are confounded by acoustic noise produced by the scanner that will be perceived more by hearing than by deaf individuals. Subsequently it is conceivable that auditory responses to acoustic noise produced by the MR scanner may mask auditory cortical responses to non-auditory stimulation, and render inter-group comparisons less significant. Uniquely, functional near-infrared spectroscopy (fNIRS) is a silent neuroimaging technique that is non-invasive and completely unaffected by the presence of a CI. Here, we used fNIRS to study temporal-lobe responses to auditory, visual and somatosensory stimuli in thirty profoundly-deaf participants and thirty normally-hearing controls. Compared with silence, acoustic noise stimuli elicited a significant group fNIRS response in the temporal region of normally-hearing individuals, which was not seen in profoundly-deaf participants. Visual motion elicited a larger group response within the right temporal lobe of profoundly-deaf participants, compared with normally-hearing controls. However, bilateral temporal lobe fNIRS activation to somatosensory stimulation was comparable in both groups. Using fNIRS these results confirm that auditory deprivation is associated with cross-modal plasticity of visual inputs to auditory cortex. Although we found no evidence for plasticity of somatosensory inputs, it is possible that our recordings may have included activation of somatosensory cortex that masked any group differences in auditory cortical responses due to the limited spatial resolution associated with fNIRS.
Collapse
Affiliation(s)
- Rebecca S Dewey
- Otology and Hearing Group, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, NG7 2UH, UK; National Institute for Health Research (NIHR) Nottingham Hearing Biomedical Research Unit, 113 The Ropewalk, Nottingham, NG1 5DU, UK.
| | - Douglas E H Hartley
- Otology and Hearing Group, Division of Clinical Neuroscience, School of Medicine, University of Nottingham, Nottingham, NG7 2UH, UK; National Institute for Health Research (NIHR) Nottingham Hearing Biomedical Research Unit, 113 The Ropewalk, Nottingham, NG1 5DU, UK; MRC Institute of Hearing Research, University Park, Nottingham, NG7 2RD, UK.
| |
Collapse
|
41
|
Götz T, Milde T, Curio G, Debener S, Lehmann T, Leistritz L, Witte OW, Witte H, Haueisen J. Primary somatosensory contextual modulation is encoded by oscillation frequency change. Clin Neurophysiol 2015; 126:1769-79. [PMID: 25670344 DOI: 10.1016/j.clinph.2014.12.028] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2014] [Revised: 11/14/2014] [Accepted: 12/01/2014] [Indexed: 10/24/2022]
Abstract
OBJECTIVE This study characterized thalamo-cortical communication by assessing the effect of context-dependent modulation on the very early somatosensory evoked high-frequency oscillations (HF oscillations). METHODS We applied electrical stimuli to the median nerve together with an auditory oddball paradigm, presenting standard and deviant target tones representing differential cognitive contexts to the constantly repeated electrical stimulation. Median nerve stimulation without auditory stimulation served as unimodal control. RESULTS A model consisting of one subcortical (near thalamus) and two cortical (Brodmann areas 1 and 3b) dipolar sources explained the measured HF oscillations. Both at subcortical and the cortical levels HF oscillations were significantly smaller during bimodal (somatosensory plus auditory) than unimodal (somatosensory only) stimulation. A delay differential equation model was developed to investigate interactions within the 3-node thalamo-cortical network. Importantly, a significant change in the eigenfrequency of Brodmann area 3b was related to the context-dependent modulation, while there was no change in the network coupling. CONCLUSION This model strongly suggests cortico-thalamic feedback from both cortical Brodmann areas 1 and 3b to the thalamus. With the 3-node network model, thalamo-cortical feedback could be described. SIGNIFICANCE Frequency encoding plays an important role in contextual modulation in the somatosensory thalamo-cortical network.
Collapse
Affiliation(s)
- T Götz
- Biomagnetic Center, Hans Berger Department of Neurology, Jena University Hospital, Erlanger Allee 101, 07747 Jena, Germany; Center for Sepsis Control and Care, Jena University Hospital, Erlanger Allee 101, 07747 Jena, Germany
| | - T Milde
- Institute of Medical Statistics, Computer Sciences and Documentation, Jena University Hospital, Bachstrasse 18, 07740 Jena, Germany
| | - G Curio
- Neurophysics Group, Department of Neurology, Campus Benjamin Franklin, Charité - University Medicine Berlin, Hindenburgdamm 30, 12200 Berlin, Germany
| | - S Debener
- Faculty VI, Department of Psychology, Neuropsychology Lab, University of Oldenburg, 26111 Oldenburg, Germany
| | - T Lehmann
- Institute of Medical Statistics, Computer Sciences and Documentation, Jena University Hospital, Bachstrasse 18, 07740 Jena, Germany
| | - L Leistritz
- Institute of Medical Statistics, Computer Sciences and Documentation, Jena University Hospital, Bachstrasse 18, 07740 Jena, Germany
| | - O W Witte
- Hans Berger Department of Neurology, Jena University Hospital, Erlanger Allee 101, 07747 Jena, Germany; Center for Sepsis Control and Care, Jena University Hospital, Erlanger Allee 101, 07747 Jena, Germany
| | - H Witte
- Institute of Medical Statistics, Computer Sciences and Documentation, Jena University Hospital, Bachstrasse 18, 07740 Jena, Germany
| | - J Haueisen
- Biomagnetic Center, Hans Berger Department of Neurology, Jena University Hospital, Erlanger Allee 101, 07747 Jena, Germany; Institute of Biomedical Engineering and Informatics, Faculty of Computer Science and Automation, Technical University Ilmenau, Gustav-Kirchhoff-Straße 2, 98693 Ilmenau, Germany.
| |
Collapse
|
42
|
Abstract
The auditory cortex is a network of areas in the part of the brain that receives inputs from the subcortical auditory pathways in the brainstem and thalamus. Through an elaborate network of intrinsic and extrinsic connections, the auditory cortex is thought to bring about the conscious perception of sound and provide a basis for the comprehension and production of meaningful utterances. In this chapter, the organization of auditory cortex is described with an emphasis on its anatomic features and the flow of information within the network. These features are then used to introduce key neurophysiologic concepts that are being intensively studied in humans and animal models. The discussion is presented in the context of our working model of the primate auditory cortex and extensions to humans. The material is presented in the context of six underlying principles, which reflect distinct, but related, aspects of anatomic and physiologic organization: (1) the division of auditory cortex into regions; (2) the subdivision of regions into areas; (3) tonotopic organization of areas; (4) thalamocortical connections; (5) serial and parallel organization of connections; and (6) topographic relationships between auditory and auditory-related areas. Although the functional roles of the various components of this network remain poorly defined, a more complete understanding is emerging from ongoing studies that link auditory behavior to its anatomic and physiologic substrates.
Collapse
Affiliation(s)
- Troy A Hackett
- Department of Hearing and Speech Sciences, Vanderbilt University School of Medicine and Department of Psychology, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
43
|
Albouy P, Lévêque Y, Hyde KL, Bouchet P, Tillmann B, Caclin A. Boosting pitch encoding with audiovisual interactions in congenital amusia. Neuropsychologia 2014; 67:111-20. [PMID: 25499145 DOI: 10.1016/j.neuropsychologia.2014.12.006] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2014] [Revised: 12/03/2014] [Accepted: 12/05/2014] [Indexed: 11/19/2022]
Abstract
The combination of information across senses can enhance perception, as revealed for example by decreased reaction times or improved stimulus detection. Interestingly, these facilitatory effects have been shown to be maximal when responses to unisensory modalities are weak. The present study investigated whether audiovisual facilitation can be observed in congenital amusia, a music-specific disorder primarily ascribed to impairments of pitch processing. Amusic individuals and their matched controls performed two tasks. In Task 1, they were required to detect auditory, visual, or audiovisual stimuli as rapidly as possible. In Task 2, they were required to detect as accurately and as rapidly as possible a pitch change within an otherwise monotonic 5-tone sequence that was presented either only auditorily (A condition), or simultaneously with a temporally congruent, but otherwise uninformative visual stimulus (AV condition). Results of Task 1 showed that amusics exhibit typical auditory and visual detection, and typical audiovisual integration capacities: both amusics and controls exhibited shorter response times for audiovisual stimuli than for either auditory stimuli or visual stimuli. Results of Task 2 revealed that both groups benefited from simultaneous uninformative visual stimuli to detect pitch changes: accuracy was higher and response times shorter in the AV condition than in the A condition. The audiovisual improvements of response times were observed for different pitch interval sizes depending on the group. These results suggest that both typical listeners and amusic individuals can benefit from multisensory integration to improve their pitch processing abilities and that this benefit varies as a function of task difficulty. These findings constitute the first step towards the perspective to exploit multisensory paradigms to reduce pitch-related deficits in congenital amusia, notably by suggesting that audiovisual paradigms are effective in an appropriate range of unimodal performance.
Collapse
Affiliation(s)
- Philippe Albouy
- Lyon Neuroscience Research Center, Brain Dynamics and Cognition Team & Auditory Cognition and Psychoacoustics Team, CRNL, INSERM U1028, CNRS UMR5292, Lyon, F-69000, France; University Lyon 1, Lyon F-69000, France; Montreal Neurological Institute, McGill University, 3801 University Street Montreal, QC, Canada H3A2B4; International Laboratory for Brain Music and Sound Research, University of Montreal and McGill University, Canada.
| | - Yohana Lévêque
- Lyon Neuroscience Research Center, Brain Dynamics and Cognition Team & Auditory Cognition and Psychoacoustics Team, CRNL, INSERM U1028, CNRS UMR5292, Lyon, F-69000, France; University Lyon 1, Lyon F-69000, France
| | - Krista L Hyde
- International Laboratory for Brain Music and Sound Research, University of Montreal and McGill University, Canada
| | - Patrick Bouchet
- Lyon Neuroscience Research Center, Brain Dynamics and Cognition Team & Auditory Cognition and Psychoacoustics Team, CRNL, INSERM U1028, CNRS UMR5292, Lyon, F-69000, France; University Lyon 1, Lyon F-69000, France
| | - Barbara Tillmann
- University Lyon 1, Lyon F-69000, France; University Lyon 1, Lyon F-69000, France
| | - Anne Caclin
- Lyon Neuroscience Research Center, Brain Dynamics and Cognition Team & Auditory Cognition and Psychoacoustics Team, CRNL, INSERM U1028, CNRS UMR5292, Lyon, F-69000, France; University Lyon 1, Lyon F-69000, France
| |
Collapse
|
44
|
Tyll S, Budinger E, Noesselt T. Thalamic influences on multisensory integration. Commun Integr Biol 2014. [DOI: 10.4161/cib.15222] [Citation(s) in RCA: 57] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
45
|
Zilber N, Ciuciu P, Gramfort A, Azizi L, van Wassenhove V. Supramodal processing optimizes visual perceptual learning and plasticity. Neuroimage 2014; 93 Pt 1:32-46. [DOI: 10.1016/j.neuroimage.2014.02.017] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2013] [Revised: 02/05/2014] [Accepted: 02/13/2014] [Indexed: 11/25/2022] Open
|
46
|
van Atteveldt N, Murray MM, Thut G, Schroeder CE. Multisensory integration: flexible use of general operations. Neuron 2014; 81:1240-1253. [PMID: 24656248 DOI: 10.1016/j.neuron.2014.02.044] [Citation(s) in RCA: 176] [Impact Index Per Article: 17.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/27/2014] [Indexed: 11/25/2022]
Abstract
Research into the anatomical substrates and "principles" for integrating inputs from separate sensory surfaces has yielded divergent findings. This suggests that multisensory integration is flexible and context dependent and underlines the need for dynamically adaptive neuronal integration mechanisms. We propose that flexible multisensory integration can be explained by a combination of canonical, population-level integrative operations, such as oscillatory phase resetting and divisive normalization. These canonical operations subsume multisensory integration into a fundamental set of principles as to how the brain integrates all sorts of information, and they are being used proactively and adaptively. We illustrate this proposition by unifying recent findings from different research themes such as timing, behavioral goal, and experience-related differences in integration.
Collapse
Affiliation(s)
- Nienke van Atteveldt
- Neuroimaging & Neuromodeling group, Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences, Meibergdreef 47, 1105 BA Amsterdam, The Netherlands; Department of Educational Neuroscience, Faculty of Psychology & Education and Institute LEARN!, VU University Amsterdam, van der Boechorststraat 1, 1081 BT Amsterdam, The Netherlands; Department of Cognitive Neuroscience, Faculty of Psychology & Neuroscience, Maastricht University, P.O. Box 616, 6200 MD Maastricht, The Netherlands.
| | - Micah M Murray
- The Laboratory for Investigative Neurophysiology (the LINE), Neuropsychology and Neurorehabilitation Service and Radiodiagnostic Service, University Hospital Center and University of Lausanne, Avenue Pierre Decker 5, 1011 Lausanne, Switzerland; EEG Brain Mapping Core, Centre for Biomedical Imaging (CIBM), Rue du Bugnon 46, 1011 Lausanne, Switzerland
| | - Gregor Thut
- Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, Glasgow, G12 8QB, UK
| | - Charles E Schroeder
- Columbia University, Department Psychiatry, and the New York State Psychiatric Institute, 1051 Riverside Drive, New York, NY 10032, USA; Nathan S. Kline Institute, Cognitive Neuroscience & Schizophrenia Program, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA.
| |
Collapse
|
47
|
Hackett TA, de la Mothe LA, Camalier CR, Falchier A, Lakatos P, Kajikawa Y, Schroeder CE. Feedforward and feedback projections of caudal belt and parabelt areas of auditory cortex: refining the hierarchical model. Front Neurosci 2014; 8:72. [PMID: 24795550 PMCID: PMC4001064 DOI: 10.3389/fnins.2014.00072] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2014] [Accepted: 03/25/2014] [Indexed: 12/21/2022] Open
Abstract
Our working model of the primate auditory cortex recognizes three major regions (core, belt, parabelt), subdivided into thirteen areas. The connections between areas are topographically ordered in a manner consistent with information flow along two major anatomical axes: core-belt-parabelt and caudal-rostral. Remarkably, most of the connections supporting this model were revealed using retrograde tracing techniques. Little is known about laminar circuitry, as anterograde tracing of axon terminations has rarely been used. The purpose of the present study was to examine the laminar projections of three areas of auditory cortex, pursuant to analysis of all areas. The selected areas were: middle lateral belt (ML); caudomedial belt (CM); and caudal parabelt (CPB). Injections of anterograde tracers yielded data consistent with major features of our model, and also new findings that compel modifications. Results supporting the model were: (1) feedforward projection from ML and CM terminated in CPB; (2) feedforward projections from ML and CPB terminated in rostral areas of the belt and parabelt; and (3) feedback projections typified inputs to the core region from belt and parabelt. At odds with the model was the convergence of feedforward inputs into rostral medial belt from ML and CPB. This was unexpected since CPB is at a higher stage of the processing hierarchy, with mainly feedback projections to all other belt areas. Lastly, extending the model, feedforward projections from CM, ML, and CPB overlapped in the temporal parietal occipital area (TPO) in the superior temporal sulcus, indicating significant auditory influence on sensory processing in this region. The combined results refine our working model and highlight the need to complete studies of the laminar inputs to all areas of auditory cortex. Their documentation is essential for developing informed hypotheses about the neurophysiological influences of inputs to each layer and area.
Collapse
Affiliation(s)
- Troy A Hackett
- Department of Hearing and Speech Sciences, Vanderbilt University School of Medicine Nashville, TN, USA
| | | | - Corrie R Camalier
- Department of Hearing and Speech Sciences, Vanderbilt University School of Medicine Nashville, TN, USA ; Laboratory of Neuropsychology, National Institutes of Mental Health Bethesda, MD, USA
| | - Arnaud Falchier
- Cognitive Neuroscience and Schizophrenia Program, Nathan Kline Institute Orangeburg, NY, USA ; Department of Psychiatry, Columbia University College of Physicians and Surgeons New York, NY, USA
| | - Peter Lakatos
- Cognitive Neuroscience and Schizophrenia Program, Nathan Kline Institute Orangeburg, NY, USA ; Department of Psychiatry, Columbia University College of Physicians and Surgeons New York, NY, USA
| | - Yoshinao Kajikawa
- Cognitive Neuroscience and Schizophrenia Program, Nathan Kline Institute Orangeburg, NY, USA ; Department of Psychiatry, Columbia University College of Physicians and Surgeons New York, NY, USA
| | - Charles E Schroeder
- Cognitive Neuroscience and Schizophrenia Program, Nathan Kline Institute Orangeburg, NY, USA ; Department of Psychiatry, Columbia University College of Physicians and Surgeons New York, NY, USA
| |
Collapse
|
48
|
Kusmierek P, Rauschecker JP. Selectivity for space and time in early areas of the auditory dorsal stream in the rhesus monkey. J Neurophysiol 2014; 111:1671-85. [PMID: 24501260 DOI: 10.1152/jn.00436.2013] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The respective roles of ventral and dorsal cortical processing streams are still under discussion in both vision and audition. We characterized neural responses in the caudal auditory belt cortex, an early dorsal stream region of the macaque. We found fast neural responses with elevated temporal precision as well as neurons selective to sound location. These populations were partly segregated: Neurons in a caudomedial area more precisely followed temporal stimulus structure but were less selective to spatial location. Response latencies in this area were even shorter than in primary auditory cortex. Neurons in a caudolateral area showed higher selectivity for sound source azimuth and elevation, but responses were slower and matching to temporal sound structure was poorer. In contrast to the primary area and other regions studied previously, latencies in the caudal belt neurons were not negatively correlated with best frequency. Our results suggest that two functional substreams may exist within the auditory dorsal stream.
Collapse
Affiliation(s)
- Pawel Kusmierek
- Department of Neuroscience, Georgetown University Medical Center, Washington, District of Columbia
| | | |
Collapse
|
49
|
van Atteveldt NM, Peterson BS, Schroeder CE. Contextual control of audiovisual integration in low-level sensory cortices. Hum Brain Mapp 2013; 35:2394-411. [PMID: 23982946 DOI: 10.1002/hbm.22336] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2013] [Revised: 05/07/2013] [Accepted: 05/15/2013] [Indexed: 11/06/2022] Open
Abstract
Potential sources of multisensory influences on low-level sensory cortices include direct projections from sensory cortices of different modalities, as well as more indirect feedback inputs from higher order multisensory cortical regions. These multiple architectures may be functionally complementary, but the exact roles and inter-relationships of the circuits are unknown. Using a fully balanced context manipulation, we tested the hypotheses that: (1) feedforward and lateral pathways subserve speed functions, such as detecting peripheral stimuli. Multisensory integration effects in this context are predicted in peripheral fields of low-level sensory cortices. (2) Slower feedback pathways underpin accuracy functions, such as object discrimination. Integration effects in this context are predicted in higher-order association cortices and central/foveal fields of low-level sensory cortex. We used functional magnetic resonance imaging to compare the effects of central versus peripheral stimulation on audiovisual integration, while varying speed and accuracy requirements for behavioral responses. We found that interactions of task demands and stimulus eccentricity in low-level sensory cortices are more complex than would be predicted by a simple dichotomy such as our hypothesized peripheral/speed and foveal/accuracy functions. Additionally, our findings point to individual differences in integration that may be related to skills and strategy. Overall, our findings suggest that instead of using fixed, specialized pathways, the exact circuits and mechanisms that are used for low-level multisensory integration are much more flexible and contingent upon both individual and contextual factors than previously assumed.
Collapse
Affiliation(s)
- Nienke M van Atteveldt
- Department of Cognitive Neuroscience, Maastricht University, Maastricht, The Netherlands; Neuroimaging and Neuromodeling Group, Netherlands Institute for Neuroscience, Amsterdam, The Netherlands; Department of Psychiatry, New York State Psychiatric Institute, Columbia University, New York, New York
| | | | | |
Collapse
|
50
|
Bolognini N, Convento S, Rossetti A, Merabet LB. Multisensory processing after a brain damage: Clues on post-injury crossmodal plasticity from neuropsychology. Neurosci Biobehav Rev 2013; 37:269-78. [DOI: 10.1016/j.neubiorev.2012.12.006] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2012] [Revised: 10/03/2012] [Accepted: 12/09/2012] [Indexed: 11/28/2022]
|