1
|
Gartside SE, Olthof BM, Rees A. Motor, somatosensory, and executive cortical areas elicit monosynaptic and polysynaptic neuronal activity in the auditory midbrain. Hear Res 2024; 447:109009. [PMID: 38670009 DOI: 10.1016/j.heares.2024.109009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 04/10/2024] [Accepted: 04/15/2024] [Indexed: 04/28/2024]
Abstract
We recently reported that the central nucleus of the inferior colliculus (the auditory midbrain) is innervated by glutamatergic pyramidal cells originating not only in auditory cortex (AC), but also in multiple 'non-auditory' regions of the cerebral cortex. Here, in anaesthetised rats, we used optogenetics and electrical stimulation, combined with recording in the inferior colliculus to determine the functional influence of these descending connections. Specifically, we determined the extent of monosynaptic excitation and the influence of these descending connections on spontaneous activity in the inferior colliculus. A retrograde virus encoding both green fluorescent protein (GFP) and channelrhodopsin (ChR2) injected into the central nucleus of the inferior colliculus (ICc) resulted in GFP expression in discrete groups of cells in multiple areas of the cerebral cortex. Light stimulation of AC and primary motor cortex (M1) caused local activation of cortical neurones and increased the firing rate of neurones in ICc indicating a direct excitatory input from AC and M1 to ICc with a restricted distribution. In naïve animals, electrical stimulation at multiple different sites within M1, secondary motor, somatosensory, and prefrontal cortices increased firing rate in ICc. However, it was notable that stimulation at some adjacent sites failed to influence firing at the recording site in ICc. Responses in ICc comprised singular spikes of constant shape and size which occurred with a short, and fixed latency (∼ 5 ms) consistent with monosynaptic excitation of individual ICc units. Increasing the stimulus current decreased the latency of these spikes, suggesting more rapid depolarization of cortical neurones, and increased the number of (usually adjacent) channels on which a monosynaptic spike was seen, suggesting recruitment of increasing numbers of cortical neurons. Electrical stimulation of cortical regions also evoked longer latency, longer duration increases in firing activity, comprising multiple units with spikes occurring with significant temporal jitter, consistent with polysynaptic excitation. Increasing the stimulus current increased the number of spikes in these polysynaptic responses and increased the number of channels on which the responses were observed, although the magnitude of the responses always diminished away from the most activated channels. Together our findings indicate descending connections from motor, somatosensory and executive cortical regions directly activate small numbers of ICc neurones and that this in turn leads to extensive polysynaptic activation of local circuits within the ICc.
Collapse
Affiliation(s)
- Sarah E Gartside
- Centre for Transformative Neuroscience and Biosciences Institute, Newcastle University, Newcastle upon Tyne, NE2 4HH, United Kingdom.
| | - Bas Mj Olthof
- Centre for Transformative Neuroscience and Biosciences Institute, Newcastle University, Newcastle upon Tyne, NE2 4HH, United Kingdom
| | - Adrian Rees
- Centre for Transformative Neuroscience and Biosciences Institute, Newcastle University, Newcastle upon Tyne, NE2 4HH, United Kingdom
| |
Collapse
|
2
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
3
|
Coen P, Sit TPH, Wells MJ, Carandini M, Harris KD. Mouse frontal cortex mediates additive multisensory decisions. Neuron 2023; 111:2432-2447.e13. [PMID: 37295419 PMCID: PMC10957398 DOI: 10.1016/j.neuron.2023.05.008] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Revised: 12/02/2022] [Accepted: 05/10/2023] [Indexed: 06/12/2023]
Abstract
The brain can combine auditory and visual information to localize objects. However, the cortical substrates underlying audiovisual integration remain uncertain. Here, we show that mouse frontal cortex combines auditory and visual evidence; that this combination is additive, mirroring behavior; and that it evolves with learning. We trained mice in an audiovisual localization task. Inactivating frontal cortex impaired responses to either sensory modality, while inactivating visual or parietal cortex affected only visual stimuli. Recordings from >14,000 neurons indicated that after task learning, activity in the anterior part of frontal area MOs (secondary motor cortex) additively encodes visual and auditory signals, consistent with the mice's behavioral strategy. An accumulator model applied to these sensory representations reproduced the observed choices and reaction times. These results suggest that frontal cortex adapts through learning to combine evidence across sensory cortices, providing a signal that is transformed into a binary decision by a downstream accumulator.
Collapse
Affiliation(s)
- Philip Coen
- UCL Queen Square Institute of Neurology, University College London, London, UK; UCL Institute of Ophthalmology, University College London, London, UK.
| | - Timothy P H Sit
- Sainsbury-Wellcome Center, University College London, London, UK
| | - Miles J Wells
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, London, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, London, UK
| |
Collapse
|
4
|
Danieli K, Guyon A, Bethus I. Episodic Memory formation: A review of complex Hippocampus input pathways. Prog Neuropsychopharmacol Biol Psychiatry 2023; 126:110757. [PMID: 37086812 DOI: 10.1016/j.pnpbp.2023.110757] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Revised: 03/08/2023] [Accepted: 03/22/2023] [Indexed: 04/24/2023]
Abstract
Memories of everyday experiences involve the encoding of a rich and dynamic representation of present objects and their contextual features. Traditionally, the resulting mnemonic trace is referred to as Episodic Memory, i.e. the "what", "where" and "when" of a lived episode. The journey for such memory trace encoding begins with the perceptual data of an experienced episode handled in sensory brain regions. The information is then streamed to cortical areas located in the ventral Medio Temporal Lobe, which produces multi-modal representations concerning either the objects (in the Perirhinal cortex) or the spatial and contextual features (in the parahippocampal region) of the episode. Then, this high-level data is gated through the Entorhinal Cortex and forwarded to the Hippocampal Formation, where all the pieces get bound together. Eventually, the resulting encoded neural pattern is relayed back to the Neocortex for a stable consolidation. This review will detail these different stages and provide a systematic overview of the major cortical streams toward the Hippocampus relevant for Episodic Memory encoding.
Collapse
Affiliation(s)
| | - Alice Guyon
- Université Cote d'Azur, Neuromod Institute, France; Université Cote d'Azur, CNRS UMR 7275, IPMC, Valbonne, France
| | - Ingrid Bethus
- Université Cote d'Azur, Neuromod Institute, France; Université Cote d'Azur, CNRS UMR 7275, IPMC, Valbonne, France
| |
Collapse
|
5
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
6
|
Bigelow J, Morrill RJ, Olsen T, Hasenstaub AR. Visual modulation of firing and spectrotemporal receptive fields in mouse auditory cortex. CURRENT RESEARCH IN NEUROBIOLOGY 2022; 3:100040. [DOI: 10.1016/j.crneur.2022.100040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Revised: 04/26/2022] [Accepted: 05/06/2022] [Indexed: 10/18/2022] Open
|
7
|
Rezaul Karim AKM, Proulx MJ, de Sousa AA, Likova LT. Neuroplasticity and Crossmodal Connectivity in the Normal, Healthy Brain. PSYCHOLOGY & NEUROSCIENCE 2021; 14:298-334. [PMID: 36937077 PMCID: PMC10019101 DOI: 10.1037/pne0000258] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Objective Neuroplasticity enables the brain to establish new crossmodal connections or reorganize old connections which are essential to perceiving a multisensorial world. The intent of this review is to identify and summarize the current developments in neuroplasticity and crossmodal connectivity, and deepen understanding of how crossmodal connectivity develops in the normal, healthy brain, highlighting novel perspectives about the principles that guide this connectivity. Methods To the above end, a narrative review is carried out. The data documented in prior relevant studies in neuroscience, psychology and other related fields available in a wide range of prominent electronic databases are critically assessed, synthesized, interpreted with qualitative rather than quantitative elements, and linked together to form new propositions and hypotheses about neuroplasticity and crossmodal connectivity. Results Three major themes are identified. First, it appears that neuroplasticity operates by following eight fundamental principles and crossmodal integration operates by following three principles. Second, two different forms of crossmodal connectivity, namely direct crossmodal connectivity and indirect crossmodal connectivity, are suggested to operate in both unisensory and multisensory perception. Third, three principles possibly guide the development of crossmodal connectivity into adulthood. These are labeled as the principle of innate crossmodality, the principle of evolution-driven 'neuromodular' reorganization and the principle of multimodal experience. These principles are combined to develop a three-factor interaction model of crossmodal connectivity. Conclusions The hypothesized principles and the proposed model together advance understanding of neuroplasticity, the nature of crossmodal connectivity, and how such connectivity develops in the normal, healthy brain.
Collapse
|
8
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
9
|
Munoz-Montoya F, Juan MC, Mendez-Lopez M, Molla R, Abad F, Fidalgo C. SLAM-based augmented reality for the assessment of short-term spatial memory. A comparative study of visual versus tactile stimuli. PLoS One 2021; 16:e0245976. [PMID: 33539369 PMCID: PMC7861452 DOI: 10.1371/journal.pone.0245976] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 01/11/2021] [Indexed: 11/20/2022] Open
Abstract
The assessment of human spatial short-term memory has mainly been performed using visual stimuli and less frequently using auditory stimuli. This paper presents a framework for the development of SLAM-based Augmented Reality applications for the assessment of spatial memory. An AR mobile application was developed for this type of assessment involving visual and tactile stimuli by using our framework. The task to be carried out with the AR application is divided into two phases: 1) a learning phase, in which participants physically walk around a room and have to remember the location of simple geometrical shapes; and 2) an evaluation phase, in which the participants are asked to recall the location of the shapes. A study for comparing the performance outcomes using visual and tactile stimuli was carried out. Fifty-three participants performed the task using the two conditions (Tactile vs Visual), but with more than two months of difference (within-subject design). The number of shapes placed correctly was similar for both conditions. However, the group that used the tactile stimulus spent significantly more time completing the task and required significantly more attempts. The performance outcomes were independent of gender. Some significant correlations among variables related to the performance outcomes and other tests were found. The following significant correlations among variables related to the performance outcomes using visual stimuli and the participants’ subjective variables were also found: 1) the greater the number of correctly placed shapes, the greater the perceived competence; 2) the more attempts required, the less the perceived competence. We also found that perceived enjoyment was higher when a higher sense of presence was induced. Our results suggest that tactile stimuli are valid stimuli to exploit for the assessment of the ability to memorize spatial-tactile associations, but that the ability to memorize spatial-visual associations is dominant. Our results also show that gender does not affect these types of memory tasks.
Collapse
Affiliation(s)
- Francisco Munoz-Montoya
- Instituto Universitario de Automática e Informática Industrial, Universitat Politècnica de València, Valencia, Spain
| | - M.-Carmen Juan
- Instituto Universitario de Automática e Informática Industrial, Universitat Politècnica de València, Valencia, Spain
- * E-mail:
| | - Magdalena Mendez-Lopez
- Departamento de Psicología y Sociología, IIS Aragón, Universidad de Zaragoza, Teruel, Spain
| | - Ramon Molla
- Instituto Universitario de Automática e Informática Industrial, Universitat Politècnica de València, Valencia, Spain
| | - Francisco Abad
- Instituto Universitario de Automática e Informática Industrial, Universitat Politècnica de València, Valencia, Spain
| | - Camino Fidalgo
- Departamento de Psicología y Sociología, IIS Aragón, Universidad de Zaragoza, Teruel, Spain
| |
Collapse
|
10
|
|
11
|
Differential Rapid Plasticity in Auditory and Visual Responses in the Primarily Multisensory Orbitofrontal Cortex. eNeuro 2020; 7:ENEURO.0061-20.2020. [PMID: 32424057 PMCID: PMC7294472 DOI: 10.1523/eneuro.0061-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 03/26/2020] [Indexed: 01/17/2023] Open
Abstract
Given the connectivity of orbitofrontal cortex (OFC) with the sensory areas and areas involved in goal execution, it is likely that OFC, along with its function in reward processing, also has a role to play in perception-based multisensory decision-making. To understand mechanisms involved in multisensory decision-making, it is important to first know the encoding of different sensory stimuli in single neurons of the mouse OFC. Ruling out effects of behavioral state, memory, and others, we studied the anesthetized mouse OFC responses to auditory, visual, and audiovisual/multisensory stimuli, multisensory associations and sensory-driven input organization to the OFC. Almost all, OFC single neurons were found to be multisensory in nature, with sublinear to supralinear integration of the component unisensory stimuli. With a novel multisensory oddball stimulus set, we show that the OFC receives both unisensory as well as multisensory inputs, further corroborated by retrograde tracers showing labeling in secondary auditory and visual cortices, which we find to also have similar multisensory integration and responses. With long audiovisual pairing/association, we show rapid plasticity in OFC single neurons, with a strong visual bias, leading to a strong depression of auditory responses and effective enhancement of visual responses. Such rapid multisensory association driven plasticity is absent in the auditory and visual cortices, suggesting its emergence in the OFC. Based on the above results, we propose a hypothetical local circuit model in the OFC that integrates auditory and visual information which participates in computing stimulus value in dynamic multisensory environments.
Collapse
|
12
|
Chikara RK, Lo WC, Ko LW. Exploration of Brain Connectivity during Human Inhibitory Control Using Inter-Trial Coherence. SENSORS 2020; 20:s20061722. [PMID: 32204504 PMCID: PMC7147711 DOI: 10.3390/s20061722] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/11/2020] [Revised: 03/11/2020] [Accepted: 03/16/2020] [Indexed: 11/16/2022]
Abstract
Inhibitory control is a cognitive process that inhibits a response. It is used in everyday activities, such as driving a motorcycle, driving a car and playing a game. The effect of this process can be compared to the red traffic light in the real world. In this study, we investigated brain connectivity under human inhibitory control using the phase lag index and inter-trial coherence (ITC). The human brain connectivity gives a more accurate representation of the functional neural network. Results of electroencephalography (EEG), the data sets were generated from twelve healthy subjects during left and right hand inhibitions using the auditory stop-signal task, showed that the inter-trial coherence in delta (1-4 Hz) and theta (4-7 Hz) band powers increased over the frontal and temporal lobe of the brain. These EEG delta and theta band activities neural markers have been related to human inhibition in the frontal lobe. In addition, inter-trial coherence in the delta-theta and alpha (8-12 Hz) band powers increased at the occipital lobe through visual stimulation. Moreover, the highest brain connectivity was observed under inhibitory control in the frontal lobe between F3-F4 channels compared to temporal and occipital lobes. The greater EEG coherence and phase lag index in the frontal lobe is associated with the human response inhibition. These findings revealed new insights to understand the neural network of brain connectivity and underlying mechanisms during human response inhibition.
Collapse
Affiliation(s)
- Rupesh Kumar Chikara
- Department of Biological Science and Technology, College of Biological Science and Technology, National Chiao Tung University, Hsinchu 300, Taiwan;
- Center For Intelligent Drug Systems and Smart Bio-devices (IDS2B), National Chiao Tung University, Hsinchu 300, Taiwan
| | - Wei-Cheng Lo
- Department of Biological Science and Technology, College of Biological Science and Technology, National Chiao Tung University, Hsinchu 300, Taiwan;
- Institute of Bioinformatics and Systems Biology, National Chiao Tung University, Hsinchu 300, Taiwan
- Correspondence: (W.-C.L.); (L.-W.K.)
| | - Li-Wei Ko
- Department of Biological Science and Technology, College of Biological Science and Technology, National Chiao Tung University, Hsinchu 300, Taiwan;
- Center For Intelligent Drug Systems and Smart Bio-devices (IDS2B), National Chiao Tung University, Hsinchu 300, Taiwan
- Institute of Bioinformatics and Systems Biology, National Chiao Tung University, Hsinchu 300, Taiwan
- The Drug Development and Value Creation Research Center, Kaohsiung Medical University, Kaohsiung 80708, Taiwan
- Correspondence: (W.-C.L.); (L.-W.K.)
| |
Collapse
|
13
|
Stein BE, Rowland BA. Using superior colliculus principles of multisensory integration to reverse hemianopia. Neuropsychologia 2020; 141:107413. [PMID: 32113921 DOI: 10.1016/j.neuropsychologia.2020.107413] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2019] [Revised: 02/04/2020] [Accepted: 02/24/2020] [Indexed: 11/18/2022]
Abstract
The diversity of our senses conveys many advantages; it enables them to compensate for one another when needed, and the information they provide about a common event can be integrated to facilitate its processing and, ultimately, adaptive responses. These cooperative interactions are produced by multisensory neurons. A well-studied model in this context is the multisensory neuron in the output layers of the superior colliculus (SC). These neurons integrate and amplify their cross-modal (e.g., visual-auditory) inputs, thereby enhancing the physiological salience of the initiating event and the probability that it will elicit SC-mediated detection, localization, and orientation behavior. Repeated experience with the same visual-auditory stimulus can also increase the neuron's sensitivity to these individual inputs. This observation raised the possibility that such plasticity could be engaged to restore visual responsiveness when compromised. For example, unilateral lesions of visual cortex compromise the visual responsiveness of neurons in the multisensory output layers of the ipsilesional SC and produces profound contralesional blindness (hemianopia). The possibility that multisensory plasticity could restore the visual responses of these neurons, and reverse blindness, was tested in the cat model of hemianopia. Hemianopic subjects were repeatedly presented with spatiotemporally congruent visual-auditory stimulus pairs in the blinded hemifield on a daily or weekly basis. After several weeks of this multisensory exposure paradigm, visual responsiveness was restored in SC neurons and behavioral responses were elicited by visual stimuli in the previously blind hemifield. The constraints on the effectiveness of this procedure proved to be the same as those constraining SC multisensory plasticity: whereas repetitions of a congruent visual-auditory stimulus was highly effective, neither exposure to its individual component stimuli, nor to these stimuli in non-congruent configurations was effective. The restored visual responsiveness proved to be robust, highly competitive with that in the intact hemifield, and sufficient to support visual discrimination.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Blvd, Winston-Salem, NC, 27157, USA
| | - Benjamin A Rowland
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Blvd, Winston-Salem, NC, 27157, USA.
| |
Collapse
|
14
|
Kumpik DP, Campbell C, Schnupp JWH, King AJ. Re-weighting of Sound Localization Cues by Audiovisual Training. Front Neurosci 2019; 13:1164. [PMID: 31802997 PMCID: PMC6873890 DOI: 10.3389/fnins.2019.01164] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Accepted: 10/15/2019] [Indexed: 11/28/2022] Open
Abstract
Sound localization requires the integration in the brain of auditory spatial cues generated by interactions with the external ears, head and body. Perceptual learning studies have shown that the relative weighting of these cues can change in a context-dependent fashion if their relative reliability is altered. One factor that may influence this process is vision, which tends to dominate localization judgments when both modalities are present and induces a recalibration of auditory space if they become misaligned. It is not known, however, whether vision can alter the weighting of individual auditory localization cues. Using virtual acoustic space stimuli, we measured changes in subjects’ sound localization biases and binaural localization cue weights after ∼50 min of training on audiovisual tasks in which visual stimuli were either informative or not about the location of broadband sounds. Four different spatial configurations were used in which we varied the relative reliability of the binaural cues: interaural time differences (ITDs) and frequency-dependent interaural level differences (ILDs). In most subjects and experiments, ILDs were weighted more highly than ITDs before training. When visual cues were spatially uninformative, some subjects showed a reduction in auditory localization bias and the relative weighting of ILDs increased after training with congruent binaural cues. ILDs were also upweighted if they were paired with spatially-congruent visual cues, and the largest group-level improvements in sound localization accuracy occurred when both binaural cues were matched to visual stimuli. These data suggest that binaural cue reweighting reflects baseline differences in the relative weights of ILDs and ITDs, but is also shaped by the availability of congruent visual stimuli. Training subjects with consistently misaligned binaural and visual cues produced the ventriloquism aftereffect, i.e., a corresponding shift in auditory localization bias, without affecting the inter-subject variability in sound localization judgments or their binaural cue weights. Our results show that the relative weighting of different auditory localization cues can be changed by training in ways that depend on their reliability as well as the availability of visual spatial information, with the largest improvements in sound localization likely to result from training with fully congruent audiovisual information.
Collapse
Affiliation(s)
- Daniel P Kumpik
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Connor Campbell
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Jan W H Schnupp
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Andrew J King
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
15
|
Macharadze T, Budinger E, Brosch M, Scheich H, Ohl FW, Henschke JU. Early Sensory Loss Alters the Dendritic Branching and Spine Density of Supragranular Pyramidal Neurons in Rodent Primary Sensory Cortices. Front Neural Circuits 2019; 13:61. [PMID: 31611778 PMCID: PMC6773815 DOI: 10.3389/fncir.2019.00061] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Accepted: 09/03/2019] [Indexed: 01/26/2023] Open
Abstract
Multisensory integration in primary auditory (A1), visual (V1), and somatosensory cortex (S1) is substantially mediated by their direct interconnections and by thalamic inputs across the sensory modalities. We have previously shown in rodents (Mongolian gerbils) that during postnatal development, the anatomical and functional strengths of these crossmodal and also of sensory matched connections are determined by early auditory, somatosensory, and visual experience. Because supragranular layer III pyramidal neurons are major targets of corticocortical and thalamocortical connections, we investigated in this follow-up study how the loss of early sensory experience changes their dendritic morphology. Gerbils were sensory deprived early in development by either bilateral sciatic nerve transection at postnatal day (P) 5, ototoxic inner hair cell damage at P10, or eye enucleation at P10. Sholl and branch order analyses of Golgi-stained layer III pyramidal neurons at P28, which demarcates the end of the sensory critical period in this species, revealed that visual and somatosensory deprivation leads to a general increase of apical and basal dendritic branching in A1, V1, and S1. In contrast, dendritic branching, particularly of apical dendrites, decreased in all three areas following auditory deprivation. Generally, the number of spines, and consequently spine density, along the apical and basal dendrites decreased in both sensory deprived and non-deprived cortical areas. Therefore, we conclude that the loss of early sensory experience induces a refinement of corticocortical crossmodal and other cortical and thalamic connections by pruning of dendritic spines at the end of the critical period. Based on present and previous own results and on findings from the literature, we propose a scenario for multisensory development following early sensory loss.
Collapse
Affiliation(s)
- Tamar Macharadze
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Clinic for Anesthesiology and Intensive Care Medicine, Otto von Guericke University Hospital, Magdeburg, Germany
| | - Eike Budinger
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| | - Michael Brosch
- Center for Behavioral Brain Sciences, Magdeburg, Germany.,Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Henning Scheich
- Center for Behavioral Brain Sciences, Magdeburg, Germany.,Emeritus Group Lifelong Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Frank W Ohl
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Institute for Biology, Otto von Guericke University, Magdeburg, Germany
| | - Julia U Henschke
- Institute of Cognitive Neurology and Dementia Research (IKND), Otto von Guericke University, Magdeburg, Germany
| |
Collapse
|
16
|
Modulation of the Visual to Auditory Human Inhibitory Brain Network: An EEG Dipole Source Localization Study. Brain Sci 2019; 9:brainsci9090216. [PMID: 31461954 PMCID: PMC6770157 DOI: 10.3390/brainsci9090216] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2019] [Revised: 08/15/2019] [Accepted: 08/23/2019] [Indexed: 12/21/2022] Open
Abstract
Auditory alarms are used to direct people’s attention to critical events in complicated environments. The capacity for identifying the auditory alarms in order to take the right action in our daily life is critical. In this work, we investigate how auditory alarms affect the neural networks of human inhibition. We used a famous stop-signal or go/no-go task to measure the effect of visual stimuli and auditory alarms on the human brain. In this experiment, go-trials used visual stimulation, via a square or circle symbol, and stop trials used auditory stimulation, via an auditory alarm. Electroencephalography (EEG) signals from twelve subjects were acquired and analyzed using an advanced EEG dipole source localization method via independent component analysis (ICA) and EEG-coherence analysis. Behaviorally, the visual stimulus elicited a significantly higher accuracy rate (96.35%) than the auditory stimulus (57.07%) during inhibitory control. EEG theta and beta band power increases in the right middle frontal gyrus (rMFG) were associated with human inhibitory control. In addition, delta, theta, alpha, and beta band increases in the right cingulate gyrus (rCG) and delta band increases in both right superior temporal gyrus (rSTG) and left superior temporal gyrus (lSTG) were associated with the network changes induced by auditory alarms. We further observed that theta-alpha and beta bands between lSTG-rMFG and lSTG-rSTG pathways had higher connectivity magnitudes in the brain network when performing the visual tasks changed to receiving the auditory alarms. These findings could be useful for further understanding the human brain in realistic environments.
Collapse
|
17
|
Visual input shapes the auditory frequency responses in the inferior colliculus of mouse. Hear Res 2019; 381:107777. [PMID: 31430633 DOI: 10.1016/j.heares.2019.107777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/30/2019] [Accepted: 08/02/2019] [Indexed: 11/23/2022]
Abstract
The integration of visual and auditory information is important for humans or animals to build an accurate and coherent perception of the external world. Although some evidence has shown some principles of the audiovisual integration, little insight has been gained into its functional purpose. In this study, we investigated the functional influence of dynamic visual input on auditory frequency processing by recording single unit activity in the central nucleus of the inferior colliculus (ICc). Results showed that the auditory responses of ICc neurons to sound frequencies could be enhanced or suppressed by visual stimuli even though the same visual stimuli induced no neural responses when presented alone. For each ICc neuron, the most effective visual stimuli were located in the same azimuth as for auditory stimuli and preceded for ∼20 ms. Additionally, visual stimuli could steepen or flatten the frequency tuning curves (FTCs) of ICc neurons by various visual effects at each responsive frequency. The modulation degree of auditory FTCs was dependent on the minimal thresholds (MTs) of ICc neurons, i.e., with MTs increasing, the modulation degree decreased. Due to the non-homogeneous distribution of MTs which was lowest at 10 kHz, visual modulation of auditory FTCs exhibited a frequency-specific manner, the closer it reached the characteristic frequency (CF) of 10 kHz, the greater modulation. Thus, visual modulation of auditory frequency responses in ICc is dependent not only on the visual stimulus but also on the auditory characteristics of ICc neurons. These results suggest a moment-to-moment visual modulation of auditory frequency responses that in real time increase auditory frequency sensitivity to audiovisual stimuli. Furthermore, in the long term such modulation could serve to instruct auditory adaptive plasticity to maintain necessary and accurate auditory detection and perceptual behavior.
Collapse
|
18
|
Huang Y, Heil P, Brosch M. Associations between sounds and actions in early auditory cortex of nonhuman primates. eLife 2019; 8:43281. [PMID: 30946010 PMCID: PMC6467566 DOI: 10.7554/elife.43281] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2018] [Accepted: 04/03/2019] [Indexed: 11/17/2022] Open
Abstract
An individual may need to take different actions to the same stimulus in different situations to achieve a given goal. The selection of the appropriate action hinges on the previously learned associations between stimuli, actions, and outcomes in the situations. Here, using a go/no-go paradigm and a symmetrical reward, we show that early auditory cortex of nonhuman primates represents such associations, in both the spiking activity and the local field potentials. Sound-evoked neuronal responses changed with sensorimotor associations shortly after sound onset, and the neuronal responses were largest when the sound signaled that a no-go response was required in a trial to obtain a reward. Our findings suggest that association processes take place in the auditory system and do not necessarily rely on association cortex. Thus, auditory cortex may contribute to a rapid selection of the appropriate motor responses to sounds during goal-directed behavior.
Collapse
Affiliation(s)
- Ying Huang
- Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany
| | - Peter Heil
- Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany.,Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Michael Brosch
- Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany
| |
Collapse
|
19
|
Meijer GT, Mertens PEC, Pennartz CMA, Olcese U, Lansink CS. The circuit architecture of cortical multisensory processing: Distinct functions jointly operating within a common anatomical network. Prog Neurobiol 2019; 174:1-15. [PMID: 30677428 DOI: 10.1016/j.pneurobio.2019.01.004] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2017] [Revised: 12/21/2018] [Accepted: 01/21/2019] [Indexed: 12/16/2022]
Abstract
Our perceptual systems continuously process sensory inputs from different modalities and organize these streams of information such that our subjective representation of the outside world is a unified experience. By doing so, they also enable further cognitive processing and behavioral action. While cortical multisensory processing has been extensively investigated in terms of psychophysics and mesoscale neural correlates, an in depth understanding of the underlying circuit-level mechanisms is lacking. Previous studies on circuit-level mechanisms of multisensory processing have predominantly focused on cue integration, i.e. the mechanism by which sensory features from different modalities are combined to yield more reliable stimulus estimates than those obtained by using single sensory modalities. In this review, we expand the framework on the circuit-level mechanisms of cortical multisensory processing by highlighting that multisensory processing is a family of functions - rather than a single operation - which involves not only the integration but also the segregation of modalities. In addition, multisensory processing not only depends on stimulus features, but also on cognitive resources, such as attention and memory, as well as behavioral context, to determine the behavioral outcome. We focus on rodent models as a powerful instrument to study the circuit-level bases of multisensory processes, because they enable combining cell-type-specific recording and interventional techniques with complex behavioral paradigms. We conclude that distinct multisensory processes share overlapping anatomical substrates, are implemented by diverse neuronal micro-circuitries that operate in parallel, and are flexibly recruited based on factors such as stimulus features and behavioral constraints.
Collapse
Affiliation(s)
- Guido T Meijer
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Paul E C Mertens
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Cyriel M A Pennartz
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Umberto Olcese
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| | - Carien S Lansink
- Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands; Research Priority Program Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, the Netherlands.
| |
Collapse
|
20
|
Cai Y, Chen G, Zhong X, Yu G, Mo H, Jiang J, Chen X, Zhao F, Zheng Y. Influence of Audiovisual Training on Horizontal Sound Localization and Its Related ERP Response. Front Hum Neurosci 2018; 12:423. [PMID: 30405377 PMCID: PMC6206041 DOI: 10.3389/fnhum.2018.00423] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2018] [Accepted: 10/01/2018] [Indexed: 01/27/2023] Open
Abstract
The objective was to investigate the influence of audiovisual training on horizontal sound localization and the underlying neurological mechanisms using a combination of psychoacoustic and electrophysiological (i.e., event-related potential, ERP) measurements on sound localization. Audiovisual stimuli were used in the training group, whilst the control group was trained using auditory stimuli only. Training sessions were undertaken once per day for three consecutive days. Sound localization accuracy was evaluated daily after training, using psychoacoustic tests. ERP responses were measured on the first and last day of tasks. Sound localization was significantly improved in the audiovisual training group when compared to the control group. Moreover, a significantly greater reduction in front-back confusion ratio for both trained and untrained angles was found between pre- and post-test in the audiovisual training group. ERP measurement showed a decrease in N1 amplitude and an increase in P2 amplitude in both groups. However, changes in late components were only found in the audiovisual training group, with an increase in P400 amplitude and decrease in N500 amplitude. These results suggest that the interactive effect of audiovisual localization training is likely to be mediated at a relatively late cognitive processing stage.
Collapse
Affiliation(s)
- Yuexin Cai
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| | - Guisheng Chen
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| | - Xiaoli Zhong
- Acoustic Laboratory, Physics Department, South China University of Technology, Guangzhou, China
| | - Guangzheng Yu
- Acoustic Laboratory, Physics Department, South China University of Technology, Guangzhou, China
| | - Hanjie Mo
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| | - Jiajia Jiang
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| | - Xiaoting Chen
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| | - Fei Zhao
- Department of Speech Language Therapy and Hearing Science, Cardiff Metropolitan University, Cardiff, United Kingdom.,Department of Hearing and Speech Science, Xinhua College, Sun Yat-sen University, Guangzhou, China
| | - Yiqing Zheng
- Department of Otolaryngology, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China.,Institute of Hearing and Speech-Language Science, Sun Yat-sen University, Guangzhou, China
| |
Collapse
|
21
|
Chaplin TA, Allitt BJ, Hagan MA, Rosa MGP, Rajan R, Lui LL. Auditory motion does not modulate spiking activity in the middle temporal and medial superior temporal visual areas. Eur J Neurosci 2018; 48:2013-2029. [PMID: 30019438 DOI: 10.1111/ejn.14071] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2018] [Revised: 06/27/2018] [Accepted: 07/07/2018] [Indexed: 12/29/2022]
Abstract
The integration of multiple sensory modalities is a key aspect of brain function, allowing animals to take advantage of concurrent sources of information to make more accurate perceptual judgments. For many years, multisensory integration in the cerebral cortex was deemed to occur only in high-level "polysensory" association areas. However, more recent studies have suggested that cross-modal stimulation can also influence neural activity in areas traditionally considered to be unimodal. In particular, several human neuroimaging studies have reported that extrastriate areas involved in visual motion perception are also activated by auditory motion, and may integrate audiovisual motion cues. However, the exact nature and extent of the effects of auditory motion on the visual cortex have not been studied at the single neuron level. We recorded the spiking activity of neurons in the middle temporal (MT) and medial superior temporal (MST) areas of anesthetized marmoset monkeys upon presentation of unimodal stimuli (moving auditory or visual patterns), as well as bimodal stimuli (concurrent audiovisual motion). Despite robust, direction selective responses to visual motion, none of the sampled neurons responded to auditory motion stimuli. Moreover, concurrent moving auditory stimuli had no significant effect on the ability of single MT and MST neurons, or populations of simultaneously recorded neurons, to discriminate the direction of motion of visual stimuli (moving random dot patterns with varying levels of motion noise). Our findings do not support the hypothesis that direct interactions between MT, MST and areas low in the hierarchy of auditory areas underlie audiovisual motion integration.
Collapse
Affiliation(s)
- Tristan A Chaplin
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Benjamin J Allitt
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia
| | - Maureen A Hagan
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Marcello G P Rosa
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Ramesh Rajan
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Leo L Lui
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| |
Collapse
|
22
|
Bimbard C, Demene C, Girard C, Radtke-Schuller S, Shamma S, Tanter M, Boubenec Y. Multi-scale mapping along the auditory hierarchy using high-resolution functional UltraSound in the awake ferret. eLife 2018; 7:35028. [PMID: 29952750 PMCID: PMC6039176 DOI: 10.7554/elife.35028] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2018] [Accepted: 06/16/2018] [Indexed: 12/22/2022] Open
Abstract
A major challenge in neuroscience is to longitudinally monitor whole brain activity across multiple spatial scales in the same animal. Functional UltraSound (fUS) is an emerging technology that offers images of cerebral blood volume over large brain portions. Here we show for the first time its capability to resolve the functional organization of sensory systems at multiple scales in awake animals, both within small structures by precisely mapping and differentiating sensory responses, and between structures by elucidating the connectivity scheme of top-down projections. We demonstrate that fUS provides stable (over days), yet rapid, highly-resolved 3D tonotopic maps in the auditory pathway of awake ferrets, thus revealing its unprecedented functional resolution (100/300µm). This was performed in four different brain regions, including very small (1–2 mm3 size), deeply situated subcortical (8 mm deep) and previously undescribed structures in the ferret. Furthermore, we used fUS to map long-distance projections from frontal cortex, a key source of sensory response modulation, to auditory cortex.
Collapse
Affiliation(s)
- Célian Bimbard
- Audition Team, Laboratoire des Systèmes Perceptifs CNRS UMR 8248, École Normale Supérieure, PSL Research University, Paris, France
| | - Charlie Demene
- Institut Langevin, ESPCI ParisTech, INSERM U979, CNRS UMR 7587, PSL Research University, Paris, France
| | - Constantin Girard
- Audition Team, Laboratoire des Systèmes Perceptifs CNRS UMR 8248, École Normale Supérieure, PSL Research University, Paris, France
| | - Susanne Radtke-Schuller
- Audition Team, Laboratoire des Systèmes Perceptifs CNRS UMR 8248, École Normale Supérieure, PSL Research University, Paris, France
| | - Shihab Shamma
- Audition Team, Laboratoire des Systèmes Perceptifs CNRS UMR 8248, École Normale Supérieure, PSL Research University, Paris, France.,Institute for Systems Research, Department of Electrical and Computer Engineering, University of Maryland College Park, Maryland, United States
| | - Mickael Tanter
- Institut Langevin, ESPCI ParisTech, INSERM U979, CNRS UMR 7587, PSL Research University, Paris, France
| | - Yves Boubenec
- Audition Team, Laboratoire des Systèmes Perceptifs CNRS UMR 8248, École Normale Supérieure, PSL Research University, Paris, France
| |
Collapse
|
23
|
Cuppini C, Shams L, Magosso E, Ursino M. A biologically inspired neurocomputational model for audiovisual integration and causal inference. Eur J Neurosci 2018; 46:2481-2498. [PMID: 28949035 DOI: 10.1111/ejn.13725] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2016] [Revised: 09/18/2017] [Accepted: 09/19/2017] [Indexed: 11/28/2022]
Abstract
Recently, experimental and theoretical research has focused on the brain's abilities to extract information from a noisy sensory environment and how cross-modal inputs are processed to solve the causal inference problem to provide the best estimate of external events. Despite the empirical evidence suggesting that the nervous system uses a statistically optimal and probabilistic approach in addressing these problems, little is known about the brain's architecture needed to implement these computations. The aim of this work was to realize a mathematical model, based on physiologically plausible hypotheses, to analyze the neural mechanisms underlying multisensory perception and causal inference. The model consists of three layers topologically organized: two encode auditory and visual stimuli, separately, and are reciprocally connected via excitatory synapses and send excitatory connections to the third downstream layer. This synaptic organization realizes two mechanisms of cross-modal interactions: the first is responsible for the sensory representation of the external stimuli, while the second solves the causal inference problem. We tested the network by comparing its results to behavioral data reported in the literature. Among others, the network can account for the ventriloquism illusion, the pattern of sensory bias and the percept of unity as a function of the spatial auditory-visual distance, and the dependence of the auditory error on the causal inference. Finally, simulations results are consistent with probability matching as the perceptual strategy used in auditory-visual spatial localization tasks, agreeing with the behavioral data. The model makes untested predictions that can be investigated in future behavioral experiments.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electrical, Electronic and Information Engineering, University of Bologna, Viale Risorgimento 2, I40136, Bologna, Italy
| | - Ladan Shams
- Department of Psychology, Department of BioEngineering, Interdepartmental Neuroscience Program, University of California, Los Angeles, CA, USA
| | - Elisa Magosso
- Department of Electrical, Electronic and Information Engineering, University of Bologna, Viale Risorgimento 2, I40136, Bologna, Italy
| | - Mauro Ursino
- Department of Electrical, Electronic and Information Engineering, University of Bologna, Viale Risorgimento 2, I40136, Bologna, Italy
| |
Collapse
|
24
|
Freeman LCA, Wood KC, Bizley JK. Multisensory stimuli improve relative localisation judgments compared to unisensory auditory or visual stimuli. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2018; 143:EL516. [PMID: 29960438 PMCID: PMC6018061 DOI: 10.1121/1.5042759] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/20/2018] [Revised: 04/25/2018] [Accepted: 05/29/2018] [Indexed: 06/08/2023]
Abstract
Observers performed a relative localisation task in which they reported whether the second of two sequentially presented signals occurred to the left or right of the first. Stimuli were detectability-matched auditory, visual, or auditory-visual signals and the goal was to compare changes in performance with eccentricity across modalities. Visual performance was superior to auditory at the midline, but inferior in the periphery, while auditory-visual performance exceeded both at all locations. No such advantage was seen when performance for auditory-only trials was contrasted with trials in which the first stimulus was auditory-visual and the second auditory only.
Collapse
Affiliation(s)
- Laura C A Freeman
- Ear Institute, University College London, 332 Gray's Inn Road, London, WC1X 8EE, United Kingdom , ,
| | - Katherine C Wood
- Ear Institute, University College London, 332 Gray's Inn Road, London, WC1X 8EE, United Kingdom , ,
| | - Jennifer K Bizley
- Ear Institute, University College London, 332 Gray's Inn Road, London, WC1X 8EE, United Kingdom , ,
| |
Collapse
|
25
|
Bach EC, Vaughan JW, Stein BE, Rowland BA. Pulsed Stimuli Elicit More Robust Multisensory Enhancement than Expected. Front Integr Neurosci 2018; 11:40. [PMID: 29354037 PMCID: PMC5758560 DOI: 10.3389/fnint.2017.00040] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2017] [Accepted: 12/15/2017] [Indexed: 11/28/2022] Open
Abstract
Neurons in the superior colliculus (SC) integrate cross-modal inputs to generate responses that are more robust than to either input alone, and are frequently greater than their sum (superadditive enhancement). Previously, the principles of a real-time multisensory transform were identified and used to accurately predict a neuron's responses to combinations of brief flashes and noise bursts. However, environmental stimuli frequently have more complex temporal structures that elicit very different response dynamics than previously examined. The present study tested whether such stimuli (i.e., pulsed) would be treated similarly by the multisensory transform. Pulsing visual and auditory stimuli elicited responses composed of higher discharge rates that had multiple peaks temporally aligned to the stimulus pulses. Combinations pulsed cues elicited multiple peaks of superadditive enhancement within the response window. Measured over the entire response, this resulted in larger enhancements than expected given enhancements elicited by non-pulsed (“sustained”) stimuli. However, as with sustained stimuli, the dynamics of multisensory responses to pulsed stimuli were highly related to the temporal dynamics of the unisensory inputs. This suggests that the specific characteristics of the multisensory transform are not determined by the external features of the cross-modal stimulus configuration; rather the temporal structure and alignment of the unisensory inputs is the dominant driving factor in the magnitudes of the multisensory product.
Collapse
Affiliation(s)
- Eva C Bach
- Department Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC, United States
| | - John W Vaughan
- Department Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC, United States
| | - Barry E Stein
- Department Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC, United States
| | - Benjamin A Rowland
- Department Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, NC, United States
| |
Collapse
|
26
|
Shrem T, Murray MM, Deouell LY. Auditory-visual integration modulates location-specific repetition suppression of auditory responses. Psychophysiology 2017; 54:1663-1675. [PMID: 28752567 DOI: 10.1111/psyp.12955] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2016] [Revised: 05/10/2017] [Accepted: 06/03/2017] [Indexed: 11/28/2022]
Abstract
Space is a dimension shared by different modalities, but at what stage spatial encoding is affected by multisensory processes is unclear. Early studies observed attenuation of N1/P2 auditory evoked responses following repetition of sounds from the same location. Here, we asked whether this effect is modulated by audiovisual interactions. In two experiments, using a repetition-suppression paradigm, we presented pairs of tones in free field, where the test stimulus was a tone presented at a fixed lateral location. Experiment 1 established a neural index of auditory spatial sensitivity, by comparing the degree of attenuation of the response to test stimuli when they were preceded by an adapter sound at the same location versus 30° or 60° away. We found that the degree of attenuation at the P2 latency was inversely related to the spatial distance between the test stimulus and the adapter stimulus. In Experiment 2, the adapter stimulus was a tone presented from the same location or a more medial location than the test stimulus. The adapter stimulus was accompanied by a simultaneous flash displayed orthogonally from one of the two locations. Sound-flash incongruence reduced accuracy in a same-different location discrimination task (i.e., the ventriloquism effect) and reduced the location-specific repetition-suppression at the P2 latency. Importantly, this multisensory effect included topographic modulations, indicative of changes in the relative contribution of underlying sources across conditions. Our findings suggest that the auditory response at the P2 latency is affected by spatially selective brain activity, which is affected crossmodally by visual information.
Collapse
Affiliation(s)
- Talia Shrem
- Human Cognitive Neuroscience Lab, Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Micah M Murray
- Laboratory for Investigative Neurophysiology (The LINE), Department of Radiology, and Neuropsychology and Neurorehabilitation Service, University Hospital Center and University of Lausanne, Lausanne, Switzerland.,EEG Brain Mapping Core, Center for Biomedical Imaging (CIBM), Lausanne, Switzerland.,Department of Ophthalmology, University of Lausanne, Jules Gonin Eye Hospital, Lausanne, Switzerland.,Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, Tennessee, USA
| | - Leon Y Deouell
- Human Cognitive Neuroscience Lab, Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel.,The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
27
|
Town SM, Brimijoin WO, Bizley JK. Egocentric and allocentric representations in auditory cortex. PLoS Biol 2017; 15:e2001878. [PMID: 28617796 PMCID: PMC5472254 DOI: 10.1371/journal.pbio.2001878] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2016] [Accepted: 05/08/2017] [Indexed: 11/18/2022] Open
Abstract
A key function of the brain is to provide a stable representation of an object's location in the world. In hearing, sound azimuth and elevation are encoded by neurons throughout the auditory system, and auditory cortex is necessary for sound localization. However, the coordinate frame in which neurons represent sound space remains undefined: classical spatial receptive fields in head-fixed subjects can be explained either by sensitivity to sound source location relative to the head (egocentric) or relative to the world (allocentric encoding). This coordinate frame ambiguity can be resolved by studying freely moving subjects; here we recorded spatial receptive fields in the auditory cortex of freely moving ferrets. We found that most spatially tuned neurons represented sound source location relative to the head across changes in head position and direction. In addition, we also recorded a small number of neurons in which sound location was represented in a world-centered coordinate frame. We used measurements of spatial tuning across changes in head position and direction to explore the influence of sound source distance and speed of head movement on auditory cortical activity and spatial tuning. Modulation depth of spatial tuning increased with distance for egocentric but not allocentric units, whereas, for both populations, modulation was stronger at faster movement speeds. Our findings suggest that early auditory cortex primarily represents sound source location relative to ourselves but that a minority of cells can represent sound location in the world independent of our own position.
Collapse
Affiliation(s)
- Stephen M. Town
- Ear Institute, University College London, London, United Kingdom
| | - W. Owen Brimijoin
- MRC/CSO Institute of Hearing Research – Scottish Section, Glasgow, United Kingdom
| | | |
Collapse
|
28
|
Integrating Spatial Working Memory and Remote Memory: Interactions between the Medial Prefrontal Cortex and Hippocampus. Brain Sci 2017; 7:brainsci7040043. [PMID: 28420200 PMCID: PMC5406700 DOI: 10.3390/brainsci7040043] [Citation(s) in RCA: 52] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/11/2017] [Accepted: 04/14/2017] [Indexed: 12/22/2022] Open
Abstract
In recent years, two separate research streams have focused on information sharing between the medial prefrontal cortex (mPFC) and hippocampus (HC). Research into spatial working memory has shown that successful execution of many types of behaviors requires synchronous activity in the theta range between the mPFC and HC, whereas studies of memory consolidation have shown that shifts in area dependency may be temporally modulated. While the nature of information that is being communicated is still unclear, spatial working memory and remote memory recall is reliant on interactions between these two areas. This review will present recent evidence that shows that these two processes are not as separate as they first appeared. We will also present a novel conceptualization of the nature of the medial prefrontal representation and how this might help explain this area’s role in spatial working memory and remote memory recall.
Collapse
|
29
|
Hammond‐Kenny A, Bajo VM, King AJ, Nodal FR. Behavioural benefits of multisensory processing in ferrets. Eur J Neurosci 2017; 45:278-289. [PMID: 27740711 PMCID: PMC5298019 DOI: 10.1111/ejn.13440] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2016] [Revised: 09/22/2016] [Accepted: 10/10/2016] [Indexed: 12/29/2022]
Abstract
Enhanced detection and discrimination, along with faster reaction times, are the most typical behavioural manifestations of the brain's capacity to integrate multisensory signals arising from the same object. In this study, we examined whether multisensory behavioural gains are observable across different components of the localization response that are potentially under the command of distinct brain regions. We measured the ability of ferrets to localize unisensory (auditory or visual) and spatiotemporally coincident auditory-visual stimuli of different durations that were presented from one of seven locations spanning the frontal hemifield. During the localization task, we recorded the head movements made following stimulus presentation, as a metric for assessing the initial orienting response of the ferrets, as well as the subsequent choice of which target location to approach to receive a reward. Head-orienting responses to auditory-visual stimuli were more accurate and faster than those made to visual but not auditory targets, suggesting that these movements were guided principally by sound alone. In contrast, approach-to-target localization responses were more accurate and faster to spatially congruent auditory-visual stimuli throughout the frontal hemifield than to either visual or auditory stimuli alone. Race model inequality analysis of head-orienting reaction times and approach-to-target response times indicates that different processes, probability summation and neural integration, respectively, are likely to be responsible for the effects of multisensory stimulation on these two measures of localization behaviour.
Collapse
Affiliation(s)
- Amy Hammond‐Kenny
- Department of Physiology, Anatomy and GeneticsUniversity of OxfordOxfordOX1 3PTUK
| | - Victoria M. Bajo
- Department of Physiology, Anatomy and GeneticsUniversity of OxfordOxfordOX1 3PTUK
| | - Andrew J. King
- Department of Physiology, Anatomy and GeneticsUniversity of OxfordOxfordOX1 3PTUK
| | - Fernando R. Nodal
- Department of Physiology, Anatomy and GeneticsUniversity of OxfordOxfordOX1 3PTUK
| |
Collapse
|
30
|
Butler BE, Chabot N, Lomber SG. Quantifying and comparing the pattern of thalamic and cortical projections to the posterior auditory field in hearing and deaf cats. J Comp Neurol 2016; 524:3042-63. [DOI: 10.1002/cne.24005] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2015] [Revised: 03/21/2016] [Accepted: 03/24/2016] [Indexed: 11/08/2022]
Affiliation(s)
- Blake E. Butler
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Brain and Mind Institute; University of Western Ontario; London Ontario Canada N6A 5B7
| | - Nicole Chabot
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Brain and Mind Institute; University of Western Ontario; London Ontario Canada N6A 5B7
| | - Stephen G. Lomber
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Department of Psychology; University of Western Ontario; London Ontario Canada N6A 5C2
- Brain and Mind Institute; University of Western Ontario; London Ontario Canada N6A 5B7
- National Centre for Audiology; University of Western Ontario; London Ontario Canada N6G 1H1
| |
Collapse
|
31
|
Bizley JK, Maddox RK, Lee AKC. Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends Neurosci 2016; 39:74-85. [PMID: 26775728 PMCID: PMC4738154 DOI: 10.1016/j.tins.2015.12.007] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2015] [Revised: 12/03/2015] [Accepted: 12/11/2015] [Indexed: 11/30/2022]
Abstract
Crossmodal integration is a term applicable to many phenomena in which one sensory modality influences task performance or perception in another sensory modality. We distinguish the term binding as one that should be reserved specifically for the process that underpins perceptual object formation. To unambiguously differentiate binding form other types of integration, behavioral and neural studies must investigate perception of a feature orthogonal to the features that link the auditory and visual stimuli. We argue that supporting true perceptual binding (as opposed to other processes such as decision-making) is one role for cross-sensory influences in early sensory cortex. These early multisensory interactions may therefore form a physiological substrate for the bottom-up grouping of auditory and visual stimuli into auditory-visual (AV) objects. Crossmodal integration and binding have been treated as synonymous in the literature, with no clear delineation between perceptual changes and other interactions such as decision-making. Crossmodal binding is proposed as a distinct form of integration leading to multisensory object formation. Multisensory stimuli are most beneficial in noisy situations, but few studies use stimulus competition to investigate the processes underpinning multisensory integration. Evidence suggests that both visual and auditory attention is object-based – all features within an object are enhanced and there is a cost to attending features across versus within objects. Multisensory interactions can be observed throughout the brain, including early sensory cortex. The role of early sensory cortex in multisensory integration is unknown, but may underlie crossmodal binding.
Collapse
Affiliation(s)
- Jennifer K Bizley
- University College London (UCL) Ear Institute, 332 Gray's Inn Road, London, WC1X 8EE, UK.
| | - Ross K Maddox
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42nd Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA.
| |
Collapse
|
32
|
DeLoss DJ, Andersen GJ. Aging, Spatial Disparity, and the Sound-Induced Flash Illusion. PLoS One 2015; 10:e0143773. [PMID: 26619352 PMCID: PMC4664268 DOI: 10.1371/journal.pone.0143773] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2015] [Accepted: 11/09/2015] [Indexed: 11/18/2022] Open
Abstract
The present study examined age-related differences in multisensory integration and the effect of spatial disparity on the sound-induced flash illusion--an illusion used in previous research to assess age-related differences in multisensory integration. Prior to participation in the study, both younger and older participants demonstrated their ability to detect 1-2 visual flashes and 1-2 auditory beep presented unimodally. After passing the pre-test, participants were then presented 1-2 flashes paired with 0-2 beeps that originated from one of five speakers positioned equidistantly 100 cm from the participant. One speaker was positioned directly below the screen, two speakers were positioned 50 cm to the left and right from the center of the screen, and two more speakers positioned to the left and right 100 cm from the center of the screen. Participants were told to report the number of flashes presented and to ignore the beeps. Both age groups showed a significant effect of the beeps on the perceived number of flashes. However, neither younger nor older individuals showed any significant effect of spatial disparity on the sound-induced flash illusion. The presence of a congruent number of beeps increased accuracy for both older and younger individuals. Reaction time data was also analyzed. As expected, older individuals showed significantly longer reaction times when compared to younger individuals. In addition, both older and younger individuals showed a significant increase in reaction time for fusion trials, where two flashes and one beep are perceived as a single flash, as compared to congruent single flash trials. This increase in reaction time was not found for fission trials, where one flash and two beeps were perceived as two flashes. This suggests that processing may differ for the two forms for fission as compared to fusion illusions.
Collapse
Affiliation(s)
- Denton J. DeLoss
- Department of Psychology, University of California Riverside, Riverside, California, United States of America
| | - George J. Andersen
- Department of Psychology, University of California Riverside, Riverside, California, United States of America
- * E-mail:
| |
Collapse
|
33
|
Becoming a mother-circuit plasticity underlying maternal behavior. Curr Opin Neurobiol 2015; 35:49-56. [PMID: 26143475 DOI: 10.1016/j.conb.2015.06.007] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2015] [Accepted: 06/15/2015] [Indexed: 11/20/2022]
Abstract
The transition to motherhood is a dramatic event during the lifetime of many animals. In mammals, motherhood is accompanied by hormonal changes in the brain that start during pregnancy, followed by experience dependent plasticity after parturition. Together, these changes prime the nervous system of the mother for efficient nurturing of her offspring. Recent work has described how neural circuits are modified during the transition to motherhood. Here we discuss changes in the auditory cortex during motherhood as a model for maternal plasticity in sensory systems. We compare classical plasticity paradigms with changes that arise naturally in mothers, highlighting current efforts to establish a mechanistic understanding of plasticity and its different components in the context of maternal behavior.
Collapse
|
34
|
Bizley JK, Bajo VM, Nodal FR, King AJ. Cortico-Cortical Connectivity Within Ferret Auditory Cortex. J Comp Neurol 2015; 523:2187-210. [PMID: 25845831 PMCID: PMC4737260 DOI: 10.1002/cne.23784] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2014] [Revised: 03/29/2015] [Accepted: 04/01/2015] [Indexed: 12/29/2022]
Abstract
Despite numerous studies of auditory cortical processing in the ferret (Mustela putorius), very little is known about the connections between the different regions of the auditory cortex that have been characterized cytoarchitectonically and physiologically. We examined the distribution of retrograde and anterograde labeling after injecting tracers into one or more regions of ferret auditory cortex. Injections of different tracers at frequency‐matched locations in the core areas, the primary auditory cortex (A1) and anterior auditory field (AAF), of the same animal revealed the presence of reciprocal connections with overlapping projections to and from discrete regions within the posterior pseudosylvian and suprasylvian fields (PPF and PSF), suggesting that these connections are frequency specific. In contrast, projections from the primary areas to the anterior dorsal field (ADF) on the anterior ectosylvian gyrus were scattered and non‐overlapping, consistent with the non‐tonotopic organization of this field. The relative strength of the projections originating in each of the primary fields differed, with A1 predominantly targeting the posterior bank fields PPF and PSF, which in turn project to the ventral posterior field, whereas AAF projects more heavily to the ADF, which then projects to the anteroventral field and the pseudosylvian sulcal cortex. These findings suggest that parallel anterior and posterior processing networks may exist, although the connections between different areas often overlap and interactions were present at all levels. J. Comp. Neurol. 523:2187–2210, 2015. © 2015 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Jennifer K Bizley
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, OX1 3PT, United Kingdom.,Ear Institute, University College London, London, WC1X 8EE, United Kingdom
| | - Victoria M Bajo
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, OX1 3PT, United Kingdom
| | | | - Andrew J King
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, OX1 3PT, United Kingdom
| |
Collapse
|
35
|
Chabot N, Butler BE, Lomber SG. Differential modification of cortical and thalamic projections to cat primary auditory cortex following early- and late-onset deafness. J Comp Neurol 2015; 523:2297-320. [DOI: 10.1002/cne.23790] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2014] [Revised: 04/07/2015] [Accepted: 04/08/2015] [Indexed: 12/26/2022]
Affiliation(s)
- Nicole Chabot
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Brain and Mind Institute, University of Western Ontario; London Ontario Canada N6A 5B7
| | - Blake E. Butler
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Brain and Mind Institute, University of Western Ontario; London Ontario Canada N6A 5B7
| | - Stephen G. Lomber
- Cerebral Systems Laboratory; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Psychology; University of Western Ontario; London Ontario Canada N6A 5C2
- Department of Physiology and Pharmacology; University of Western Ontario; London Ontario Canada N6A 5C1
- Brain and Mind Institute, University of Western Ontario; London Ontario Canada N6A 5B7
- National Centre for Audiology; University of Western Ontario; London Ontario Canada N6A 1H1
| |
Collapse
|
36
|
van Wassenhove V, Grzeczkowski L. Visual-induced expectations modulate auditory cortical responses. Front Neurosci 2015; 9:11. [PMID: 25705174 PMCID: PMC4319385 DOI: 10.3389/fnins.2015.00011] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2014] [Accepted: 01/11/2015] [Indexed: 11/13/2022] Open
Abstract
Active sensing has important consequences on multisensory processing (Schroeder et al., 2010). Here, we asked whether in the absence of saccades, the position of the eyes and the timing of transient color changes of visual stimuli could selectively affect the excitability of auditory cortex by predicting the “where” and the “when” of a sound, respectively. Human participants were recorded with magnetoencephalography (MEG) while maintaining the position of their eyes on the left, right, or center of the screen. Participants counted color changes of the fixation cross while neglecting sounds which could be presented to the left, right, or both ears. First, clear alpha power increases were observed in auditory cortices, consistent with participants' attention directed to visual inputs. Second, color changes elicited robust modulations of auditory cortex responses (“when” prediction) seen as ramping activity, early alpha phase-locked responses, and enhanced high-gamma band responses in the contralateral side of sound presentation. Third, no modulations of auditory evoked or oscillatory activity were found to be specific to eye position. Altogether, our results suggest that visual transience can automatically elicit a prediction of “when” a sound will occur by changing the excitability of auditory cortices irrespective of the attended modality, eye position or spatial congruency of auditory and visual events. To the contrary, auditory cortical responses were not significantly affected by eye position suggesting that “where” predictions may require active sensing or saccadic reset to modulate auditory cortex responses, notably in the absence of spatial orientation to sounds.
Collapse
Affiliation(s)
- Virginie van Wassenhove
- CEA, DSV/I2BM, NeuroSpin; INSERM, Cognitive Neuroimaging Unit, U992; Université Paris-Sud Gif-sur-Yvette, France
| | - Lukasz Grzeczkowski
- CEA, DSV/I2BM, NeuroSpin; INSERM, Cognitive Neuroimaging Unit, U992; Université Paris-Sud Gif-sur-Yvette, France ; Laboratory of Psychophysics, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| |
Collapse
|
37
|
Multisensory training improves auditory spatial processing following bilateral cochlear implantation. J Neurosci 2014; 34:11119-30. [PMID: 25122908 DOI: 10.1523/jneurosci.4767-13.2014] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
Cochlear implants (CIs) partially restore hearing to the deaf by directly stimulating the inner ear. In individuals fitted with CIs, lack of auditory experience due to loss of hearing before language acquisition can adversely impact outcomes. For example, adults with early-onset hearing loss generally do not integrate inputs from both ears effectively when fitted with bilateral CIs (BiCIs). Here, we used an animal model to investigate the effects of long-term deafness on auditory localization with BiCIs and approaches for promoting the use of binaural spatial cues. Ferrets were deafened either at the age of hearing onset or as adults. All animals were implanted in adulthood, either unilaterally or bilaterally, and were subsequently assessed for their ability to localize sound in the horizontal plane. The unilaterally implanted animals were unable to perform this task, regardless of the duration of deafness. Among animals with BiCIs, early-onset hearing loss was associated with poor auditory localization performance, compared with late-onset hearing loss. However, performance in the early-deafened group with BiCIs improved significantly after multisensory training with interleaved auditory and visual stimuli. We demonstrate a possible neural substrate for this by showing a training-induced improvement in the responsiveness of auditory cortical neurons and in their sensitivity to interaural level differences, the principal localization cue available to BiCI users. Importantly, our behavioral and physiological evidence demonstrates a facilitative role for vision in restoring auditory spatial processing following potential cross-modal reorganization. These findings support investigation of a similar training paradigm in human CI users.
Collapse
|
38
|
Stein BE, Stanford TR, Rowland BA. Development of multisensory integration from the perspective of the individual neuron. Nat Rev Neurosci 2014; 15:520-35. [PMID: 25158358 DOI: 10.1038/nrn3742] [Citation(s) in RCA: 211] [Impact Index Per Article: 21.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function.
Collapse
|
39
|
Associations of regional GABA and glutamate with intrinsic and extrinsic neural activity in humans—a review of multimodal imaging studies. Neurosci Biobehav Rev 2014; 47:36-52. [PMID: 25066091 DOI: 10.1016/j.neubiorev.2014.07.016] [Citation(s) in RCA: 148] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2014] [Revised: 06/30/2014] [Accepted: 07/17/2014] [Indexed: 01/04/2023]
Abstract
The integration of multiple imaging modalities is becoming an increasingly well used research strategy for studying the human brain. The neurotransmitters glutamate and GABA particularly lend themselves towards such studies. This is because these transmitters are ubiquitous throughout the cortex, where they are the key constituents of the inhibition/excitation balance, and because they can be easily measured in vivo through magnetic resonance spectroscopy, as well as with select positron emission tomography approaches. How these transmitters underly functional responses measured with techniques such as fMRI and EEG remains unclear though, and was the target of this review. Consistently shown in the literature was a negative correlation between GABA concentrations and stimulus-induced activity within the measured region. Also consistently found was a positive correlation between glutamate concentrations and inter-regional activity relationships, both during tasks and rest. These findings are outlined along with results from populations with mental disorders to give an overview of what brain imaging has suggested to date about the biochemical underpinnings of functional activity in health and disease. We conclude that the combination of functional and biochemical imaging in humans is an increasingly informative approach that does however require a number of key methodological and interpretive issues be addressed before can meet its potential.
Collapse
|
40
|
Identifying and quantifying multisensory integration: a tutorial review. Brain Topogr 2014; 27:707-30. [PMID: 24722880 DOI: 10.1007/s10548-014-0365-7] [Citation(s) in RCA: 133] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2013] [Accepted: 03/26/2014] [Indexed: 12/19/2022]
Abstract
We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such multisensory integration is how to quantify the multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify multisensory integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify multisensory processes and integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.
Collapse
|
41
|
Kimura A. Diverse subthreshold cross-modal sensory interactions in the thalamic reticular nucleus: implications for new pathways of cross-modal attentional gating function. Eur J Neurosci 2014; 39:1405-18. [PMID: 24646412 DOI: 10.1111/ejn.12545] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2013] [Revised: 01/26/2014] [Accepted: 02/03/2014] [Indexed: 11/30/2022]
Abstract
Our attention to a sensory cue of a given modality interferes with attention to a sensory cue of another modality. However, an object emitting various sensory cues attracts attention more effectively. The thalamic reticular nucleus (TRN) could play a pivotal role in such cross-modal modulation of attention given that cross-modal sensory interaction takes place in the TRN, because the TRN occupies a highly strategic position to function in the control of gain and/or gating of sensory processing in the thalamocortical loop. In the present study cross-modal interactions between visual and auditory inputs were examined in single TRN cells of anesthetised rats using juxta-cellular recording and labeling techniques. Visual or auditory responses were modulated by subthreshold sound or light stimuli, respectively, in the majority of recordings (46 of 54 visual and 60 of 73 auditory cells). However, few bimodal sensory cells were found. Cells showing modulation of the sensory response were distributed in the whole visual and auditory sectors of the TRN. Modulated cells sent axonal projections to first-order or higher-order thalamic nuclei. Suppression predominated in modulation that took place not only in primary responses but also in late responses repeatedly evoked after sensory stimulation. Combined sensory stimulation also evoked de-novo responses, and modulated response latency and burst spiking. These results indicate that the TRN incorporates sensory inputs of different modalities into single cell activity to function in sensory processing in the lemniscal and non-lemniscal systems. This raises the possibility that the TRN constitutes neural pathways involved in cross-modal attentional gating.
Collapse
Affiliation(s)
- Akihisa Kimura
- Department of Physiology, Wakayama Medical University, Wakayama Kimiidera 811-1, Wakayama, 641-8509, Japan
| |
Collapse
|
42
|
Coding the meaning of sounds: contextual modulation of auditory responses in the basolateral amygdala. J Neurosci 2013; 33:17538-48. [PMID: 24174686 DOI: 10.1523/jneurosci.2205-13.2013] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Female mice emit a low-frequency harmonic (LFH) call in association with distinct behavioral contexts: mating and physical threat or pain. Here we report the results of acoustic, behavioral, and neurophysiological studies of the contextual analysis of these calls in CBA/CaJ mice. We first show that the acoustical features of the LFH call do not differ between contexts. We then show that male mice avoid the LFH call in the presence of a predator cue (cat fur) but are more attracted to the same exemplar of the call in the presence of a mating cue (female urine). The males thus use nonauditory cues to determine the meaning of the LFH call, but these cues do not generalize to noncommunication sounds, such as noise bursts. We then characterized neural correlates of contextual meaning of the LFH call in responses of basolateral amygdala (BLA) neurons from awake, freely moving mice. There were two major findings. First, BLA neurons typically displayed early excitation to all tested behaviorally aversive stimuli. Second, the nonauditory context modulates the BLA population response to the LFH call but not to the noncommunication sound. These results suggest that the meaning of communication calls is reflected in the spike discharge patterns of BLA neurons.
Collapse
|
43
|
Relearning auditory spectral cues for locations inside and outside the visual field. J Assoc Res Otolaryngol 2013; 15:249-63. [PMID: 24306277 DOI: 10.1007/s10162-013-0429-5] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2013] [Accepted: 11/17/2013] [Indexed: 11/27/2022] Open
Abstract
Previous research has demonstrated that, over a period of weeks, the auditory system accommodates to changes in the monaural spectral cues for sound locations within the frontal region of space. We were interested to determine if similar accommodation could occur for locations in the posterior regions of space, i.e. in the absence of contemporaneous visual information that indicates any mismatch between the perceived and actual location of a sound source. To distort the normal spectral cues to sound location, eight listeners wore small moulds in each ear. HRTF recordings confirmed that while the moulds substantially altered the monaural spectral cues, sufficient residual cues were retained to provide a basis for relearning. Compared to control measures, sound localization performance initially decreased significantly, with a sevenfold increase in front-back confusions and elevation errors more than doubled. Subjects wore the moulds continuously for a period of up to 60 days (median 38 days), over which time performance improved but remained significantly poorer than control levels. Sound localization performance for frontal locations (audio-visual field) was compared with that for posterior space (audio-only field), and there was no significant difference between regions in either the extent or rate of accommodation. This suggests a common mechanism for both regions of space that does not rely on contemporaneous visual information as a teacher signal for recalibration of the auditory system to modified spectral cues.
Collapse
|
44
|
Lanz F, Moret V, Rouiller EM, Loquet G. Multisensory Integration in Non-Human Primates during a Sensory-Motor Task. Front Hum Neurosci 2013; 7:799. [PMID: 24319421 PMCID: PMC3837444 DOI: 10.3389/fnhum.2013.00799] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 11/03/2013] [Indexed: 12/12/2022] Open
Abstract
Daily our central nervous system receives inputs via several sensory modalities, processes them and integrates information in order to produce a suitable behavior. The amazing part is that such a multisensory integration brings all information into a unified percept. An approach to start investigating this property is to show that perception is better and faster when multimodal stimuli are used as compared to unimodal stimuli. This forms the first part of the present study conducted in a non-human primate's model (n = 2) engaged in a detection sensory-motor task where visual and auditory stimuli were displayed individually or simultaneously. The measured parameters were the reaction time (RT) between stimulus and onset of arm movement, successes and errors percentages, as well as the evolution as a function of time of these parameters with training. As expected, RTs were shorter when the subjects were exposed to combined stimuli. The gains for both subjects were around 20 and 40 ms, as compared with the auditory and visual stimulus alone, respectively. Moreover the number of correct responses increased in response to bimodal stimuli. We interpreted such multisensory advantage through redundant signal effect which decreases perceptual ambiguity, increases speed of stimulus detection, and improves performance accuracy. The second part of the study presents single-unit recordings derived from the premotor cortex (PM) of the same subjects during the sensory-motor task. Response patterns to sensory/multisensory stimulation are documented and specific type proportions are reported. Characterization of bimodal neurons indicates a mechanism of audio-visual integration possibly through a decrease of inhibition. Nevertheless the neural processing leading to faster motor response from PM as a polysensory association cortical area remains still unclear.
Collapse
Affiliation(s)
- Florian Lanz
- Domain of Physiology, Department of Medicine, Fribourg Cognition Center, University of Fribourg , Fribourg , Switzerland
| | | | | | | |
Collapse
|
45
|
Mao YT, Pallas SL. Cross-modal plasticity results in increased inhibition in primary auditory cortical areas. Neural Plast 2013; 2013:530651. [PMID: 24288625 PMCID: PMC3833201 DOI: 10.1155/2013/530651] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2013] [Revised: 08/15/2013] [Accepted: 08/17/2013] [Indexed: 11/26/2022] Open
Abstract
Loss of sensory input from peripheral organ damage, sensory deprivation, or brain damage can result in adaptive or maladaptive changes in sensory cortex. In previous research, we found that auditory cortical tuning and tonotopy were impaired by cross-modal invasion of visual inputs. Sensory deprivation is typically associated with a loss of inhibition. To determine whether inhibitory plasticity is responsible for this process, we measured pre- and postsynaptic changes in inhibitory connectivity in ferret auditory cortex (AC) after cross-modal plasticity. We found that blocking GABAA receptors increased responsiveness and broadened sound frequency tuning in the cross-modal group more than in the normal group. Furthermore, expression levels of glutamic acid decarboxylase (GAD) protein were increased in the cross-modal group. We also found that blocking inhibition unmasked visual responses of some auditory neurons in cross-modal AC. Overall, our data suggest a role for increased inhibition in reducing the effectiveness of the abnormal visual inputs and argue that decreased inhibition is not responsible for compromised auditory cortical function after cross-modal invasion. Our findings imply that inhibitory plasticity may play a role in reorganizing sensory cortex after cross-modal invasion, suggesting clinical strategies for recovery after brain injury or sensory deprivation.
Collapse
Affiliation(s)
- Yu-Ting Mao
- Department of Biology, Georgia State University, Atlanta, GA 30303, USA
| | - Sarah L. Pallas
- Department of Biology, Georgia State University, Atlanta, GA 30303, USA
- Neuroscience Institute, Georgia State University, P.O. Box 5030, Atlanta, GA 30302-5030, USA
| |
Collapse
|
46
|
Ghose D, Wallace MT. Heterogeneity in the spatial receptive field architecture of multisensory neurons of the superior colliculus and its effects on multisensory integration. Neuroscience 2013; 256:147-62. [PMID: 24183964 DOI: 10.1016/j.neuroscience.2013.10.044] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2013] [Revised: 10/08/2013] [Accepted: 10/22/2013] [Indexed: 11/15/2022]
Abstract
Multisensory integration has been widely studied in neurons of the mammalian superior colliculus (SC). This has led to the description of various determinants of multisensory integration, including those based on stimulus- and neuron-specific factors. The most widely characterized of these illustrate the importance of the spatial and temporal relationships of the paired stimuli as well as their relative effectiveness in eliciting a response in determining the final integrated output. Although these stimulus-specific factors have generally been considered in isolation (i.e., manipulating stimulus location while holding all other factors constant), they have an intrinsic interdependency that has yet to be fully elucidated. For example, changes in stimulus location will likely also impact both the temporal profile of response and the effectiveness of the stimulus. The importance of better describing this interdependency is further reinforced by the fact that SC neurons have large receptive fields, and that responses at different locations within these receptive fields are far from equivalent. To address these issues, the current study was designed to examine the interdependency between the stimulus factors of space and effectiveness in dictating the multisensory responses of SC neurons. The results show that neuronal responsiveness changes dramatically with changes in stimulus location - highlighting a marked heterogeneity in the spatial receptive fields of SC neurons. More importantly, this receptive field heterogeneity played a major role in the integrative product exhibited by stimulus pairings, such that pairings at weakly responsive locations of the receptive fields resulted in the largest multisensory interactions. Together these results provide greater insight into the interrelationship of the factors underlying multisensory integration in SC neurons, and may have important mechanistic implications for multisensory integration and the role it plays in shaping SC-mediated behaviors.
Collapse
Affiliation(s)
- D Ghose
- Department of Psychology, Vanderbilt University, Nashville, TN, United States; Kennedy Center for Research on Human Development, Vanderbilt University, Nashville, TN, United States.
| | - M T Wallace
- Department of Psychology, Vanderbilt University, Nashville, TN, United States; Kennedy Center for Research on Human Development, Vanderbilt University, Nashville, TN, United States; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, United States; Department of Psychiatry, Vanderbilt University, Nashville, TN, United States; Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, United States
| |
Collapse
|
47
|
Gleiss S, Kayser C. Eccentricity dependent auditory enhancement of visual stimulus detection but not discrimination. Front Integr Neurosci 2013; 7:52. [PMID: 23882195 PMCID: PMC3715717 DOI: 10.3389/fnint.2013.00052] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2013] [Accepted: 07/01/2013] [Indexed: 11/13/2022] Open
Abstract
Sensory perception is enhanced by the complementary information provided by our different sensory modalities and even apparently task irrelevant stimuli in one modality can facilitate performance in another. While perception in general comprises both, the detection of sensory objects as well as their discrimination and recognition, most studies on audio-visual interactions have focused on either of these aspects. However, previous evidence, neuroanatomical projections between early sensory cortices and computational mechanisms suggest that sounds might differentially affect visual detection and discrimination and differentially at central and peripheral retinal locations. We performed an experiment to directly test this by probing the enhancement of visual detection and discrimination by auxiliary sounds at different visual eccentricities and within the same subjects. Specifically, we quantified the enhancement provided by sounds that reduce the overall uncertainty about the visual stimulus beyond basic multisensory co-stimulation. This revealed a general trend for stronger enhancement at peripheral locations in both tasks, but a statistically significant effect only for detection and only at peripheral locations. Overall this suggests that there are topographic differences in the auditory facilitation of basic visual processes and that these may differentially affect basic aspects of visual recognition.
Collapse
Affiliation(s)
- Stephanie Gleiss
- Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | | |
Collapse
|
48
|
Van Barneveld DCPBM, Van Wanrooij MM. The influence of static eye and head position on the ventriloquist effect. Eur J Neurosci 2013; 37:1501-10. [PMID: 23463919 DOI: 10.1111/ejn.12176] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2012] [Revised: 12/20/2012] [Accepted: 01/30/2013] [Indexed: 11/28/2022]
Abstract
Orienting responses to audiovisual events have shorter reaction times and better accuracy and precision when images and sounds in the environment are aligned in space and time. How the brain constructs an integrated audiovisual percept is a computational puzzle because the auditory and visual senses are represented in different reference frames: the retina encodes visual locations with respect to the eyes; whereas the sound localisation cues are referenced to the head. In the well-known ventriloquist effect, the auditory spatial percept of the ventriloquist's voice is attracted toward the synchronous visual image of the dummy, but does this visual bias on sound localisation operate in a common reference frame by correctly taking into account eye and head position? Here we studied this question by independently varying initial eye and head orientations, and the amount of audiovisual spatial mismatch. Human subjects pointed head and/or gaze to auditory targets in elevation, and were instructed to ignore co-occurring visual distracters. Results demonstrate that different initial head and eye orientations are accurately and appropriately incorporated into an audiovisual response. Effectively, sounds and images are perceptually fused according to their physical locations in space independent of an observer's point of view. Implications for neurophysiological findings and modelling efforts that aim to reconcile sensory and motor signals for goal-directed behaviour are discussed.
Collapse
Affiliation(s)
- Denise C P B M Van Barneveld
- Department of Biophysics, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, P.O. Box 9010, 6500 GL, Nijmegen, The Netherlands
| | | |
Collapse
|
49
|
Audiovisual integration in the primary auditory cortex of an awake rodent. Neurosci Lett 2013; 534:24-9. [DOI: 10.1016/j.neulet.2012.10.056] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2012] [Revised: 09/17/2012] [Accepted: 10/24/2012] [Indexed: 11/23/2022]
|
50
|
Jäncke L, Rogenmoser L, Meyer M, Elmer S. Pre-attentive modulation of brain responses to tones in coloured-hearing synesthetes. BMC Neurosci 2012; 13:151. [PMID: 23241212 PMCID: PMC3547775 DOI: 10.1186/1471-2202-13-151] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2012] [Accepted: 11/29/2012] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Coloured-hearing (CH) synesthesia is a perceptual phenomenon in which an acoustic stimulus (the inducer) initiates a concurrent colour perception (the concurrent). Individuals with CH synesthesia "see" colours when hearing tones, words, or music; this specific phenomenon suggesting a close relationship between auditory and visual representations. To date, it is still unknown whether the perception of colours is associated with a modulation of brain functions in the inducing brain area, namely in the auditory-related cortex and associated brain areas. In addition, there is an on-going debate as to whether attention to the inducer is necessarily required for eliciting a visual concurrent, or whether the latter can emerge in a pre-attentive fashion. RESULTS By using the EEG technique in the context of a pre-attentive mismatch negativity (MMN) paradigm, we show that the binding of tones and colours in CH synesthetes is associated with increased MMN amplitudes in response to deviant tones supposed to induce novel concurrent colour perceptions. Most notably, the increased MMN amplitudes we revealed in the CH synesthetes were associated with stronger intracerebral current densities originating from the auditory cortex, parietal cortex, and ventral visual areas. CONCLUSIONS The automatic binding of tones and colours in CH synesthetes is accompanied by an early pre-attentive process recruiting the auditory cortex, inferior and superior parietal lobules, as well as ventral occipital areas.
Collapse
Affiliation(s)
- Lutz Jäncke
- Division Neuropsychology, Institute of Psychology, University of Zurich, Binzmühlestrasse 14/25, Zurich CH-8050, Switzerland
- Center for Integrative Human Physiology, Zurich, Switzerland
- International Normal Aging and Plasticity Imaging Center (INAPIC), Zurich, Switzerland
- Research Unit “Plasticity and learning in the aging brain”, University of Zurich, Zurich, Switzerland
| | - Lars Rogenmoser
- Division Neuropsychology, Institute of Psychology, University of Zurich, Binzmühlestrasse 14/25, Zurich CH-8050, Switzerland
| | - Martin Meyer
- Center for Integrative Human Physiology, Zurich, Switzerland
- Research Unit “Plasticity and learning in the aging brain”, University of Zurich, Zurich, Switzerland
| | - Stefan Elmer
- Division Neuropsychology, Institute of Psychology, University of Zurich, Binzmühlestrasse 14/25, Zurich CH-8050, Switzerland
| |
Collapse
|