1
|
Keum D, Medina AE. The effect of developmental alcohol exposure on multisensory integration is larger in deeper cortical layers. Alcohol 2024; 121:193-198. [PMID: 38417561 PMCID: PMC11345874 DOI: 10.1016/j.alcohol.2024.02.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 02/23/2024] [Accepted: 02/23/2024] [Indexed: 03/01/2024]
Abstract
Fetal Alcohol Spectrum Disorders (FASD) are one of the most common causes of mental disability in the world. Despite efforts to increase public awareness of the risks of drinking during pregnancy, epidemiological studies indicate a prevalence of 1-6% in all births. There is growing evidence that deficits in sensory processing may contribute to social problems observed in FASD. Multisensory (MS) integration occurs when a combination of inputs from two sensory modalities leads to enhancement or suppression of neuronal firing. MS enhancement is usually linked to processes that facilitate cognition and reaction time, whereas MS suppression has been linked to filtering unwanted sensory information. The rostral portion of the posterior parietal cortex (PPr) of the ferret is an area that shows robust visual-tactile integration and displays both MS enhancement and suppression. Recently, our lab demonstrated that ferrets exposed to alcohol during the "third trimester equivalent" of human gestation show less MS enhancement and more MS suppression in PPr than controls. Here we complement these findings by comparing in vivo electrophysiological recordings from channels located in shallow and deep cortical layers. We observed that while the effects of alcohol (less MS enhancement and more MS suppression) were found in all layers, the magnitude of these effects was more pronounced in putative layers V-VI. These findings extend our knowledge of the sensory deficits of FASD.
Collapse
Affiliation(s)
- Dongil Keum
- Department of Pediatrics, School of Medicine, University of Maryland, Baltimore, Maryland, United States
| | - Alexandre E Medina
- Department of Pediatrics, School of Medicine, University of Maryland, Baltimore, Maryland, United States.
| |
Collapse
|
2
|
Billock VA, Dougherty K, Kinney MJ, Preston AM, Winterbottom MD. Multisensory-inspired modeling and neural correlates for two key binocular interactions. Sci Rep 2024; 14:11269. [PMID: 38760410 PMCID: PMC11101479 DOI: 10.1038/s41598-024-60926-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Accepted: 04/29/2024] [Indexed: 05/19/2024] Open
Abstract
Most binocular vision models assume that the two eyes sum incompletely. However, some facilitatory cortical neurons fire for only one eye, but amplify their firing rates if both eyes are stimulated. These 'binocular gate' neurons closely resemble subthreshold multisensory neurons. Binocular amplification for binocular gate neurons follows a power law, with a compressive exponent. Unexpectedly, this rule also applies to facilitatory true binocular neurons; although driven by either eye, binocular neurons are well modeled as gated amplifiers of their strongest monocular response, if both eyes are stimulated. Psychophysical data follows the same power law as the neural data, with a similar exponent; binocular contrast sensitivity can be modeled as a gated amplification of the more sensitive eye. These results resemble gated amplification phenomena in multisensory integration, and other non-driving modulatory interactions that affect sensory processing. Models of incomplete summation seem unnecessary for V1 facilitatory neurons or contrast sensitivity. However, binocular combination of clearly visible monocular stimuli follows Schrödinger's nonlinear magnitude-weighted average. We find that putatively suppressive binocular neurons closely follow Schrödinger's equation. Similar suppressive multisensory neurons are well documented but seldom studied. Facilitatory binocular neurons and mildly suppressive binocular neurons are likely neural correlates of binocular sensitivity and binocular appearance respectively.
Collapse
Grants
- 1R01EY027402-02 U.S. Department of Health & Human Services | NIH | National Eye Institute (NEI)
- T32EY007135 U.S. Department of Health & Human Services | NIH | National Eye Institute (NEI)
- P30EY008126 U.S. Department of Health & Human Services | NIH | National Eye Institute (NEI)
- US Navy Aerospace Medical Reseach Laboratory, Leidos, Dayton, OH, United States
- Princeton University, Princeton Neuroscience Institute, Princeton, NJ, United States
- Naval Air Warfare Center, Human Systems Engineering Department, Patuxent River, MD, United States
- Naval Aerospace Medical Research Laboratory, NAMRU-D, Vision and Acceleration, Wright-Patterson AFB
- US Air Force Research Laboratory, Wright-Patterson AFB, OH, United States
- Office of the Assistant Secretary of Defense, Dp_67.2_17_J9_1757 work unit H1814.
- MULTISENSORY-INSPIRED MODELING AND NEURAL CORRELATES FOR TWO KEY BINOCULAR INTERACTIONS
Collapse
Affiliation(s)
- Vincent A Billock
- Leidos, Inc. at the Naval Aerospace Medical Research Laboratory, NAMRU-D, Wright-Patterson AFB, OH, USA.
| | - Kacie Dougherty
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Micah J Kinney
- Naval Air Warfare Center, NAWCAD, Patuxent River, MD, USA
| | - Adam M Preston
- Naval Aerospace Medical Research Laboratory, NAMRU-D, Wright-Patterson AFB, OH, USA
| | - Marc D Winterbottom
- Air Force Research Laboratory, 711th Human Performance Wing, Wright-Patterson AFB, OH, USA
| |
Collapse
|
3
|
Kral A, Sharma A. Crossmodal plasticity in hearing loss. Trends Neurosci 2023; 46:377-393. [PMID: 36990952 PMCID: PMC10121905 DOI: 10.1016/j.tins.2023.02.004] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2022] [Revised: 01/27/2023] [Accepted: 02/21/2023] [Indexed: 03/29/2023]
Abstract
Crossmodal plasticity is a textbook example of the ability of the brain to reorganize based on use. We review evidence from the auditory system showing that such reorganization has significant limits, is dependent on pre-existing circuitry and top-down interactions, and that extensive reorganization is often absent. We argue that the evidence does not support the hypothesis that crossmodal reorganization is responsible for closing critical periods in deafness, and crossmodal plasticity instead represents a neuronal process that is dynamically adaptable. We evaluate the evidence for crossmodal changes in both developmental and adult-onset deafness, which start as early as mild-moderate hearing loss and show reversibility when hearing is restored. Finally, crossmodal plasticity does not appear to affect the neuronal preconditions for successful hearing restoration. Given its dynamic and versatile nature, we describe how this plasticity can be exploited for improving clinical outcomes after neurosensory restoration.
Collapse
Affiliation(s)
- Andrej Kral
- Institute of AudioNeuroTechnology and Department of Experimental Otology, Otolaryngology Clinics, Hannover Medical School, Hannover, Germany; Australian Hearing Hub, School of Medicine and Health Sciences, Macquarie University, Sydney, NSW, Australia
| | - Anu Sharma
- Department of Speech Language and Hearing Science, Center for Neuroscience, Institute of Cognitive Science, University of Colorado Boulder, Boulder, CO, USA.
| |
Collapse
|
4
|
Villwock A, Grin K. Somatosensory processing in deaf and deafblind individuals: How does the brain adapt as a function of sensory and linguistic experience? A critical review. Front Psychol 2022; 13:938842. [PMID: 36324786 PMCID: PMC9618853 DOI: 10.3389/fpsyg.2022.938842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2022] [Accepted: 09/22/2022] [Indexed: 11/17/2022] Open
Abstract
How do deaf and deafblind individuals process touch? This question offers a unique model to understand the prospects and constraints of neural plasticity. Our brain constantly receives and processes signals from the environment and combines them into the most reliable information content. The nervous system adapts its functional and structural organization according to the input, and perceptual processing develops as a function of individual experience. However, there are still many unresolved questions regarding the deciding factors for these changes in deaf and deafblind individuals, and so far, findings are not consistent. To date, most studies have not taken the sensory and linguistic experiences of the included participants into account. As a result, the impact of sensory deprivation vs. language experience on somatosensory processing remains inconclusive. Even less is known about the impact of deafblindness on brain development. The resulting neural adaptations could be even more substantial, but no clear patterns have yet been identified. How do deafblind individuals process sensory input? Studies on deafblindness have mostly focused on single cases or groups of late-blind individuals. Importantly, the language backgrounds of deafblind communities are highly variable and include the usage of tactile languages. So far, this kind of linguistic experience and its consequences have not been considered in studies on basic perceptual functions. Here, we will provide a critical review of the literature, aiming at identifying determinants for neuroplasticity and gaps in our current knowledge of somatosensory processing in deaf and deafblind individuals.
Collapse
Affiliation(s)
- Agnes Villwock
- Sign Languages, Department of Rehabilitation Sciences, Humboldt-Universität zu Berlin, Berlin, Germany
| | | |
Collapse
|
5
|
Merrikhi Y, Kok MA, Carrasco A, Meredith MA, Lomber SG. MULTISENSORY RESPONSES IN A BELT REGION OF THE DORSAL AUDITORY CORTICAL PATHWAY. Eur J Neurosci 2021; 55:589-610. [PMID: 34927294 DOI: 10.1111/ejn.15573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 12/13/2021] [Accepted: 12/14/2021] [Indexed: 11/30/2022]
Abstract
A basic function of the cerebral cortex is to receive and integrate information from different sensory modalities into a comprehensive percept of the environment. Neurons that demonstrate multisensory convergence occur across the necortex, but are especially prevalent in higher-order, association areas. However, a recent study of a cat higher-order auditory area, the dorsal zone (DZ) of auditory cortex, did not observe any multisensory features. Therefore, the goal of the present investigation was to address this conflict using recording and testing methodologies that are established for exposing and studying multisensory neuronal processing. Among the 482 neurons studied, we found that 76.6% were influenced by non-auditory stimuli. Of these neurons, 99% were affected by visual stimulation, but only 11% by somatosensory. Furthermore, a large proportion of the multisensory neurons showed integrated responses to multisensory stimulation, constituted a majority of the excitatory and inhibitory neurons encountered (as identified by the duration of their waveshape), and exhibited a distinct spatial distribution within DZ. These findings demonstrate that the dorsal zone of auditory cortex robustly exhibits multisensory properties and that the proportions of multisensory neurons encountered are consistent with those identified in other higher-order cortices.
Collapse
Affiliation(s)
- Yaser Merrikhi
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Melanie A Kok
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - Andres Carrasco
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia, USA
| | - Stephen G Lomber
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
6
|
Opoku-Baah C, Schoenhaut AM, Vassall SG, Tovar DA, Ramachandran R, Wallace MT. Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
Affiliation(s)
- Collins Opoku-Baah
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Adriana M Schoenhaut
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Sarah G Vassall
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - David A Tovar
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Ramnarayan Ramachandran
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA
- Vanderbilt Vision Research Center, Nashville, TN, USA
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
- Department of Psychology, Vanderbilt University, Nashville, TN, USA.
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA.
- Vanderbilt Vision Research Center, Nashville, TN, USA.
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
- Department of Pharmacology, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
7
|
Scurry AN, Huber E, Matera C, Jiang F. Increased Right Posterior STS Recruitment Without Enhanced Directional-Tuning During Tactile Motion Processing in Early Deaf Individuals. Front Neurosci 2020; 14:864. [PMID: 32982667 PMCID: PMC7477335 DOI: 10.3389/fnins.2020.00864] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Accepted: 07/24/2020] [Indexed: 01/19/2023] Open
Abstract
Upon early sensory deprivation, the remaining modalities often exhibit cross-modal reorganization, such as primary auditory cortex (PAC) recruitment for visual motion processing in early deafness (ED). Previous studies of compensatory plasticity in ED individuals have given less attention to tactile motion processing. In the current study, we aimed to examine the effects of early auditory deprivation on tactile motion processing. We simulated four directions of tactile motion on each participant's right index finger and characterized their tactile motion responses and directional-tuning profiles using population receptive field analysis. Similar tactile motion responses were found within primary (SI) and secondary (SII) somatosensory cortices between ED and hearing control groups, whereas ED individuals showed a reduced proportion of voxels with directionally tuned responses in SI contralateral to stimulation. There were also significant but minimal responses to tactile motion within PAC for both groups. While early deaf individuals show significantly larger recruitment of right posterior superior temporal sulcus (pSTS) region upon tactile motion stimulation, there was no evidence of enhanced directional tuning. Greater recruitment of right pSTS region is consistent with prior studies reporting reorganization of multimodal areas due to sensory deprivation. The absence of increased directional tuning within the right pSTS region may suggest a more distributed population of neurons dedicated to processing tactile spatial information as a consequence of early auditory deprivation.
Collapse
Affiliation(s)
- Alexandra N Scurry
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Elizabeth Huber
- Department of Speech and Hearing Sciences, Institute for Learning & Brain Sciences, University of Washington, Seattle, WA, United States
| | - Courtney Matera
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| |
Collapse
|
8
|
Meredith MA, Keniston LP, Prickett EH, Bajwa M, Cojanu A, Clemo HR, Allman BL. What is a multisensory cortex? A laminar, connectional, and functional study of a ferret temporal cortical multisensory area. J Comp Neurol 2020; 528:1864-1882. [PMID: 31955427 DOI: 10.1002/cne.24859] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Revised: 01/13/2020] [Accepted: 01/13/2020] [Indexed: 01/24/2023]
Abstract
Now that examples of multisensory neurons have been observed across the neocortex, this has led to some confusion about the features that actually designate a region as "multisensory." While the documentation of multisensory effects within many different cortical areas is clear, often little information is available about their proportions or net functional effects. To assess the compositional and functional features that contribute to the multisensory nature of a region, the present investigation used multichannel neuronal recording and tract tracing methods to examine the ferret temporal region: the lateral rostral suprasylvian sulcal area. Here, auditory-tactile multisensory neurons were predominant and constituted the majority of neurons across all cortical layers whose responses dominated the net spiking activity of the area. These results were then compared with a literature review of cortical multisensory data and were found to closely resemble multisensory features of other, higher-order sensory areas. Collectively, these observations argue that multisensory processing presents itself in hierarchical and area-specific ways, from regions that exhibit few multisensory features to those whose composition and processes are dominated by multisensory activity. It seems logical that the former exhibit some multisensory features (among many others), while the latter are legitimately designated as "multisensory."
Collapse
Affiliation(s)
- M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Leslie P Keniston
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Elizabeth H Prickett
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Moazzum Bajwa
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Alexandru Cojanu
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - H Ruth Clemo
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Brian L Allman
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| |
Collapse
|
9
|
Abstract
Over the past decade, there has been an unprecedented level of interest and progress into understanding visual processing in the brain of the deaf. Specifically, when the brain is deprived of input from one sensory modality (such as hearing), it often compensates with supranormal performance in one or more of the intact sensory systems (such as vision). Recent psychophysical, functional imaging, and reversible deactivation studies have converged to define the specific visual abilities that are enhanced in the deaf, as well as the cortical loci that undergo crossmodal plasticity in the deaf and are responsible for mediating these superior visual functions. Examination of these investigations reveals that central visual functions, such as object and facial discrimination, and peripheral visual functions, such as motion detection, visual localization, visuomotor synchronization, and Vernier acuity (measured in the periphery), are specifically enhanced in the deaf, compared with hearing participants. Furthermore, the cortical loci identified to mediate these functions reside in deaf auditory cortex: BA 41, BA 42, and BA 22, in addition to the rostral area, planum temporale, Te3, and temporal voice area in humans; primary auditory cortex, anterior auditory field, dorsal zone of auditory cortex, auditory field of the anterior ectosylvian sulcus, and posterior auditory field in cats; and primary auditory cortex and anterior auditory field in both ferrets and mice. Overall, the findings from these studies show that crossmodal reorganization in auditory cortex of the deaf is responsible for the superior visual abilities of the deaf.
Collapse
|
10
|
Schormans AL, Typlt M, Allman BL. Adult-Onset Hearing Impairment Induces Layer-Specific Cortical Reorganization: Evidence of Crossmodal Plasticity and Central Gain Enhancement. Cereb Cortex 2019; 29:1875-1888. [PMID: 29668848 PMCID: PMC6458918 DOI: 10.1093/cercor/bhy067] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Revised: 02/22/2018] [Indexed: 11/14/2022] Open
Abstract
Adult-onset hearing impairment can lead to hyperactivity in the auditory pathway (i.e., central gain enhancement) as well as increased cortical responsiveness to nonauditory stimuli (i.e., crossmodal plasticity). However, it remained unclear to what extent hearing loss-induced hyperactivity is relayed beyond the auditory cortex, and thus, whether central gain enhancement competes or coexists with crossmodal plasticity throughout the distinct layers of the audiovisual cortex. To that end, we investigated the effects of partial hearing loss on laminar processing in the auditory, visual and audiovisual cortices of adult rats using extracellular electrophysiological recordings performed 2 weeks after loud noise exposure. Current-source density analyses revealed that central gain enhancement was not relayed to the audiovisual cortex (V2L), and was instead restricted to the granular layer of the higher order auditory area, AuD. In contrast, crossmodal plasticity was evident across multiple cortical layers within V2L, and also manifested in AuD. Surprisingly, despite this coexistence of central gain enhancement and crossmodal plasticity, noise exposure did not disrupt the responsiveness of these neighboring cortical regions to combined audiovisual stimuli. Overall, we have shown for the first time that adult-onset hearing impairment causes a complex assortment of intramodal and crossmodal changes across the layers of higher order sensory cortices.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| | - Marei Typlt
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
11
|
Billock VA, Havig PR. A Simple Power Law Governs Many Sensory Amplifications and Multisensory Enhancements. Sci Rep 2018; 8:7645. [PMID: 29769622 PMCID: PMC5955996 DOI: 10.1038/s41598-018-25973-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2018] [Accepted: 05/02/2018] [Indexed: 12/05/2022] Open
Abstract
When one sensory response occurs in the presence of a different sensory stimulation, the sensory response is often amplified. The variety of sensory enhancement data tends to obscure the underlying rules, but it has long been clear that weak signals are usually amplified more than strong ones (the Principle of Inverse Effectiveness). Here we show that for many kinds of sensory amplification, the underlying law is simple and elegant: the amplified response is a power law of the unamplified response, with a compressive exponent that amplifies weak signals more than strong. For both psychophysics and cortical electrophysiology, for both humans and animals, and for both sensory integration and enhancement within a sense, gated power law amplification (amplification of one sense triggered by the presence of a different sensory signal) is often sufficient to explain sensory enhancement.
Collapse
Affiliation(s)
- Vincent A Billock
- College of Optometry, Ohio State University, Columbus, OH, 43210, USA.
| | - Paul R Havig
- U.S. Air Force Research Laboratory, Wright-Patterson Air Force Base, OH, 45433, USA
| |
Collapse
|
12
|
Meredith MA, Wallace MT, Clemo HR. Do the Different Sensory Areas Within the Cat Anterior Ectosylvian Sulcal Cortex Collectively Represent a Network Multisensory Hub? Multisens Res 2018; 31:793-823. [PMID: 31157160 PMCID: PMC6542292 DOI: 10.1163/22134808-20181316] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Current theory supports that the numerous functional areas of the cerebral cortex are organized and function as a network. Using connectional databases and computational approaches, the cerebral network has been demonstrated to exhibit a hierarchical structure composed of areas, clusters and, ultimately, hubs. Hubs are highly connected, higher-order regions that also facilitate communication between different sensory modalities. One region computationally identified network hub is the visual area of the Anterior Ectosylvian Sulcal cortex (AESc) of the cat. The Anterior Ectosylvian Visual area (AEV) is but one component of the AESc that also includes the auditory (Field of the Anterior Ectosylvian Sulcus - FAES) and somatosensory (Fourth somatosensory representation - SIV). To better understand the nature of cortical network hubs, the present report reviews the biological features of the AESc. Within the AESc, each area has extensive external cortical connections as well as among one another. Each of these core representations is separated by a transition zone characterized by bimodal neurons that share sensory properties of both adjoining core areas. Finally, core and transition zones are underlain by a continuous sheet of layer 5 neurons that project to common output structures. Altogether, these shared properties suggest that the collective AESc region represents a multiple sensory/multisensory cortical network hub. Ultimately, such an interconnected, composite structure adds complexity and biological detail to the understanding of cortical network hubs and their function in cortical processing.
Collapse
Affiliation(s)
- M. Alex Meredith
- Department of Anatomy and Neurobiology, Virginia
Commonwealth University School of Medicine, Richmond, VA 23298 USA
| | - Mark T. Wallace
- Vanderbilt Brain Institute, Vanderbilt University,
Nashville, TN 37240 USA
| | - H. Ruth Clemo
- Department of Anatomy and Neurobiology, Virginia
Commonwealth University School of Medicine, Richmond, VA 23298 USA
| |
Collapse
|
13
|
Cuppini C, Ursino M, Magosso E, Ross LA, Foxe JJ, Molholm S. A Computational Analysis of Neural Mechanisms Underlying the Maturation of Multisensory Speech Integration in Neurotypical Children and Those on the Autism Spectrum. Front Hum Neurosci 2017; 11:518. [PMID: 29163099 PMCID: PMC5670153 DOI: 10.3389/fnhum.2017.00518] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2017] [Accepted: 10/11/2017] [Indexed: 11/13/2022] Open
Abstract
Failure to appropriately develop multisensory integration (MSI) of audiovisual speech may affect a child's ability to attain optimal communication. Studies have shown protracted development of MSI into late-childhood and identified deficits in MSI in children with an autism spectrum disorder (ASD). Currently, the neural basis of acquisition of this ability is not well understood. Here, we developed a computational model informed by neurophysiology to analyze possible mechanisms underlying MSI maturation, and its delayed development in ASD. The model posits that strengthening of feedforward and cross-sensory connections, responsible for the alignment of auditory and visual speech sound representations in posterior superior temporal gyrus/sulcus, can explain behavioral data on the acquisition of MSI. This was simulated by a training phase during which the network was exposed to unisensory and multisensory stimuli, and projections were crafted by Hebbian rules of potentiation and depression. In its mature architecture, the network also reproduced the well-known multisensory McGurk speech effect. Deficits in audiovisual speech perception in ASD were well accounted for by fewer multisensory exposures, compatible with a lack of attention, but not by reduced synaptic connectivity or synaptic plasticity.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electric, Electronic and Information Engineering, University of Bologna, Bologna, Italy
| | - Mauro Ursino
- Department of Electric, Electronic and Information Engineering, University of Bologna, Bologna, Italy
| | - Elisa Magosso
- Department of Electric, Electronic and Information Engineering, University of Bologna, Bologna, Italy
| | - Lars A. Ross
- Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| | - John J. Foxe
- Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
- Department of Neuroscience and The Del Monte Institute for Neuroscience, University of Rochester School of Medicine, Rochester, NY, United States
| | - Sophie Molholm
- Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| |
Collapse
|
14
|
Shrem T, Murray MM, Deouell LY. Auditory-visual integration modulates location-specific repetition suppression of auditory responses. Psychophysiology 2017; 54:1663-1675. [PMID: 28752567 DOI: 10.1111/psyp.12955] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2016] [Revised: 05/10/2017] [Accepted: 06/03/2017] [Indexed: 11/28/2022]
Abstract
Space is a dimension shared by different modalities, but at what stage spatial encoding is affected by multisensory processes is unclear. Early studies observed attenuation of N1/P2 auditory evoked responses following repetition of sounds from the same location. Here, we asked whether this effect is modulated by audiovisual interactions. In two experiments, using a repetition-suppression paradigm, we presented pairs of tones in free field, where the test stimulus was a tone presented at a fixed lateral location. Experiment 1 established a neural index of auditory spatial sensitivity, by comparing the degree of attenuation of the response to test stimuli when they were preceded by an adapter sound at the same location versus 30° or 60° away. We found that the degree of attenuation at the P2 latency was inversely related to the spatial distance between the test stimulus and the adapter stimulus. In Experiment 2, the adapter stimulus was a tone presented from the same location or a more medial location than the test stimulus. The adapter stimulus was accompanied by a simultaneous flash displayed orthogonally from one of the two locations. Sound-flash incongruence reduced accuracy in a same-different location discrimination task (i.e., the ventriloquism effect) and reduced the location-specific repetition-suppression at the P2 latency. Importantly, this multisensory effect included topographic modulations, indicative of changes in the relative contribution of underlying sources across conditions. Our findings suggest that the auditory response at the P2 latency is affected by spatially selective brain activity, which is affected crossmodally by visual information.
Collapse
Affiliation(s)
- Talia Shrem
- Human Cognitive Neuroscience Lab, Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Micah M Murray
- Laboratory for Investigative Neurophysiology (The LINE), Department of Radiology, and Neuropsychology and Neurorehabilitation Service, University Hospital Center and University of Lausanne, Lausanne, Switzerland.,EEG Brain Mapping Core, Center for Biomedical Imaging (CIBM), Lausanne, Switzerland.,Department of Ophthalmology, University of Lausanne, Jules Gonin Eye Hospital, Lausanne, Switzerland.,Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, Tennessee, USA
| | - Leon Y Deouell
- Human Cognitive Neuroscience Lab, Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel.,The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
15
|
Meredith MA, Clemo HR, Lomber SG. Is territorial expansion a mechanism for crossmodal plasticity? Eur J Neurosci 2017; 45:1165-1176. [PMID: 28370755 DOI: 10.1111/ejn.13564] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Revised: 02/07/2017] [Accepted: 03/13/2017] [Indexed: 01/08/2023]
Abstract
Crossmodal plasticity is the phenomenon whereby, following sensory damage or deprivation, the lost sensory function of a brain region is replaced by one of the remaining senses. One of several proposed mechanisms for this phenomenon involves the expansion of a more active brain region at the expense of another whose sensory inputs have been damaged or lost. This territorial expansion hypothesis was examined in the present study. The cat ectosylvian visual area (AEV) borders the auditory field of the anterior ectosylvian sulcus (FAES), which becomes visually reorganized in the early deaf. If this crossmodal effect in the FAES is due to the expansion of the adjoining AEV into the territory of the FAES after hearing loss, then the reorganized FAES should exhibit connectional features characteristic of the AEV. However, tracer injections revealed significantly different patterns of cortical connectivity between the AEV and the early deaf FAES, and substantial cytoarchitectonic and behavioral distinctions occur as well. Therefore, the crossmodal reorganization of the FAES cannot be mechanistically attributed to the expansion of the adjoining cortical territory of the AEV and an overwhelming number of recent studies now support unmasking of existing connections as the operative mechanism underlying crossmodal plasticity.
Collapse
Affiliation(s)
- M A Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, 1101 E. Marshall St., Sanger Hall Rm. 12-067, Richmond, VA, 23298, USA
| | - H R Clemo
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, 1101 E. Marshall St., Sanger Hall Rm. 12-067, Richmond, VA, 23298, USA
| | - S G Lomber
- Departments of Physiology and Pharmacology, & Psychology, University of Western Ontario, London, ON, Canada
| |
Collapse
|
16
|
Corina DP, Blau S, LaMarr T, Lawyer LA, Coffey-Corina S. Auditory and Visual Electrophysiology of Deaf Children with Cochlear Implants: Implications for Cross-modal Plasticity. Front Psychol 2017; 8:59. [PMID: 28203210 PMCID: PMC5285328 DOI: 10.3389/fpsyg.2017.00059] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2016] [Accepted: 01/10/2017] [Indexed: 11/13/2022] Open
Abstract
Deaf children who receive a cochlear implant early in life and engage in intensive oral/aural therapy often make great strides in spoken language acquisition. However, despite clinicians' best efforts, there is a great deal of variability in language outcomes. One concern is that cortical regions which normally support auditory processing may become reorganized for visual function, leaving fewer available resources for auditory language acquisition. The conditions under which these changes occur are not well understood, but we may begin investigating this phenomenon by looking for interactions between auditory and visual evoked cortical potentials in deaf children. If children with abnormal auditory responses show increased sensitivity to visual stimuli, this may indicate the presence of maladaptive cortical plasticity. We recorded evoked potentials, using both auditory and visual paradigms, from 25 typical hearing children and 26 deaf children (ages 2-8 years) with cochlear implants. An auditory oddball paradigm was used (85% /ba/ syllables vs. 15% frequency modulated tone sweeps) to elicit an auditory P1 component. Visual evoked potentials (VEPs) were recorded during presentation of an intermittent peripheral radial checkerboard while children watched a silent cartoon, eliciting a P1-N1 response. We observed reduced auditory P1 amplitudes and a lack of latency shift associated with normative aging in our deaf sample. We also observed shorter latencies in N1 VEPs to visual stimulus offset in deaf participants. While these data demonstrate cortical changes associated with auditory deprivation, we did not find evidence for a relationship between cortical auditory evoked potentials and the VEPs. This is consistent with descriptions of intra-modal plasticity within visual systems of deaf children, but do not provide evidence for cross-modal plasticity. In addition, we note that sign language experience had no effect on deaf children's early auditory and visual ERP responses.
Collapse
Affiliation(s)
- David P Corina
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, University of California at Davis, DavisCA, USA; Department of Linguistics, University of California at Davis, DavisCA, USA
| | - Shane Blau
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, University of California at Davis, Davis CA, USA
| | - Todd LaMarr
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, University of California at Davis, Davis CA, USA
| | - Laurel A Lawyer
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, University of California at Davis, Davis CA, USA
| | - Sharon Coffey-Corina
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, University of California at Davis, Davis CA, USA
| |
Collapse
|
17
|
Schormans AL, Scott KE, Vo AMQ, Tyker A, Typlt M, Stolzberg D, Allman BL. Audiovisual Temporal Processing and Synchrony Perception in the Rat. Front Behav Neurosci 2017; 10:246. [PMID: 28119580 PMCID: PMC5222817 DOI: 10.3389/fnbeh.2016.00246] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2016] [Accepted: 12/16/2016] [Indexed: 11/13/2022] Open
Abstract
Extensive research on humans has improved our understanding of how the brain integrates information from our different senses, and has begun to uncover the brain regions and large-scale neural activity that contributes to an observer’s ability to perceive the relative timing of auditory and visual stimuli. In the present study, we developed the first behavioral tasks to assess the perception of audiovisual temporal synchrony in rats. Modeled after the parameters used in human studies, separate groups of rats were trained to perform: (1) a simultaneity judgment task in which they reported whether audiovisual stimuli at various stimulus onset asynchronies (SOAs) were presented simultaneously or not; and (2) a temporal order judgment task in which they reported whether they perceived the auditory or visual stimulus to have been presented first. Furthermore, using in vivo electrophysiological recordings in the lateral extrastriate visual (V2L) cortex of anesthetized rats, we performed the first investigation of how neurons in the rat multisensory cortex integrate audiovisual stimuli presented at different SOAs. As predicted, rats (n = 7) trained to perform the simultaneity judgment task could accurately (~80%) identify synchronous vs. asynchronous (200 ms SOA) trials. Moreover, the rats judged trials at 10 ms SOA to be synchronous, whereas the majority (~70%) of trials at 100 ms SOA were perceived to be asynchronous. During the temporal order judgment task, rats (n = 7) perceived the synchronous audiovisual stimuli to be “visual first” for ~52% of the trials, and calculation of the smallest timing interval between the auditory and visual stimuli that could be detected in each rat (i.e., the just noticeable difference (JND)) ranged from 77 ms to 122 ms. Neurons in the rat V2L cortex were sensitive to the timing of audiovisual stimuli, such that spiking activity was greatest during trials when the visual stimulus preceded the auditory by 20–40 ms. Ultimately, given that our behavioral and electrophysiological results were consistent with studies conducted on human participants and previous recordings made in multisensory brain regions of different species, we suggest that the rat represents an effective model for studying audiovisual temporal synchrony at both the neuronal and perceptual level.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Kaela E Scott
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Albert M Q Vo
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Anna Tyker
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Marei Typlt
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Daniel Stolzberg
- Department of Physiology and Pharmacology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| |
Collapse
|
18
|
Murray MM, Lewkowicz DJ, Amedi A, Wallace MT. Multisensory Processes: A Balancing Act across the Lifespan. Trends Neurosci 2016; 39:567-579. [PMID: 27282408 PMCID: PMC4967384 DOI: 10.1016/j.tins.2016.05.003] [Citation(s) in RCA: 137] [Impact Index Per Article: 15.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2016] [Revised: 04/13/2016] [Accepted: 05/12/2016] [Indexed: 11/20/2022]
Abstract
Multisensory processes are fundamental in scaffolding perception, cognition, learning, and behavior. How and when stimuli from different sensory modalities are integrated rather than treated as separate entities is poorly understood. We review how the relative reliance on stimulus characteristics versus learned associations dynamically shapes multisensory processes. We illustrate the dynamism in multisensory function across two timescales: one long term that operates across the lifespan and one short term that operates during the learning of new multisensory relations. In addition, we highlight the importance of task contingencies. We conclude that these highly dynamic multisensory processes, based on the relative weighting of stimulus characteristics and learned associations, provide both stability and flexibility to brain functions over a wide range of temporal scales.
Collapse
Affiliation(s)
- Micah M Murray
- The Laboratory for Investigative Neurophysiology (The LINE), Department of Clinical Neurosciences and Department of Radiology, University Hospital Centre and University of Lausanne, Lausanne, Switzerland; Electroencephalography Brain Mapping Core, Centre for Biomedical Imaging (CIBM), Lausanne, Switzerland; Department of Ophthalmology, University of Lausanne, Jules Gonin Eye Hospital, Lausanne, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
| | - David J Lewkowicz
- Department of Communication Sciences and Disorders, Northeastern University, Boston, MA, USA
| | - Amir Amedi
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada (IMRIC), Hadassah Medical School, Hebrew University of Jerusalem, Jerusalem, Israel; Interdisciplinary and Cognitive Science Program, The Edmond & Lily Safra Center for Brain Sciences (ELSC), Hebrew University of Jerusalem, Jerusalem, Israel
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA; Kennedy Center for Research on Human Development, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN, USA; Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
19
|
Butler BE, Chabot N, Lomber SG. A quantitative comparison of the hemispheric, areal, and laminar origins of sensory and motor cortical projections to the superior colliculus of the cat. J Comp Neurol 2016; 524:2623-42. [PMID: 26850989 DOI: 10.1002/cne.23980] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2015] [Revised: 02/03/2016] [Accepted: 02/03/2016] [Indexed: 11/11/2022]
Abstract
The superior colliculus (SC) is a midbrain structure central to orienting behaviors. The organization of descending projections from sensory cortices to the SC has garnered much attention; however, rarely have projections from multiple modalities been quantified and contrasted, allowing for meaningful conclusions within a single species. Here, we examine corticotectal projections from visual, auditory, somatosensory, motor, and limbic cortices via retrograde pathway tracers injected throughout the superficial and deep layers of the cat SC. As anticipated, the majority of cortical inputs to the SC originate in the visual cortex. In fact, each field implicated in visual orienting behavior makes a substantial projection. Conversely, only one area of the auditory orienting system, the auditory field of the anterior ectosylvian sulcus (fAES), and no area involved in somatosensory orienting, shows significant corticotectal inputs. Although small relative to visual inputs, the projection from the fAES is of particular interest, as it represents the only bilateral cortical input to the SC. This detailed, quantitative study allows for comparison across modalities in an animal that serves as a useful model for both auditory and visual perception. Moreover, the differences in patterns of corticotectal projections between modalities inform the ways in which orienting systems are modulated by cortical feedback. J. Comp. Neurol. 524:2623-2642, 2016. © 2016 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Blake E Butler
- Cerebral Systems Laboratory, University of Western Ontario, London, Ontario, Canada, N6A 5C2.,Department of Physiology and Pharmacology, University of Western Ontario, London, Ontario, Canada, N6A 5C1.,Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada, N6A 5B7
| | - Nicole Chabot
- Cerebral Systems Laboratory, University of Western Ontario, London, Ontario, Canada, N6A 5C2.,Department of Physiology and Pharmacology, University of Western Ontario, London, Ontario, Canada, N6A 5C1.,Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada, N6A 5B7
| | - Stephen G Lomber
- Cerebral Systems Laboratory, University of Western Ontario, London, Ontario, Canada, N6A 5C2.,Department of Physiology and Pharmacology, University of Western Ontario, London, Ontario, Canada, N6A 5C1.,Department of Psychology, University of Western Ontario, London, Ontario, Canada, N6A 5C2.,Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada, N6A 5B7.,National Centre for Audiology, University of Western Ontario, London, Ontario, Canada, N6G 1H1
| |
Collapse
|
20
|
Meredith MA, Clemo HR, Corley SB, Chabot N, Lomber SG. Cortical and thalamic connectivity of the auditory anterior ectosylvian cortex of early-deaf cats: Implications for neural mechanisms of crossmodal plasticity. Hear Res 2015; 333:25-36. [PMID: 26724756 DOI: 10.1016/j.heares.2015.12.007] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Revised: 11/23/2015] [Accepted: 12/01/2015] [Indexed: 01/31/2023]
Abstract
Early hearing loss leads to crossmodal plasticity in regions of the cerebrum that are dominated by acoustical processing in hearing subjects. Until recently, little has been known of the connectional basis of this phenomenon. One region whose crossmodal properties are well-established is the auditory field of the anterior ectosylvian sulcus (FAES) in the cat, where neurons are normally responsive to acoustic stimulation and its deactivation leads to the behavioral loss of accurate orienting toward auditory stimuli. However, in early-deaf cats, visual responsiveness predominates in the FAES and its deactivation blocks accurate orienting behavior toward visual stimuli. For such crossmodal reorganization to occur, it has been presumed that novel inputs or increased projections from non-auditory cortical areas must be generated, or that existing non-auditory connections were 'unmasked.' These possibilities were tested using tracer injections into the FAES of adult cats deafened early in life (and hearing controls), followed by light microscopy to localize retrogradely labeled neurons. Surprisingly, the distribution of cortical and thalamic afferents to the FAES was very similar among early-deaf and hearing animals. No new visual projection sources were identified and visual cortical connections to the FAES were comparable in projection proportions. These results support an alternate theory for the connectional basis for cross-modal plasticity that involves enhanced local branching of existing projection terminals that originate in non-auditory as well as auditory cortices.
Collapse
Affiliation(s)
- M Alex Meredith
- Virginia Commonwealth University School of Medicine, Department of Anatomy and Neurobiology, Richmond, VA 23298, USA.
| | - H Ruth Clemo
- Virginia Commonwealth University School of Medicine, Department of Anatomy and Neurobiology, Richmond, VA 23298, USA
| | - Sarah B Corley
- Virginia Commonwealth University School of Medicine, Department of Anatomy and Neurobiology, Richmond, VA 23298, USA; University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
| | - Nicole Chabot
- Cerebral Systems Laboratory, The Brain and Mind Institute, Natural Sciences Centre, University of Western Ontario, London, Ontario N6A 5B7, Canada
| | - Stephen G Lomber
- Cerebral Systems Laboratory, The Brain and Mind Institute, Natural Sciences Centre, University of Western Ontario, London, Ontario N6A 5B7, Canada
| |
Collapse
|
21
|
Wu C, Stefanescu RA, Martel DT, Shore SE. Listening to another sense: somatosensory integration in the auditory system. Cell Tissue Res 2015; 361:233-50. [PMID: 25526698 PMCID: PMC4475675 DOI: 10.1007/s00441-014-2074-7] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Accepted: 11/18/2014] [Indexed: 12/19/2022]
Abstract
Conventionally, sensory systems are viewed as separate entities, each with its own physiological process serving a different purpose. However, many functions require integrative inputs from multiple sensory systems and sensory intersection and convergence occur throughout the central nervous system. The neural processes for hearing perception undergo significant modulation by the two other major sensory systems, vision and somatosensation. This synthesis occurs at every level of the ascending auditory pathway: the cochlear nucleus, inferior colliculus, medial geniculate body and the auditory cortex. In this review, we explore the process of multisensory integration from (1) anatomical (inputs and connections), (2) physiological (cellular responses), (3) functional and (4) pathological aspects. We focus on the convergence between auditory and somatosensory inputs in each ascending auditory station. This review highlights the intricacy of sensory processing and offers a multisensory perspective regarding the understanding of sensory disorders.
Collapse
Affiliation(s)
- Calvin Wu
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, 48109, USA
| | | | | | | |
Collapse
|
22
|
Abstract
Spatial ventriloquism refers to the phenomenon that a visual stimulus such as a flash can attract the perceived location of a spatially discordant but temporally synchronous sound. An analogous example of mutual attraction between audition and vision has been found in the temporal domain, where temporal aspects of a visual event, such as its onset, frequency, or duration, can be biased by a slightly asynchronous sound. In this review, we examine various manifestations of spatial and temporal attraction between the senses (both direct effects and aftereffects), and we discuss important constraints on the occurrence of these effects. Factors that potentially modulate ventriloquism-such as attention, synesthetic correspondence, and other cognitive factors-are described. We trace theories and models of spatial and temporal ventriloquism, from the traditional unity assumption and modality appropriateness hypothesis to more recent Bayesian and neural network approaches. Finally, we summarize recent evidence probing the underlying neural mechanisms of spatial and temporal ventriloquism.
Collapse
|
23
|
Billock VA, Tsou BH. Bridging the divide between sensory integration and binding theory: Using a binding-like neural synchronization mechanism to model sensory enhancements during multisensory interactions. J Cogn Neurosci 2014; 26:1587-99. [PMID: 24456391 DOI: 10.1162/jocn_a_00574] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Neural information combination problems are ubiquitous in cognitive neuroscience. Two important disciplines, although conceptually similar, take radically different approaches to these problems. Sensory binding theory is largely grounded in synchronization of neurons responding to different aspects of a stimulus, resulting in a coherent percept. Sensory integration focuses more on the influences of the senses on each other and is largely grounded in the study of neurons that respond to more than one sense. It would be desirable to bridge these disciplines, so that insights gleaned from either could be harnessed by the other. To link these two fields, we used a binding-like oscillatory synchronization mechanism to simulate neurons in rattlesnake that are driven by one sense but modulated by another. Mutual excitatory coupling produces synchronized trains of action potentials with enhanced firing rates. The same neural synchronization mechanism models the behavior of a population of cells in cat visual cortex that are modulated by auditory activation. The coupling strength of the synchronizing neurons is crucial to the outcome; a criterion of strong coupling (kept weak enough to avoid seriously distorting action potential amplitude) results in intensity-dependent sensory enhancement-the principle of inverse effectiveness-a key property of sensory integration.
Collapse
|
24
|
Sarko DK, Ghose D, Wallace MT. Convergent approaches toward the study of multisensory perception. Front Syst Neurosci 2013; 7:81. [PMID: 24265607 PMCID: PMC3820972 DOI: 10.3389/fnsys.2013.00081] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2013] [Accepted: 10/20/2013] [Indexed: 11/13/2022] Open
Abstract
Classical analytical approaches for examining multisensory processing in individual neurons have relied heavily on changes in mean firing rate to assess the presence and magnitude of multisensory interaction. However, neurophysiological studies within individual sensory systems have illustrated that important sensory and perceptual information is encoded in forms that go beyond these traditional spike-based measures. Here we review analytical tools as they are used within individual sensory systems (auditory, somatosensory, and visual) to advance our understanding of how sensory cues are effectively integrated across modalities (e.g., audiovisual cues facilitating speech processing). Specifically, we discuss how methods used to assess response variability (Fano factor, or FF), local field potentials (LFPs), current source density (CSD), oscillatory coherence, spike synchrony, and receiver operating characteristics (ROC) represent particularly promising tools for understanding the neural encoding of multisensory stimulus features. The utility of each approach and how it might optimally be applied toward understanding multisensory processing is placed within the context of exciting new data that is just beginning to be generated. Finally, we address how underlying encoding mechanisms might shape-and be tested alongside with-the known behavioral and perceptual benefits that accompany multisensory processing.
Collapse
Affiliation(s)
- Diana K. Sarko
- Department of Anatomy, Cell Biology and Physiology, Edward Via College of Osteopathic MedicineSpartanburg, SC, USA
| | - Dipanwita Ghose
- Department of Anesthesiology, Vanderbilt University Medical CenterNashville, TN, USA
| | - Mark T. Wallace
- Department of Hearing and Speech Sciences, Vanderbilt UniversityNashville, TN, USA
| |
Collapse
|
25
|
Leone LM, McCourt ME. The roles of physical and physiological simultaneity in audiovisual multisensory facilitation. Iperception 2013; 4:213-28. [PMID: 24349682 PMCID: PMC3859565 DOI: 10.1068/i0532] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2013] [Revised: 12/12/2012] [Indexed: 10/27/2022] Open
Abstract
A series of experiments measured the audiovisual stimulus onset asynchrony (SOAAV), yielding facilitative multisensory integration. We evaluated (1) the range of SOAAV over which facilitation occurred when unisensory stimuli were weak; (2) whether the range of SOAAV producing facilitation supported the hypothesis that physiological simultaneity of unisensory activity governs multisensory facilitation; and (3) whether AV multisensory facilitation depended on relative stimulus intensity. We compared response-time distributions to unisensory auditory (A) and visual (V) stimuli with those to AV stimuli over a wide range (300 and 20 ms increments) of SOAAV, across four conditions of varying stimulus intensity. In condition 1, the intensity of unisensory stimuli was adjusted such that d' ≈ 2. In condition 2, V stimulus intensity was increased (d' > 4), while A stimulus intensity was as in condition 1. In condition 3, A stimulus intensity was increased (d' > 4) while V stimulus intensity was as in condition 1. In condition 4, both A and V stimulus intensities were increased to clearly suprathreshold levels (d' > 4). Across all conditions of stimulus intensity, significant multisensory facilitation occurred exclusively for simultaneously presented A and V stimuli. In addition, facilitation increased as stimulus intensity increased, in disagreement with inverse effectiveness. These results indicate that the requirements for facilitative multisensory integration include both physical and physiological simultaneity.
Collapse
Affiliation(s)
- Lynnette M Leone
- Center for Visual and Cognitive Neuroscience, Department of Psychology, NDSU Department 2765, P.O. Box 6050, College of Science and Mathematics, North Dakota State University, Fargo, ND 58105-6050, USA; e-mail:
| | - Mark E McCourt
- Center for Visual and Cognitive Neuroscience, Department of Psychology, NDSU Department 2765, P.O. Box 6050, College of Science and Mathematics, North Dakota State University, Fargo, ND 58105-6050, USA; e-mail:
| |
Collapse
|
26
|
Lippert MT, Takagaki K, Kayser C, Ohl FW. Asymmetric multisensory interactions of visual and somatosensory responses in a region of the rat parietal cortex. PLoS One 2013; 8:e63631. [PMID: 23667650 PMCID: PMC3646793 DOI: 10.1371/journal.pone.0063631] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2012] [Accepted: 04/09/2013] [Indexed: 11/18/2022] Open
Abstract
Perception greatly benefits from integrating multiple sensory cues into a unified percept. To study the neural mechanisms of sensory integration, model systems are required that allow the simultaneous assessment of activity and the use of techniques to affect individual neural processes in behaving animals. While rodents qualify for these requirements, little is known about multisensory integration and areas involved for this purpose in the rodent. Using optical imaging combined with laminar electrophysiological recordings, the rat parietal cortex was identified as an area where visual and somatosensory inputs converge and interact. Our results reveal similar response patterns to visual and somatosensory stimuli at the level of current source density (CSD) responses and multi-unit responses within a strip in parietal cortex. Surprisingly, a selective asymmetry was observed in multisensory interactions: when the somatosensory response preceded the visual response, supra-linear summation of CSD was observed, but the reverse stimulus order resulted in sub-linear effects in the CSD. This asymmetry was not present in multi-unit activity however, which showed consistently sub-linear interactions. These interactions were restricted to a specific temporal window, and pharmacological tests revealed significant local intra-cortical contributions to this phenomenon. Our results highlight the rodent parietal cortex as a system to model the neural underpinnings of multisensory processing in behaving animals and at the cellular level.
Collapse
Affiliation(s)
- Michael T Lippert
- Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany.
| | | | | | | |
Collapse
|
27
|
Auditory-driven phase reset in visual cortex: human electrocorticography reveals mechanisms of early multisensory integration. Neuroimage 2013; 79:19-29. [PMID: 23624493 DOI: 10.1016/j.neuroimage.2013.04.060] [Citation(s) in RCA: 104] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2012] [Revised: 04/08/2013] [Accepted: 04/14/2013] [Indexed: 11/22/2022] Open
Abstract
Findings in animal models demonstrate that activity within hierarchically early sensory cortical regions can be modulated by cross-sensory inputs through resetting of the phase of ongoing intrinsic neural oscillations. Here, subdural recordings evaluated whether phase resetting by auditory inputs would impact multisensory integration processes in human visual cortex. Results clearly showed auditory-driven phase reset in visual cortices and, in some cases, frank auditory event-related potentials (ERP) were also observed over these regions. Further, when audiovisual bisensory stimuli were presented, this led to robust multisensory integration effects which were observed in both the ERP and in measures of phase concentration. These results extend findings from animal models to human visual cortices, and highlight the impact of cross-sensory phase resetting by a non-primary stimulus on multisensory integration in ostensibly unisensory cortices.
Collapse
|
28
|
Wong C, Chabot N, Kok MA, Lomber SG. Modified Areal Cartography in Auditory Cortex Following Early- and Late-Onset Deafness. Cereb Cortex 2013; 24:1778-92. [DOI: 10.1093/cercor/bht026] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|
29
|
Multisensory representation of frequency across audition and touch: high density electrical mapping reveals early sensory-perceptual coupling. J Neurosci 2013; 32:15338-44. [PMID: 23115172 DOI: 10.1523/jneurosci.1796-12.2012] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The frequency of environmental vibrations is sampled by two of the major sensory systems, audition and touch, notwithstanding that these signals are transduced through very different physical media and entirely separate sensory epithelia. Psychophysical studies have shown that manipulating frequency in audition or touch can have a significant cross-sensory impact on perceived frequency in the other sensory system, pointing to intimate links between these senses during computation of frequency. In this regard, the frequency of a vibratory event can be thought of as a multisensory perceptual construct. In turn, electrophysiological studies point to temporally early multisensory interactions that occur in hierarchically early sensory regions where convergent inputs from the auditory and somatosensory systems are to be found. A key question pertains to the level of processing at which the multisensory integration of featural information, such as frequency, occurs. Do the sensory systems calculate frequency independently before this information is combined, or is this feature calculated in an integrated fashion during preattentive sensory processing? The well characterized mismatch negativity, an electrophysiological response that indexes preattentive detection of a change within the context of a regular pattern of stimulation, served as our dependent measure. High-density electrophysiological recordings were made in humans while they were presented with separate blocks of somatosensory, auditory, and audio-somatosensory "standards" and "deviants," where the deviant differed in frequency. Multisensory effects were identified beginning at ∼200 ms, with the multisensory mismatch negativity (MMN) significantly different from the sum of the unisensory MMNs. This provides compelling evidence for preattentive coupling between the somatosensory and auditory channels in the cortical representation of frequency.
Collapse
|
30
|
King AJ, Walker KMM. Integrating information from different senses in the auditory cortex. BIOLOGICAL CYBERNETICS 2012; 106:617-25. [PMID: 22798035 PMCID: PMC4340563 DOI: 10.1007/s00422-012-0502-x] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/24/2012] [Accepted: 06/21/2012] [Indexed: 05/09/2023]
Abstract
Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies.
Collapse
Affiliation(s)
- Andrew J King
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, UK.
| | | |
Collapse
|
31
|
Banks MI, Uhlrich DJ, Smith PH, Krause BM, Manning KA. Descending projections from extrastriate visual cortex modulate responses of cells in primary auditory cortex. Cereb Cortex 2011; 21:2620-38. [PMID: 21471557 PMCID: PMC3183425 DOI: 10.1093/cercor/bhr048] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Primary sensory cortical responses are modulated by the presence or expectation of related sensory information in other modalities, but the sources of multimodal information and the cellular locus of this integration are unclear. We investigated the modulation of neural responses in the murine primary auditory cortical area Au1 by extrastriate visual cortex (V2). Projections from V2 to Au1 terminated in a classical descending/modulatory pattern, with highest density in layers 1, 2, 5, and 6. In brain slices, whole-cell recordings revealed long latency responses to stimulation in V2L that could modulate responses to subsequent white matter (WM) stimuli at latencies of 5-20 ms. Calcium responses imaged in Au1 cell populations showed that preceding WM with V2L stimulation modulated WM responses, with both summation and suppression observed. Modulation of WM responses was most evident for near-threshold WM stimuli. These data indicate that corticocortical projections from V2 contribute to multimodal integration in primary auditory cortex.
Collapse
Affiliation(s)
- Matthew I Banks
- Department of Anesthesiology, University of Wisconsin, Madison, WI 53706, USA.
| | | | | | | | | |
Collapse
|
32
|
|
33
|
Clemo H, Keniston L, Meredith M. Structural Basis of Multisensory Processing. Front Neurosci 2011. [DOI: 10.1201/b11092-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
34
|
|
35
|
Meredith M, Allman B, Keniston L, Clemo H. Are Bimodal Neurons the Same throughout the Brain? Front Neurosci 2011. [DOI: 10.1201/b11092-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
36
|
Kayser C, Petkov C, Remedios R, Logothetis N. Multisensory Influences on Auditory Processing. Front Neurosci 2011. [DOI: 10.1201/9781439812174-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
37
|
Meredith M, Allman B, Keniston L, Clemo H. Are Bimodal Neurons the Same throughout the Brain? Front Neurosci 2011. [DOI: 10.1201/9781439812174-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
38
|
Kayser C, Petkov C, Remedios R, Logothetis N. Multisensory Influences on Auditory Processing. Front Neurosci 2011. [DOI: 10.1201/b11092-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
39
|
|
40
|
|
41
|
|
42
|
Meredith MA, Kryklywy J, McMillan AJ, Malhotra S, Lum-Tai R, Lomber SG. Crossmodal reorganization in the early deaf switches sensory, but not behavioral roles of auditory cortex. Proc Natl Acad Sci U S A 2011; 108:8856-61. [PMID: 21555555 PMCID: PMC3102418 DOI: 10.1073/pnas.1018519108] [Citation(s) in RCA: 107] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
Abstract
It is well known that early disruption of sensory input from one modality can induce crossmodal reorganization of a deprived cortical area, resulting in compensatory abilities in the remaining senses. Compensatory effects, however, occur in selected cortical regions and it is not known whether such compensatory phenomena have any relation to the original function of the reorganized area. In the cortex of hearing cats, the auditory field of the anterior ectosylvian sulcus (FAES) is largely responsive to acoustic stimulation and its unilateral deactivation results in profound contralateral acoustic orienting deficits. Given these functional and behavioral roles, the FAES was studied in early-deafened cats to examine its crossmodal sensory properties as well as to assess the behavioral role of that reorganization. Recordings in the FAES of early-deafened adults revealed robust responses to visual stimulation as well as receptive fields that collectively represented the contralateral visual field. A second group of early-deafened cats was trained to localize visual targets in a perimetry array. In these animals, cooling loops were surgically placed on the FAES to reversibly deactivate the region, which resulted in substantial contralateral visual orienting deficits. These results demonstrate that crossmodal plasticity can substitute one sensory modality for another while maintaining the functional repertoire of the reorganized region.
Collapse
Affiliation(s)
- M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, VA 23298, USA.
| | | | | | | | | | | |
Collapse
|
43
|
Lee CC, Winer JA. Convergence of thalamic and cortical pathways in cat auditory cortex. Hear Res 2011; 274:85-94. [PMID: 20576491 PMCID: PMC2965817 DOI: 10.1016/j.heares.2010.05.008] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/26/2010] [Revised: 05/05/2010] [Accepted: 05/17/2010] [Indexed: 11/25/2022]
Abstract
Cat auditory cortex (AC) receives input from many thalamic nuclei and cortical areas. Previous connectional studies often focused on one connectional system in isolation, limiting perspectives on AC computational processes. Here we review the convergent thalamic, commissural, and corticocortical projections to thirteen AC areas in the cat. Each input differs in strength and may thus serve unique roles. We compared the convergent intrinsic and extrinsic input to each area quantitatively. The intrinsic input was almost half the total. Among extrinsic projections, ipsilateral cortical sources contributed 75%, thalamic input contributed 15%, and contralateral sources contributed 10%. The patterns of distribution support the division of AC areas into families of tonotopic, non-tonotopic, multisensory, and limbic-related areas, each with convergent input arising primarily from within its group. The connections within these areal families suggest a form of processing in which convergence of input to an area could enable new forms of integration. In contrast, the lateral connections between families could subserve integration between categorical representations, allowing otherwise independent streams to communicate and thereby coordinating operations over wide spatial and functional scales. These patterns of serial and interfamilial cooperation challenge more classical models of organization that underestimate the diversity and complexity of AC connectivity.
Collapse
Affiliation(s)
- Charles C Lee
- Department of Neurobiology, University of Chicago, Chicago, IL 60615, United States.
| | | |
Collapse
|
44
|
Sub-threshold cross-modal sensory interaction in the thalamus: lemniscal auditory response in the medial geniculate nucleus is modulated by somatosensory stimulation. Neuroscience 2011; 174:200-15. [DOI: 10.1016/j.neuroscience.2010.11.041] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2010] [Revised: 11/19/2010] [Accepted: 11/19/2010] [Indexed: 11/19/2022]
|
45
|
Cappe C, Murray MM, Barone P, Rouiller EM. Multisensory facilitation of behavior in monkeys: effects of stimulus intensity. J Cogn Neurosci 2010; 22:2850-63. [PMID: 20044892 DOI: 10.1162/jocn.2010.21423] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Multisensory stimuli can improve performance, facilitating RTs on sensorimotor tasks. This benefit is referred to as the redundant signals effect (RSE) and can exceed predictions on the basis of probability summation, indicative of integrative processes. Although an RSE exceeding probability summation has been repeatedly observed in humans and nonprimate animals, there are scant and inconsistent data from nonhuman primates performing similar protocols. Rather, existing paradigms have instead focused on saccadic eye movements. Moreover, the extant results in monkeys leave unresolved how stimulus synchronicity and intensity impact performance. Two trained monkeys performed a simple detection task involving arm movements to auditory, visual, or synchronous auditory-visual multisensory pairs. RSEs in excess of predictions on the basis of probability summation were observed and thus forcibly follow from neural response interactions. Parametric variation of auditory stimulus intensity revealed that in both animals, RT facilitation was limited to situations where the auditory stimulus intensity was below or up to 20 dB above perceptual threshold, despite the visual stimulus always being suprathreshold. No RT facilitation or even behavioral costs were obtained with auditory intensities 30-40 dB above threshold. The present study demonstrates the feasibility and the suitability of behaving monkeys for investigating links between psychophysical and neurophysiologic instantiations of multisensory interactions.
Collapse
Affiliation(s)
- Céline Cappe
- Neuropsychology and Neurorehabilitation Service and Radiology Service, Centre Hospitalier Universitaire Vaudois and University of Lausanne, Lausanne, Switzerland.
| | | | | | | |
Collapse
|
46
|
Neural correlates of human somatosensory integration in tinnitus. Hear Res 2010; 267:78-88. [DOI: 10.1016/j.heares.2010.04.006] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/04/2009] [Revised: 04/14/2010] [Accepted: 04/19/2010] [Indexed: 11/24/2022]
|
47
|
Neuroanatomical identification of crossmodal auditory inputs to interneurons in somatosensory cortex. Exp Brain Res 2010; 202:725-31. [PMID: 20087577 DOI: 10.1007/s00221-010-2163-0] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2009] [Accepted: 01/07/2010] [Indexed: 10/20/2022]
Abstract
Multisensory convergence is the first, requisite step in the process that generates neural responses to events involving more than one sensory modality. Although anatomical studies have documented the merging of afferents from different sensory modalities within a given area, they do not provide insight into the architecture of connectivity at the neuronal level that underlies multisensory processing. In fact, few anatomical studies of multisensory convergence at the neuronal level have been conducted. The present study used a combination of tract-tracing, immunocytochemistry, and confocal microscopic techniques to examine the connections related to crossmodal auditory cortical inputs to somatosensory area SIV. Axons labeled from auditory cortex were found in contact with immunolabeled interneurons in SIV, some of which also colocalized vesicular glutamate transporter 1, indicating the presence of an active, glutamatergic synapse. No specific subtype of inhibitory interneuron appeared to be targeted by the crossmodal contacts. These results provide insight into the structural basis for multisensory processing at the neuronal level and offer anatomical evidence for the direct involvement of inhibitory interneurons in multisensory processing.
Collapse
|
48
|
Lakatos P, O'Connell MN, Barczak A, Mills A, Javitt DC, Schroeder CE. The leading sense: supramodal control of neurophysiological context by attention. Neuron 2009; 64:419-30. [PMID: 19914189 DOI: 10.1016/j.neuron.2009.10.014] [Citation(s) in RCA: 290] [Impact Index Per Article: 18.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/15/2009] [Indexed: 11/24/2022]
Abstract
Attending to a stimulus enhances its neuronal representation, even at the level of primary sensory cortex. Cross-modal modulation can similarly enhance a neuronal representation, and this process can also operate at the primary cortical level. Phase reset of ongoing neuronal oscillatory activity has been shown to be an important element of the underlying modulation of local cortical excitability in both cases. We investigated the influence of attention on oscillatory phase reset in primary auditory and visual cortices of macaques performing an intermodal selective attention task. In addition to responses "driven" by preferred modality stimuli, we noted that both preferred and nonpreferred modality stimuli could "modulate" local cortical excitability by phase reset of ongoing oscillatory activity, and that this effect was linked to their being attended. These findings outline a supramodal mechanism by which attention can control neurophysiological context, thus determining the representation of specific sensory content in primary sensory cortex.
Collapse
Affiliation(s)
- Peter Lakatos
- Cognitive Neuroscience and Schizophrenia Program, Nathan Kline Institute, Orangeburg, New York 10962, USA.
| | | | | | | | | | | |
Collapse
|
49
|
Meredith MA, Allman BL, Keniston LP, Clemo HR. Auditory influences on non-auditory cortices. Hear Res 2009; 258:64-71. [PMID: 19303926 PMCID: PMC2787633 DOI: 10.1016/j.heares.2009.03.005] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/17/2008] [Revised: 02/19/2009] [Accepted: 03/09/2009] [Indexed: 11/24/2022]
Abstract
Although responses to auditory stimuli have been extensively examined in the well-known regions of auditory cortex, there are numerous reports of acoustic sensitivity in cortical areas that are dominated by other sensory modalities. Whether in 'polysensory' cortex or in visual or somatosensory regions, auditory responses in non-auditory cortex have been described largely in terms of auditory processing. This review takes a different perspective that auditory responses in non-auditory cortex, either through multisensory subthreshold or bimodal processing, provide subtle but consistent expansion of the range of activity of the dominant modality within a given area. Thus, the features of these acoustic responses may have more to do with the subtle adjustment of response gain within a given non-auditory region than the encoding of their tonal properties.
Collapse
Affiliation(s)
- M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, VA 23298, USA.
| | | | | | | |
Collapse
|
50
|
Abstract
Multisensory neurons are now known to be widespread in low-level regions of the cortex usually thought of as being responsible for modality-specific processing. The auditory cortex provides a particularly striking example of this, exhibiting responses to both visual and somatosensory stimulation. Single-neuron recording studies in ferrets have shown that each of auditory fields that have been characterized using physiological and anatomical criteria also receives visual inputs, with the incidence of visually-sensitive neurons ranging from 15% to 20% in the primary areas to around 50% or more in higher-level areas. Although some neurons exhibit spiking responses to visual stimulation, these inputs often have subthreshold influences that modulate the responses of the cortical neurons to sound. Insights into the possible role played by the visual inputs can be obtained by examining their sources of origin and the way in which they alter the processing capabilities of neurons in the auditory cortex. These studies suggest that one of the functions of the visual input to auditory cortex is to sharpen the relatively imprecise spatial coding typically found there. Because the extent to which this happens varies between cortical fields, the investigation of multisensory interactions can also help in understanding their relative contributions to auditory perception.
Collapse
|