1
|
Park WJ, Fine I. The perception of auditory motion in sighted and early blind individuals. Proc Natl Acad Sci U S A 2023; 120:e2310156120. [PMID: 38015842 PMCID: PMC10710053 DOI: 10.1073/pnas.2310156120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2023] [Accepted: 10/29/2023] [Indexed: 11/30/2023] Open
Abstract
Motion perception is a fundamental sensory task that plays a critical evolutionary role. In vision, motion processing is classically described using a motion energy model with spatiotemporally nonseparable filters suited for capturing the smooth continuous changes in spatial position over time afforded by moving objects. However, it is still not clear whether the filters underlying auditory motion discrimination are also continuous motion detectors or infer motion from comparing discrete sound locations over time (spatiotemporally separable). We used a psychophysical reverse correlation paradigm, where participants discriminated the direction of a motion signal in the presence of spatiotemporal noise, to determine whether the filters underlying auditory motion discrimination were spatiotemporally separable or nonseparable. We then examined whether these auditory motion filters were altered as a result of early blindness. We found that both sighted and early blind individuals have separable filters. However, early blind individuals show increased sensitivity to auditory motion, with reduced susceptibility to noise and filters that were more accurate in detecting motion onsets/offsets. Model simulations suggest that this reliance on separable filters is optimal given the limited spatial resolution of auditory input.
Collapse
Affiliation(s)
- Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA98195
| | - Ione Fine
- Department of Psychology, University of Washington, Seattle, WA98195
| |
Collapse
|
2
|
Fine I, Park WJ. Do you hear what I see? How do early blind individuals experience object motion? Philos Trans R Soc Lond B Biol Sci 2023; 378:20210460. [PMID: 36511418 PMCID: PMC9745882 DOI: 10.1098/rstb.2021.0460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 09/13/2022] [Indexed: 12/15/2022] Open
Abstract
One of the most important tasks for 3D vision is tracking the movement of objects in space. The ability of early blind individuals to understand motion in the environment from noisy and unreliable auditory information is an impressive example of cortical adaptation that is only just beginning to be understood. Here, we compare visual and auditory motion processing, and discuss the effect of early blindness on the perception of auditory motion. Blindness leads to cross-modal recruitment of the visual motion area hMT+ for auditory motion processing. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion. We discuss how this dramatic shift in the cortical basis of motion processing might influence the perceptual experience of motion in early blind individuals. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Ione Fine
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| | - Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| |
Collapse
|
3
|
Adaptive Response Behavior in the Pursuit of Unpredictably Moving Sounds. eNeuro 2021; 8:ENEURO.0556-20.2021. [PMID: 33875456 PMCID: PMC8116108 DOI: 10.1523/eneuro.0556-20.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 03/05/2021] [Accepted: 03/13/2021] [Indexed: 11/21/2022] Open
Abstract
Although moving sound-sources abound in natural auditory scenes, it is not clear how the human brain processes auditory motion. Previous studies have indicated that, although ocular localization responses to stationary sounds are quite accurate, ocular smooth pursuit of moving sounds is very poor. We here demonstrate that human subjects faithfully track a sound’s unpredictable movements in the horizontal plane with smooth-pursuit responses of the head. Our analysis revealed that the stimulus–response relation was well described by an under-damped passive, second-order low-pass filter in series with an idiosyncratic, fixed, pure delay. The model contained only two free parameters: the system’s damping coefficient, and its central (resonance) frequency. We found that the latter remained constant at ∼0.6 Hz throughout the experiment for all subjects. Interestingly, the damping coefficient systematically increased with trial number, suggesting the presence of an adaptive mechanism in the auditory pursuit system (APS). This mechanism functions even for unpredictable sound-motion trajectories endowed with fixed, but covert, frequency characteristics in open-loop tracking conditions. We conjecture that the APS optimizes a trade-off between response speed and effort. Taken together, our data support the existence of a pursuit system for auditory head-tracking, which would suggest the presence of a neural representation of a spatial auditory fovea (AF).
Collapse
|
4
|
Zuk NJ, Delgutte B. Neural coding and perception of auditory motion direction based on interaural time differences. J Neurophysiol 2019; 122:1821-1842. [PMID: 31461376 DOI: 10.1152/jn.00081.2019] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
While motion is important for parsing a complex auditory scene into perceptual objects, how it is encoded in the auditory system is unclear. Perceptual studies suggest that the ability to identify the direction of motion is limited by the duration of the moving sound, yet we can detect changes in interaural differences at even shorter durations. To understand the source of these distinct temporal limits, we recorded from single units in the inferior colliculus (IC) of unanesthetized rabbits in response to noise stimuli containing a brief segment with linearly time-varying interaural time difference ("ITD sweep") temporally embedded in interaurally uncorrelated noise. We also tested the ability of human listeners to either detect the ITD sweeps or identify the motion direction. Using a point-process model to separate the contributions of stimulus dependence and spiking history to single-neuron responses, we found that the neurons respond primarily by following the instantaneous ITD rather than exhibiting true direction selectivity. Furthermore, using an optimal classifier to decode the single-neuron responses, we found that neural threshold durations of ITD sweeps for both direction identification and detection overlapped with human threshold durations even though the average response of the neurons could track the instantaneous ITD beyond psychophysical limits. Our results suggest that the IC does not explicitly encode motion direction, but internal neural noise may limit the speed at which we can identify the direction of motion.NEW & NOTEWORTHY Recognizing motion and identifying an object's trajectory are important for parsing a complex auditory scene, but how we do so is unclear. We show that neurons in the auditory midbrain do not exhibit direction selectivity as found in the visual system but instead follow the trajectory of the motion in their temporal firing patterns. Our results suggest that the inherent variability in neural firings may limit our ability to identify motion direction at short durations.
Collapse
Affiliation(s)
- Nathaniel J Zuk
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, Massachusetts
| | - Bertrand Delgutte
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, Massachusetts.,Department of Otolaryngology, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
5
|
Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale. J Neurosci 2019; 39:2208-2220. [PMID: 30651333 DOI: 10.1523/jneurosci.2289-18.2018] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2018] [Revised: 12/20/2018] [Accepted: 12/21/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.
Collapse
|
6
|
Joris PX. Neural binaural sensitivity at high sound speeds: Single cell responses in cat midbrain to fast-changing interaural time differences of broadband sounds. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 145:EL45. [PMID: 30710960 PMCID: PMC7112706 DOI: 10.1121/1.5087524] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2018] [Revised: 12/13/2018] [Accepted: 12/18/2018] [Indexed: 06/09/2023]
Abstract
Relative motion between the body and the outside world is a rich source of information. Neural selectivity to motion is well-established in several sensory systems, but is controversial in hearing. This study examines neural sensitivity to changes in the instantaneous interaural time difference of sounds at the two ears. Midbrain neurons track such changes up to extremely high speeds, show only a coarse dependence of firing rate on speed, and lack directional selectivity. These results argue against the presence of selectivity to auditory motion at the level of the midbrain, but reveal an acuity which enables coding of fast-fluctuating binaural cues in realistic sound environments.
Collapse
Affiliation(s)
- Philip X Joris
- Laboratory of Auditory Neurophysiology, KU Leuven, Herestraat 49, B-3000 Leuven, Belgium
| |
Collapse
|
7
|
Chaplin TA, Rosa MGP, Lui LL. Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex. Front Neural Circuits 2018; 12:93. [PMID: 30416431 PMCID: PMC6212655 DOI: 10.3389/fncir.2018.00093] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 10/08/2018] [Indexed: 11/13/2022] Open
Abstract
The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects in the visual field often produce concurrent sounds. Thus, it can highly advantageous to detect motion elicited from sensory signals of either modality, and to integrate them to produce more reliable motion perception. A great deal of progress has been made in understanding how visual motion perception is governed by the activity of single neurons in the primate cerebral cortex, but far less progress has been made in understanding both auditory motion and audiovisual motion integration. Here we, review the key cortical regions for motion processing, focussing on translational motion. We compare the representations of space and motion in the visual and auditory systems, and examine how single neurons in these two sensory systems encode the direction of motion. We also discuss the way in which humans integrate of audio and visual motion cues, and the regions of the cortex that may mediate this process.
Collapse
Affiliation(s)
- Tristan A Chaplin
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Marcello G P Rosa
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Leo L Lui
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| |
Collapse
|
8
|
Poirier C, Baumann S, Dheerendra P, Joly O, Hunter D, Balezeau F, Sun L, Rees A, Petkov CI, Thiele A, Griffiths TD. Auditory motion-specific mechanisms in the primate brain. PLoS Biol 2017; 15:e2001379. [PMID: 28472038 PMCID: PMC5417421 DOI: 10.1371/journal.pbio.2001379] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 04/07/2017] [Indexed: 12/25/2022] Open
Abstract
This work examined the mechanisms underlying auditory motion processing in the auditory cortex of awake monkeys using functional magnetic resonance imaging (fMRI). We tested to what extent auditory motion analysis can be explained by the linear combination of static spatial mechanisms, spectrotemporal processes, and their interaction. We found that the posterior auditory cortex, including A1 and the surrounding caudal belt and parabelt, is involved in auditory motion analysis. Static spatial and spectrotemporal processes were able to fully explain motion-induced activation in most parts of the auditory cortex, including A1, but not in circumscribed regions of the posterior belt and parabelt cortex. We show that in these regions motion-specific processes contribute to the activation, providing the first demonstration that auditory motion is not simply deduced from changes in static spatial location. These results demonstrate that parallel mechanisms for motion and static spatial analysis coexist within the auditory dorsal stream.
Collapse
Affiliation(s)
- Colline Poirier
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
- * E-mail: (CP); (TDG)
| | - Simon Baumann
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Pradeep Dheerendra
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Olivier Joly
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - David Hunter
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Fabien Balezeau
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Li Sun
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Adrian Rees
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Christopher I. Petkov
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Alexander Thiele
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Timothy D. Griffiths
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
- * E-mail: (CP); (TDG)
| |
Collapse
|
9
|
Wasmuht DF, Pena JL, Gutfreund Y. Stimulus-specific adaptation to visual but not auditory motion direction in the barn owl's optic tectum. Eur J Neurosci 2016; 45:610-621. [PMID: 27987375 DOI: 10.1111/ejn.13505] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2016] [Revised: 12/11/2016] [Accepted: 12/12/2016] [Indexed: 12/01/2022]
Abstract
Whether the auditory and visual systems use a similar coding strategy to represent motion direction is an open question. We investigated this question in the barn owl's optic tectum (OT) testing stimulus-specific adaptation (SSA) to the direction of motion. SSA, the reduction of the response to a repetitive stimulus that does not generalize to other stimuli, has been well established in OT neurons. SSA suggests a separate representation of the adapted stimulus in upstream pathways. So far, only SSA to static stimuli has been studied in the OT. Here, we examined adaptation to moving auditory and visual stimuli. SSA to motion direction was examined using repeated presentations of moving stimuli, occasionally switching motion to the opposite direction. Acoustic motion was either mimicked by varying binaural spatial cues or implemented in free field using a speaker array. While OT neurons displayed SSA to motion direction in visual space, neither stimulation paradigms elicited significant SSA to auditory motion direction. These findings show a qualitative difference in how auditory and visual motion is processed in the OT and support the existence of dedicated circuitry for representing motion direction in the early stages of visual but not the auditory system.
Collapse
Affiliation(s)
- Dante F Wasmuht
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Bat-Galim, Haifa, 31096, Israel
| | - Jose L Pena
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, USA
| | - Yoram Gutfreund
- Department of Neuroscience, The Ruth and Bruce Rappaport Faculty of Medicine and Research Institute, The Technion, Bat-Galim, Haifa, 31096, Israel
| |
Collapse
|
10
|
Neuhoff JG. Looming sounds are perceived as faster than receding sounds. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2016; 1:15. [PMID: 28180166 PMCID: PMC5256440 DOI: 10.1186/s41235-016-0017-4] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2016] [Accepted: 09/23/2016] [Indexed: 11/17/2022]
Abstract
Each year thousands of people are killed by looming motor vehicles. Throughout our evolutionary history looming objects have posed a threat to survival and perceptual systems have evolved unique solutions to confront these environmental challenges. Vision provides an accurate representation of time-to-contact with a looming object and usually allows us to interact successfully with the object if required. However, audition functions as a warning system and yields an anticipatory representation of arrival time, indicating that the object has arrived when it is still some distance away. The bias provides a temporal margin of safety that allows more time to initiate defensive actions. In two studies this bias was shown to influence the perception of the speed of looming and receding sound sources. Listeners heard looming and receding sound sources and judged how fast they were moving. Listeners perceived the speed of looming sounds as faster than that of equivalent receding sounds. Listeners also showed better discrimination of the speed of looming sounds than receding sounds. Finally, close sounds were perceived as faster than distant sounds. The results suggest a prioritization of the perception of the speed of looming and receding sounds that mirrors the level of threat posed by moving objects in the environment.
Collapse
Affiliation(s)
- John G Neuhoff
- Department of Psychology, The College of Wooster, Wooster, OH 44691 USA
| |
Collapse
|
11
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
12
|
Wang Y, Gutfreund Y, Peña JL. Coding space-time stimulus dynamics in auditory brain maps. Front Physiol 2014; 5:135. [PMID: 24782781 PMCID: PMC3986518 DOI: 10.3389/fphys.2014.00135] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2013] [Accepted: 03/19/2014] [Indexed: 11/21/2022] Open
Abstract
Sensory maps are often distorted representations of the environment, where ethologically-important ranges are magnified. The implication of a biased representation extends beyond increased acuity for having more neurons dedicated to a certain range. Because neurons are functionally interconnected, non-uniform representations influence the processing of high-order features that rely on comparison across areas of the map. Among these features are time-dependent changes of the auditory scene generated by moving objects. How sensory representation affects high order processing can be approached in the map of auditory space of the owl's midbrain, where locations in the front are over-represented. In this map, neurons are selective not only to location but also to location over time. The tuning to space over time leads to direction selectivity, which is also topographically organized. Across the population, neurons tuned to peripheral space are more selective to sounds moving into the front. The distribution of direction selectivity can be explained by spatial and temporal integration on the non-uniform map of space. Thus, the representation of space can induce biased computation of a second-order stimulus feature. This phenomenon is likely observed in other sensory maps and may be relevant for behavior.
Collapse
Affiliation(s)
- Yunyan Wang
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine Bronx, NY, USA
| | - Yoram Gutfreund
- The Rappaport Research Institute and Faculty of Medicine The Technion, Haifa, Israel
| | - José L Peña
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine Bronx, NY, USA
| |
Collapse
|
13
|
Getzmann S, Lewald J. Cortical processing of change in sound location: Smooth motion versus discontinuous displacement. Brain Res 2012; 1466:119-27. [DOI: 10.1016/j.brainres.2012.05.033] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Revised: 03/29/2012] [Accepted: 05/17/2012] [Indexed: 10/28/2022]
|
14
|
Singheiser M, Ferger R, von Campenhausen M, Wagner H. Adaptation in the auditory midbrain of the barn owl (Tyto alba) induced by tonal double stimulation. Eur J Neurosci 2012; 35:445-56. [PMID: 22288481 DOI: 10.1111/j.1460-9568.2011.07967.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
During hunting, the barn owl typically listens to several successive sounds as generated, for example, by rustling mice. As auditory cells exhibit adaptive coding, the earlier stimuli may influence the detection of the later stimuli. This situation was mimicked with two double-stimulus paradigms, and adaptation was investigated in neurons of the barn owl's central nucleus of the inferior colliculus. Each double-stimulus paradigm consisted of a first or reference stimulus and a second stimulus (probe). In one paradigm (second level tuning), the probe level was varied, whereas in the other paradigm (inter-stimulus interval tuning), the stimulus interval between the first and second stimulus was changed systematically. Neurons were stimulated with monaural pure tones at the best frequency, while the response was recorded extracellularly. The responses to the probe were significantly reduced when the reference stimulus and probe had the same level and the inter-stimulus interval was short. This indicated response adaptation, which could be compensated for by an increase of the probe level of 5-7 dB over the reference level, if the latter was in the lower half of the dynamic range of a neuron's rate-level function. Recovery from adaptation could be best fitted with a double exponential showing a fast (1.25 ms) and a slow (800 ms) component. These results suggest that neurons in the auditory system show dynamic coding properties to tonal double stimulation that might be relevant for faithful upstream signal propagation. Furthermore, the overall stimulus level of the masker also seems to affect the recovery capabilities of auditory neurons.
Collapse
Affiliation(s)
- Martin Singheiser
- Department of Zoology, RWTH Aachen University, Mies-van-der-Rohe-Strasse 15, D-52074 Aachen, Germany
| | | | | | | |
Collapse
|
15
|
Abstract
The auditory system codes spatial locations in a way that deviates from the spatial representations found in other modalities. This difference is especially striking in the cortex, where neurons form topographical maps of visual and tactile space but where auditory space is represented through a population rate code. In this hemifield code, sound source location is represented in the activity of two widely tuned opponent populations, one tuned to the right and the other to the left side of auditory space. Scientists are only beginning to uncover how this coding strategy adapts to various spatial processing demands. This review presents the current understanding of auditory spatial processing in the cortex. To this end, the authors consider how various implementations of the hemifield code may exist within the auditory cortex and how these may be modulated by the stimulation and task context. As a result, a coherent set of neural strategies for auditory spatial processing emerges.
Collapse
|
16
|
Getzmann S. Auditory motion perception: onset position and motion direction are encoded in discrete processing stages. Eur J Neurosci 2011; 33:1339-50. [DOI: 10.1111/j.1460-9568.2011.07617.x] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
17
|
Getzmann S, Lewald J. The effect of spatial adaptation on auditory motion processing. Hear Res 2011; 272:21-9. [DOI: 10.1016/j.heares.2010.11.005] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2010] [Revised: 11/10/2010] [Accepted: 11/11/2010] [Indexed: 10/18/2022]
|
18
|
Hoffmann S, Schuller G, Firzlaff U. Dynamic stimulation evokes spatially focused receptive fields in bat auditory cortex. Eur J Neurosci 2010; 31:371-85. [DOI: 10.1111/j.1460-9568.2009.07051.x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
19
|
Getzmann S. Effect of auditory motion velocity on reaction time and cortical processes. Neuropsychologia 2009; 47:2625-33. [PMID: 19467249 DOI: 10.1016/j.neuropsychologia.2009.05.012] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2008] [Revised: 05/12/2009] [Accepted: 05/17/2009] [Indexed: 10/20/2022]
Abstract
The study investigated the processing of sound motion, employing a psychophysical motion discrimination task in combination with electroencephalography. Following stationary auditory stimulation from a central space position, the onset of left- and rightward motion elicited a specific cortical response that was lateralized to the hemisphere contralateral to the direction of motion. The contralaterality of the motion onset response decreased when the velocity was reduced. Higher motion velocity was associated with larger and earlier cortical responses and with shorter reaction times to motion onset. The results indicate a close correspondence of brain activity and behavioral performance in auditory motion detection.
Collapse
Affiliation(s)
- Stephan Getzmann
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany.
| |
Collapse
|
20
|
Nelson PC, Smith ZM, Young ED. Wide-dynamic-range forward suppression in marmoset inferior colliculus neurons is generated centrally and accounts for perceptual masking. J Neurosci 2009; 29:2553-62. [PMID: 19244530 PMCID: PMC2677200 DOI: 10.1523/jneurosci.5359-08.2009] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2008] [Revised: 01/14/2009] [Accepted: 01/28/2009] [Indexed: 11/21/2022] Open
Abstract
An organism's ability to detect and discriminate sensory inputs depends on the recent stimulus history. For example, perceptual detection thresholds for a brief tone can be elevated by as much as 50 dB when following a masking stimulus. Previous work suggests that such forward masking is not a direct result of peripheral neural adaptation; the central pathway apparently modifies the representation in a way that further attenuates the input's response to short probe signals. Here, we show that much of this transformation is complete by the level of the inferior colliculus (IC). Single-neuron extracellular responses were recorded in the central nucleus of the awake marmoset IC. The threshold for a 20 ms probe tone presented at best frequency was determined for various masker-probe delays, over a range of masker sound pressure levels (SPLs) and frequencies. The most striking aspect of the data was the increased potency of forward maskers as their SPL was increased, despite the fact that the excitatory response to the masker was often saturating or nonmonotonic over the same range of levels. This led to probe thresholds at high masker levels that were almost always higher than those observed in the auditory nerve. Probe threshold shifts were not usually caused by a persistent excitatory response to the masker; instead we propose a wide-dynamic-range inhibitory mechanism locked to sound offset as an explanation for several key aspects of the data. These findings further delineate the role of subcortical auditory processing in the generation of a context-dependent representation of ongoing acoustic scenes.
Collapse
Affiliation(s)
- Paul C Nelson
- Center for Hearing and Balance, Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205, USA.
| | | | | |
Collapse
|
21
|
Malinina ES. Processing of spectral localization-informative changes in sound signals by neurons of inferior colliculus and auditory cortex of the house mouse Mus musculus. J EVOL BIOCHEM PHYS+ 2006. [DOI: 10.1134/s0022093006050103] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
22
|
Witten IB, Bergan JF, Knudsen EI. Dynamic shifts in the owl's auditory space map predict moving sound location. Nat Neurosci 2006; 9:1439-45. [PMID: 17013379 DOI: 10.1038/nn1781] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2006] [Accepted: 09/12/2006] [Indexed: 11/09/2022]
Abstract
The optic tectum of the barn owl contains a map of auditory space. We found that, in response to moving sounds, the locations of receptive fields that make up the map shifted toward the approaching sound. The magnitude of the receptive field shifts increased systematically with increasing stimulus velocity and, therefore, was appropriate to compensate for sensory and motor delays inherent to auditory orienting behavior. Thus, the auditory space map is not static, but shifts adaptively and dynamically in response to stimulus motion. We provide a computational model to account for these results. Because the model derives predictive responses from processes that are known to occur commonly in neural networks, we hypothesize that analogous predictive responses will be found to exist widely in the central nervous system. This hypothesis is consistent with perceptions of stimulus motion in humans for many sensory parameters.
Collapse
Affiliation(s)
- Ilana B Witten
- Department of Neurobiology, Stanford University Medical School, Stanford, California 94305, USA
| | | | | |
Collapse
|
23
|
Porter KK, Groh JM. The "other" transformation required for visual-auditory integration: representational format. PROGRESS IN BRAIN RESEARCH 2006; 155:313-23. [PMID: 17027396 DOI: 10.1016/s0079-6123(06)55018-6] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
Multisensory integration of spatial signals requires not only that stimulus locations be encoded in the same spatial reference frame, but also that stimulus locations be encoded in the same representational format. Previous studies have addressed the issue of spatial reference frame, but representational format, particularly for sound location, has been relatively overlooked. We discuss here our recent findings that sound location in the primate inferior colliculus is encoded using a "rate" code, a format that differs from the place code used for representing visual stimulus locations. Possible mechanisms for transforming signals from rate-to-place or place-to-rate coding formats are considered.
Collapse
|
24
|
Porter KK, Metzger RR, Groh JM. Representation of eye position in primate inferior colliculus. J Neurophysiol 2005; 95:1826-42. [PMID: 16221747 DOI: 10.1152/jn.00857.2005] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between -24 and 24 degrees while sounds were presented from loudspeakers at locations within that same range. Approximately 40% of our sample of 153 neurons showed statistically significant sensitivity to eye position during either the presentation of an auditory stimulus or in the absence of sound (Bonferroni corrected P < 0.05). The representation for eye position was predominantly monotonic and favored contralateral eye positions. Eye-position sensitivity was more prevalent among neurons without sound-location sensitivity: about half of neurons that were insensitive to sound location were sensitive to eye position, whereas only about one-quarter of sound-location-sensitive neurons were also sensitive to eye position. Our findings suggest that sound location and eye position are encoded using independent but overlapping rate codes at the level of the IC. The use of a common format has computational advantages for integrating these two signals. The differential distribution of eye-position sensitivity and sound-location sensitivity suggests that this process has begun by the level of the IC but is not yet complete at this stage. We discuss how these signals might fit into Groh and Sparks' vector subtraction model for coordinate transformations.
Collapse
Affiliation(s)
- Kristin Kelly Porter
- Dept. of Psychological and Brain Sciences, 6207 Moore Hall, Dartmouth College, Hanover, NH 03755, USA
| | | | | |
Collapse
|
25
|
Nikitin NI, Varfolomeev AL, Kotelenko LM. Responses of cat primary auditory cortex neurons to moving stimuli with dynamically changing interaural delays. ACTA ACUST UNITED AC 2005; 34:949-59. [PMID: 15686141 DOI: 10.1023/b:neab.0000042654.09989.85] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
The spike responses of individual neurons in the primary auditory cortex were studied in anesthetized cats during exposure to stationary and moving stimuli with static or dynamically changing interaural delays (deltaT). Static stimuli were tones and clicks. Dynamic stimuli were created using series of synphase and antiphase clicks with interaural delays which changed over time. Sensitivity to changes in deltaT was predominantly present in neurons with low characteristic frequencies (less than 2.8 kHz). Changes in deltaT in moving stimuli induced responses in neurons sensitive to changes in deltaT in the stationary stimulus. The effect of movement could be a relationship between the level of spike activity and the direction and rate of change of deltaT or it could be a displacement of the tuning curve for the response to deltaT (the deltaT function) in the direction opposite to that of the direction of the change in deltaT. The magnitude of the effects of movement depended on the position of the period for changes in deltaT relative to the deltaT function. The greatest effects were seen with changes in deltaT on the sloping part of the deltaT function.
Collapse
Affiliation(s)
- N I Nikitin
- Auditory Physiology Group, I. P. Pavlov Institute of Physiology, Russian Academy of Sciences, 6 Makarov Bank, 199034 St. Petersburg, Russia
| | | | | |
Collapse
|
26
|
Behrend O, Dickson B, Clarke E, Jin C, Carlile S. Neural responses to free field and virtual acoustic stimulation in the inferior colliculus of the guinea pig. J Neurophysiol 2004; 92:3014-29. [PMID: 15212432 DOI: 10.1152/jn.00402.2004] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Virtual auditory space (VAS) stimuli based on outer ear transfer functions became increasingly important in spatial hearing research. However, few studies have investigated the match between responses of auditory neurons to VAS and free-field (FF) stimulation. This study validates acoustic spatial receptive fields (SRFs) of 183 individual midbrain units using both VAS and FF stimuli. The first-spike latency, which varied systematically across SRFs, was 14.9 +/- 8.3 (SD) ms in FF, and 15.1 +/- 8.3 ms in VAS. Spike-count-based SRFs measured 0-20 dB above the neural threshold covered on average 44.5 +/- 18.0% of the recorded sphere in FF and 45.5 +/- 18.7% in VAS. The average deviation of the centroid position of SRFs using FF and VAS stimuli was 7.4 degrees azimuth and 3.3 degrees elevation. The average spike rate remained unchanged. The SRF overlap recorded using FF and VAS stimuli (mean: 71.3 +/- 12.6%) or repeated FF stimuli (70.2 +/- 14.2%) was high and strongly correlated (r = 0.96; P < 0.05). The SRF match observed with FF and VAS stimuli was not significantly altered over a range of stimulus levels (paired t-test P = 0.51; n = 6). Randomized VAS barely affected SRF sizes, centroids, or maximum spike count but decreased the average minimum response to 59% compared with sequential stimulation (paired t-test; P = 0.05; n = 26). SRF recordings in VAS excluding the acoustic distortions of the recording equipment differed from those in VAS incorporating the equipment (paired t-test P = 0.01; n = 5). In conclusion, neurophysiological recordings demonstrate that individualized VAS stimuli provided a good simulation of a FF environment.
Collapse
Affiliation(s)
- Oliver Behrend
- Auditory Neuroscience Laboratory, Department of Physiology, University of Sydney, New South Wales 2006, Australia.
| | | | | | | | | |
Collapse
|
27
|
Neelon MF, Jenison RL. The temporal growth and decay of the auditory motion aftereffect. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2004; 115:3112-3123. [PMID: 15237836 DOI: 10.1121/1.1687834] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
The present work investigated the temporal tuning of the auditory motion aftereffect (aMAE) by measuring the time course of adaptation and recovery to auditory motion exposure. On every trial, listeners were first exposed to a broadband, horizontally moving sound source for either 1 or 5 seconds, then presented moving test stimuli after delays of 0, 2/3, or 1 2/3 seconds. All stimuli were synthesized from head related transfer functions recorded for each participant. One second of motion exposure (i.e., a single pass of the moving source) produced clearly measurable aMAEs which generally decayed monotonically after adaptation ended, while five seconds exposure produced stronger aftereffects that remained largely unattenuated across test delays. These differences may imply two components to the aMAE: a short time-constant motion illusion and a longer time-constant response bias. Finally, aftereffects were produced only by adaptor movement toward but not away from listener midline. This aftereffect asymmetry may also be a consequence of brief adaptation times and reflect initial neural response to auditory motion in primate auditory cortex.
Collapse
Affiliation(s)
- Michael F Neelon
- Department of Psychology, 1202 W. Johnson St., University of Wisconsin, Madison, Wisconsin 53706, USA.
| | | |
Collapse
|
28
|
Groh JM, Kelly KA, Underhill AM. A monotonic code for sound azimuth in primate inferior colliculus. J Cogn Neurosci 2004; 15:1217-31. [PMID: 14709238 DOI: 10.1162/089892903322598166] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in the frontal hemisphere. Such neurons nearly always responded monotonically as a function of sound azimuth, with stronger responses for more contralateral sound locations. Few, if any, neurons had circumscribed receptive fields. Spatial sensitivity was broad: the proportion of the total sample of neurons responding to a sound at a given location ranged from 30% for ipsilateral locations to 80% for contralateral locations. These findings suggest that sound azimuth is represented via a population rate code of very broadly responsive neurons in primate inferior colliculi. This representation differs in format from the place code used for encoding the locations of visual and tactile stimuli and poses problems for the eventual convergence of auditory and visual or somatosensory signals. Accordingly, models for converting this representation into a place code are discussed.
Collapse
Affiliation(s)
- Jennifer M Groh
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH 03755, USA.
| | | | | |
Collapse
|
29
|
Soto-Faraco S, Spence C, Kingstone A. Cross-Modal Dynamic Capture: Congruency Effects in the Perception of Motion Across Sensory Modalities. ACTA ACUST UNITED AC 2004; 30:330-45. [PMID: 15053692 DOI: 10.1037/0096-1523.30.2.330] [Citation(s) in RCA: 73] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
This study investigated multisensory interactions in the perception of auditory and visual motion. When auditory and visual apparent motion streams are presented concurrently in opposite directions, participants often fail to discriminate the direction of motion of the auditory stream, whereas perception of the visual stream is unaffected by the direction of auditory motion (Experiment 1). This asymmetry persists even when the perceived quality of apparent motion is equated for the 2 modalities (Experiment 2). Subsequently, it was found that this visual modulation of auditory motion is caused by an illusory reversal in the perceived direction of sounds (Experiment 3). This "dynamic capture" effect occurs over and above ventriloquism among static events (Experiments 4 and 5), and it generalizes to continuous motion displays (Experiment 6). These data are discussed in light of related multisensory phenomena and their support for a "modality appropriateness" interpretation of multisensory integration in motion perception.
Collapse
|
30
|
Abstract
The ability to process motion is crucial for coherent perception and action. While the majority of studies have focused on the unimodal factors that influence motion perception (see, for example, the other chapters in this Special Issue), some researchers have also investigated the extent to which information presented in one sensory modality can affect the perception of motion for stimuli presented in another modality. Although early studies often gave rise to mixed results, the development of increasingly sophisticated psychophysical paradigms are now enabling researchers to determine the spatiotemporal constraints on multisensory interactions in the perception of motion. Recent findings indicate that these interactions stand over-and-above the multisensory interactions documented previously for static stimuli, such as the oft-cited 'ventriloquism' effect. Neuroimaging and neuropsychological studies are also beginning to elucidate the network of neural structures responsible for the processing of motion information in the different sensory modalities, an important first step that will ultimately lead to the determination of the neural substrates underlying these multisensory contributions to motion perception.
Collapse
Affiliation(s)
- Salvador Soto-Faraco
- Departament de Psicologia Bàsica, Universitat de Barcelona, Pg. Vall d'Hebrón, 171, 08035 Barcelona, Spain.
| | | | | |
Collapse
|
31
|
Lengyel M, Szatmáry Z, Erdi P. Dynamically detuned oscillations account for the coupled rate and temporal code of place cell firing. Hippocampus 2003; 13:700-14. [PMID: 12962315 DOI: 10.1002/hipo.10116] [Citation(s) in RCA: 134] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Firing of place cells in the exploring rat conveys doubly coded spatial information: both the rate of spikes and their timing relative to the phase of the ongoing field theta oscillation are correlated with the location of the animal. Specifically, the firing rate of a place cell waxes and wanes, while the timing of spikes precesses monotonically as the animal traverses the portion of the environment preferred by the cell. We propose a mechanism for the generation of this firing pattern that can be applied for place cells in all three hippocampal subfields and that encodes spatial information in the output of the cell without relying on topographical connections or topographical input. A single pyramidal cell was modeled so that the cell received rhythmic inhibition in phase with theta field potential oscillation on the soma and was excited on the dendrite with input depending on the speed of the rat. The dendrite sustained an intrinsic membrane potential oscillation, frequency modulated by its input. Firing probability of the cell was determined jointly by somatic and dendritic oscillations. Results were obtained on different levels of abstraction: a purely analytical derivation was arrived at, corroborated by numerical simulations of rate neurons, and an extension of these simulations to spiking neurons was also performed. Realistic patterns of rate and temporal coding emerged and were found to be inseparable. These results may have implications on the robustness of information coding in place cell firing and on the ways information is processed in structures downstream to the hippocampus.
Collapse
Affiliation(s)
- Máté Lengyel
- Department of Biophysics, KFKI Research Institute for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest, Hungary.
| | | | | |
Collapse
|
32
|
Abstract
We investigated spike-frequency adaptation of neurons sensitive to interaural phase disparities (IPDs) in the inferior colliculus (IC) of urethane-anesthetized guinea pigs using a stimulus paradigm designed to exclude the influence of adaptation below the level of binaural integration. The IPD-step stimulus consists of a binaural 3,000-ms tone, in which the first 1,000 ms is held at a neuron's least favorable ("worst") IPD, adapting out monaural components, before being stepped rapidly to a neuron's most favorable ("best") IPD for 300 ms. After some variable interval (1-1,000 ms), IPD is again stepped to the best IPD for 300 ms, before being returned to a neuron's worst IPD for the remainder of the stimulus. Exponential decay functions fitted to the response to best-IPD steps revealed an average adaptation time constant of 52.9 +/- 26.4 ms. Recovery from adaptation to best IPD steps showed an average time constant of 225.5 +/- 210.2 ms. Recovery time constants were not correlated with adaptation time constants. During the recovery period, adaptation to a 2nd best-IPD step followed similar kinetics to adaptation during the 1st best-IPD step. The mean adaptation time constant at stimulus onset (at worst IPD) was 34.8 +/- 19.7 ms, similar to the 38.4 +/- 22.1 ms recorded to contralateral stimulation alone. Individual time constants after stimulus onset were correlated with each other but not with time constants during the best-IPD step. We conclude that such binaurally derived measures of adaptation reflect processes that occur above the level of exclusively monaural pathways, and subsequent to the site of primary binaural interaction.
Collapse
Affiliation(s)
- Neil J Ingham
- Department of Physiology, University College London, Gower Street, London, WC1E 6BT, United Kingdom.
| | | |
Collapse
|
33
|
Abstract
In order to study how and if single brainstem units respond to moving compared with stationary sounds, radially moving sound sources were presented to the bat, Rhinolophus ferrumequinum. This time-variant binaural stimulation was simulated dichotically through earphones (closed-acoustic-field for the virtual azimuth range of +/-40 degrees from the midline). Neurophysiologically recorded responses primarily showed a function of interaural intensity difference (IID) which is considered a direct correlate of the sound source's azimuth angle. However, this is only true for the stationary case. Unit's response did not remain unaffected by the dynamic stimulus cues of sound source movement (velocity and direction). Maximal discharge rate became a function of motion velocity as well as the slopes of the response profiles. Hence, coding of IID became ambiguous as, depending on the unit, the response profiles and therefore a unit's receptive field, became spatially shifted with respect to one another when the direction of the sound source movement was reversed. Shifts within the movement direction (hysteresis) as well as against it (termed here 'advance') were observed: hysteresis is typical for units with non-monotonic, stationary rate/intensity functions, whereas those units with monotonic functions predominantly show advances. Further dynamic response features in form of transient peaks and troughs, superimposed on the response profiles, were registered. It appears that the ongoing firing rate no longer represents azimuth position alone, but vigorously reproduces the dynamic cues (velocity and movement direction), too. With respect to the neural mechanisms leading to dynamic response features, it is proposed that, as long excitation and inhibition act with similar short time constants, neural activity can rapidly and faithfully follow changing IIDs. Different time constants for excitation, inhibition, facilitation, and depression may be responsible for the dynamic 'features' such as transient responses and hysteresis/advance. They may provide biologically relevant information for nocturnally hunting bats to efficiently guide their flight maneuvers.
Collapse
Affiliation(s)
- Peter A Schlegel
- Zoologisches Institut, Ludwigs- Maximilians- Universität, Luisenstr. 14, D 80333 München, Germany.
| |
Collapse
|
34
|
Context-dependent adaptive coding of interaural phase disparity in the auditory cortex of awake macaques. J Neurosci 2002. [PMID: 12040069 DOI: 10.1523/jneurosci.22-11-04625.2002] [Citation(s) in RCA: 79] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
In the ascending auditory pathway, the context in which a particular stimulus occurs can influence the character of the responses that encode it. Here we demonstrate that the cortical representation of a binaural cue to sound source location is profoundly context-dependent: spike rates elicited by a 0 degrees interaural phase disparity (IPD) were very different when preceded by 90 degrees versus -90 degrees IPD. The changes in firing rate associated with equivalent stimuli occurring in different contexts are comparable to changes in discharge rate that establish cortical tuning to the cue itself. Single-unit responses to trapezoidally modulated IPD stimuli were recorded in the auditory cortices of awake rhesus monkeys. Each trapezoidal stimulus consisted of linear modulations of IPD between two steady-state IPDs differing by 90 degrees. The stimulus set was constructed so that identical IPDs and sweeps through identical IPD ranges recurred as elements of disparate sequences. We routinely observed orderly context-induced shifts in IPD tuning. These shifts reflected an underlying enhancement of the contrast in the discharge rate representation of different IPDs. This process is subserved by sensitivity to stimulus events in the recent past, involving multiple adaptive mechanisms operating on timescales ranging from tens of milliseconds to seconds. These findings suggest that the cortical processing of dynamic acoustic signals is dominated by an adaptive coding strategy that prioritizes the representation of stimulus changes over actual stimulus values. We show how cortical selectivity for motion direction in real space could emerge as a consequence of this general coding principle.
Collapse
|
35
|
The Inferior Colliculus: A Hub for the Central Auditory System. INTEGRATIVE FUNCTIONS IN THE MAMMALIAN AUDITORY PATHWAY 2002. [DOI: 10.1007/978-1-4757-3654-0_7] [Citation(s) in RCA: 70] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
|
36
|
Firzlaff U, Schuller G. Motion processing in the auditory cortex of the rufous horseshoe bat: role of GABAergic inhibition. Eur J Neurosci 2001; 14:1687-701. [PMID: 11860463 DOI: 10.1046/j.0953-816x.2001.01797.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
This study examined the influence of inhibition on motion-direction-sensitive responses of neurons in the dorsal fields of auditory cortex of the rufous horseshoe bat. Responses to auditory apparent motion stimuli were recorded extracellularly from neurons while microiontophoretically applying gamma-aminobutyric acid (GABA) and the GABAA receptor antagonist bicuculline methiodide (BMI). Neurons could respond with a directional preference exhibiting stronger responses to one direction of motion or a shift of receptive field (RF) borders depending on direction of motion. BMI influenced the motion direction sensitivity of 53% of neurons. In 21% of neurons the motion-direction sensitivity was decreased by BMI by decreasing either directional preference or RF shift. In neurons with a directional preference, BMI increased the spike number for the preferred direction by a similar amount as for the nonpreferred direction. Thus, inhibition was not direction specific. BMI increased motion-direction sensitivity by either increasing directional preference or magnitude of RF shifts in 22% of neurons. Ten percent of neurons changed their response from a RF shift to a directional preference under BMI. In these neurons, the observed effects could often be better explained by adaptation of excitation rather than inhibition. The results suggest, that adaptation of excitation, as well as cortex specific GABAergic inhibition, contribute to motion-direction sensitivity in the auditory cortex of the rufous horseshoe bat.
Collapse
Affiliation(s)
- U Firzlaff
- Department Biologie II, Ludwig-Maximilians-Universität München, Luisenstr. 14, D-80333 München, Germany.
| | | |
Collapse
|
37
|
Malone BJ, Semple MN. Effects of auditory stimulus context on the representation of frequency in the gerbil inferior colliculus. J Neurophysiol 2001; 86:1113-30. [PMID: 11535662 DOI: 10.1152/jn.2001.86.3.1113] [Citation(s) in RCA: 59] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Prior studies of dynamic conditioning have focused on modulation of binaural localization cues, revealing that the responses of inferior colliculus (IC) neurons to particular values of interaural phase and level disparities depend critically on the context in which they occur. Here we show that monaural frequency transitions, which do not simulate azimuthal motion, also condition the responses of IC neurons. We characterized single-unit responses to two frequency transition stimuli: a glide stimulus comprising two tones linked by a linear frequency sweep (origin-sweep-target) and a step stimulus consisting of one tone followed immediately by another (origin-target). Using sets of glide and step stimuli converging on a common target, we constructed conditioned response functions (RFs) depicting the variability in the response to an identical stimulus as a function of the preceding origin frequency. For nearly all cells, the response to the target depended on the origin frequency, even for origins outside the excitatory frequency response area of the cell. Results from conditioned RFs based on long (2-4 s) and short (200 ms) duration step stimuli indicate that conditioning effects can be induced in the absence of the dynamic sweep, and by stimuli of relatively short duration. Because IC neurons are tuned to frequency, changes in the origin frequency often change the "effective" stimulus duty cycle. In many cases, the enhancement of the target response appeared related to the decrease in the "effective" stimulus duty cycle rather than to the prior presentation of a particular origin frequency. Although this implies that nonselective adaptive mechanisms are responsible for conditioned responses, slightly more than half of IC neurons in each paradigm responded significantly differently to targets following origins that elicited statistically indistinguishable responses. The prevailing influence of stimulus context when discharge history is controlled demonstrates that not all the mechanisms governing conditioning depend on the discharge history of the recorded neuron. Selective adaptation among the neuron's variously tuned afferents may help engender stimulus-specific conditioning. The demonstration that conditioning effects reflect sensitivity to spectral as well as spatial stimulus contrast has broad implications for the processing of a wide range of dynamic acoustic signals and sound sequences.
Collapse
Affiliation(s)
- B J Malone
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA.
| | | |
Collapse
|