1
|
Broersen R, Thompson G, Thomas F, Stuart GJ. Binocular processing facilitates escape behavior through multiple pathways to the superior colliculus. Curr Biol 2025; 35:1242-1257.e9. [PMID: 39983730 DOI: 10.1016/j.cub.2025.01.066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Revised: 11/25/2024] [Accepted: 01/29/2025] [Indexed: 02/23/2025]
Abstract
The superior colliculus (SC) is the main brain region regulating defensive behaviors to visual threats. Yet, how the SC integrates binocular visual information and to what extent binocular vision drives defensive behaviors remains unknown. Here, we show that SC neurons respond to binocular visual input with diverse synaptic and spiking responses, summating visual inputs largely sublinearly. Using pathway-specific optogenetic silencing, we find that contralateral and ipsilateral visual information is carried to binocular SC neurons through retinal, interhemispheric, and corticotectal pathways. These pathways carry binocular visual input to the SC in a layer-specific manner, with superficial layers receiving visual information through retinal input, whereas intermediate and deep layers rely on interhemispheric and corticotectal pathways. We further show that binocular vision facilitates visually evoked escape behavior. Together, our data shed light on the cellular and circuit mechanisms underlying binocular visual processing in the SC and its role in defensive behaviors to visual threats.
Collapse
Affiliation(s)
- Robin Broersen
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, Australian National University, 131 Garran Rd, Acton, ACT 2601, Australia; Department of Neuroscience, Erasmus MC, Wytemaweg 80, 3015 CN Rotterdam, the Netherlands.
| | - Genevieve Thompson
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, Australian National University, 131 Garran Rd, Acton, ACT 2601, Australia
| | - Felix Thomas
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, Australian National University, 131 Garran Rd, Acton, ACT 2601, Australia
| | - Greg J Stuart
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, Australian National University, 131 Garran Rd, Acton, ACT 2601, Australia; Department of Physiology, Monash University, Wellington Rd, Clayton, VIC 3800, Australia.
| |
Collapse
|
2
|
Vega-Zuniga T, Sumser A, Symonova O, Koppensteiner P, Schmidt FH, Joesch M. A thalamic hub-and-spoke network enables visual perception during action by coordinating visuomotor dynamics. Nat Neurosci 2025; 28:627-639. [PMID: 39930095 PMCID: PMC11893466 DOI: 10.1038/s41593-025-01874-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2024] [Accepted: 12/19/2024] [Indexed: 03/12/2025]
Abstract
For accurate perception and motor control, an animal must distinguish between sensory experiences elicited by external stimuli and those elicited by its own actions. The diversity of behaviors and their complex influences on the senses make this distinction challenging. Here, we uncover an action-cue hub that coordinates motor commands with visual processing in the brain's first visual relay. We show that the ventral lateral geniculate nucleus (vLGN) acts as a corollary discharge center, integrating visual translational optic flow signals with motor copies from saccades, locomotion and pupil dynamics. The vLGN relays these signals to correct action-specific visual distortions and to refine perception, as shown for the superior colliculus and in a depth-estimation task. Simultaneously, brain-wide vLGN projections drive corrective actions necessary for accurate visuomotor control. Our results reveal an extended corollary discharge architecture that refines early visual transformations and coordinates actions via a distributed hub-and-spoke network to enable visual perception during action.
Collapse
Affiliation(s)
- Tomas Vega-Zuniga
- Institute of Science and Technology Austria, Klosterneuburg, Austria.
| | - Anton Sumser
- Institute of Science and Technology Austria, Klosterneuburg, Austria
- Division of Neuroscience, Faculty of Biology, LMU Munich, Martinsried, Germany
| | - Olga Symonova
- Institute of Science and Technology Austria, Klosterneuburg, Austria
| | | | - Florian H Schmidt
- Institute of Science and Technology Austria, Klosterneuburg, Austria
| | - Maximilian Joesch
- Institute of Science and Technology Austria, Klosterneuburg, Austria.
| |
Collapse
|
3
|
Martins DM, Manda JM, Goard MJ, Parker PRL. Building egocentric models of local space from retinal input. Curr Biol 2024; 34:R1185-R1202. [PMID: 39626632 PMCID: PMC11620475 DOI: 10.1016/j.cub.2024.10.057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/08/2024]
Abstract
Determining the location of objects relative to ourselves is essential for interacting with the world. Neural activity in the retina is used to form a vision-independent model of the local spatial environment relative to the body. For example, when an animal navigates through a forest, it rapidly shifts its gaze to identify the position of important objects, such as a tree obstructing its path. This seemingly trivial behavior belies a sophisticated neural computation. Visual information entering the brain in a retinocentric reference frame must be transformed into an egocentric reference frame to guide motor planning and action. This, in turn, allows the animal to extract the location of the tree and plan a path around it. In this review, we explore the anatomical, physiological, and computational implementation of retinocentric-to-egocentric reference frame transformations - a research area undergoing rapid progress stimulated by an ever-expanding molecular, physiological, and computational toolbox for probing neural circuits. We begin by summarizing evidence for retinocentric and egocentric reference frames in the brains of diverse organisms, from insects to primates. Next, we cover how distance estimation contributes to creating a three-dimensional representation of local space. We then review proposed implementations of reference frame transformations across different biological and artificial neural networks. Finally, we discuss how an internal egocentric model of the environment is maintained independently of the sensory inputs from which it is derived. By comparing findings across a variety of nervous systems and behaviors, we aim to inspire new avenues for investigating the neural basis of reference frame transformation, a canonical computation critical for modeling the external environment and guiding goal-directed behavior.
Collapse
Affiliation(s)
- Dylan M Martins
- Graduate Program in Dynamical Neuroscience, University of California, Santa Barbara, Santa Barbara, CA 93106, USA
| | - Joy M Manda
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ 08854, USA
| | - Michael J Goard
- Department of Psychological and Brain Sciences and Department of Molecular, Cellular, and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA 93106, USA.
| | - Philip R L Parker
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ 08854, USA.
| |
Collapse
|
4
|
Mai J, Gargiullo R, Zheng M, Esho V, Hussein OE, Pollay E, Bowe C, Williamson LM, McElroy AF, Saunders JL, Goolsby WN, Brooks KA, Rodgers CC. Sound-seeking before and after hearing loss in mice. Sci Rep 2024; 14:19181. [PMID: 39160202 PMCID: PMC11333604 DOI: 10.1038/s41598-024-67577-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Accepted: 07/11/2024] [Indexed: 08/21/2024] Open
Abstract
How we move our bodies affects how we perceive sound. For instance, head movements help us to better localize the source of a sound and to compensate for asymmetric hearing loss. However, many auditory experiments are designed to restrict head and body movements. To study the role of movement in hearing, we developed a behavioral task called sound-seeking that rewarded freely moving mice for tracking down an ongoing sound source. Over the course of learning, mice more efficiently navigated to the sound. Next, we asked how sound-seeking was affected by hearing loss induced by surgical removal of the malleus from the middle ear. After bilateral hearing loss sound-seeking performance drastically declined and did not recover. In striking contrast, after unilateral hearing loss mice were only transiently impaired and then recovered their sound-seek ability over about a week. Throughout recovery, unilateral mice increasingly relied on a movement strategy of sequentially checking potential locations for the sound source. In contrast, the startle reflex (an innate auditory behavior) was preserved after unilateral hearing loss and abolished by bilateral hearing loss without recovery over time. In sum, mice compensate with body movement for permanent unilateral damage to the peripheral auditory system. Looking forward, this paradigm provides an opportunity to examine how movement enhances perception and enables resilient adaptation to sensory disorders.
Collapse
Affiliation(s)
- Jessica Mai
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Rowan Gargiullo
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Megan Zheng
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Valentina Esho
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Osama E Hussein
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Eliana Pollay
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Cedric Bowe
- Neuroscience Graduate Program, Emory University, Atlanta, GA, 30322, USA
| | - Lucas M Williamson
- Neuroscience Graduate Program, Emory University, Atlanta, GA, 30322, USA
| | - Abigail F McElroy
- Neuroscience Graduate Program, Emory University, Atlanta, GA, 30322, USA
| | - Jonny L Saunders
- Department of Neurology, University of California, Los Angeles, Los Angeles, CA, 90095, USA
| | - William N Goolsby
- Department of Cell Biology, Emory University School of Medicine, Atlanta, GA, 30322, USA
| | - Kaitlyn A Brooks
- Department of Otolaryngology-Head and Neck Surgery, Emory University School of Medicine, Atlanta, GA, 30308, USA
| | - Chris C Rodgers
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, 30322, USA.
- Department of Cell Biology, Emory University School of Medicine, Atlanta, GA, 30322, USA.
- Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA, 30322, USA.
- Department of Biology, Emory College of Arts and Sciences, Atlanta, GA, 30322, USA.
| |
Collapse
|
5
|
Josephine Stednitz S, Lesak A, Fecker AL, Painter P, Washbourne P, Mazzucato L, Scott EK. Probabilistic modeling reveals coordinated social interaction states and their multisensory bases. ARXIV 2024:arXiv:2408.01683v1. [PMID: 39130202 PMCID: PMC11312628] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 08/13/2024]
Abstract
Social behavior across animal species ranges from simple pairwise interactions to thousands of individuals coordinating goal-directed movements. Regardless of the scale, these interactions are governed by the interplay between multimodal sensory information and the internal state of each animal. Here, we investigate how animals use multiple sensory modalities to guide social behavior in the highly social zebrafish (Danio rerio) and uncover the complex features of pairwise interactions early in development. To identify distinct behaviors and understand how they vary over time, we developed a new hidden Markov model with constrained linear-model emissions to automatically classify states of coordinated interaction, using the movements of one animal to predict those of another. We discovered that social behaviors alternate between two interaction states within a single experimental session, distinguished by unique movements and timescales. Long-range interactions, akin to shoaling, rely on vision, while mechanosensation underlies rapid synchronized movements and parallel swimming, precursors of schooling. Altogether, we observe spontaneous interactions in pairs of fish, develop novel hidden Markov modeling to reveal two fundamental interaction modes, and identify the sensory systems involved in each. Our modeling approach to pairwise social interactions has broad applicability to a wide variety of naturalistic behaviors and species and solves the challenge of detecting transient couplings between quasi-periodic time series.
Collapse
Affiliation(s)
| | - Andrew Lesak
- Institute of Neuroscience, University of Oregon, Eugene, OR, USA
| | - Adeline L Fecker
- Institute of Neuroscience, University of Oregon, Eugene, OR, USA
| | | | - Phil Washbourne
- Institute of Neuroscience, University of Oregon, Eugene, OR, USA
| | - Luca Mazzucato
- Institute of Neuroscience, University of Oregon, Eugene, OR, USA
| | - Ethan K Scott
- Department of Anatomy & Physiology, University of Melbourne, Parkville, VIC, Australia
- Queensland Brain Institute, University of Queensland, St Lucia, QLD, Australia
| |
Collapse
|
6
|
Herrera E, Chédotal A, Mason C. Development of the Binocular Circuit. Annu Rev Neurosci 2024; 47:303-322. [PMID: 38635868 DOI: 10.1146/annurev-neuro-111020-093230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/20/2024]
Abstract
Seeing in three dimensions is a major property of the visual system in mammals. The circuit underlying this property begins in the retina, from which retinal ganglion cells (RGCs) extend to the same or opposite side of the brain. RGC axons decussate to form the optic chiasm, then grow to targets in the thalamus and midbrain, where they synapse with neurons that project to the visual cortex. Here we review the cellular and molecular mechanisms of RGC axonal growth cone guidance across or away from the midline via receptors to cues in the midline environment. We present new views on the specification of ipsi- and contralateral RGC subpopulations and factors implementing their organization in the optic tract and termination in subregions of their targets. Lastly, we describe the functional and behavioral aspects of binocular vision, focusing on the mouse, and discuss recent discoveries in the evolution of the binocular circuit.
Collapse
Affiliation(s)
- Eloísa Herrera
- Instituto de Neurociencias (CSIC-UMH), Consejo Superior de Investigaciones Científicas and Universidad Miguel Hernández, Alicante, Spain;
| | - Alain Chédotal
- Université Claude Bernard Lyon 1, MeLiS, CNRS UMR5284, INSERM U1314, Lyon, France
- Institut de Pathologie, Groupe Hospitalier Est, Hospices Civils de Lyon, Lyon, France
- Institut de la Vision, INSERM, Sorbonne Université, Paris, France;
| | - Carol Mason
- Departments of Pathology and Cell Biology, Neuroscience, and Ophthalmology, Zuckerman Institute, Columbia University, New York, NY, USA;
| |
Collapse
|
7
|
Fitzpatrick MJ, Krizan J, Hsiang JC, Shen N, Kerschensteiner D. A pupillary contrast response in mice and humans: Neural mechanisms and visual functions. Neuron 2024; 112:2404-2422.e9. [PMID: 38697114 PMCID: PMC11257825 DOI: 10.1016/j.neuron.2024.04.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Revised: 12/21/2023] [Accepted: 04/10/2024] [Indexed: 05/04/2024]
Abstract
In the pupillary light response (PLR), increases in ambient light constrict the pupil to dampen increases in retinal illuminance. Here, we report that the pupillary reflex arc implements a second input-output transformation; it senses temporal contrast to enhance spatial contrast in the retinal image and increase visual acuity. The pupillary contrast response (PCoR) is driven by rod photoreceptors via type 6 bipolar cells and M1 ganglion cells. Temporal contrast is transformed into sustained pupil constriction by the M1's conversion of excitatory input into spike output. Computational modeling explains how the PCoR shapes retinal images. Pupil constriction improves acuity in gaze stabilization and predation in mice. Humans exhibit a PCoR with similar tuning properties to mice, which interacts with eye movements to optimize the statistics of the visual input for retinal encoding. Thus, we uncover a conserved component of active vision, its cell-type-specific pathway, computational mechanisms, and optical and behavioral significance.
Collapse
Affiliation(s)
- Michael J Fitzpatrick
- Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA; Graduate Program in Neuroscience, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA; Medical Scientist Training Program, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA
| | - Jenna Krizan
- Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA; Graduate Program in Neuroscience, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA
| | - Jen-Chun Hsiang
- Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA
| | - Ning Shen
- Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA
| | - Daniel Kerschensteiner
- Department of Ophthalmology and Visual Sciences, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA; Department of Neuroscience, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA; Department of Biomedical Engineering, Washington University School of Medicine in St. Louis, St. Louis, MO 63110, USA.
| |
Collapse
|
8
|
Skyberg RJ, Niell CM. Natural visual behavior and active sensing in the mouse. Curr Opin Neurobiol 2024; 86:102882. [PMID: 38704868 PMCID: PMC11254345 DOI: 10.1016/j.conb.2024.102882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2023] [Revised: 04/05/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
In the natural world, animals use vision for a wide variety of behaviors not reflected in most laboratory paradigms. Although mice have low-acuity vision, they use their vision for many natural behaviors, including predator avoidance, prey capture, and navigation. They also perform active sensing, moving their head and eyes to achieve behavioral goals and acquire visual information. These aspects of natural vision result in visual inputs and corresponding behavioral outputs that are outside the range of conventional vision studies but are essential aspects of visual function. Here, we review recent studies in mice that have tapped into natural behavior and active sensing to reveal the computational logic of neural circuits for vision.
Collapse
Affiliation(s)
- Rolf J Skyberg
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA. https://twitter.com/SkybergRolf
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA.
| |
Collapse
|
9
|
Oesch LT, Ryan MB, Churchland AK. From innate to instructed: A new look at perceptual decision-making. Curr Opin Neurobiol 2024; 86:102871. [PMID: 38569230 PMCID: PMC11162954 DOI: 10.1016/j.conb.2024.102871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/07/2024] [Accepted: 03/08/2024] [Indexed: 04/05/2024]
Abstract
Understanding how subjects perceive sensory stimuli in their environment and use this information to guide appropriate actions is a major challenge in neuroscience. To study perceptual decision-making in animals, researchers use tasks that either probe spontaneous responses to stimuli (often described as "naturalistic") or train animals to associate stimuli with experimenter-defined responses. Spontaneous decisions rely on animals' pre-existing knowledge, while trained tasks offer greater versatility, albeit often at the cost of extensive training. Here, we review emerging approaches to investigate perceptual decision-making using both spontaneous and trained behaviors, highlighting their strengths and limitations. Additionally, we propose how trained decision-making tasks could be improved to achieve faster learning and a more generalizable understanding of task rules.
Collapse
Affiliation(s)
- Lukas T Oesch
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States
| | - Michael B Ryan
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States. https://twitter.com/NeuroMikeRyan
| | - Anne K Churchland
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States.
| |
Collapse
|
10
|
Ambrad Giovannetti E, Rancz E. Behind mouse eyes: The function and control of eye movements in mice. Neurosci Biobehav Rev 2024; 161:105671. [PMID: 38604571 DOI: 10.1016/j.neubiorev.2024.105671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/12/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
The mouse visual system has become the most popular model to study the cellular and circuit mechanisms of sensory processing. However, the importance of eye movements only started to be appreciated recently. Eye movements provide a basis for predictive sensing and deliver insights into various brain functions and dysfunctions. A plethora of knowledge on the central control of eye movements and their role in perception and behaviour arose from work on primates. However, an overview of various eye movements in mice and a comparison to primates is missing. Here, we review the eye movement types described to date in mice and compare them to those observed in primates. We discuss the central neuronal mechanisms for their generation and control. Furthermore, we review the mounting literature on eye movements in mice during head-fixed and freely moving behaviours. Finally, we highlight gaps in our understanding and suggest future directions for research.
Collapse
Affiliation(s)
| | - Ede Rancz
- INMED, INSERM, Aix-Marseille University, Marseille, France.
| |
Collapse
|
11
|
Mai J, Gargiullo R, Zheng M, Esho V, Hussein OE, Pollay E, Bowe C, Williamson LM, McElroy AF, Goolsby WN, Brooks KA, Rodgers CC. Sound-seeking before and after hearing loss in mice. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.01.08.574475. [PMID: 38260458 PMCID: PMC10802496 DOI: 10.1101/2024.01.08.574475] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/24/2024]
Abstract
How we move our bodies affects how we perceive sound. For instance, we can explore an environment to seek out the source of a sound and we can use head movements to compensate for hearing loss. How we do this is not well understood because many auditory experiments are designed to limit head and body movements. To study the role of movement in hearing, we developed a behavioral task called sound-seeking that rewarded mice for tracking down an ongoing sound source. Over the course of learning, mice more efficiently navigated to the sound. We then asked how auditory behavior was affected by hearing loss induced by surgical removal of the malleus from the middle ear. An innate behavior, the auditory startle response, was abolished by bilateral hearing loss and unaffected by unilateral hearing loss. Similarly, performance on the sound-seeking task drastically declined after bilateral hearing loss and did not recover. In striking contrast, mice with unilateral hearing loss were only transiently impaired on sound-seeking; over a recovery period of about a week, they regained high levels of performance, increasingly reliant on a different spatial sampling strategy. Thus, even in the face of permanent unilateral damage to the peripheral auditory system, mice recover their ability to perform a naturalistic sound-seeking task. This paradigm provides an opportunity to examine how body movement enables better hearing and resilient adaptation to sensory deprivation.
Collapse
Affiliation(s)
- Jessica Mai
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Rowan Gargiullo
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Megan Zheng
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Valentina Esho
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Osama E Hussein
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Eliana Pollay
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
| | - Cedric Bowe
- Neuroscience Graduate Program, Emory University, Atlanta GA 30322
| | | | | | - William N Goolsby
- Department of Cell Biology, Emory University School of Medicine, Atlanta GA 30322
| | - Kaitlyn A Brooks
- Department of Otolaryngology - Head and Neck Surgery, Emory University School of Medicine, Atlanta GA 30308
| | - Chris C Rodgers
- Department of Neurosurgery, Emory University School of Medicine, Atlanta GA 30322
- Department of Cell Biology, Emory University School of Medicine, Atlanta GA 30322
- Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta GA 30322
- Department of Biology, Emory College of Arts and Sciences, Atlanta GA 30322
| |
Collapse
|
12
|
Kim H, Koike Y, Choi W, Lee J. The effect of different depth planes during a manual tracking task in three-dimensional virtual reality space. Sci Rep 2023; 13:21499. [PMID: 38057361 PMCID: PMC10700492 DOI: 10.1038/s41598-023-48869-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2023] [Accepted: 11/30/2023] [Indexed: 12/08/2023] Open
Abstract
Unlike ballistic arm movements such as reaching, the contribution of depth information to the performance of manual tracking movements is unclear. Thus, to understand how the brain handles information, we investigated how a required movement along the depth axis would affect behavioral tracking performance, postulating that it would be affected by the amount of depth movement. We designed a visually guided planar tracking task that requires movement on three planes with different depths: a fronto-parallel plane called ROT (0), a sagittal plane called ROT (90), and a plane rotated by 45° with respect to the sagittal plane called ROT (45). Fifteen participants performed a circular manual tracking task under binocular and monocular visions in a three-dimensional (3D) virtual reality space. As a result, under binocular vision, ROT (90), which required the largest depth movement among the tasks, showed the greatest error in 3D. Similarly, the errors (deviation from the target path) on the depth axis revealed significant differences among the tasks. Under monocular vision, significant differences in errors were observed only on the lateral axis. Moreover, we observed that the errors in the lateral and depth axes were proportional to the required movement on these axes under binocular vision and confirmed that the required depth movement under binocular vision determined depth error independent of the other axes. This finding implies that the brain may independently process binocular vision information on each axis. Meanwhile, the required depth movement under monocular vision was independent of performance along the depth axis, indicating an intractable behavior. Our findings highlight the importance of handling depth movement, especially when a virtual reality situation, involving tracking tasks, is generated.
Collapse
Affiliation(s)
- Hyeonseok Kim
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, CA, 92093, USA
| | - Yasuharu Koike
- Institute of Innovative Research, Tokyo Institute of Technology, Yokohama, 226-8503, Japan
| | - Woong Choi
- College of ICT Construction & Welfare Convergence, Kangnam University, Yongin, 16979, Republic of Korea.
| | - Jongho Lee
- Department of Clinical Engineering, Komatsu University, Komatsu, 923-0961, Japan.
| |
Collapse
|
13
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023; 24:487-501. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
14
|
Skeels S, von der Emde G, Burt de Perera T. Mormyrid fish as models for investigating sensory-motor integration: A behavioural perspective. J Zool (1987) 2023; 319:243-253. [PMID: 38515784 PMCID: PMC10953462 DOI: 10.1111/jzo.13046] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 11/12/2022] [Accepted: 12/22/2022] [Indexed: 02/04/2023]
Abstract
Animals possess senses which gather information from their environment. They can tune into important aspects of this information and decide on the most appropriate response, requiring coordination of their sensory and motor systems. This interaction is bidirectional. Animals can actively shape their perception with self-driven motion, altering sensory flow to maximise the environmental information they are able to extract. Mormyrid fish are excellent candidates for studying sensory-motor interactions, because they possess a unique sensory system (the active electric sense) and exhibit notable behaviours that seem to be associated with electrosensing. This review will take a behavioural approach to unpicking this relationship, using active electrolocation as an example where body movements and sensing capabilities are highly related and can be assessed in tandem. Active electrolocation is the process where individuals will generate and detect low-voltage electric fields to locate and recognise nearby objects. We will focus on research in the mormyrid Gnathonemus petersii (G. petersii), given the extensive study of this species, particularly its object recognition abilities. By studying object detection and recognition, we can assess the potential benefits of self-driven movements to enhance selection of biologically relevant information. Finally, these findings are highly relevant to understanding the involvement of movement in shaping the sensory experience of animals that use other sensory modalities. Understanding the overlap between sensory and motor systems will give insight into how different species have become adapted to their environments.
Collapse
Affiliation(s)
- S. Skeels
- Department of BiologyUniversity of OxfordOxfordUK
| | | | | |
Collapse
|