1
|
Vélez-Fort M, Cossell L, Porta L, Clopath C, Margrie TW. Motor and vestibular signals in the visual cortex permit the separation of self versus externally generated visual motion. Cell 2025; 188:2175-2189.e15. [PMID: 39978344 DOI: 10.1016/j.cell.2025.01.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Revised: 01/06/2025] [Accepted: 01/24/2025] [Indexed: 02/22/2025]
Abstract
Knowing whether we are moving or something in the world is moving around us is possibly the most critical sensory discrimination we need to perform. How the brain and, in particular, the visual system solves this motion-source separation problem is not known. Here, we find that motor, vestibular, and visual motion signals are used by the mouse primary visual cortex (VISp) to differentially represent the same visual flow information according to whether the head is stationary or experiencing passive versus active translation. During locomotion, we find that running suppresses running-congruent translation input and that translation signals dominate VISp activity when running and translation speed become incongruent. This cross-modal interaction between the motor and vestibular systems was found throughout the cortex, indicating that running and translation signals provide a brain-wide egocentric reference frame for computing the internally generated and actual speed of self when moving through and sensing the external world.
Collapse
Affiliation(s)
- Mateo Vélez-Fort
- Sainsbury Wellcome Centre, University College London, London, UK
| | - Lee Cossell
- Sainsbury Wellcome Centre, University College London, London, UK
| | - Laura Porta
- Sainsbury Wellcome Centre, University College London, London, UK
| | - Claudia Clopath
- Sainsbury Wellcome Centre, University College London, London, UK; Bioengineering Department, Imperial College London, London, UK
| | - Troy W Margrie
- Sainsbury Wellcome Centre, University College London, London, UK.
| |
Collapse
|
2
|
Meier AM, D'Souza RD, Ji W, Han EB, Burkhalter A. Interdigitating Modules for Visual Processing During Locomotion and Rest in Mouse V1. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2025:2025.02.21.639505. [PMID: 40060542 PMCID: PMC11888233 DOI: 10.1101/2025.02.21.639505] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 03/20/2025]
Abstract
Layer 1 of V1 has been shown to receive locomotion-related signals from the dorsal lateral geniculate (dLGN) and lateral posterior (LP) thalamic nuclei (Roth et al., 2016). Inputs from the dLGN terminate in M2+ patches while inputs from LP target M2- interpatches (D'Souza et al., 2019) suggesting that motion related signals are processed in distinct networks. Here, we investigated by calcium imaging in head-fixed awake mice whether L2/3 neurons underneath L1 M2+ and M2- modules are differentially activated by locomotion, and whether distinct networks of feedback connections from higher cortical areas to L1 may contribute to these differences. We found that strongly locomotion-modulated cell clusters during visual stimulation were aligned with M2- interpatches, while weakly modulated cells clustered under M2+ patches. Unlike M2+ patch cells, pairs of M2- interpatch cells showed increased correlated variability of calcium transients when the sites in the visuotopic map were far apart, suggesting that activity is integrated across large parts of the visual field. Pathway tracing further suggests that strong locomotion modulation in L2/3 M2- interpatch cells of V1 relies on looped, like-to-like networks between apical dendrites of MOs-, PM- and RSP-projecting neurons and feedback input from these areas to L1. M2- interpatches receive strong inputs from SST neurons, suggesting that during locomotion these interneurons influence the firing of specific subnetworks by controlling the excitability of apical dendrites in M2- interpatches.
Collapse
Affiliation(s)
- A M Meier
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110; USA
| | - R D D'Souza
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110; USA
| | - W Ji
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110; USA
| | - E B Han
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110; USA
| | - A Burkhalter
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110; USA
| |
Collapse
|
3
|
Bouvier G, Sanzeni A, Hamada E, Brunel N, Scanziani M. Inter- and Intrahemispheric Sources of Vestibular Signals to V1. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.11.18.624137. [PMID: 39605728 PMCID: PMC11601413 DOI: 10.1101/2024.11.18.624137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 11/29/2024]
Abstract
Head movements are sensed by the vestibular organs. Unlike classical senses, signals from vestibular organs are not conveyed to a dedicated cortical area but are broadcast throughout the cortex. Surprisingly, the routes taken by vestibular signals to reach the cortex are still largely uncharted. Here we show that the primary visual cortex (V1) receives real-time head movement signals - direction, velocity, and acceleration - from the ipsilateral pulvinar and contralateral visual cortex. The ipsilateral pulvinar provides the main head movement signal, with a bias toward contraversive movements (e.g. clockwise movements in left V1). Conversely, the contralateral visual cortex provides head movement signals during ipsiversive movements. Crucially, head movement variables encoded in V1 are already encoded in the pulvinar, suggesting that those variables are computed subcortically. Thus, the convergence of inter- and intrahemispheric signals endows V1 with a rich representation of the animal's head movements.
Collapse
Affiliation(s)
- Guy Bouvier
- Department of Physiology, University of California, San Francisco, San Francisco, CA, USA
- Howard Hughes Medical Institute, University of California, San Francisco, San Francisco, CA, USA
- Université Paris-Saclay, CNRS, Institut des Neurosciences Paris-Saclay, 91400 Saclay, France
| | - Alessandro Sanzeni
- Department of Computing Sciences, Bocconi University, 20100 Milan, Italy
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA
- Department of Neurobiology, Duke University, Durham, NC 27710, USA
| | - Elizabeth Hamada
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA
| | - Nicolas Brunel
- Department of Computing Sciences, Bocconi University, 20100 Milan, Italy
- Department of Neurobiology, Duke University, Durham, NC 27710, USA
| | - Massimo Scanziani
- Department of Physiology, University of California, San Francisco, San Francisco, CA, USA
- Howard Hughes Medical Institute, University of California, San Francisco, San Francisco, CA, USA
| |
Collapse
|
4
|
Tariq MF, Sterrett SC, Moore S, Lane, Perkel DJ, Gire DH. Dynamics of odor-source localization: Insights from real-time odor plume recordings and head-motion tracking in freely moving mice. PLoS One 2024; 19:e0310254. [PMID: 39325742 PMCID: PMC11426488 DOI: 10.1371/journal.pone.0310254] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 08/27/2024] [Indexed: 09/28/2024] Open
Abstract
Animals navigating turbulent odor plumes exhibit a rich variety of behaviors, and employ efficient strategies to locate odor sources. A growing body of literature has started to probe this complex task of localizing airborne odor sources in walking mammals to further our understanding of neural encoding and decoding of naturalistic sensory stimuli. However, correlating the intermittent olfactory information with behavior has remained a long-standing challenge due to the stochastic nature of the odor stimulus. We recently reported a method to record real-time olfactory information available to freely moving mice during odor-guided navigation, hence overcoming that challenge. Here we combine our odor-recording method with head-motion tracking to establish correlations between plume encounters and head movements. We show that mice exhibit robust head-pitch motions in the 5-14Hz range during an odor-guided navigation task, and that these head motions are modulated by plume encounters. Furthermore, mice reduce their angles with respect to the source upon plume contact. Head motions may thus be an important part of the sensorimotor behavioral repertoire during naturalistic odor-source localization.
Collapse
Affiliation(s)
- Mohammad F. Tariq
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Scott C. Sterrett
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Sidney Moore
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| | - Lane
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
- Department of Psychology, Seattle University, Seattle, Washington, United States of America
| | - David J. Perkel
- Departments of Biology & Otolaryngology, University of Washington, Seattle, Washington, United States of America
| | - David H. Gire
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
5
|
Wang B, Audette NJ, Schneider DM, Aljadeff J. Desegregation of neuronal predictive processing. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.08.05.606684. [PMID: 39149380 PMCID: PMC11326200 DOI: 10.1101/2024.08.05.606684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 08/17/2024]
Abstract
Neural circuits construct internal 'world-models' to guide behavior. The predictive processing framework posits that neural activity signaling sensory predictions and concurrently computing prediction-errors is a signature of those internal models. Here, to understand how the brain generates predictions for complex sensorimotor signals, we investigate the emergence of high-dimensional, multi-modal predictive representations in recurrent networks. We find that robust predictive processing arises in a network with loose excitatory/inhibitory balance. Contrary to previous proposals of functionally specialized cell-types, the network exhibits desegregation of stimulus and prediction-error representations. We confirmed these model predictions by experimentally probing predictive-coding circuits using a rich stimulus-set to violate learned expectations. When constrained by data, our model further reveals and makes concrete testable experimental predictions for the distinct functional roles of excitatory and inhibitory neurons, and of neurons in different layers along a laminar hierarchy, in computing multi-modal predictions. These results together imply that in natural conditions, neural representations of internal models are highly distributed, yet structured to allow flexible readout of behaviorally-relevant information. The generality of our model advances the understanding of computation of internal models across species, by incorporating different types of predictive computations into a unified framework.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, CA, 92093, USA
| | | | - David M Schneider
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, CA, 92093, USA
| |
Collapse
|
6
|
Tariq MF, Sterrett SC, Moore S, Lane L, Perkel DJ, Gire DH. Dynamics of odor-source localization: Insights from real-time odor plume recordings and head-motion tracking in freely moving mice. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.11.10.566539. [PMID: 38014041 PMCID: PMC10680624 DOI: 10.1101/2023.11.10.566539] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
Animals navigating turbulent odor plumes exhibit a rich variety of behaviors, and employ efficient strategies to locate odor sources. A growing body of literature has started to probe this complex task of localizing airborne odor sources in walking mammals to further our understanding of neural encoding and decoding of naturalistic sensory stimuli. However, correlating the intermittent olfactory information with behavior has remained a long-standing challenge due to the stochastic nature of the odor stimulus. We recently reported a method to record real-time olfactory information available to freely moving mice during odor-guided navigation, hence overcoming that challenge. Here we combine our odor-recording method with head-motion tracking to establish correlations between plume encounters and head movements. We show that mice exhibit robust head-pitch motions in the 5-14Hz range during an odor-guided navigation task, and that these head motions are modulated by plume encounters. Furthermore, mice reduce their angles with respect to the source upon plume contact. Head motions may thus be an important part of the sensorimotor behavioral repertoire during naturalistic odor-source localization.
Collapse
Affiliation(s)
- Mohammad F. Tariq
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, USA
- Department of Psychology, University of Washington, Seattle, Washington, USA
| | - Scott C. Sterrett
- Graduate Program in Neuroscience, University of Washington, Seattle, Washington, USA
- Department of Psychology, University of Washington, Seattle, Washington, USA
| | - Sidney Moore
- Department of Psychology, University of Washington, Seattle, Washington, USA
| | - Lane Lane
- Department of Psychology, University of Washington, Seattle, Washington, USA
- Department of Psychology, Seattle University, Seattle, Washington, USA
| | - David J. Perkel
- Departments of Biology & Otolaryngology, University of Washington, Seattle, Washington, USA
| | - David H. Gire
- Department of Psychology, University of Washington, Seattle, Washington, USA
| |
Collapse
|
7
|
Skyberg RJ, Niell CM. Natural visual behavior and active sensing in the mouse. Curr Opin Neurobiol 2024; 86:102882. [PMID: 38704868 PMCID: PMC11254345 DOI: 10.1016/j.conb.2024.102882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2023] [Revised: 04/05/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
In the natural world, animals use vision for a wide variety of behaviors not reflected in most laboratory paradigms. Although mice have low-acuity vision, they use their vision for many natural behaviors, including predator avoidance, prey capture, and navigation. They also perform active sensing, moving their head and eyes to achieve behavioral goals and acquire visual information. These aspects of natural vision result in visual inputs and corresponding behavioral outputs that are outside the range of conventional vision studies but are essential aspects of visual function. Here, we review recent studies in mice that have tapped into natural behavior and active sensing to reveal the computational logic of neural circuits for vision.
Collapse
Affiliation(s)
- Rolf J Skyberg
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA. https://twitter.com/SkybergRolf
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA.
| |
Collapse
|
8
|
Ambrad Giovannetti E, Rancz E. Behind mouse eyes: The function and control of eye movements in mice. Neurosci Biobehav Rev 2024; 161:105671. [PMID: 38604571 DOI: 10.1016/j.neubiorev.2024.105671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/12/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
The mouse visual system has become the most popular model to study the cellular and circuit mechanisms of sensory processing. However, the importance of eye movements only started to be appreciated recently. Eye movements provide a basis for predictive sensing and deliver insights into various brain functions and dysfunctions. A plethora of knowledge on the central control of eye movements and their role in perception and behaviour arose from work on primates. However, an overview of various eye movements in mice and a comparison to primates is missing. Here, we review the eye movement types described to date in mice and compare them to those observed in primates. We discuss the central neuronal mechanisms for their generation and control. Furthermore, we review the mounting literature on eye movements in mice during head-fixed and freely moving behaviours. Finally, we highlight gaps in our understanding and suggest future directions for research.
Collapse
Affiliation(s)
| | - Ede Rancz
- INMED, INSERM, Aix-Marseille University, Marseille, France.
| |
Collapse
|
9
|
Wang S, He Y, Hu J, Xia J, Fang K, Yu J, Wang Y. Eye movement intervention facilitates concurrent perception and memory processing. Cereb Cortex 2024; 34:bhae190. [PMID: 38725291 DOI: 10.1093/cercor/bhae190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2024] [Revised: 04/17/2024] [Accepted: 04/18/2024] [Indexed: 01/28/2025] Open
Abstract
A widely used psychotherapeutic treatment for post-traumatic stress disorder (PTSD) involves performing bilateral eye movement (EM) during trauma memory retrieval. However, how this treatment-described as eye movement desensitization and reprocessing (EMDR)-alleviates trauma-related symptoms is unclear. While conventional theories suggest that bilateral EM interferes with concurrently retrieved trauma memories by taxing the limited working memory resources, here, we propose that bilateral EM actually facilitates information processing. In two EEG experiments, we replicated the bilateral EM procedure of EMDR, having participants engaging in continuous bilateral EM or receiving bilateral sensory stimulation (BS) as a control while retrieving short- or long-term memory. During EM or BS, we presented bystander images or memory cues to probe neural representations of perceptual and memory information. Multivariate pattern analysis of the EEG signals revealed that bilateral EM enhanced neural representations of simultaneously processed perceptual and memory information. This enhancement was accompanied by heightened visual responses and increased neural excitability in the occipital region. Furthermore, bilateral EM increased information transmission from the occipital to the frontoparietal region, indicating facilitated information transition from low-level perceptual representation to high-level memory representation. These findings argue for theories that emphasize information facilitation rather than disruption in the EMDR treatment.
Collapse
Affiliation(s)
- Sinuo Wang
- Department of Psychology and Behavioral Sciences, Zhejiang University, No. 388 Yuhangtang Road, Hangzhou 310058, Zhejiang, China
| | - Yang He
- School of Psychology, Jiangxi Normal University, No. 99 Ziyang Road, Nanchang 330022, Jiangxi, China
| | - Jie Hu
- Department of Psychology and Behavioral Sciences, Zhejiang University, No. 388 Yuhangtang Road, Hangzhou 310058, Zhejiang, China
| | - Jianan Xia
- Department of Psychology and Behavioral Sciences, Zhejiang University, No. 388 Yuhangtang Road, Hangzhou 310058, Zhejiang, China
| | - Ke Fang
- Department of Psychology and Behavioral Sciences, Zhejiang University, No. 388 Yuhangtang Road, Hangzhou 310058, Zhejiang, China
| | - Junna Yu
- Pharmacy Department, Yantai Hospital of Traditional Chinese Medicine, No. 39 Xingfu Road, Yantai 2264013, Shandong, China
| | - Yingying Wang
- Department of Psychology and Behavioral Sciences, Zhejiang University, No. 388 Yuhangtang Road, Hangzhou 310058, Zhejiang, China
| |
Collapse
|
10
|
Crombie D, Spacek MA, Leibold C, Busse L. Spiking activity in the visual thalamus is coupled to pupil dynamics across temporal scales. PLoS Biol 2024; 22:e3002614. [PMID: 38743775 PMCID: PMC11093384 DOI: 10.1371/journal.pbio.3002614] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Accepted: 04/05/2024] [Indexed: 05/16/2024] Open
Abstract
The processing of sensory information, even at early stages, is influenced by the internal state of the animal. Internal states, such as arousal, are often characterized by relating neural activity to a single "level" of arousal, defined by a behavioral indicator such as pupil size. In this study, we expand the understanding of arousal-related modulations in sensory systems by uncovering multiple timescales of pupil dynamics and their relationship to neural activity. Specifically, we observed a robust coupling between spiking activity in the mouse dorsolateral geniculate nucleus (dLGN) of the thalamus and pupil dynamics across timescales spanning a few seconds to several minutes. Throughout all these timescales, 2 distinct spiking modes-individual tonic spikes and tightly clustered bursts of spikes-preferred opposite phases of pupil dynamics. This multi-scale coupling reveals modulations distinct from those captured by pupil size per se, locomotion, and eye movements. Furthermore, coupling persisted even during viewing of a naturalistic movie, where it contributed to differences in the encoding of visual information. We conclude that dLGN spiking activity is under the simultaneous influence of multiple arousal-related processes associated with pupil dynamics occurring over a broad range of timescales.
Collapse
Affiliation(s)
- Davide Crombie
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany
- Graduate School of Systemic Neurosciences, LMU Munich, Munich, Germany
| | - Martin A. Spacek
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany
| | - Christian Leibold
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany
- Fakultät für Biologie & Bernstein Center Freiburg, Albert-Ludwigs-Universität Freiburg, Freiburg im Breisgau, Germany
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany
- Bernstein Center for Computational Neuroscience, Munich, Germany
| |
Collapse
|
11
|
Wang C, Derderian KD, Hamada E, Zhou X, Nelson AD, Kyoung H, Ahituv N, Bouvier G, Bender KJ. Impaired cerebellar plasticity hypersensitizes sensory reflexes in SCN2A-associated ASD. Neuron 2024; 112:1444-1455.e5. [PMID: 38412857 PMCID: PMC11065582 DOI: 10.1016/j.neuron.2024.01.029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2023] [Revised: 01/03/2024] [Accepted: 01/29/2024] [Indexed: 02/29/2024]
Abstract
Children diagnosed with autism spectrum disorder (ASD) commonly present with sensory hypersensitivity or abnormally strong reactions to sensory stimuli. Such hypersensitivity can be overwhelming, causing high levels of distress that contribute markedly to the negative aspects of the disorder. Here, we identify a mechanism that underlies hypersensitivity in a sensorimotor reflex found to be altered in humans and in mice with loss of function in the ASD risk-factor gene SCN2A. The cerebellum-dependent vestibulo-ocular reflex (VOR), which helps maintain one's gaze during movement, was hypersensitized due to deficits in cerebellar synaptic plasticity. Heterozygous loss of SCN2A-encoded NaV1.2 sodium channels in granule cells impaired high-frequency transmission to Purkinje cells and long-term potentiation, a form of synaptic plasticity important for modulating VOR gain. VOR plasticity could be rescued in mice via a CRISPR-activator approach that increases Scn2a expression, demonstrating that evaluation of a simple reflex can be used to assess and quantify successful therapeutic intervention.
Collapse
Affiliation(s)
- Chenyu Wang
- Neuroscience Graduate Program, University of California, San Francisco, San Francisco, CA, USA; Department of Neurology, University of California, San Francisco, San Francisco, CA, USA; Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA
| | - Kimberly D Derderian
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA
| | - Elizabeth Hamada
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA
| | - Xujia Zhou
- Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, USA
| | - Andrew D Nelson
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA; Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA
| | - Henry Kyoung
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA
| | - Nadav Ahituv
- Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA; Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, USA; Institute for Human Genetics, University of California, San Francisco, San Francisco, CA, USA
| | - Guy Bouvier
- Department of Physiology, University of California, San Francisco, San Francisco, CA, USA; Université Paris-Saclay, CNRS, Institut des Neurosciences Paris-Saclay, 91400 Saclay, France.
| | - Kevin J Bender
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA; Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, USA.
| |
Collapse
|
12
|
A Dehaqani A, Michelon F, Patella P, Petrucco L, Piasini E, Iurilli G. A mechanosensory feedback that uncouples external and self-generated sensory responses in the olfactory cortex. Cell Rep 2024; 43:114013. [PMID: 38551962 DOI: 10.1016/j.celrep.2024.114013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Revised: 12/20/2023] [Accepted: 03/13/2024] [Indexed: 04/28/2024] Open
Abstract
Sampling behaviors have sensory consequences that can hinder perceptual stability. In olfaction, sniffing affects early odor encoding, mimicking a sudden change in odor concentration. We examined how the inhalation speed affects the representation of odor concentration in the main olfactory cortex. Neurons combine the odor input with a global top-down signal preceding the sniff and a mechanosensory feedback generated by the air passage through the nose during inhalation. Still, the population representation of concentration is remarkably sniff invariant. This is because the mechanosensory and olfactory responses are uncorrelated within and across neurons. Thus, faster odor inhalation and an increase in concentration change the cortical activity pattern in distinct ways. This encoding strategy affords tolerance to potential concentration fluctuations caused by varying inhalation speeds. Since mechanosensory reafferences are widespread across sensory systems, the coding scheme described here may be a canonical strategy to mitigate the sensory ambiguities caused by movements.
Collapse
Affiliation(s)
- Alireza A Dehaqani
- Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy; CIMeC, University of Trento, 38068 Rovereto, Italy
| | - Filippo Michelon
- Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy; CIMeC, University of Trento, 38068 Rovereto, Italy
| | - Paola Patella
- Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy
| | - Luigi Petrucco
- Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy
| | - Eugenio Piasini
- International School for Advanced Studies (SISSA), Trieste, Italy
| | - Giuliano Iurilli
- Center for Neuroscience and Cognitive Systems, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.
| |
Collapse
|
13
|
Oude Lohuis MN, Marchesi P, Olcese U, Pennartz CMA. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 2024; 27:758-771. [PMID: 38307971 DOI: 10.1038/s41593-023-01564-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 12/19/2023] [Indexed: 02/04/2024]
Abstract
Primary sensory cortices respond to crossmodal stimuli-for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80-100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.
Collapse
Affiliation(s)
- Matthijs N Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Pietro Marchesi
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M A Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands.
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands.
| |
Collapse
|
14
|
Syeda A, Zhong L, Tung R, Long W, Pachitariu M, Stringer C. Facemap: a framework for modeling neural activity based on orofacial tracking. Nat Neurosci 2024; 27:187-195. [PMID: 37985801 PMCID: PMC10774130 DOI: 10.1038/s41593-023-01490-6] [Citation(s) in RCA: 32] [Impact Index Per Article: 32.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2022] [Accepted: 10/10/2023] [Indexed: 11/22/2023]
Abstract
Recent studies in mice have shown that orofacial behaviors drive a large fraction of neural activity across the brain. To understand the nature and function of these signals, we need better computational models to characterize the behaviors and relate them to neural activity. Here we developed Facemap, a framework consisting of a keypoint tracker and a deep neural network encoder for predicting neural activity. Our algorithm for tracking mouse orofacial behaviors was more accurate than existing pose estimation tools, while the processing speed was several times faster, making it a powerful tool for real-time experimental interventions. The Facemap tracker was easy to adapt to data from new labs, requiring as few as 10 annotated frames for near-optimal performance. We used the keypoints as inputs to a deep neural network which predicts the activity of ~50,000 simultaneously-recorded neurons and, in visual cortex, we doubled the amount of explained variance compared to previous methods. Using this model, we found that the neuronal activity clusters that were well predicted from behavior were more spatially spread out across cortex. We also found that the deep behavioral features from the model had stereotypical, sequential dynamics that were not reversible in time. In summary, Facemap provides a stepping stone toward understanding the function of the brain-wide neural signals and their relation to behavior.
Collapse
Affiliation(s)
- Atika Syeda
- HHMI Janelia Research Campus, Ashburn, VA, USA.
| | - Lin Zhong
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Renee Tung
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Will Long
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | | | | |
Collapse
|
15
|
Baffour-Awuah KA, Bridge H, Engward H, MacKinnon RC, Ip IB, Jolly JK. The missing pieces: an investigation into the parallels between Charles Bonnet, phantom limb and tinnitus syndromes. Ther Adv Ophthalmol 2024; 16:25158414241302065. [PMID: 39649951 PMCID: PMC11624543 DOI: 10.1177/25158414241302065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2024] [Accepted: 11/04/2024] [Indexed: 12/11/2024] Open
Abstract
Charles Bonnet syndrome (CBS) is a condition characterised by visual hallucinations of varying complexity on a background of vision loss. CBS research has gained popularity only in recent decades, despite evidence dating back to 1760. Knowledge of CBS among both the patient and professional populations unfortunately remains poor, and little is known of its underlying pathophysiology. CBS parallels two other better-known conditions that occur as a result of sensory loss: phantom limb syndrome (PLS) (aberrant sensation of the presence of a missing limb) and tinnitus (aberrant sensation of sound). As 'phantom' conditions, CBS, PLS and tinnitus share sensory loss as a precipitating factor, and, as subjective perceptual phenomena, face similar challenges to investigations. Thus far, these conditions have been studied separately from each other. This review aims to bridge the conceptual gap between CBS, PLS and tinnitus and seek common lessons between them. It considers the current knowledge base of CBS and explores the extent to which an understanding of PLS and tinnitus could provide valuable insights into the pathology of CBS (including the roles of cortical reorganisation, emotional and cognitive factors), and towards identifying effective potential management for CBS.
Collapse
Affiliation(s)
- Kwame A. Baffour-Awuah
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- East and North Hertfordshire NHS Trust, Hertfordshire, UK
| | - Holly Bridge
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
| | - Hilary Engward
- Veterans and Families Institute, Anglia Ruskin University, Cambridge, UK
| | - Robert C. MacKinnon
- School of Psychology, Sports and Sensory Sciences, Anglia Ruskin University, Cambridge, UK
| | - I. Betina Ip
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
| | - Jasleen K. Jolly
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Young Street, Cambridge, CB1 2LZ, UK
| |
Collapse
|
16
|
Cano-Ferrer X, Tran-Van-Minh A, Rancz E. RPM: An open-source Rotation Platform for open- and closed-loop vestibular stimulation in head-fixed Mice. J Neurosci Methods 2024; 401:110002. [PMID: 37925080 DOI: 10.1016/j.jneumeth.2023.110002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 10/07/2023] [Accepted: 10/27/2023] [Indexed: 11/06/2023]
Abstract
Head fixation allows the recording and presentation of controlled stimuli and is used to study neural processes underlying spatial navigation. However, it disrupts the head direction system because of the lack of vestibular stimulation. To overcome this limitation, we developed a novel rotation platform which can be driven by the experimenter (open-loop) or by animal movement (closed-loop). The platform is modular, affordable, easy to build and open source. Additional modules presented here include cameras for monitoring eye movements, visual virtual reality, and a micro-manipulator for positioning various probes for recording or optical interference. We demonstrate the utility of the platform by recording eye movements and showing the robust activation of head-direction cells. This novel experimental apparatus combines the advantages of head fixation and intact vestibular activity in the horizontal plane. The open-loop mode can be used to study e.g., vestibular sensory representation and processing, while the closed-loop mode allows animals to navigate in rotational space, providing a better substrate for 2-D navigation in virtual environments. The full build documentation is maintained at https://ranczlab.github.io/RPM/.
Collapse
Affiliation(s)
- Xavier Cano-Ferrer
- The Francis Crick Institute, Cortical Circuits Laboratory, London NW1 1AT, UK; The Francis Crick Institute, Making Science and Technology Platform, London NW1 1AT, UK
| | | | - Ede Rancz
- The Francis Crick Institute, Cortical Circuits Laboratory, London NW1 1AT, UK; INMED, INSERM, Aix-Marseille Université, France.
| |
Collapse
|
17
|
Sanzeni A, Palmigiano A, Nguyen TH, Luo J, Nassi JJ, Reynolds JH, Histed MH, Miller KD, Brunel N. Mechanisms underlying reshuffling of visual responses by optogenetic stimulation in mice and monkeys. Neuron 2023; 111:4102-4115.e9. [PMID: 37865082 PMCID: PMC10841937 DOI: 10.1016/j.neuron.2023.09.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 05/05/2023] [Accepted: 09/15/2023] [Indexed: 10/23/2023]
Abstract
The ability to optogenetically perturb neural circuits opens an unprecedented window into mechanisms governing circuit function. We analyzed and theoretically modeled neuronal responses to visual and optogenetic inputs in mouse and monkey V1. In both species, optogenetic stimulation of excitatory neurons strongly modulated the activity of single neurons yet had weak or no effects on the distribution of firing rates across the population. Thus, the optogenetic inputs reshuffled firing rates across the network. Key statistics of mouse and monkey responses lay on a continuum, with mice/monkeys occupying the low-/high-rate regions, respectively. We show that neuronal reshuffling emerges generically in randomly connected excitatory/inhibitory networks, provided the coupling strength (combination of recurrent coupling and external input) is sufficient that powerful inhibitory feedback cancels the mean optogenetic input. A more realistic model, distinguishing tuned visual vs. untuned optogenetic input in a structured network, reduces the coupling strength needed to explain reshuffling.
Collapse
Affiliation(s)
- Alessandro Sanzeni
- Department of Computing Sciences, Bocconi University, 20100 Milan, Italy; Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA
| | - Agostina Palmigiano
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA
| | - Tuan H Nguyen
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Physics, Columbia University, New York, NY 10027, USA
| | - Junxiang Luo
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - Jonathan J Nassi
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - John H Reynolds
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - Mark H Histed
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD 20814, USA
| | - Kenneth D Miller
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons and Mortimer B. Zuckerman Mind Brain Behavior Institute, Columbia University, New York City, NY 10027, USA.
| | - Nicolas Brunel
- Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27710, USA.
| |
Collapse
|
18
|
Zhang A, Zador AM. Neurons in the primary visual cortex of freely moving rats encode both sensory and non-sensory task variables. PLoS Biol 2023; 21:e3002384. [PMID: 38048367 PMCID: PMC10721203 DOI: 10.1371/journal.pbio.3002384] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 12/14/2023] [Accepted: 10/17/2023] [Indexed: 12/06/2023] Open
Abstract
Neurons in primary visual cortex (area V1) are strongly driven by both sensory stimuli and non-sensory events. However, although the representation of sensory stimuli has been well characterized, much less is known about the representation of non-sensory events. Here, we characterize the specificity and organization of non-sensory representations in rat V1 during a freely moving visual decision task. We find that single neurons encode diverse combinations of task features simultaneously and across task epochs. Despite heterogeneity at the level of single neuron response patterns, both visual and nonvisual task variables could be reliably decoded from small neural populations (5 to 40 units) throughout a trial. Interestingly, in animals trained to make an auditory decision following passive observation of a visual stimulus, some but not all task features could also be decoded from V1 activity. Our results support the view that even in V1-the earliest stage of the cortical hierarchy-bottom-up sensory information may be combined with top-down non-sensory information in a task-dependent manner.
Collapse
Affiliation(s)
- Anqi Zhang
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, United States of America
- Cold Spring Harbor Laboratory School of Biological Sciences, Cold Spring Harbor, New York, United States of America
| | - Anthony M. Zador
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, United States of America
| |
Collapse
|
19
|
Parker PRL, Martins DM, Leonard ESP, Casey NM, Sharp SL, Abe ETT, Smear MC, Yates JL, Mitchell JF, Niell CM. A dynamic sequence of visual processing initiated by gaze shifts. Nat Neurosci 2023; 26:2192-2202. [PMID: 37996524 PMCID: PMC11270614 DOI: 10.1038/s41593-023-01481-7] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/04/2023] [Indexed: 11/25/2023]
Abstract
Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Nathan M Casey
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Shelby L Sharp
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Matthew C Smear
- Institute of Neuroscience and Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Jacob L Yates
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Jude F Mitchell
- Department of Brain and Cognitive Sciences and Center for Visual Sciences, University of Rochester, Rochester, NY, USA.
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
20
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
21
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023; 24:487-501. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
22
|
Keshavarzi S, Velez-Fort M, Margrie TW. Cortical Integration of Vestibular and Visual Cues for Navigation, Visual Processing, and Perception. Annu Rev Neurosci 2023; 46:301-320. [PMID: 37428601 PMCID: PMC7616138 DOI: 10.1146/annurev-neuro-120722-100503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/12/2023]
Abstract
Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, for example, spatial navigation, is yet to be understood. Recent novel experimental approaches in rodents have probed both the physiological and behavioral significance of vestibular signals and indicate that their widespread integration with vision improves both the cortical representation and perceptual accuracy of self-motion and orientation. Here, we summarize these recent findings with a focus on cortical circuits involved in visual perception and spatial navigation and highlight the major remaining knowledge gaps. We suggest that vestibulo-visual integration reflects a process of constant updating regarding the status of self-motion, and access to such information by the cortex is used for sensory perception and predictions that may be implemented for rapid, navigation-related decision-making.
Collapse
Affiliation(s)
- Sepiedeh Keshavarzi
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Mateo Velez-Fort
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Troy W Margrie
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| |
Collapse
|
23
|
Mimica B, Tombaz T, Battistin C, Fuglstad JG, Dunn BA, Whitlock JR. Behavioral decomposition reveals rich encoding structure employed across neocortex in rats. Nat Commun 2023; 14:3947. [PMID: 37402724 PMCID: PMC10319800 DOI: 10.1038/s41467-023-39520-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2022] [Accepted: 06/16/2023] [Indexed: 07/06/2023] Open
Abstract
The cortical population code is pervaded by activity patterns evoked by movement, but it remains largely unknown how such signals relate to natural behavior or how they might support processing in sensory cortices where they have been observed. To address this we compared high-density neural recordings across four cortical regions (visual, auditory, somatosensory, motor) in relation to sensory modulation, posture, movement, and ethograms of freely foraging male rats. Momentary actions, such as rearing or turning, were represented ubiquitously and could be decoded from all sampled structures. However, more elementary and continuous features, such as pose and movement, followed region-specific organization, with neurons in visual and auditory cortices preferentially encoding mutually distinct head-orienting features in world-referenced coordinates, and somatosensory and motor cortices principally encoding the trunk and head in egocentric coordinates. The tuning properties of synaptically coupled cells also exhibited connection patterns suggestive of area-specific uses of pose and movement signals, particularly in visual and auditory regions. Together, our results indicate that ongoing behavior is encoded at multiple levels throughout the dorsal cortex, and that low-level features are differentially utilized by different regions to serve locally relevant computations.
Collapse
Affiliation(s)
- Bartul Mimica
- Princeton Neuroscience Institute, Princeton University, Washington Road, Princeton, 100190, NJ, USA.
| | - Tuçe Tombaz
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
| | - Claudia Battistin
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
- Department of Mathematical Sciences, Norwegian University of Science and Technology, 7491, Trondheim, Norway
| | - Jingyi Guo Fuglstad
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
| | - Benjamin A Dunn
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway
- Department of Mathematical Sciences, Norwegian University of Science and Technology, 7491, Trondheim, Norway
| | - Jonathan R Whitlock
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Olav Kyrres Gate 9, 7030, Trondheim, Norway.
| |
Collapse
|
24
|
Disse GD, Nandakumar B, Pauzin FP, Blumenthal GH, Kong Z, Ditterich J, Moxon KA. Neural ensemble dynamics in trunk and hindlimb sensorimotor cortex encode for the control of postural stability. Cell Rep 2023; 42:112347. [PMID: 37027302 DOI: 10.1016/j.celrep.2023.112347] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2022] [Revised: 02/09/2023] [Accepted: 03/21/2023] [Indexed: 04/08/2023] Open
Abstract
The cortex has a disputed role in monitoring postural equilibrium and intervening in cases of major postural disturbances. Here, we investigate the patterns of neural activity in the cortex that underlie neural dynamics during unexpected perturbations. In both the primary sensory (S1) and motor (M1) cortices of the rat, unique neuronal classes differentially covary their responses to distinguish different characteristics of applied postural perturbations; however, there is substantial information gain in M1, demonstrating a role for higher-order computations in motor control. A dynamical systems model of M1 activity and forces generated by the limbs reveals that these neuronal classes contribute to a low-dimensional manifold comprised of separate subspaces enabled by congruent and incongruent neural firing patterns that define different computations depending on the postural responses. These results inform how the cortex engages in postural control, directing work aiming to understand postural instability after neurological disease.
Collapse
Affiliation(s)
- Gregory D Disse
- Neuroscience Graduate Group, University of California, Davis, Davis, CA 95616, USA; Biomedical Engineering, University of California, Davis, Davis, CA 95616, USA
| | | | - Francois P Pauzin
- Biomedical Engineering, University of California, Davis, Davis, CA 95616, USA
| | - Gary H Blumenthal
- School of Biomedical Engineering Science and Health Systems, Drexel University, Philadelphia, PA 19104, USA
| | - Zhaodan Kong
- Mechanical and Aerospace Engineering, University of California, Davis, Davis, CA 95616, USA
| | - Jochen Ditterich
- Neuroscience Graduate Group, University of California, Davis, Davis, CA 95616, USA; Neurobiology, Physiology and Behavior, University of California, Davis, Davis, CA 95616, USA
| | - Karen A Moxon
- Neuroscience Graduate Group, University of California, Davis, Davis, CA 95616, USA; Biomedical Engineering, University of California, Davis, Davis, CA 95616, USA.
| |
Collapse
|
25
|
Xu Z, Hu J, Wang Y. Bilateral eye movements disrupt the involuntary perceptual representation of trauma-related memories. Behav Res Ther 2023; 165:104311. [PMID: 37037182 DOI: 10.1016/j.brat.2023.104311] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 03/29/2023] [Accepted: 04/07/2023] [Indexed: 04/12/2023]
Abstract
Bilateral eye movement (EM) is a critical component in eye movement desensitization and reprocessing (EMDR), an effective treatment for post-traumatic stress disorder. However, the role of bilateral EM in alleviating trauma-related symptoms is unclear. Here we hypothesize that bilateral EM selectively disrupts the perceptual representation of traumatic memories. We used the trauma film paradigm as an analog for trauma experience. Nonclinical participants viewed trauma films followed by a bilateral EM intervention or a static Fixation period as a control. Perceptual and semantic memories for the film were assessed with different measures. Results showed a significant decrease in perceptual memory recognition shortly after the EM intervention and subsequently in the frequency and vividness of film-related memory intrusions across one week, relative to the Fixation condition. The EM intervention did not affect the explicit recognition of semantic memories, suggesting a dissociation between perceptual and semantic memory disruption. Furthermore, the EM intervention effectively reduced psychophysiological affective responses, including the skin conductance response and pupil size, to film scenes and subjective affective ratings of film-related intrusions. Together, bilateral EMs effectively reduce the perceptual representation and affective response of trauma-related memories. Further theoretical developments are needed to elucidate the mechanism of bilateral EMs in trauma treatment.
Collapse
Affiliation(s)
- Zhenjie Xu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, 310028, Zhejiang, China
| | - Jie Hu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, 310028, Zhejiang, China
| | - Yingying Wang
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, 310028, Zhejiang, China.
| |
Collapse
|
26
|
Klioutchnikov A, Wallace DJ, Sawinski J, Voit KM, Groemping Y, Kerr JND. A three-photon head-mounted microscope for imaging all layers of visual cortex in freely moving mice. Nat Methods 2023; 20:610-616. [PMID: 36443485 PMCID: PMC10089923 DOI: 10.1038/s41592-022-01688-9] [Citation(s) in RCA: 36] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Accepted: 10/19/2022] [Indexed: 11/30/2022]
Abstract
Advances in head-mounted microscopes have enabled imaging of neuronal activity using genetic tools in freely moving mice but these microscopes are restricted to recording in minimally lit arenas and imaging upper cortical layers. Here we built a 2-g, three-photon excitation-based microscope, containing a z-drive that enabled access to all cortical layers while mice freely behaved in a fully lit environment. The microscope had on-board photon detectors, robust to environmental light, and the arena lighting was timed to the end of each line-scan, enabling functional imaging of activity from cortical layer 4 and layer 6 neurons expressing jGCaMP7f in mice roaming a fully lit or dark arena. By comparing the neuronal activity measured from populations in these layers we show that activity in cortical layer 4 and layer 6 is differentially modulated by lit and dark conditions during free exploration.
Collapse
Affiliation(s)
- Alexandr Klioutchnikov
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Damian J Wallace
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Juergen Sawinski
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Kay-Michael Voit
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Yvonne Groemping
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany
| | - Jason N D Kerr
- Department of Behavior and Brain Organization, Max Planck Institute for Neurobiology of Behavior-caesar, Bonn, Germany.
| |
Collapse
|
27
|
Horrocks EAB, Mareschal I, Saleem AB. Walking humans and running mice: perception and neural encoding of optic flow during self-motion. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210450. [PMID: 36511417 PMCID: PMC9745880 DOI: 10.1098/rstb.2021.0450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Accepted: 08/30/2022] [Indexed: 12/15/2022] Open
Abstract
Locomotion produces full-field optic flow that often dominates the visual motion inputs to an observer. The perception of optic flow is in turn important for animals to guide their heading and interact with moving objects. Understanding how locomotion influences optic flow processing and perception is therefore essential to understand how animals successfully interact with their environment. Here, we review research investigating how perception and neural encoding of optic flow are altered during self-motion, focusing on locomotion. Self-motion has been found to influence estimation and sensitivity for optic flow speed and direction. Nonvisual self-motion signals also increase compensation for self-driven optic flow when parsing the visual motion of moving objects. The integration of visual and nonvisual self-motion signals largely follows principles of Bayesian inference and can improve the precision and accuracy of self-motion perception. The calibration of visual and nonvisual self-motion signals is dynamic, reflecting the changing visuomotor contingencies across different environmental contexts. Throughout this review, we consider experimental research using humans, non-human primates and mice. We highlight experimental challenges and opportunities afforded by each of these species and draw parallels between experimental findings. These findings reveal a profound influence of locomotion on optic flow processing and perception across species. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Edward A. B. Horrocks
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| | - Isabelle Mareschal
- School of Biological and Behavioural Sciences, Queen Mary, University of London, London E1 4NS, UK
| | - Aman B. Saleem
- Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London WC1H 0AP, UK
| |
Collapse
|
28
|
Accanto N, Blot FGC, Lorca-Cámara A, Zampini V, Bui F, Tourain C, Badt N, Katz O, Emiliani V. A flexible two-photon fiberscope for fast activity imaging and precise optogenetic photostimulation of neurons in freely moving mice. Neuron 2023; 111:176-189.e6. [PMID: 36395773 DOI: 10.1016/j.neuron.2022.10.030] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Revised: 07/28/2022] [Accepted: 10/19/2022] [Indexed: 11/17/2022]
Abstract
We developed a flexible two-photon microendoscope (2P-FENDO) capable of all-optical brain investigation at near cellular resolution in freely moving mice. The system performs fast two-photon (2P) functional imaging and 2P holographic photostimulation of single and multiple cells using axially confined extended spots. Proof-of-principle experiments were performed in freely moving mice co-expressing jGCaMP7s and the opsin ChRmine in the visual or barrel cortex. On a field of view of 250 μm in diameter, we demonstrated functional imaging at a frame rate of up to 50 Hz and precise photostimulation of selected groups of cells. With the capability to simultaneously image and control defined neuronal networks in freely moving animals, 2P-FENDO will enable a precise investigation of neuronal functions in the brain during naturalistic behaviors.
Collapse
Affiliation(s)
- Nicolò Accanto
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France.
| | - François G C Blot
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France
| | | | - Valeria Zampini
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France
| | - Florence Bui
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France
| | - Christophe Tourain
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France
| | - Noam Badt
- Department of Applied Physics, Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| | - Ori Katz
- Department of Applied Physics, Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| | - Valentina Emiliani
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, F-75012 Paris, France.
| |
Collapse
|
29
|
Ebrahimi AS, Orlowska-Feuer P, Huang Q, Zippo AG, Martial FP, Petersen RS, Storchi R. Three-dimensional unsupervised probabilistic pose reconstruction (3D-UPPER) for freely moving animals. Sci Rep 2023; 13:155. [PMID: 36599877 PMCID: PMC9813182 DOI: 10.1038/s41598-022-25087-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Accepted: 11/24/2022] [Indexed: 01/05/2023] Open
Abstract
A key step in understanding animal behaviour relies in the ability to quantify poses and movements. Methods to track body landmarks in 2D have made great progress over the last few years but accurate 3D reconstruction of freely moving animals still represents a challenge. To address this challenge here we develop the 3D-UPPER algorithm, which is fully automated, requires no a priori knowledge of the properties of the body and can also be applied to 2D data. We find that 3D-UPPER reduces by [Formula: see text] fold the error in 3D reconstruction of mouse body during freely moving behaviour compared with the traditional triangulation of 2D data. To achieve that, 3D-UPPER performs an unsupervised estimation of a Statistical Shape Model (SSM) and uses this model to constrain the viable 3D coordinates. We show, by using simulated data, that our SSM estimator is robust even in datasets containing up to 50% of poses with outliers and/or missing data. In simulated and real data SSM estimation converges rapidly, capturing behaviourally relevant changes in body shape associated with exploratory behaviours (e.g. with rearing and changes in body orientation). Altogether 3D-UPPER represents a simple tool to minimise errors in 3D reconstruction while capturing meaningful behavioural parameters.
Collapse
Affiliation(s)
- Aghileh S Ebrahimi
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK.
| | - Patrycja Orlowska-Feuer
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Qian Huang
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Antonio G Zippo
- Institute of Neuroscience, Consiglio Nazionale delle Ricerche, Milan, Italy
| | - Franck P Martial
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Rasmus S Petersen
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Riccardo Storchi
- Division of Neuroscience, School of Biological Science, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| |
Collapse
|
30
|
Parker PRL, Abe ETT, Leonard ESP, Martins DM, Niell CM. Joint coding of visual input and eye/head position in V1 of freely moving mice. Neuron 2022; 110:3897-3906.e5. [PMID: 36137549 PMCID: PMC9742335 DOI: 10.1016/j.neuron.2022.08.029] [Citation(s) in RCA: 32] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Revised: 07/16/2022] [Accepted: 08/30/2022] [Indexed: 12/15/2022]
Abstract
Visual input during natural behavior is highly dependent on movements of the eyes and head, but how information about eye and head position is integrated with visual processing during free movement is unknown, as visual physiology is generally performed under head fixation. To address this, we performed single-unit electrophysiology in V1 of freely moving mice while simultaneously measuring the mouse's eye position, head orientation, and the visual scene from the mouse's perspective. From these measures, we mapped spatiotemporal receptive fields during free movement based on the gaze-corrected visual input. Furthermore, we found a significant fraction of neurons tuned for eye and head position, and these signals were integrated with visual responses through a multiplicative mechanism in the majority of modulated neurons. These results provide new insight into coding in the mouse V1 and, more generally, provide a paradigm for investigating visual physiology under natural conditions, including active sensing and ethological behavior.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
31
|
Orlowska-Feuer P, Ebrahimi AS, Zippo AG, Petersen RS, Lucas RJ, Storchi R. Look-up and look-down neurons in the mouse visual thalamus during freely moving exploration. Curr Biol 2022; 32:3987-3999.e4. [PMID: 35973431 PMCID: PMC9616738 DOI: 10.1016/j.cub.2022.07.049] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Revised: 05/24/2022] [Accepted: 07/20/2022] [Indexed: 12/28/2022]
Abstract
Visual information reaches cortex via the thalamic dorsal lateral geniculate nucleus (dLGN). dLGN activity is modulated by global sleep/wake states and arousal, indicating that it is not simply a passive relay station. However, its potential for more specific visuomotor integration is largely unexplored. We addressed this question by developing robust 3D video reconstruction of mouse head and body during spontaneous exploration paired with simultaneous neuronal recordings from dLGN. Unbiased evaluation of a wide range of postures and movements revealed a widespread coupling between neuronal activity and few behavioral parameters. In particular, postures associated with the animal looking up/down correlated with activity in >50% neurons, and the extent of this effect was comparable with that induced by full-body movements (typically locomotion). By contrast, thalamic activity was minimally correlated with other postures or movements (e.g., left/right head and body torsions). Importantly, up/down postures and full-body movements were largely independent and jointly coupled to neuronal activity. Thus, although most units were excited during full-body movements, some expressed highest firing when the animal was looking up ("look-up" neurons), whereas others expressed highest firing when the animal was looking down ("look-down" neurons). These results were observed in the dark, thus representing a genuine behavioral modulation, and were amplified in a lit arena. Our results demonstrate that the primary visual thalamus, beyond global modulations by sleep/awake states, is potentially involved in specific visuomotor integration and reveal two distinct couplings between up/down postures and neuronal activity.
Collapse
Affiliation(s)
- Patrycja Orlowska-Feuer
- University of Manchester, Faculty of Biology, Medicine and Health, School of Biological Science, Division of Neuroscience and Experimental Psychology, Oxford Road, M139PL Manchester, UK
| | - Aghileh S Ebrahimi
- University of Manchester, Faculty of Biology, Medicine and Health, School of Biological Science, Division of Neuroscience and Experimental Psychology, Oxford Road, M139PL Manchester, UK
| | - Antonio G Zippo
- Institute of Neuroscience, Consiglio Nazionale delle Ricerche, Via Raoul Follereau, 3, 20854 Vedano al Lambro, Italy
| | - Rasmus S Petersen
- University of Manchester, Faculty of Biology, Medicine and Health, School of Biological Science, Division of Neuroscience and Experimental Psychology, Oxford Road, M139PL Manchester, UK
| | - Robert J Lucas
- University of Manchester, Faculty of Biology, Medicine and Health, School of Biological Science, Division of Neuroscience and Experimental Psychology, Oxford Road, M139PL Manchester, UK
| | - Riccardo Storchi
- University of Manchester, Faculty of Biology, Medicine and Health, School of Biological Science, Division of Neuroscience and Experimental Psychology, Oxford Road, M139PL Manchester, UK.
| |
Collapse
|
32
|
Distinguishing externally from saccade-induced motion in visual cortex. Nature 2022; 610:135-142. [PMID: 36104560 PMCID: PMC9534749 DOI: 10.1038/s41586-022-05196-w] [Citation(s) in RCA: 37] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 08/04/2022] [Indexed: 12/03/2022]
Abstract
Distinguishing sensory stimuli caused by changes in the environment from those caused by an animal’s own actions is a hallmark of sensory processing1. Saccades are rapid eye movements that shift the image on the retina. How visual systems differentiate motion of the image induced by saccades from actual motion in the environment is not fully understood2. Here we discovered that in mouse primary visual cortex (V1) the two types of motion evoke distinct activity patterns. This is because, during saccades, V1 combines the visual input with a strong non-visual input arriving from the thalamic pulvinar nucleus. The non-visual input triggers responses that are specific to the direction of the saccade and the visual input triggers responses that are specific to the direction of the shift of the stimulus on the retina, yet the preferred directions of these two responses are uncorrelated. Thus, the pulvinar input ensures differential V1 responses to external and self-generated motion. Integration of external sensory information with information about body movement may be a general mechanism for sensory cortices to distinguish between self-generated and external stimuli. Distinct activity patterns in the primary visual cortex distinguish movement in the environment from motion caused by eye movements.
Collapse
|
33
|
Avitan L, Stringer C. Not so spontaneous: Multi-dimensional representations of behaviors and context in sensory areas. Neuron 2022; 110:3064-3075. [PMID: 35863344 DOI: 10.1016/j.neuron.2022.06.019] [Citation(s) in RCA: 40] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2022] [Revised: 06/20/2022] [Accepted: 06/22/2022] [Indexed: 11/27/2022]
Abstract
Sensory areas are spontaneously active in the absence of sensory stimuli. This spontaneous activity has long been studied; however, its functional role remains largely unknown. Recent advances in technology, allowing large-scale neural recordings in the awake and behaving animal, have transformed our understanding of spontaneous activity. Studies using these recordings have discovered high-dimensional spontaneous activity patterns, correlation between spontaneous activity and behavior, and dissimilarity between spontaneous and sensory-driven activity patterns. These findings are supported by evidence from developing animals, where a transition toward these characteristics is observed as the circuit matures, as well as by evidence from mature animals across species. These newly revealed characteristics call for the formulation of a new role for spontaneous activity in neural sensory computation.
Collapse
Affiliation(s)
- Lilach Avitan
- Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel.
| | | |
Collapse
|
34
|
Alexander AS, Tung JC, Chapman GW, Conner AM, Shelley LE, Hasselmo ME, Nitz DA. Adaptive integration of self-motion and goals in posterior parietal cortex. Cell Rep 2022; 38:110504. [PMID: 35263604 PMCID: PMC9026715 DOI: 10.1016/j.celrep.2022.110504] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2021] [Revised: 12/14/2021] [Accepted: 02/14/2022] [Indexed: 02/05/2023] Open
Abstract
Rats readily switch between foraging and more complex navigational behaviors such as pursuit of other rats or prey. These tasks require vastly different tracking of multiple behaviorally significant variables including self-motion state. To explore whether navigational context modulates self-motion tracking, we examined self-motion tuning in posterior parietal cortex neurons during foraging versus visual target pursuit. Animals performing the pursuit task demonstrate predictive processing of target trajectories by anticipating and intercepting them. Relative to foraging, pursuit yields multiplicative gain modulation of self-motion tuning and enhances self-motion state decoding. Self-motion sensitivity in parietal cortex neurons is, on average, history dependent regardless of behavioral context, but the temporal window of self-motion integration extends during target pursuit. Finally, many self-motion-sensitive neurons conjunctively track the visual target position relative to the animal. Thus, posterior parietal cortex functions to integrate the location of navigationally relevant target stimuli into an ongoing representation of past, present, and future locomotor trajectories.
Collapse
Affiliation(s)
- Andrew S Alexander
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA; Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Avenue, Boston, MA 02215, USA.
| | - Janet C Tung
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA
| | - G William Chapman
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Avenue, Boston, MA 02215, USA
| | - Allison M Conner
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA
| | - Laura E Shelley
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA
| | - Michael E Hasselmo
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Avenue, Boston, MA 02215, USA
| | - Douglas A Nitz
- Department of Cognitive Science, University of California, San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
35
|
Hierarchical and nonhierarchical features of the mouse visual cortical network. Nat Commun 2022; 13:503. [PMID: 35082302 PMCID: PMC8791996 DOI: 10.1038/s41467-022-28035-y] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Accepted: 12/13/2021] [Indexed: 01/08/2023] Open
Abstract
Neocortical computations underlying vision are performed by a distributed network of functionally specialized areas. Mouse visual cortex, a dense interareal network that exhibits hierarchical properties, comprises subnetworks interconnecting distinct processing streams. To determine the layout of the mouse visual hierarchy, we have evaluated the laminar patterns formed by interareal axonal projections originating in each of ten areas. Reciprocally connected pairs of areas exhibit feedforward/feedback relationships consistent with a hierarchical organization. Beta regression analyses, which estimate a continuous hierarchical distance measure, indicate that the network comprises multiple nonhierarchical circuits embedded in a hierarchical organization of overlapping levels. Single-unit recordings in anaesthetized mice show that receptive field sizes are generally consistent with the hierarchy, with the ventral stream exhibiting a stricter hierarchy than the dorsal stream. Together, the results provide an anatomical metric for hierarchical distance, and reveal both hierarchical and nonhierarchical motifs in mouse visual cortex. Mouse visual cortex is a dense, interconnected network of distinct areas. D’Souza et al. identify an anatomical index to quantify the hierarchical nature of pathways, and highlight the hierarchical and nonhierarchical features of the network.
Collapse
|
36
|
Nestvogel DB, McCormick DA. Visual thalamocortical mechanisms of waking state-dependent activity and alpha oscillations. Neuron 2022; 110:120-138.e4. [PMID: 34687663 PMCID: PMC8815448 DOI: 10.1016/j.neuron.2021.10.005] [Citation(s) in RCA: 52] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Revised: 08/19/2021] [Accepted: 10/01/2021] [Indexed: 01/07/2023]
Abstract
The brain exhibits distinct patterns of recurrent activity closely related to behavioral state. The neural mechanisms that underlie state-dependent activity in the awake animal are incompletely understood. Here, we demonstrate that two types of state-dependent activity, rapid arousal/movement-related signals and a 3-5 Hz alpha-like rhythm, in the primary visual cortex (V1) of mice strongly correlate with activity in the visual thalamus. Inactivation of V1 does not interrupt arousal/movement signals in most visual thalamic neurons, but it abolishes the 3-5 Hz oscillation. Silencing of the visual thalamus similarly eradicates the alpha-like rhythm and perturbs arousal/movement-related activation in V1. Intracellular recordings in thalamic neurons reveal the 3-5 Hz oscillation to be associated with rhythmic low-threshold Ca2+ spikes. Our results indicate that thalamocortical interactions through ionotropic signaling, together with cell-intrinsic properties of thalamocortical cells, play a crucial role in shaping state-dependent activity in V1 of the awake animal.
Collapse
Affiliation(s)
| | - David A McCormick
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA.
| |
Collapse
|
37
|
Hennestad E, Witoelar A, Chambers AR, Vervaeke K. Mapping vestibular and visual contributions to angular head velocity tuning in the cortex. Cell Rep 2021; 37:110134. [PMID: 34936869 PMCID: PMC8721284 DOI: 10.1016/j.celrep.2021.110134] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 09/21/2021] [Accepted: 11/24/2021] [Indexed: 11/19/2022] Open
Abstract
Neurons that signal the angular velocity of head movements (AHV cells) are important for processing visual and spatial information. However, it has been challenging to isolate the sensory modality that drives them and to map their cortical distribution. To address this, we develop a method that enables rotating awake, head-fixed mice under a two-photon microscope in a visual environment. Starting in layer 2/3 of the retrosplenial cortex, a key area for vision and navigation, we find that 10% of neurons report angular head velocity (AHV). Their tuning properties depend on vestibular input with a smaller contribution of vision at lower speeds. Mapping the spatial extent, we find AHV cells in all cortical areas that we explored, including motor, somatosensory, visual, and posterior parietal cortex. Notably, the vestibular and visual contributions to AHV are area dependent. Thus, many cortical circuits have access to AHV, enabling a diverse integration with sensorimotor and cognitive information.
Collapse
Affiliation(s)
- Eivind Hennestad
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Aree Witoelar
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Anna R Chambers
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway
| | - Koen Vervaeke
- Institute of Basic Medical Sciences, Section of Physiology, University of Oslo, Oslo, Norway.
| |
Collapse
|
38
|
Keshavarzi S, Bracey EF, Faville RA, Campagner D, Tyson AL, Lenzi SC, Branco T, Margrie TW. Multisensory coding of angular head velocity in the retrosplenial cortex. Neuron 2021; 110:532-543.e9. [PMID: 34788632 PMCID: PMC8823706 DOI: 10.1016/j.neuron.2021.10.031] [Citation(s) in RCA: 40] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2021] [Revised: 07/29/2021] [Accepted: 10/20/2021] [Indexed: 01/05/2023]
Abstract
To successfully navigate the environment, animals depend on their ability to continuously track their heading direction and speed. Neurons that encode angular head velocity (AHV) are fundamental to this process, yet the contribution of various motion signals to AHV coding in the cortex remains elusive. By performing chronic single-unit recordings in the retrosplenial cortex (RSP) of the mouse and tracking the activity of individual AHV cells between freely moving and head-restrained conditions, we find that vestibular inputs dominate AHV signaling. Moreover, the addition of visual inputs onto these neurons increases the gain and signal-to-noise ratio of their tuning during active exploration. Psychophysical experiments and neural decoding further reveal that vestibular-visual integration increases the perceptual accuracy of angular self-motion and the fidelity of its representation by RSP ensembles. We conclude that while cortical AHV coding requires vestibular input, where possible, it also uses vision to optimize heading estimation during navigation. Angular head velocity (AHV) coding is widespread in the retrosplenial cortex (RSP) AHV cells maintain their tuning during passive motion and require vestibular input The perception of angular self-motion is improved when visual cues are present AHV coding is similarly improved when both vestibular and visual stimuli are used
Collapse
Affiliation(s)
- Sepiedeh Keshavarzi
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom.
| | - Edward F Bracey
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Richard A Faville
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Dario Campagner
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom; Gatsby Computational Neuroscience Unit, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Adam L Tyson
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Stephen C Lenzi
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Tiago Branco
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom
| | - Troy W Margrie
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), 25 Howland Street, London W1T 4JG, United Kingdom.
| |
Collapse
|
39
|
Fayat R, Delgado Betancourt V, Goyallon T, Petremann M, Liaudet P, Descossy V, Reveret L, Dugué GP. Inertial Measurement of Head Tilt in Rodents: Principles and Applications to Vestibular Research. SENSORS (BASEL, SWITZERLAND) 2021; 21:6318. [PMID: 34577524 PMCID: PMC8472891 DOI: 10.3390/s21186318] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Revised: 09/03/2021] [Accepted: 09/13/2021] [Indexed: 12/21/2022]
Abstract
Inertial sensors are increasingly used in rodent research, in particular for estimating head orientation relative to gravity, or head tilt. Despite this growing interest, the accuracy of tilt estimates computed from rodent head inertial data has never been assessed. Using readily available inertial measurement units mounted onto the head of freely moving rats, we benchmarked a set of tilt estimation methods against concurrent 3D optical motion capture. We show that, while low-pass filtered head acceleration signals only provided reliable tilt estimates in static conditions, sensor calibration combined with an appropriate choice of orientation filter and parameters could yield average tilt estimation errors below 1.5∘ during movement. We then illustrate an application of inertial head tilt measurements in a preclinical rat model of unilateral vestibular lesion and propose a set of metrics describing the severity of associated postural and motor symptoms and the time course of recovery. We conclude that headborne inertial sensors are an attractive tool for quantitative rodent behavioral analysis in general and for the study of vestibulo-postural functions in particular.
Collapse
Affiliation(s)
- Romain Fayat
- Neurophysiologie des Circuits Cérébraux, Institut de Biologie de l’ENS (IBENS), Ecole Normale Supérieure, UMR CNRS 8197, INSERM U1024, Université PSL, 75005 Paris, France;
- Laboratoire MAP5, UMR CNRS 8145, Université Paris Descartes, 75006 Paris, France
| | | | - Thibault Goyallon
- Laboratoire Jean Kuntzmann, Université Grenoble Alpes, UMR CNRS 5224, INRIA, 38330 Montbonnot-Saint-Martin, France; (T.G.); (L.R.)
| | - Mathieu Petremann
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Pauline Liaudet
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Vincent Descossy
- Preclinical Development, Sensorion SA, 34080 Montpellier, France; (V.D.B.); (M.P.); (P.L.); (V.D.)
| | - Lionel Reveret
- Laboratoire Jean Kuntzmann, Université Grenoble Alpes, UMR CNRS 5224, INRIA, 38330 Montbonnot-Saint-Martin, France; (T.G.); (L.R.)
| | - Guillaume P. Dugué
- Neurophysiologie des Circuits Cérébraux, Institut de Biologie de l’ENS (IBENS), Ecole Normale Supérieure, UMR CNRS 8197, INSERM U1024, Université PSL, 75005 Paris, France;
| |
Collapse
|
40
|
Niell CM, Scanziani M. How Cortical Circuits Implement Cortical Computations: Mouse Visual Cortex as a Model. Annu Rev Neurosci 2021; 44:517-546. [PMID: 33914591 PMCID: PMC9925090 DOI: 10.1146/annurev-neuro-102320-085825] [Citation(s) in RCA: 50] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The mouse, as a model organism to study the brain, gives us unprecedented experimental access to the mammalian cerebral cortex. By determining the cortex's cellular composition, revealing the interaction between its different components, and systematically perturbing these components, we are obtaining mechanistic insight into some of the most basic properties of cortical function. In this review, we describe recent advances in our understanding of how circuits of cortical neurons implement computations, as revealed by the study of mouse primary visual cortex. Further, we discuss how studying the mouse has broadened our understanding of the range of computations performed by visual cortex. Finally, we address how future approaches will fulfill the promise of the mouse in elucidating fundamental operations of cortex.
Collapse
Affiliation(s)
- Cristopher M. Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene, Oregon 97403, USA
| | - Massimo Scanziani
- Department of Physiology and Howard Hughes Medical Institute, University of California San Francisco, San Francisco, California 94158, USA;
| |
Collapse
|
41
|
Flossmann T, Rochefort NL. Spatial navigation signals in rodent visual cortex. Curr Opin Neurobiol 2020; 67:163-173. [PMID: 33360769 DOI: 10.1016/j.conb.2020.11.004] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 11/04/2020] [Accepted: 11/06/2020] [Indexed: 12/14/2022]
Abstract
During navigation, animals integrate sensory information with body movements to guide actions. The impact of both navigational and movement-related signals on cortical visual information processing remains largely unknown. We review recent studies in awake rodents that have revealed navigation-related signals in the primary visual cortex (V1) including speed, distance travelled and head-orienting movements. Both cortical and subcortical inputs convey self-motion related information to V1 neurons: for example, top-down inputs from secondary motor and retrosplenial cortices convey information about head movements and spatial expectations. Within V1, subtypes of inhibitory neurons are critical for the integration of navigation-related and visual signals. We conclude with potential functional roles of navigation-related signals in V1 including gain control, motor error signals and predictive coding.
Collapse
Affiliation(s)
- Tom Flossmann
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom
| | - Nathalie L Rochefort
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom; Simons Initiative for the Developing Brain, University of Edinburgh, Edinburgh, EH8 9XD, United Kingdom.
| |
Collapse
|
42
|
Pattadkal JJ, Priebe NJ. On the Rotations of the Cranial Spheres. Neuron 2020; 108:399-400. [PMID: 33181072 DOI: 10.1016/j.neuron.2020.10.030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
We integrate information from multiple sensory modalities and from ongoing plans to construct a perception of the world. In this issue of Neuron, Bouvier et al. (2020) and Guitchounts et al. (2020) examine the detailed circuitry that supports a flexible integration of head and visual signals in rodent primary visual cortex.
Collapse
Affiliation(s)
- Jagruti J Pattadkal
- Center for Learning and Memory, Department of Neuroscience, The University of Texas at Austin, 2415 Speedway, Austin, TX 78712, USA
| | - Nicholas J Priebe
- Center for Learning and Memory, Department of Neuroscience, The University of Texas at Austin, 2415 Speedway, Austin, TX 78712, USA.
| |
Collapse
|
43
|
Moving around a visual scene. Nat Rev Neurosci 2020; 21:523. [DOI: 10.1038/s41583-020-0374-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|