1
|
Skyberg RJ, Niell CM. Natural visual behavior and active sensing in the mouse. Curr Opin Neurobiol 2024; 86:102882. [PMID: 38704868 DOI: 10.1016/j.conb.2024.102882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2023] [Revised: 04/05/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
In the natural world, animals use vision for a wide variety of behaviors not reflected in most laboratory paradigms. Although mice have low-acuity vision, they use their vision for many natural behaviors, including predator avoidance, prey capture, and navigation. They also perform active sensing, moving their head and eyes to achieve behavioral goals and acquire visual information. These aspects of natural vision result in visual inputs and corresponding behavioral outputs that are outside the range of conventional vision studies but are essential aspects of visual function. Here, we review recent studies in mice that have tapped into natural behavior and active sensing to reveal the computational logic of neural circuits for vision.
Collapse
Affiliation(s)
- Rolf J Skyberg
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA. https://twitter.com/SkybergRolf
| | - Cristopher M Niell
- Department of Biology and Institute of Neuroscience, University of Oregon, Eugene OR 97403, USA.
| |
Collapse
|
2
|
Oesch LT, Ryan MB, Churchland AK. From innate to instructed: A new look at perceptual decision-making. Curr Opin Neurobiol 2024; 86:102871. [PMID: 38569230 PMCID: PMC11162954 DOI: 10.1016/j.conb.2024.102871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/07/2024] [Accepted: 03/08/2024] [Indexed: 04/05/2024]
Abstract
Understanding how subjects perceive sensory stimuli in their environment and use this information to guide appropriate actions is a major challenge in neuroscience. To study perceptual decision-making in animals, researchers use tasks that either probe spontaneous responses to stimuli (often described as "naturalistic") or train animals to associate stimuli with experimenter-defined responses. Spontaneous decisions rely on animals' pre-existing knowledge, while trained tasks offer greater versatility, albeit often at the cost of extensive training. Here, we review emerging approaches to investigate perceptual decision-making using both spontaneous and trained behaviors, highlighting their strengths and limitations. Additionally, we propose how trained decision-making tasks could be improved to achieve faster learning and a more generalizable understanding of task rules.
Collapse
Affiliation(s)
- Lukas T Oesch
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States
| | - Michael B Ryan
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States. https://twitter.com/NeuroMikeRyan
| | - Anne K Churchland
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States.
| |
Collapse
|
3
|
Ambrad Giovannetti E, Rancz E. Behind mouse eyes: The function and control of eye movements in mice. Neurosci Biobehav Rev 2024; 161:105671. [PMID: 38604571 DOI: 10.1016/j.neubiorev.2024.105671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/12/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
The mouse visual system has become the most popular model to study the cellular and circuit mechanisms of sensory processing. However, the importance of eye movements only started to be appreciated recently. Eye movements provide a basis for predictive sensing and deliver insights into various brain functions and dysfunctions. A plethora of knowledge on the central control of eye movements and their role in perception and behaviour arose from work on primates. However, an overview of various eye movements in mice and a comparison to primates is missing. Here, we review the eye movement types described to date in mice and compare them to those observed in primates. We discuss the central neuronal mechanisms for their generation and control. Furthermore, we review the mounting literature on eye movements in mice during head-fixed and freely moving behaviours. Finally, we highlight gaps in our understanding and suggest future directions for research.
Collapse
Affiliation(s)
| | - Ede Rancz
- INMED, INSERM, Aix-Marseille University, Marseille, France.
| |
Collapse
|
4
|
Singh VP, Li J, Mitchell J, Miller C. Active vision in freely moving marmosets using head-mounted eye tracking. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.11.593707. [PMID: 38766147 PMCID: PMC11100783 DOI: 10.1101/2024.05.11.593707] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2024]
Abstract
Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, the synergistic interactions of different motor actions/plans inherent to active sensing - e.g. eyes, head, posture, movement, etc. - on visual perception are largely unknown. To address this considerable gap in knowledge, we developed an innovative wireless head-mounted eye tracking system called CEREBRO for small mammals, such as marmoset monkeys. Our system performs Chair-free Eye-Recording using Backpack mounted micROcontrollers. Because eye illumination and environment lighting change continuously in natural contexts, we developed a segmentation artificial neural network to perform robust pupil tracking in these conditions. Leveraging this innovative system to investigate active vision, we demonstrate that although freely-moving marmosets exhibit frequent compensatory eye movements equivalent to other primates, including humans, the predictability of the visual system is enhanced when animals are freely-moving relative to when they are head-fixed. Moreover, despite increases in eye/head-motion during locomotion, gaze stabilization actually improved over periods when the monkeys were stationary. Rather than impair vision, the dynamics of gaze stabilization in freely-moving primates has been optimized over evolution to enable active sensing during natural exploration.
Collapse
Affiliation(s)
- Vikram Pal Singh
- Cortical Systems & Behavior Lab, University of California San Diego
| | - Jingwen Li
- Cortical Systems & Behavior Lab, University of California San Diego
| | - Jude Mitchell
- Department of Brain and Cognitive Science, University of Rochester
| | - Cory Miller
- Cortical Systems & Behavior Lab, University of California San Diego
- Neurosciences Graduate Program, University of California San Diego
| |
Collapse
|
5
|
Clayton KK, Stecyk KS, Guo AA, Chambers AR, Chen K, Hancock KE, Polley DB. Sound elicits stereotyped facial movements that provide a sensitive index of hearing abilities in mice. Curr Biol 2024; 34:1605-1620.e5. [PMID: 38492568 PMCID: PMC11043000 DOI: 10.1016/j.cub.2024.02.057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Revised: 01/02/2024] [Accepted: 02/23/2024] [Indexed: 03/18/2024]
Abstract
Sound elicits rapid movements of muscles in the face, ears, and eyes that protect the body from injury and trigger brain-wide internal state changes. Here, we performed quantitative facial videography from mice resting atop a piezoelectric force plate and observed that broadband sounds elicited rapid and stereotyped facial twitches. Facial motion energy (FME) adjacent to the whisker array was 30 dB more sensitive than the acoustic startle reflex and offered greater inter-trial and inter-animal reliability than sound-evoked pupil dilations or movement of other facial and body regions. FME tracked the low-frequency envelope of broadband sounds, providing a means to study behavioral discrimination of complex auditory stimuli, such as speech phonemes in noise. Approximately 25% of layer 5-6 units in the auditory cortex (ACtx) exhibited firing rate changes during facial movements. However, FME facilitation during ACtx photoinhibition indicated that sound-evoked facial movements were mediated by a midbrain pathway and modulated by descending corticofugal input. FME and auditory brainstem response (ABR) thresholds were closely aligned after noise-induced sensorineural hearing loss, yet FME growth slopes were disproportionately steep at spared frequencies, reflecting a central plasticity that matched commensurate changes in ABR wave 4. Sound-evoked facial movements were also hypersensitive in Ptchd1 knockout mice, highlighting the use of FME for identifying sensory hyper-reactivity phenotypes after adult-onset hyperacusis and inherited deficiencies in autism risk genes. These findings present a sensitive and integrative measure of hearing while also highlighting that even low-intensity broadband sounds can elicit a complex mixture of auditory, motor, and reafferent somatosensory neural activity.
Collapse
Affiliation(s)
- Kameron K Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA.
| | - Kamryn S Stecyk
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna A Guo
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA
| | - Anna R Chambers
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Ke Chen
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Kenneth E Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| | - Daniel B Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA 02114, USA; Department of Otolaryngology-Head and Neck Surgery, Harvard Medical School, Boston, MA 02114, USA
| |
Collapse
|
6
|
Li J, Aoi MC, Miller CT. Representing the dynamics of natural marmoset vocal behaviors in frontal cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.17.585423. [PMID: 38559173 PMCID: PMC10979968 DOI: 10.1101/2024.03.17.585423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/04/2024]
Abstract
Here we tested the respective contributions of primate premotor and prefrontal cortex to support vocal behavior. We applied a model-based GLM analysis that better accounts for the inherent variance in natural, continuous behaviors to characterize the activity of neurons throughout frontal cortex as freely-moving marmosets engaged in conversational exchanges. While analyses revealed functional clusters of neural activity related to the different processes involved in the vocal behavior, these clusters did not map to subfields of prefrontal or premotor cortex, as has been observed in more conventional task-based paradigms. Our results suggest a distributed functional organization for the myriad neural mechanisms underlying natural social interactions and has implications for our concepts of the role that frontal cortex plays in governing ethological behaviors in primates.
Collapse
|
7
|
Megemont M, Tortorelli LS, McBurney-Lin J, Cohen JY, O'Connor DH, Yang H. Simultaneous recordings of pupil size variation and locus coeruleus activity in mice. STAR Protoc 2024; 5:102785. [PMID: 38127625 PMCID: PMC10772391 DOI: 10.1016/j.xpro.2023.102785] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2023] [Revised: 11/03/2023] [Accepted: 12/01/2023] [Indexed: 12/23/2023] Open
Abstract
An extensive literature describes how pupil size reflects neuromodulatory activity, including the noradrenergic system. Here, we present a protocol for the simultaneous recording of optogenetically identified locus coeruleus (LC) units and pupil diameter in mice under different conditions. We describe steps for building an optrode, performing surgery to implant the optrode and headpost, searching for opto-tagged LC units, and performing dual LC-pupil recording. We then detail procedures for data processing and analysis. For complete details on the use and execution of this protocol, please refer to Megemont et al.1.
Collapse
Affiliation(s)
- Marine Megemont
- Department of Molecular, Cell and Systems Biology, University of California, Riverside, Riverside, CA 92521, USA.
| | - Lucas S Tortorelli
- Department of Molecular, Cell and Systems Biology, University of California, Riverside, Riverside, CA 92521, USA
| | - Jim McBurney-Lin
- Department of Molecular, Cell and Systems Biology, University of California, Riverside, Riverside, CA 92521, USA; Neuroscience Graduate Program, University of California, Riverside, Riverside, CA 92521, USA
| | - Jeremiah Y Cohen
- Solomon H. Snyder Department of Neuroscience & Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Daniel H O'Connor
- Solomon H. Snyder Department of Neuroscience & Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Hongdian Yang
- Department of Molecular, Cell and Systems Biology, University of California, Riverside, Riverside, CA 92521, USA; Neuroscience Graduate Program, University of California, Riverside, Riverside, CA 92521, USA.
| |
Collapse
|
8
|
Parker PRL, Martins DM, Leonard ESP, Casey NM, Sharp SL, Abe ETT, Smear MC, Yates JL, Mitchell JF, Niell CM. A dynamic sequence of visual processing initiated by gaze shifts. Nat Neurosci 2023; 26:2192-2202. [PMID: 37996524 DOI: 10.1038/s41593-023-01481-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 10/04/2023] [Indexed: 11/25/2023]
Abstract
Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.
Collapse
Affiliation(s)
- Philip R L Parker
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
- Behavioral and Systems Neuroscience, Department of Psychology, Rutgers University, New Brunswick, NJ, USA
| | - Dylan M Martins
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Emmalyn S P Leonard
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Nathan M Casey
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Shelby L Sharp
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Elliott T T Abe
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA
| | - Matthew C Smear
- Institute of Neuroscience and Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Jacob L Yates
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Jude F Mitchell
- Department of Brain and Cognitive Sciences and Center for Visual Sciences, University of Rochester, Rochester, NY, USA.
| | - Cristopher M Niell
- Institute of Neuroscience and Department of Biology, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
9
|
Solbach MD, Tsotsos JK. The psychophysics of human three-dimensional active visuospatial problem-solving. Sci Rep 2023; 13:19967. [PMID: 37968501 PMCID: PMC10651907 DOI: 10.1038/s41598-023-47188-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 11/09/2023] [Indexed: 11/17/2023] Open
Abstract
Our understanding of how visual systems detect, analyze and interpret visual stimuli has advanced greatly. However, the visual systems of all animals do much more; they enable visual behaviours. How well the visual system performs while interacting with the visual environment and how vision is used in the real world is far from fully understood, especially in humans. It has been suggested that comparison is the most primitive of psychophysical tasks. Thus, as a probe into these active visual behaviours, we use a same-different task: Are two physical 3D objects visually the same? This task is a fundamental cognitive ability. We pose this question to human subjects who are free to move about and examine two real objects in a physical 3D space. The experimental design is such that all behaviours are directed to viewpoint change. Without any training, our participants achieved a mean accuracy of 93.82%. No learning effect was observed on accuracy after many trials, but some effect was seen for response time, number of fixations and extent of head movement. Our probe task, even though easily executed at high-performance levels, uncovered a surprising variety of complex strategies for viewpoint control, suggesting that solutions were developed dynamically and deployed in a seemingly directed hypothesize-and-test manner tailored to the specific task. Subjects need not acquire task-specific knowledge; instead, they formulate effective solutions right from the outset, and as they engage in a series of attempts, those solutions progressively refine, becoming more efficient without compromising accuracy.
Collapse
Affiliation(s)
- Markus D Solbach
- Department of Electrical Engineering and Computer Science, York University, Toronto, ON, M3J 1P3, Canada.
| | - John K Tsotsos
- Department of Electrical Engineering and Computer Science, York University, Toronto, ON, M3J 1P3, Canada
| |
Collapse
|
10
|
Li JY, Glickfeld LL. Input-specific synaptic depression shapes temporal integration in mouse visual cortex. Neuron 2023; 111:3255-3269.e6. [PMID: 37543037 PMCID: PMC10592405 DOI: 10.1016/j.neuron.2023.07.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Revised: 06/07/2023] [Accepted: 07/06/2023] [Indexed: 08/07/2023]
Abstract
Efficient sensory processing requires the nervous system to adjust to ongoing features of the environment. In primary visual cortex (V1), neuronal activity strongly depends on recent stimulus history. Existing models can explain effects of prolonged stimulus presentation but remain insufficient for explaining effects observed after shorter durations commonly encountered under natural conditions. We investigated the mechanisms driving adaptation in response to brief (100 ms) stimuli in L2/3 V1 neurons by performing in vivo whole-cell recordings to measure membrane potential and synaptic inputs. We find that rapid adaptation is generated by stimulus-specific suppression of excitatory and inhibitory synaptic inputs. Targeted optogenetic experiments reveal that these synaptic effects are due to input-specific short-term depression of transmission between layers 4 and 2/3. Thus, brief stimulus presentation engages a distinct adaptation mechanism from that previously reported in response to prolonged stimuli, enabling flexible control of sensory encoding across a wide range of timescales.
Collapse
Affiliation(s)
- Jennifer Y Li
- Department of Neurobiology, Duke University Medical Center, Durham, NC 27701, USA
| | - Lindsey L Glickfeld
- Department of Neurobiology, Duke University Medical Center, Durham, NC 27701, USA.
| |
Collapse
|
11
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
12
|
Keshavarzi S, Velez-Fort M, Margrie TW. Cortical Integration of Vestibular and Visual Cues for Navigation, Visual Processing, and Perception. Annu Rev Neurosci 2023; 46:301-320. [PMID: 37428601 DOI: 10.1146/annurev-neuro-120722-100503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/12/2023]
Abstract
Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, for example, spatial navigation, is yet to be understood. Recent novel experimental approaches in rodents have probed both the physiological and behavioral significance of vestibular signals and indicate that their widespread integration with vision improves both the cortical representation and perceptual accuracy of self-motion and orientation. Here, we summarize these recent findings with a focus on cortical circuits involved in visual perception and spatial navigation and highlight the major remaining knowledge gaps. We suggest that vestibulo-visual integration reflects a process of constant updating regarding the status of self-motion, and access to such information by the cortex is used for sensory perception and predictions that may be implemented for rapid, navigation-related decision-making.
Collapse
Affiliation(s)
- Sepiedeh Keshavarzi
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Mateo Velez-Fort
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| | - Troy W Margrie
- The Sainsbury Wellcome Centre for Neural Circuits and Behavior, University College London, London, United Kingdom;
| |
Collapse
|
13
|
Saleem AB, Busse L. Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
Affiliation(s)
- Aman B Saleem
- UCL Institute of Behavioural Neuroscience, Department of Experimental Psychology, University College London, London, UK.
| | - Laura Busse
- Division of Neuroscience, Faculty of Biology, LMU Munich, Munich, Germany.
- Bernstein Centre for Computational Neuroscience Munich, Munich, Germany.
| |
Collapse
|
14
|
Yates JL, Coop SH, Sarch GH, Wu RJ, Butts DA, Rucci M, Mitchell JF. Detailed characterization of neural selectivity in free viewing primates. Nat Commun 2023; 14:3656. [PMID: 37339973 PMCID: PMC10282080 DOI: 10.1038/s41467-023-38564-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 05/08/2023] [Indexed: 06/22/2023] Open
Abstract
Fixation constraints in visual tasks are ubiquitous in visual and cognitive neuroscience. Despite its widespread use, fixation requires trained subjects, is limited by the accuracy of fixational eye movements, and ignores the role of eye movements in shaping visual input. To overcome these limitations, we developed a suite of hardware and software tools to study vision during natural behavior in untrained subjects. We measured visual receptive fields and tuning properties from multiple cortical areas of marmoset monkeys who freely viewed full-field noise stimuli. The resulting receptive fields and tuning curves from primary visual cortex (V1) and area MT match reported selectivity from the literature which was measured using conventional approaches. We then combined free viewing with high-resolution eye tracking to make the first detailed 2D spatiotemporal measurements of foveal receptive fields in V1. These findings demonstrate the power of free viewing to characterize neural responses in untrained animals while simultaneously studying the dynamics of natural behavior.
Collapse
Affiliation(s)
- Jacob L Yates
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA.
- Center for Visual Science, University of Rochester, Rochester, NY, USA.
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA.
- Herbert Wertheim School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA.
| | - Shanna H Coop
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Neurobiology, Stanford University, Stanford, CA, USA
| | - Gabriel H Sarch
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Ruei-Jr Wu
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Institute of Optics, University of Rochester, Rochester, NY, USA
| | - Daniel A Butts
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
| | - Michele Rucci
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Jude F Mitchell
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| |
Collapse
|
15
|
Shaw L, Wang KH, Mitchell J. Fast prediction in marmoset reach-to-grasp movements for dynamic prey. Curr Biol 2023:S0960-9822(23)00662-0. [PMID: 37279754 DOI: 10.1016/j.cub.2023.05.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 03/31/2023] [Accepted: 05/15/2023] [Indexed: 06/08/2023]
Abstract
Primates have evolved sophisticated, visually guided reaching behaviors for interacting with dynamic objects, such as insects, during foraging.1,2,3,4,5 Reaching control in dynamic natural conditions requires active prediction of the target's future position to compensate for visuo-motor processing delays and to enhance online movement adjustments.6,7,8,9,10,11,12 Past reaching research in non-human primates mainly focused on seated subjects engaged in repeated ballistic arm movements to either stationary targets or targets that instantaneously change position during the movement.13,14,15,16,17 However, those approaches impose task constraints that limit the natural dynamics of reaching. A recent field study in marmoset monkeys highlights predictive aspects of visually guided reaching during insect prey capture among wild marmoset monkeys.5 To examine the complementary dynamics of similar natural behavior within a laboratory context, we developed an ecologically motivated, unrestrained reach-to-grasp task involving live crickets. We used multiple high-speed video cameras to capture the movements of common marmosets (Callithrix jacchus) and crickets stereoscopically and applied machine vision algorithms for marker-free object and hand tracking. Contrary to estimates under traditional constrained reaching paradigms, we find that reaching for dynamic targets can operate at incredibly short visuo-motor delays around 80 ms, rivaling the speeds that are typical of the oculomotor systems during closed-loop visual pursuit.18 Multivariate linear regression modeling of the kinematic relationships between the hand and cricket velocity revealed that predictions of the expected future location can compensate for visuo-motor delays during fast reaching. These results suggest a critical role of visual prediction facilitating online movement adjustments for dynamic prey.
Collapse
Affiliation(s)
- Luke Shaw
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA
| | - Kuan Hong Wang
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA.
| | - Jude Mitchell
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA.
| |
Collapse
|
16
|
Ding Z, Fahey PG, Papadopoulos S, Wang EY, Celii B, Papadopoulos C, Kunin AB, Chang A, Fu J, Ding Z, Patel S, Ponder K, Muhammad T, Bae JA, Bodor AL, Brittain D, Buchanan J, Bumbarger DJ, Castro MA, Cobos E, Dorkenwald S, Elabbady L, Halageri A, Jia Z, Jordan C, Kapner D, Kemnitz N, Kinn S, Lee K, Li K, Lu R, Macrina T, Mahalingam G, Mitchell E, Mondal SS, Mu S, Nehoran B, Popovych S, Schneider-Mizell CM, Silversmith W, Takeno M, Torres R, Turner NL, Wong W, Wu J, Yin W, Yu SC, Froudarakis E, Sinz F, Seung HS, Collman F, da Costa NM, Reid RC, Walker EY, Pitkow X, Reimer J, Tolias AS. Functional connectomics reveals general wiring rule in mouse visual cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.13.531369. [PMID: 36993398 PMCID: PMC10054929 DOI: 10.1101/2023.03.13.531369] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
To understand how the brain computes, it is important to unravel the relationship between circuit connectivity and function. Previous research has shown that excitatory neurons in layer 2/3 of the primary visual cortex of mice with similar response properties are more likely to form connections. However, technical challenges of combining synaptic connectivity and functional measurements have limited these studies to few, highly local connections. Utilizing the millimeter scale and nanometer resolution of the MICrONS dataset, we studied the connectivity-function relationship in excitatory neurons of the mouse visual cortex across interlaminar and interarea projections, assessing connection selectivity at the coarse axon trajectory and fine synaptic formation levels. A digital twin model of this mouse, that accurately predicted responses to arbitrary video stimuli, enabled a comprehensive characterization of the function of neurons. We found that neurons with highly correlated responses to natural videos tended to be connected with each other, not only within the same cortical area but also across multiple layers and visual areas, including feedforward and feedback connections, whereas we did not find that orientation preference predicted connectivity. The digital twin model separated each neuron's tuning into a feature component (what the neuron responds to) and a spatial component (where the neuron's receptive field is located). We show that the feature, but not the spatial component, predicted which neurons were connected at the fine synaptic scale. Together, our results demonstrate the "like-to-like" connectivity rule generalizes to multiple connection types, and the rich MICrONS dataset is suitable to further refine a mechanistic understanding of circuit structure and function.
Collapse
Affiliation(s)
- Zhuokun Ding
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Paul G Fahey
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Stelios Papadopoulos
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Eric Y Wang
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Brendan Celii
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Christos Papadopoulos
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Alexander B Kunin
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
- Department of Mathematics, Creighton University, Omaha, USA
| | - Andersen Chang
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Jiakun Fu
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Zhiwei Ding
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Saumil Patel
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Kayla Ponder
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Taliah Muhammad
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - J Alexander Bae
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Electrical and Computer Engineering Department, Princeton University, Princeton, USA
| | | | | | | | | | - Manuel A Castro
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Erick Cobos
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Sven Dorkenwald
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | | | - Akhilesh Halageri
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Zhen Jia
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | - Chris Jordan
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Dan Kapner
- Allen Institute for Brain Science, Seattle, USA
| | - Nico Kemnitz
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Sam Kinn
- Allen Institute for Brain Science, Seattle, USA
| | - Kisuk Lee
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Brain & Cognitive Sciences Department, Massachusetts Institute of Technology, Cambridge, USA
| | - Kai Li
- Computer Science Department, Princeton University, Princeton, USA
| | - Ran Lu
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Thomas Macrina
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | | | - Eric Mitchell
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Shanka Subhra Mondal
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Electrical and Computer Engineering Department, Princeton University, Princeton, USA
| | - Shang Mu
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Barak Nehoran
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | - Sergiy Popovych
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | | | | | - Marc Takeno
- Allen Institute for Brain Science, Seattle, USA
| | | | - Nicholas L Turner
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Computer Science Department, Princeton University, Princeton, USA
| | - William Wong
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Jingpeng Wu
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Wenjing Yin
- Allen Institute for Brain Science, Seattle, USA
| | - Szi-Chieh Yu
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Emmanouil Froudarakis
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
- Department of Basic Sciences, Faculty of Medicine, University of Crete, Heraklion, Greece
| | - Fabian Sinz
- Institute for Bioinformatics and Medical Informatics, University Tübingen, Tübingen, Germany
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology Hellas, Heraklion, Greece
| | - H Sebastian Seung
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | | | | | - R Clay Reid
- Allen Institute for Brain Science, Seattle, USA
| | - Edgar Y Walker
- Department of Physiology and Biophysics, University of Washington, Seattle, USA
- Computational Neuroscience Center, University of Washington, Seattle, USA
| | - Xaq Pitkow
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
- Department of Electrical and Computer Engineering, Rice University, Houston, USA
| | - Jacob Reimer
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
| | - Andreas S Tolias
- Center for Neuroscience and Artificial Intelligence, Baylor College of Medicine, Houston, USA
- Department of Neuroscience, Baylor College of Medicine, Houston, USA
- Department of Electrical and Computer Engineering, Rice University, Houston, USA
| |
Collapse
|
17
|
St-Amand D, Baker CL. Model-Based Approach Shows ON Pathway Afferents Elicit a Transient Decrease of V1 Responses. J Neurosci 2023; 43:1920-1932. [PMID: 36759194 PMCID: PMC10027028 DOI: 10.1523/jneurosci.1220-22.2023] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Revised: 01/29/2023] [Accepted: 01/30/2023] [Indexed: 02/11/2023] Open
Abstract
Neurons in the primary visual cortex (V1) receive excitation and inhibition from distinct parallel pathways processing lightness (ON) and darkness (OFF). V1 neurons overall respond more strongly to dark than light stimuli, consistent with a preponderance of darker regions in natural images, as well as human psychophysics. However, it has been unclear whether this "dark-dominance" is because of more excitation from the OFF pathway or more inhibition from the ON pathway. To understand the mechanisms behind dark-dominance, we record electrophysiological responses of individual simple-type V1 neurons to natural image stimuli and then train biologically inspired convolutional neural networks to predict the neurons' responses. Analyzing a sample of 71 neurons (in anesthetized, paralyzed cats of either sex) has revealed their responses to be more driven by dark than light stimuli, consistent with previous investigations. We show that this asymmetry is predominantly because of slower inhibition to dark stimuli rather than to stronger excitation from the thalamocortical OFF pathway. Consistent with dark-dominant neurons having faster responses than light-dominant neurons, we find dark-dominance to solely occur in the early latencies of neurons' responses. Neurons that are strongly dark-dominated also tend to be less orientation-selective. This novel approach gives us new insight into the dark-dominance phenomenon and provides an avenue to address new questions about excitatory and inhibitory integration in cortical neurons.SIGNIFICANCE STATEMENT Neurons in the early visual cortex respond on average more strongly to dark than to light stimuli, but the mechanisms behind this bias have been unclear. Here we address this issue by combining single-unit electrophysiology with a novel machine learning model to analyze neurons' responses to natural image stimuli in primary visual cortex. Using these techniques, we find slower inhibition to light than to dark stimuli to be the leading mechanism behind stronger dark responses. This slower inhibition to light might help explain other empirical findings, such as why orientation selectivity is weaker at earlier response latencies. These results demonstrate how imbalances in excitation versus inhibition can give rise to response asymmetries in cortical neuron responses.
Collapse
Affiliation(s)
- David St-Amand
- McGill Vision Research Unit, Department of Ophthalmology & Visual Sciences, McGill University, Montreal, Quebec H3G 1A4, Canada
| | - Curtis L Baker
- McGill Vision Research Unit, Department of Ophthalmology & Visual Sciences, McGill University, Montreal, Quebec H3G 1A4, Canada
| |
Collapse
|
18
|
Li JY, Glickfeld LL. Input-specific synaptic depression shapes temporal integration in mouse visual cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.30.526211. [PMID: 36778279 PMCID: PMC9915496 DOI: 10.1101/2023.01.30.526211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Efficient sensory processing requires the nervous system to adjust to ongoing features of the environment. In primary visual cortex (V1), neuronal activity strongly depends on recent stimulus history. Existing models can explain effects of prolonged stimulus presentation, but remain insufficient for explaining effects observed after shorter durations commonly encountered under natural conditions. We investigated the mechanisms driving adaptation in response to brief (100 ms) stimuli in L2/3 V1 neurons by performing in vivo whole-cell recordings to measure membrane potential and synaptic inputs. We find that rapid adaptation is generated by stimulus-specific suppression of excitatory and inhibitory synaptic inputs. Targeted optogenetic experiments reveal that these synaptic effects are due to input-specific short-term depression of transmission between layers 4 and 2/3. Thus, distinct mechanisms are engaged following brief and prolonged stimulus presentation and together enable flexible control of sensory encoding across a wide range of time scales.
Collapse
Affiliation(s)
- Jennifer Y Li
- Department of Neurobiology, Duke University Medical Center, Durham, NC 27701, USA
| | - Lindsey L Glickfeld
- Department of Neurobiology, Duke University Medical Center, Durham, NC 27701, USA
| |
Collapse
|
19
|
Noel JP, Balzani E, Avila E, Lakshminarasimhan KJ, Bruni S, Alefantis P, Savin C, Angelaki DE. Coding of latent variables in sensory, parietal, and frontal cortices during closed-loop virtual navigation. eLife 2022; 11:80280. [PMID: 36282071 PMCID: PMC9668339 DOI: 10.7554/elife.80280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2022] [Accepted: 10/24/2022] [Indexed: 11/13/2022] Open
Abstract
We do not understand how neural nodes operate and coordinate within the recurrent action-perception loops that characterize naturalistic self-environment interactions. Here, we record single-unit spiking activity and local field potentials (LFPs) simultaneously from the dorsomedial superior temporal area (MSTd), parietal area 7a, and dorsolateral prefrontal cortex (dlPFC) as monkeys navigate in virtual reality to 'catch fireflies'. This task requires animals to actively sample from a closed-loop virtual environment while concurrently computing continuous latent variables: (i) the distance and angle travelled (i.e., path integration) and (ii) the distance and angle to a memorized firefly location (i.e., a hidden spatial goal). We observed a patterned mixed selectivity, with the prefrontal cortex most prominently coding for latent variables, parietal cortex coding for sensorimotor variables, and MSTd most often coding for eye movements. However, even the traditionally considered sensory area (i.e., MSTd) tracked latent variables, demonstrating path integration and vector coding of hidden spatial goals. Further, global encoding profiles and unit-to-unit coupling (i.e., noise correlations) suggested a functional subnetwork composed by MSTd and dlPFC, and not between these and 7a, as anatomy would suggest. We show that the greater the unit-to-unit coupling between MSTd and dlPFC, the more the animals' gaze position was indicative of the ongoing location of the hidden spatial goal. We suggest this MSTd-dlPFC subnetwork reflects the monkeys' natural and adaptive task strategy wherein they continuously gaze toward the location of the (invisible) target. Together, these results highlight the distributed nature of neural coding during closed action-perception loops and suggest that fine-grain functional subnetworks may be dynamically established to subserve (embodied) task strategies.
Collapse
Affiliation(s)
- Jean-Paul Noel
- Center for Neural Science, New York University, New York City, United States
| | - Edoardo Balzani
- Center for Neural Science, New York University, New York City, United States
| | - Eric Avila
- Center for Neural Science, New York University, New York City, United States
| | - Kaushik J Lakshminarasimhan
- Center for Neural Science, New York University, New York City, United States.,Center for Theoretical Neuroscience, Columbia University, New York, United States
| | - Stefania Bruni
- Center for Neural Science, New York University, New York City, United States
| | - Panos Alefantis
- Center for Neural Science, New York University, New York City, United States
| | - Cristina Savin
- Center for Neural Science, New York University, New York City, United States
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York City, United States
| |
Collapse
|