1
|
Bayer M, Zimmermann E. Serial dependencies in visual stability during self-motion. J Neurophysiol 2023; 130:447-457. [PMID: 37465870 DOI: 10.1152/jn.00157.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Revised: 07/14/2023] [Accepted: 07/18/2023] [Indexed: 07/20/2023] Open
Abstract
Every time we move our head, the brain must decide whether the displacement of the visual scene is the result of external or self-produced motion. Gaze shifts generate the biggest and most frequent disturbance of vision. Visual stability during gaze shifts is necessary for both, dissociating self-produced from external motion and retaining bodily balance. Here, we asked participants to perform an eye-head gaze shift to a target that was briefly presented in a head-mounted display. We manipulated the velocity of the scene displacement across trials such that the background moved either too fast or too slow in relation to the head movement speed. Participants were required to report whether they perceived the gaze-contingent visual motion as faster or slower than what they would expect from their head movement velocity. We found that the point of visual stability was attracted to the velocity presented in the previous trial. Our data reveal that serial dependencies in visual stability calibrate the mapping between motor-related signals coding head movement velocity and visual motion velocity. This process is likely to aid in visual stability as the accuracy of this mapping is crucial to maintain visual stability during self-motion.NEW & NOTEWORTHY We report that visual stability during self-motion is maintained by serial dependencies between the current and the previous gaze-contingent visual velocity that was experienced during a head movement. The gaze-contingent scene displacement velocity that appears normal to us thus depends on what we have registered in the recent history of gaze shifts. Serial dependencies provide an efficient means to maintain visual stability during self-motion.
Collapse
Affiliation(s)
- Manuel Bayer
- Institute for Experimental Psychology, Heinrich-Heine-University Düsseldorf, Germany
| | - Eckart Zimmermann
- Institute for Experimental Psychology, Heinrich-Heine-University Düsseldorf, Germany
| |
Collapse
|
2
|
DiRisio GF, Ra Y, Qiu Y, Anzai A, DeAngelis GC. Neurons in Primate Area MSTd Signal Eye Movement Direction Inferred from Dynamic Perspective Cues in Optic Flow. J Neurosci 2023; 43:1888-1904. [PMID: 36725323 PMCID: PMC10027048 DOI: 10.1523/jneurosci.1885-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Revised: 01/18/2023] [Accepted: 01/24/2023] [Indexed: 02/03/2023] Open
Abstract
Smooth eye movements are common during natural viewing; we frequently rotate our eyes to track moving objects or to maintain fixation on an object during self-movement. Reliable information about smooth eye movements is crucial to various neural computations, such as estimating heading from optic flow or judging depth from motion parallax. While it is well established that extraretinal signals (e.g., efference copies of motor commands) carry critical information about eye velocity, the rotational optic flow field produced by eye rotations also carries valuable information. Although previous work has shown that dynamic perspective cues in optic flow can be used in computations that require estimates of eye velocity, it has remained unclear where and how the brain processes these visual cues and how they are integrated with extraretinal signals regarding eye rotation. We examined how neurons in the dorsal region of the medial superior temporal area (MSTd) of two male rhesus monkeys represent the direction of smooth pursuit eye movements based on both visual cues (dynamic perspective) and extraretinal signals. We find that most MSTd neurons have matched preferences for the direction of eye rotation based on visual and extraretinal signals. Moreover, neural responses to combinations of these signals are well predicted by a weighted linear summation model. These findings demonstrate a neural substrate for representing the velocity of smooth eye movements based on rotational optic flow and establish area MSTd as a key node for integrating visual and extraretinal signals into a more generalized representation of smooth eye movements.SIGNIFICANCE STATEMENT We frequently rotate our eyes to smoothly track objects of interest during self-motion. Information about eye velocity is crucial for a variety of computations performed by the brain, including depth perception and heading perception. Traditionally, information about eye rotation has been thought to arise mainly from extraretinal signals, such as efference copies of motor commands. Previous work shows that eye velocity can also be inferred from rotational optic flow that accompanies smooth eye movements, but the neural origins of these visual signals about eye rotation have remained unknown. We demonstrate that macaque neurons signal the direction of smooth eye rotation based on visual signals, and that they integrate both visual and extraretinal signals regarding eye rotation in a congruent fashion.
Collapse
Affiliation(s)
- Grace F DiRisio
- Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, New York 14627
- Department of Neurobiology, University of Chicago, Chicago, Illinois 60637
| | - Yongsoo Ra
- Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, New York 14627
- Department of Neurobiology, Harvard Medical School, Boston, Massachusetts 02115
| | - Yinghui Qiu
- Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, New York 14627
- College of Veterinary Medicine, Cornell University, Ithaca, New York 14853-6401
| | - Akiyuki Anzai
- Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, New York 14627
| | - Gregory C DeAngelis
- Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, New York 14627
| |
Collapse
|
3
|
Mei Chow H, Spering M. Eye movements during optic flow perception. Vision Res 2023; 204:108164. [PMID: 36566560 DOI: 10.1016/j.visres.2022.108164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Revised: 11/22/2022] [Accepted: 12/07/2022] [Indexed: 12/24/2022]
Abstract
Optic flow is an important visual cue for human perception and locomotion and naturally triggers eye movements. Here we investigate whether the perception of optic flow direction is limited or enhanced by eye movements. In Exp. 1, 23 human observers localized the focus of expansion (FOE) of an optic flow pattern; in Exp. 2, 18 observers had to detect brief visual changes at the FOE. Both tasks were completed during free viewing and fixation conditions while eye movements were recorded. Task difficulty was varied by manipulating the coherence of radial motion from the FOE (4 %-90 %). During free viewing, observers tracked the optic flow pattern with a combination of saccades and smooth eye movements. During fixation, observers nevertheless made small-scale eye movements. Despite differences in spatial scale, eye movements during free viewing and fixation were similarly directed toward the FOE (saccades) and away from the FOE (smooth tracking). Whereas FOE localization sensitivity was not affected by eye movement instructions (Exp. 1), observers' sensitivity to detect brief changes at the FOE was 27 % higher (p <.001) during free-viewing compared to fixation (Exp. 2). This performance benefit was linked to reduced saccade endpoint errors, indicating the direct beneficial impact of foveating eye movements on performance in a fine-grain perceptual task, but not during coarse perceptual localization.
Collapse
Affiliation(s)
- Hiu Mei Chow
- Dept. of Psychology, St. Thomas University, Fredericton, Canada; Dept. of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada.
| | - Miriam Spering
- Dept. of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada; Djavad Mowafaghian Center for Brain Health, University of British Columbia, Vancouver, Canada; Institute for Computing, Information and Cognitive Systems, University of British Columbia, Vancouver, Canada
| |
Collapse
|
4
|
Chen K, Beyeler M, Krichmar JL. Cortical Motion Perception Emerges from Dimensionality Reduction with Evolved Spike-Timing-Dependent Plasticity Rules. J Neurosci 2022; 42:5882-5898. [PMID: 35732492 PMCID: PMC9337611 DOI: 10.1523/jneurosci.0384-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 05/16/2022] [Accepted: 06/08/2022] [Indexed: 01/29/2023] Open
Abstract
The nervous system is under tight energy constraints and must represent information efficiently. This is particularly relevant in the dorsal part of the medial superior temporal area (MSTd) in primates where neurons encode complex motion patterns to support a variety of behaviors. A sparse decomposition model based on a dimensionality reduction principle known as non-negative matrix factorization (NMF) was previously shown to account for a wide range of monkey MSTd visual response properties. This model resulted in sparse, parts-based representations that could be regarded as basis flow fields, a linear superposition of which accurately reconstructed the input stimuli. This model provided evidence that the seemingly complex response properties of MSTd may be a by-product of MSTd neurons performing dimensionality reduction on their input. However, an open question is how a neural circuit could carry out this function. In the current study, we propose a spiking neural network (SNN) model of MSTd based on evolved spike-timing-dependent plasticity and homeostatic synaptic scaling (STDP-H) learning rules. We demonstrate that the SNN model learns compressed and efficient representations of the input patterns similar to the patterns that emerge from NMF, resulting in MSTd-like receptive fields observed in monkeys. This SNN model suggests that STDP-H observed in the nervous system may be performing a similar function as NMF with sparsity constraints, which provides a test bed for mechanistic theories of how MSTd may efficiently encode complex patterns of visual motion to support robust self-motion perception.SIGNIFICANCE STATEMENT The brain may use dimensionality reduction and sparse coding to efficiently represent stimuli under metabolic constraints. Neurons in monkey area MSTd respond to complex optic flow patterns resulting from self-motion. We developed a spiking neural network model that showed MSTd-like response properties can emerge from evolving spike-timing-dependent plasticity with STDP-H parameters of the connections between then middle temporal area and MSTd. Simulated MSTd neurons formed a sparse, reduced population code capable of encoding perceptual variables important for self-motion perception. This model demonstrates that complex neuronal responses observed in MSTd may emerge from efficient coding and suggests that neurobiological plasticity, like STDP-H, may contribute to reducing the dimensions of input stimuli and allowing spiking neurons to learn sparse representations.
Collapse
Affiliation(s)
| | - Michael Beyeler
- Departments of Computer Science
- Psychological & Brain Sciences, University of California, Santa Barbara, California 93106
| | - Jeffrey L Krichmar
- Departments of Cognitive Sciences
- Computer Science, University of California, Irvine, California 92697
| |
Collapse
|
5
|
Matthis JS, Muller KS, Bonnen KL, Hayhoe MM. Retinal optic flow during natural locomotion. PLoS Comput Biol 2022; 18:e1009575. [PMID: 35192614 PMCID: PMC8896712 DOI: 10.1371/journal.pcbi.1009575] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2021] [Revised: 03/04/2022] [Accepted: 10/14/2021] [Indexed: 11/18/2022] Open
Abstract
We examine the structure of the visual motion projected on the retina during natural locomotion in real world environments. Bipedal gait generates a complex, rhythmic pattern of head translation and rotation in space, so without gaze stabilization mechanisms such as the vestibular-ocular-reflex (VOR) a walker's visually specified heading would vary dramatically throughout the gait cycle. The act of fixation on stable points in the environment nulls image motion at the fovea, resulting in stable patterns of outflow on the retinae centered on the point of fixation. These outflowing patterns retain a higher order structure that is informative about the stabilized trajectory of the eye through space. We measure this structure by applying the curl and divergence operations on the retinal flow velocity vector fields and found features that may be valuable for the control of locomotion. In particular, the sign and magnitude of foveal curl in retinal flow specifies the body's trajectory relative to the gaze point, while the point of maximum divergence in the retinal flow field specifies the walker's instantaneous overground velocity/momentum vector in retinotopic coordinates. Assuming that walkers can determine the body position relative to gaze direction, these time-varying retinotopic cues for the body's momentum could provide a visual control signal for locomotion over complex terrain. In contrast, the temporal variation of the eye-movement-free, head-centered flow fields is large enough to be problematic for use in steering towards a goal. Consideration of optic flow in the context of real-world locomotion therefore suggests a re-evaluation of the role of optic flow in the control of action during natural behavior.
Collapse
Affiliation(s)
- Jonathan Samir Matthis
- Department of Biology, Northeastern University, Boston, Massachusetts, United States of America
| | - Karl S. Muller
- Center for Perceptual Systems, University of Texas at Austin, Austin, Texas, United States of America
| | - Kathryn L. Bonnen
- School of Optometry, Indiana University Bloomington, Bloomington, Indiana, United States of America
| | - Mary M. Hayhoe
- Center for Perceptual Systems, University of Texas at Austin, Austin, Texas, United States of America
| |
Collapse
|
6
|
Wild B, Treue S. Primate extrastriate cortical area MST: a gateway between sensation and cognition. J Neurophysiol 2021; 125:1851-1882. [PMID: 33656951 DOI: 10.1152/jn.00384.2020] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023] Open
Abstract
Primate visual cortex consists of dozens of distinct brain areas, each providing a highly specialized component to the sophisticated task of encoding the incoming sensory information and creating a representation of our visual environment that underlies our perception and action. One such area is the medial superior temporal cortex (MST), a motion-sensitive, direction-selective part of the primate visual cortex. It receives most of its input from the middle temporal (MT) area, but MST cells have larger receptive fields and respond to more complex motion patterns. The finding that MST cells are tuned for optic flow patterns has led to the suggestion that the area plays an important role in the perception of self-motion. This hypothesis has received further support from studies showing that some MST cells also respond selectively to vestibular cues. Furthermore, the area is part of a network that controls the planning and execution of smooth pursuit eye movements and its activity is modulated by cognitive factors, such as attention and working memory. This review of more than 90 studies focuses on providing clarity of the heterogeneous findings on MST in the macaque cortex and its putative homolog in the human cortex. From this analysis of the unique anatomical and functional position in the hierarchy of areas and processing steps in primate visual cortex, MST emerges as a gateway between perception, cognition, and action planning. Given this pivotal role, this area represents an ideal model system for the transition from sensation to cognition.
Collapse
Affiliation(s)
- Benedict Wild
- Cognitive Neuroscience Laboratory, German Primate Center, Leibniz Institute for Primate Research, Goettingen, Germany.,Goettingen Graduate Center for Neurosciences, Biophysics, and Molecular Biosciences (GGNB), University of Goettingen, Goettingen, Germany
| | - Stefan Treue
- Cognitive Neuroscience Laboratory, German Primate Center, Leibniz Institute for Primate Research, Goettingen, Germany.,Faculty of Biology and Psychology, University of Goettingen, Goettingen, Germany.,Leibniz-ScienceCampus Primate Cognition, Goettingen, Germany.,Bernstein Center for Computational Neuroscience, Goettingen, Germany
| |
Collapse
|
7
|
Burlingham CS, Heeger DJ. Heading perception depends on time-varying evolution of optic flow. Proc Natl Acad Sci U S A 2020; 117:33161-33169. [PMID: 33328275 PMCID: PMC7776640 DOI: 10.1073/pnas.2022984117] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
There is considerable support for the hypothesis that perception of heading in the presence of rotation is mediated by instantaneous optic flow. This hypothesis, however, has never been tested. We introduce a method, termed "nonvarying phase motion," for generating a stimulus that conveys a single instantaneous optic flow field, even though the stimulus is presented for an extended period of time. In this experiment, observers viewed stimulus videos and performed a forced-choice heading discrimination task. For nonvarying phase motion, observers made large errors in heading judgments. This suggests that instantaneous optic flow is insufficient for heading perception in the presence of rotation. These errors were mostly eliminated when the velocity of phase motion was varied over time to convey the evolving sequence of optic flow fields corresponding to a particular heading. This demonstrates that heading perception in the presence of rotation relies on the time-varying evolution of optic flow. We hypothesize that the visual system accurately computes heading, despite rotation, based on optic acceleration, the temporal derivative of optic flow.
Collapse
Affiliation(s)
| | - David J Heeger
- Department of Psychology, New York University, New York, NY 10003;
- Center for Neural Science, New York University, New York, NY 10003
| |
Collapse
|
8
|
The Effects of Depth Cues and Vestibular Translation Signals on the Rotation Tolerance of Heading Tuning in Macaque Area MSTd. eNeuro 2020; 7:ENEURO.0259-20.2020. [PMID: 33127626 PMCID: PMC7688306 DOI: 10.1523/eneuro.0259-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 10/17/2020] [Accepted: 10/22/2020] [Indexed: 12/03/2022] Open
Abstract
When the eyes rotate during translational self-motion, the focus of expansion (FOE) in optic flow no longer indicates heading, yet heading judgements are largely unbiased. Much emphasis has been placed on the role of extraretinal signals in compensating for the visual consequences of eye rotation. However, recent studies also support a purely visual mechanism of rotation compensation in heading-selective neurons. Computational theories support a visual compensatory strategy but require different visual depth cues. We examined the rotation tolerance of heading tuning in macaque area MSTd using two different virtual environments, a frontoparallel (2D) wall and a 3D cloud of random dots. Both environments contained rotational optic flow cues (i.e., dynamic perspective), but only the 3D cloud stimulus contained local motion parallax cues, which are required by some models. The 3D cloud environment did not enhance the rotation tolerance of heading tuning for individual MSTd neurons, nor the accuracy of heading estimates decoded from population activity, suggesting a key role for dynamic perspective cues. We also added vestibular translation signals to optic flow, to test whether rotation tolerance is enhanced by non-visual cues to heading. We found no benefit of vestibular signals overall, but a modest effect for some neurons with significant vestibular heading tuning. We also find that neurons with more rotation tolerant heading tuning typically are less selective to pure visual rotation cues. Together, our findings help to clarify the types of information that are used to construct heading representations that are tolerant to eye rotations.
Collapse
|
9
|
Kuang S, Deng H, Zhang T. Adaptive heading performance during self-motion perception. Psych J 2019; 9:295-305. [PMID: 31814320 DOI: 10.1002/pchj.330] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2019] [Revised: 11/05/2019] [Accepted: 11/13/2019] [Indexed: 11/07/2022]
Abstract
Previous studies have documented that the perception of self-motion direction can be extracted from the patterns of image motion on the retina (also termed optic flow). Self-motion perception remains stable even when the optic-flow information is distorted by concurrent gaze shifts from body/eye rotations. This has been interpreted that extraretinal signals-efference copies of eye/body movements-are involved in compensating for retinal distortions. Here, we tested an alternative hypothesis to the extraretinal interpretation. We hypothesized that accurate self-motion perception can be achieved from a purely optic-flow-based visual strategy acquired through experience, independent of extraretinal mechanism. To test this, we asked human subjects to perform a self-motion direction discrimination task under normal optic flow (fixation condition) or distorted optic flow resulted from either realistic (pursuit condition) or simulated (simulated condition) eye movements. The task was performed either without (pre- and posttraining) or with (during training) the feedback about the correct answer. We first replicated the previous observation that before training, direction perception was greatly impaired in the simulated condition where the optic flow was distorted and extraretinal eye movement signals were absent. We further showed that after a few training sessions, the initial impairment in direction perception was gradually improved. These results reveal that behavioral training can enforce the exploitation of retinal cues to compensate for the distortion, without the contribution from the extraretinal signals. Our results suggest that self-motion perception is a flexible and adaptive process which might depend on neural plasticity in relevant cortical areas.
Collapse
Affiliation(s)
- Shenbing Kuang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Hu Deng
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Tao Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| |
Collapse
|