1
|
Gallagher M, Haynes JD, Culling JF, Freeman TCA. A model of audio-visual motion integration during active self-movement. J Vis 2025; 25:8. [PMID: 39969485 PMCID: PMC11841688 DOI: 10.1167/jov.25.2.8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2024] [Accepted: 01/08/2025] [Indexed: 02/20/2025] Open
Abstract
Despite good evidence for optimal audio-visual integration in stationary observers, few studies have considered the impact of self-movement on this process. When the head and/or eyes move, the integration of vision and hearing is complicated, as the sensory measurements begin in different coordinate frames. To successfully integrate these signals, they must first be transformed into the same coordinate frame. We propose that audio and visual motion cues are separately transformed using self-movement signals, before being integrated as body-centered cues to audio-visual motion. We tested this hypothesis using a psychophysical audio-visual integration task in which participants made left/right judgments of audio, visual, or audio-visual targets during self-generated yaw head rotations. Estimates of precision and bias from the audio and visual conditions were used to predict performance in the audio-visual conditions. We found that audio-visual performance was predicted well by models that suggested the transformation of cues into common coordinates but could not be explained by a model that did not rely on coordinate transformation before integration. We also found that precision specifically was better predicted by a model that accounted for shared noise arising from signals encoding head movement. Taken together, our findings suggest that motion perception in active observers is based on the integration of partially correlated body-centered signals.
Collapse
Affiliation(s)
| | - Joshua D Haynes
- School of Psychology, Cardiff University, Cardiff, UK
- School of Health Sciences, University of Manchester, Manchester, UK
| | | | | |
Collapse
|
2
|
Masselink J, Lappe M. Visuomotor learning from postdictive motor error. eLife 2021; 10:64278. [PMID: 33687328 PMCID: PMC8057815 DOI: 10.7554/elife.64278] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2020] [Accepted: 03/04/2021] [Indexed: 01/02/2023] Open
Abstract
Sensorimotor learning adapts motor output to maintain movement accuracy. For saccadic eye movements, learning also alters space perception, suggesting a dissociation between the performed saccade and its internal representation derived from corollary discharge (CD). This is critical since learning is commonly believed to be driven by CD-based visual prediction error. We estimate the internal saccade representation through pre- and trans-saccadic target localization, showing that it decouples from the actual saccade during learning. We present a model that explains motor and perceptual changes by collective plasticity of spatial target percept, motor command, and a forward dynamics model that transforms CD from motor into visuospatial coordinates. We show that learning does not follow visual prediction error but instead a postdictive update of space after saccade landing. We conclude that trans-saccadic space perception guides motor learning via CD-based postdiction of motor error under the assumption of a stable world.
Collapse
Affiliation(s)
- Jana Masselink
- Institute for Psychology and Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| | - Markus Lappe
- Institute for Psychology and Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Münster, Germany
| |
Collapse
|
3
|
Garzorz IT, Freeman TCA, Ernst MO, MacNeilage PR. Insufficient compensation for self-motion during perception of object speed: The vestibular Aubert-Fleischl phenomenon. J Vis 2018; 18:9. [DOI: 10.1167/18.13.9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
- Isabelle T. Garzorz
- German Center for Vertigo and Balance Disorders (DSGZ), University Hospital of Munich, Ludwig Maximilian University, Munich, Germany
- Graduate School of Systemic Neurosciences (GSN), Ludwig Maximilian University, Planegg-Martinsried, Germany
| | | | - Marc O. Ernst
- Applied Cognitive Psychology, Faculty for Computer Science, Engineering, and Psychology, Ulm University, Ulm, Germany
| | - Paul R. MacNeilage
- German Center for Vertigo and Balance Disorders (DSGZ), University Hospital of Munich, Ludwig Maximilian University, Munich, Germany
- Present address: Department of Psychology, Cognitive and Brain Sciences, University of Nevada, Reno, NV, USA
| |
Collapse
|
4
|
On the Mechanics of Immediate Corrections and Aftereffects in Prism Adaptation. Vision (Basel) 2017; 1:vision1040027. [PMID: 31740652 PMCID: PMC6836038 DOI: 10.3390/vision1040027] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2017] [Revised: 12/11/2017] [Accepted: 12/12/2017] [Indexed: 12/02/2022] Open
Abstract
Prisms laterally shifting the perceived visual world cause arm movements to deviate from intended targets. The resulting error—the direct effect—both for pointing and throwing movements, usually corresponds to only around half of the prism’s optical power due to an “immediate correction effect”. We investigated the mechanisms of this immediate correction effect. In three experiments with 73 healthy subjects we find that the immediate correction effect is associated with a head and/or eye rotation. Since these rotations are subconscious they are not taken into account by the participants. These subconscious rotations compensate for a large portion of the prism’s optical effect and change the subjective straight ahead. These movements seem to be induced only in a rich visual environment and hence do not take place in the dark. They correspond to the difference between the direct effect and the optical power of the prisms and seem to cause the immediate correction effect. Hence, eye-hand adaptation only adapts to the prism’s optical power minus unconscious head rotation and hence is much smaller than the optical power of the prisms.
Collapse
|
5
|
Furman M, Gur M. And yet it moves: Perceptual illusions and neural mechanisms of pursuit compensation during smooth pursuit eye movements. Neurosci Biobehav Rev 2012; 36:143-51. [DOI: 10.1016/j.neubiorev.2011.05.005] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2010] [Revised: 05/02/2011] [Accepted: 05/11/2011] [Indexed: 10/18/2022]
|
6
|
Davies JR, Freeman TCA. Simultaneous adaptation to non-collinear retinal motion and smooth pursuit eye movement. Vision Res 2011; 51:1637-47. [PMID: 21605588 DOI: 10.1016/j.visres.2011.05.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2010] [Revised: 05/03/2011] [Accepted: 05/06/2011] [Indexed: 10/18/2022]
Abstract
Simultaneously adapting to retinal motion and non-collinear pursuit eye movement produces a motion aftereffect (MAE) that moves in a different direction to either of the individual adapting motions. Mack, Hill and Kahn (1989, Perception, 18, 649-655) suggested that the MAE was determined by the perceived motion experienced during adaptation. We tested the perceived-motion hypothesis by having observers report perceived direction during simultaneous adaptation. For both central and peripheral retinal motion adaptation, perceived direction did not predict the direction of subsequent MAE. To explain the findings we propose that the MAE is based on the vector sum of two components, one corresponding to a retinal MAE opposite to the adapting retinal motion and the other corresponding to an extra-retina MAE opposite to the eye movement. A vector model of this component hypothesis showed that the MAE directions reported in our experiments were the result of an extra-retinal component that was substantially larger in magnitude than the retinal component when the adapting retinal motion was positioned centrally. However, when retinal adaptation was peripheral, the model suggested the magnitude of the components should be about the same. These predictions were tested in a final experiment that used a magnitude estimation technique. Contrary to the predictions, the results showed no interaction between type of adaptation (retinal or pursuit) and the location of adapting retinal motion. Possible reasons for the failure of component hypothesis to fully explain the data are discussed.
Collapse
Affiliation(s)
- J Rhys Davies
- School of Psychology, Tower Building, Park Place, Cardiff University, CF10 3AT, UK
| | | |
Collapse
|
7
|
Spering M, Montagnini A. Do we track what we see? Common versus independent processing for motion perception and smooth pursuit eye movements: A review. Vision Res 2011; 51:836-52. [DOI: 10.1016/j.visres.2010.10.017] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2010] [Revised: 10/09/2010] [Accepted: 10/11/2010] [Indexed: 01/08/2023]
|
8
|
Tactile suppression of displacement. Exp Brain Res 2010; 206:299-310. [DOI: 10.1007/s00221-010-2407-z] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2010] [Accepted: 08/27/2010] [Indexed: 12/23/2022]
|
9
|
Haarmeier T, Kammer T. Effect of TMS on oculomotor behavior but not perceptual stability during smooth pursuit eye movements. Cereb Cortex 2010; 20:2234-43. [PMID: 20064941 DOI: 10.1093/cercor/bhp285] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
During smooth pursuit eye movements, we do not mistake the shift of the retinal image induced by the visual background for motion of the world around us but instead perceive a stable world. The goal of this study was to search for the neuronal substrates providing perceptual stability. To this end, pursuit eye movements across a background stimulus and perceptual stability were measured in the absence and presence, respectively, of transcranial magnetic stimulation (TMS) applied to 6 different brain regions, that is, primary visual cortex (V1), area MT+/V5, left and right temporoparietal junctions (TPJs), medial parieto-occipital cortex (POC), and the lateral cerebellum (LC). Stimulation of MT+/V5 and the cerebellum induced significant decreases in pursuit gain independent of background presentation, whereas stimulation of TPJ impaired the suppression of the optokinetic reflex induced by background stimulation. In contrast to changes in pursuit, only nonsignificant modifications in perceptual stability were observed. We conclude that MT+/V5, TPJ, and the LC contribute to pursuit eye movements and that TPJ supports the suppression of optokinesis. The lack of significant influences of TMS on perception suggests that motion perception invariance is not based on a localized but rather a highly distributed network featuring parallel processing.
Collapse
Affiliation(s)
- Thomas Haarmeier
- Department of Cognitive Neurology and Department of General Neurology, Hertie-Institute for Clinical Brain Research, University of Tübingen, 72076 Tübingen, Germany.
| | | |
Collapse
|
10
|
Crapse TB, Sommer MA. Corollary discharge circuits in the primate brain. Curr Opin Neurobiol 2008; 18:552-7. [PMID: 18848626 DOI: 10.1016/j.conb.2008.09.017] [Citation(s) in RCA: 91] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2008] [Revised: 09/25/2008] [Accepted: 09/25/2008] [Indexed: 10/21/2022]
Abstract
Movements are necessary to engage the world, but every movement results in sensorimotor ambiguity. Self-movements cause changes to sensory inflow as well as changes in the positions of objects relative to motor effectors (eyes and limbs). Hence the brain needs to monitor self-movements, and one way this is accomplished is by routing copies of movement commands to appropriate structures. These signals, known as corollary discharge (CD), enable compensation for sensory consequences of movement and preemptive updating of spatial representations. Such operations occur with a speed and accuracy that implies a reliance on prediction. Here we review recent CD studies and find that they arrive at a shared conclusion: CD contributes to prediction for the sake of sensorimotor harmony.
Collapse
Affiliation(s)
- Trinity B Crapse
- Department of Neuroscience, A210 Langley Hall, Center for the Neural Basis of Cognition, and Center for Neuroscience at the University of Pittsburgh, University of Pittsburgh, Pittsburgh, PA 15260, USA.
| | | |
Collapse
|
11
|
Dicke PW, Chakraborty S, Thier P. Neuronal correlates of perceptual stability during eye movements. Eur J Neurosci 2008; 27:991-1002. [PMID: 18333969 DOI: 10.1111/j.1460-9568.2008.06054.x] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
We are usually unaware of retinal image motion resulting from our own movement. For instance, during slow-tracking eye movements the world around us remains perceptually stable despite the retinal image slip induced by the eye movement. It is commonly held that this example of perceptual invariance is achieved by subtracting an internal reference signal, reflecting the eye movement, from the retinal motion signal. If the two cancel each other, visual objects, which do not move, will also be perceived as non-moving. If, however, the reference signal is too small or too large, a false eye movement-induced motion of the external world, the Filehne illusion, will be perceived. We have exploited our ability to manipulate the size of the reference signal in an attempt to identify neurons in the visual cortex of monkeys, influenced by the percept of self-induced visual motion or the reference signal rather than the retinal motion signal. We report here that such 'percept-related' neurons can already be found in the primary visual cortex area, although few in numbers. They become more frequent in areas middle temporal and medial superior temporal in the superior temporal sulcus, and comprise almost 50% of all neurons in area visual posterior sylvian (VPS) in the posterior part of the lateral sulcus. In summary, our findings suggest that our ability to perceive a visual world, which is stable despite self-motion, is based on a neuronal network, which culminates in the VPS located in the lateral sulcus below the classical dorsal stream of visual processing.
Collapse
Affiliation(s)
- Peter W Dicke
- Center for Neurology, Hertie Institute for Clinical Brain Research, Department of Cognitive Neurology, University of Tuebingen, Otfried-Mueller-Str. 27, 72076 Tuebingen, Germany.
| | | | | |
Collapse
|
12
|
Trenner MU, Fahle M, Fasold O, Heekeren HR, Villringer A, Wenzel R. Human cortical areas involved in sustaining perceptual stability during smooth pursuit eye movements. Hum Brain Mapp 2008; 29:300-11. [PMID: 17415782 PMCID: PMC6870627 DOI: 10.1002/hbm.20387] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Because both, eye movements and object movements induce an image motion on the retina, eye movements must be compensated to allow a coherent and stable perception of our surroundings. The inferential theory of perception postulates that retinal image motion is compared with an internal reference signal related to eye movements. This mechanism allows to distinguish between the potential sources producing retinal image motion. Referring to this theory, we investigated referential calculation during smooth pursuit eye movements (SPEM) in humans using event-related functional magnetic resonance imaging (fMRI). The blood oxygenation level dependent (BOLD) response related to SPEM in front of a stable background was measured for different parametric steps of preceding motion stimuli and therefore assumed for different states of the referential system. To achieve optimally accurate anatomy and more detectable fMRI signal changes in group analysis, we applied cortex-based statistics both to all brain volumes and to defined regions of interest. Our analysis revealed that the activity in a temporal region as well as the posterior parietal cortex (PPC) depended on the velocity of the preceding stimuli. Previous single-cell recordings in monkeys demonstrated that the visual posterior sylvian area (VPS) is relevant for perceptual stability. The activation apparent in our study thus may represent a human analogue of this area. The PPC is known as being strongly related to goal-directed eye movements. In conclusion, temporal and parietal cortical areas may be involved in referential calculation and thereby in sustaining visual perceptual stability during eye movements.
Collapse
Affiliation(s)
- Maja U Trenner
- Berlin NeuroImaging Center, Neurologische Klinik und Poliklinik, Charité Universitätsmedizin Berlin, Berlin, Germany.
| | | | | | | | | | | |
Collapse
|
13
|
Simultaneous adaptation of retinal and extra-retinal motion signals. Vision Res 2007; 47:3373-84. [PMID: 18006036 DOI: 10.1016/j.visres.2007.10.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2006] [Revised: 09/20/2007] [Accepted: 10/03/2007] [Indexed: 11/23/2022]
Abstract
A number of models of motion perception include estimates of eye velocity to help compensate for the incidental retinal motion produced by smooth pursuit. The 'classical' model uses extra-retinal motor command signals to obtain the estimate. More recent 'reference-signal' models use retinal motion information to enhance the extra-retinal signal. The consequence of simultaneously adapting to pursuit and retinal motion is thought to favour the reference-signal model, largely because the perception of motion during pursuit ('perceived stability') changes despite the absence of a standard motion aftereffect. The current experiments investigated whether the classical model could also account for these findings. Experiment 1 replicated the changes to perceived stability and then showed how simultaneous motion adaptation changes perceived retinal speed (a velocity aftereffect). Contrary to claims made by proponents of the reference-signal model, adapting simultaneously to pursuit and retinal motion therefore alters the retinal motion inputs to the stability computation. Experiment 2 tested the idea that simultaneous motion adaptation sets up a competitive interaction between two types of velocity aftereffect, one retinal and one extra-retinal. The results showed that pursuit adaptation by itself drove perceived stability in one direction and that adding adapting retinal motion drove perceived stability in the other. Moreover, perceived stability changed in conditions that contained no mismatch between adapting pursuit and adapting retinal motion, contrary to the reference-signal account. Experiment 3 investigated whether the effects of simultaneous motion adaptation were directionally tuned. Surprisingly no tuning was found, but this was true for both perceived stability and retinal velocity aftereffect. The three experiments suggest that simultaneous motion adaptation alters perceived stability based on separable changes to retinal and extra-retinal inputs. Possible mechanisms underlying the extra-retinal velocity aftereffect are discussed.
Collapse
|
14
|
Tikhonov A, Haarmeier T, Thier P, Braun C, Lutzenberger W. Neuromagnetic activity in medial parietooccipital cortex reflects the perception of visual motion during eye movements. Neuroimage 2004; 21:593-600. [PMID: 14980561 DOI: 10.1016/j.neuroimage.2003.09.045] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2003] [Revised: 09/17/2003] [Accepted: 09/17/2003] [Indexed: 11/22/2022] Open
Abstract
We usually perceive a stationary, stable world despite coherent visual motion induced by eye movements. This astonishing example of perceptual invariance results from a comparison of visual information with internal reference signals (nonretinal signals) predicting the visual consequences of an eye movement. The important consequence of this concept is that our subjective percept of visual motion reflects the outcome of this comparison rather than retinal image slip. To localize the cortical networks underlying this comparison, we compared magnetoencephalography (MEG) responses under two conditions of pursuit-induced retinal image motion, which were identical physically but--due to different calibrational states of the nonretinal signal prompted under our experimental conditions--gave rise to different percepts of visual motion. This approach allows us to demonstrate that our perception of self-induced visual motion resides in comparably "late" parts of the cortical hierarchy of motion processing sparing the early stages up to cortical area MT/V5 but including cortex in and around the medial aspect of the parietooccipital cortex as one of its core elements.
Collapse
Affiliation(s)
- Alexander Tikhonov
- Department of Cognitive Neurology, University of Tübingen, D-72076 Tuebingen, Germany
| | | | | | | | | |
Collapse
|
15
|
Abstract
Extra-retinal information about eye velocity is thought to play an important role in compensating the retinal motion experienced during an eye movement. Evidently this compensation process is prone to error, since stimulus properties such as contrast and spatial frequency have marked effect on perceived motion with respect to the head. Here we investigate the suggestion, that 'optokinetic potential' [Perception 14 (1985) 631] may contribute to an explanation of these errors. First, we measured the optokinetic nystagmus induced by each stimulus so as to determine the optokinetic potential. Second, we determined the speed match between two patches of Gaussian blobs presented sequentially. Observers pursued the first pattern and kept their eyes stationary when viewing the second. For stimuli with identical contrast or spatial frequency, the pursued pattern was perceived to move slower than the non-pursued pattern (the Aubert-Fleischl phenomenon). Lowering the contrast or the spatial frequency of the non-pursued pattern resulted in a systematic decrease of its perceived speed. A further condition in which the contrast or spatial frequency of the pursued pattern was varied, resulted in no change to its perceived speed. Pursuit eye movements were recorded and found to be independent of stimulus properties. The results cast doubt on the idea that changing contrast or spatial frequency affects perceived head-centred speed by altering optokinetic potential.
Collapse
Affiliation(s)
- Jane H Sumnall
- School of Psychology, Cardiff University, P.O. Box 901, Cardiff, Wales CF10 3YG, UK
| | | | | |
Collapse
|
16
|
Abstract
The eyes are always moving even during fixation, making the retinal image move concomitantly. While these motions activate early visual stages, they are excluded from one's perception. A striking illusion reported here renders them visible: a static pattern surrounded by a synchronously flickering pattern appears to move coherently in random directions. There was a positive correlation between the illusion and fixational eye movements. A simulation revealed that motion computation artificially creates a motion difference between center and surround, which is usually a cue to object motion but now a wrong cue to seeing eye movements of oneself on-line. Therefore, this novel illusion indicates that the visual system normally counteracts shaky visual inputs due to small eye movements by using retinal, as opposed to extraretinal, motion signals. As long as they comprise common image motions over space, they are interpreted as coming from a static outer world viewed through moving eyes. Such visual stability fails in the condition of artificial flicker, because common image motions due to eye movements are registered differently between flickering and non-flickering regions.
Collapse
Affiliation(s)
- Ikuya Murakami
- Human and Information Science Laboratory, NTT Communication Science Laboratories, NTT Corporation, 3-1 Morinosato Wakamiya, Atsugi, 243-0198, Kanagawa, Japan.
| |
Collapse
|
17
|
Ilg UJ. Commentary: smooth pursuit eye movements: from low-level to high-level vision. PROGRESS IN BRAIN RESEARCH 2003; 140:279-98. [PMID: 12508597 DOI: 10.1016/s0079-6123(02)40057-x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/28/2023]
Abstract
If an object of great interest moves in our environment, we are able to elicit smooth pursuit eye movements that keep the image of the moving object stationary on our fovea. The processing of visual motion underlying the execution of smooth pursuit eye movements is very similar to the processing underlying the perception of visual motion. During initiation of smooth pursuit, an averaging across all available motion information occurs. Cognitive factors including attention, prediction and learning are able to influence the execution of smooth pursuit. The pursuit target trajectory in space is represented in the discharge rates of neurons in the posterior parietal cortex of rhesus monkeys.
Collapse
Affiliation(s)
- Uwe J Ilg
- Neurologische Universitätsklinik, Hoppe-Seyler-Strasse 3, D-72076 Tübingen, Germany.
| |
Collapse
|
18
|
Li HCO, Brenner E, Cornelissen FW, Kim ES. Systematic distortion of perceived 2D shape during smooth pursuit eye movements. Vision Res 2002; 42:2569-75. [PMID: 12446031 DOI: 10.1016/s0042-6989(02)00295-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Even when the retinal image of a static scene is constantly shifting, as occurs when the viewer pursues a small moving object with his or her eyes, the scene is usually correctly perceived to be static. Following early suggestions by von Helmholtz, it is commonly believed that this spatial stability is achieved by combining retinal and extra-retinal signals. Here, we report a perceptually salient 2D shape distortion that can arise during pursuit. We provide evidence that the perceived 2D shape reflects retinal image contents alone, implying that the extra-retinal signal is ignored when judging 2D shape.
Collapse
Affiliation(s)
- Hyung-Chul O Li
- Department of Industrial Psychology, Kwangwoon University, Nowon-Gu, Wolgae-Dong, 447-1, Seoul, South Korea.
| | | | | | | |
Collapse
|
19
|
Abstract
Stimulus motion is a prominent feature that is used by the visual system to segment figure from ground and perceptually bind widely separated objects. Pursuit eye movements can be influenced by such perceptual grouping processes. We have examined the subjects' ability to detect small amounts of coherent motion in random dot kinematograms during pursuit. We compared performance on tests of coherent motion perception while subjects fixated a stationary spot or while they tracked a moving target. The results indicate that smooth pursuit can improve subjects' ability to detect the presence of coherent motion. We tentatively propose that an efference copy of the eye movement signal can enhance the ability of the visual system to detect correlations between sparsely placed targets among noisy distractors.
Collapse
Affiliation(s)
- Mark W Greenlee
- Institute of Cognitive Science, University of Oldenburg, Ammerländer Heerstrasse 114, 26111 Oldenburg, Germany
| | | | | |
Collapse
|
20
|
Freeman TCA, Sumnall JH. Motion versus position in the perception of head-centred movement. Perception 2002; 31:603-15. [PMID: 12044100 DOI: 10.1068/p3256] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
Abstract. Observers can recover motion with respect to the head during an eye movement by comparing signals encoding retinal motion and the velocity of pursuit. Evidently there is a mismatch between these signals because perceived head-centred motion is not always veridical. One example is the Filehne illusion, in which a stationary object appears to move in the opposite direction to pursuit. Like the motion aftereffect, the phenomenal experience of the Filehne illusion is one in which the stimulus moves but does not seem to go anywhere. This raises problems when measuring the illusion by motion nulling because the more traditional technique confounds perceived motion with changes in perceived position. We devised a new nulling technique using global-motion stimuli that degraded familiar position cues but preserved cues to motion. Stimuli consisted of random-dot patterns comprising signal and noise dots that moved at the same retinal 'base' speed. Noise moved in random directions. In an eye-stationary speed-matching experiment we found noise slowed perceived retinal speed as 'coherence strength' (ie percentage of signal) was reduced. The effect occurred over the two-octave range of base speeds studied and well above direction threshold. When the same stimuli were combined with pursuit, observers were able to null the Filehne illusion by adjusting coherence. A power law relating coherence to retinal base speed fit the data well with a negative exponent. Eye-movement recordings showed that pursuit was quite accurate. We then tested the hypothesis that the stimuli found at the null-points appeared to move at the same retinal speed. Two observers supported the hypothesis, a third partially, and a fourth showed a small linear trend. In addition, the retinal speed found by the traditional Filehne technique was similar to the matches obtained with the global-motion stimuli. The results provide support for the idea that speed is the critical cue in head-centred motion perception.
Collapse
|
21
|
Abstract
The pattern of motion in the retinal image during self-motion contains information about the person's movement. Pursuit eye movements perturb the pattern of retinal-image motion, complicating the problem of self-motion perception. A question of considerable current interest is the relative importance of retinal and extra-retinal signals in compensating for these effects of pursuit on the retinal image. We addressed this question by examining the effect of prior motion stimuli on self-motion judgments during pursuit. Observers viewed 300 ms random-dot displays simulating forward self-motion during pursuit to the right or to the left; at the end of each display a probe appeared and observers judged whether they would pass left or right of it. The display was preceded by a 300 ms dot pattern that was either stationary or moved in the same direction as, or opposite to, the eye movement. This prior motion stimulus had a large effect on self-motion judgments when the simulated scene was a frontoparallel wall (experiment 1), but not when it was a three-dimensional (3-D) scene (experiment 2). Corresponding simulated-pursuit conditions controlled for purely retinal motion aftereffects, implying that the effect in experiment 1 is mediated by an interaction between retinal and extra-retinal signals. In experiment 3, we examined self-motion judgments with respect to a 3-D scene with mixtures of real and simulated pursuit. When real and simulated pursuits were in opposite directions, performance was determined by the total amount of pursuit-related retinal motion, consistent with an extra-retinal 'trigger' signal that facilitates the action of a retinally based pursuit-compensation mechanism. However, results of experiment 1 without a prior motion stimulus imply that extra-retinal signals are more informative when retinal information is lacking. We conclude that the relative importance of retinal and extra-retinal signals for pursuit compensation varies with the informativeness of the retinal motion pattern, at least for short durations. Our results provide partial explanations for a number of findings in the literature on perception of self-motion and motion in the frontal plane.
Collapse
Affiliation(s)
- J A Crowell
- Department of Psychology, Ohio State University, Columbus 43210, USA.
| | | |
Collapse
|
22
|
Haarmeier T, Bunjes F, Lindner A, Berret E, Thier P. Optimizing visual motion perception during eye movements. Neuron 2001; 32:527-35. [PMID: 11709162 DOI: 10.1016/s0896-6273(01)00486-x] [Citation(s) in RCA: 50] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
We usually perceive a stationary, stable world and we are able to correctly estimate the direction of heading from optic flow despite coherent visual motion induced by eye movements. This astonishing example of perceptual invariance results from a comparison of visual information with internal reference signals predicting the visual consequences of an eye movement. Here we demonstrate that the reference signal predicting the consequences of smooth-pursuit eye movements is continuously calibrated on the basis of direction-selective interactions between the pursuit motor command and the rotational flow induced by the eye movement, thereby minimizing imperfections of the reference signal and guaranteeing an ecologically optimal interpretation of visual motion.
Collapse
Affiliation(s)
- T Haarmeier
- Department of Cognitive Neurology, University of Tübingen, 72076, Tübingen, Germany
| | | | | | | | | |
Collapse
|
23
|
Abstract
By adding retinal and pursuit eye-movement velocity one can determine the motion of an object with respect to the head. It would seem likely that the visual system carries out a similar computation by summing extra-retinal, eye-velocity signals with retinal motion signals. Perceived head-centred motion may therefore be determined by differences in the way these signals encode speed. For example, if extra-retinal signals provide the lower estimate of speed then moving objects will appear slower when pursued (Aubert-Fleischl phenomenon) and stationary objects will move opposite to an eye movement (Filehne illusion). Most previous work proposes that these illusions exist because retinal signals encode retinal motion accurately while extra-retinal signals under-estimate eye speed. A more general model is presented in which both signals could be in error. Two types of input/output speed relationship are examined. The first uses linear speed transducers and the second non-linear speed transducers, the latter based on power laws. It is shown that studies of the Aubert-Fleischl phenomenon and Filehne illusion reveal the gain ratio or power ratio alone. We also consider general velocity-matching and show that in theory matching functions are limited by gain ratio in the linear case. However, in the non-linear case individual transducer shapes are revealed albeit up to an unknown scaling factor. The experiments show that the Aubert-Fleischl phenomenon and Filehne illusion are adequately described by linear speed transducers with a gain ratio less than one. For some observers, this is also the case in general velocity-matching experiments. For other observers, however, behaviour is non-linear and, according to the transducer model, indicates the existence of expansive non-linearities in speed encoding. This surprising result is discussed in relation to other theories of head-centred motion perception and the possible strategies some observers might adopt when judging stimulus motion during an eye movement.
Collapse
Affiliation(s)
- T C Freeman
- School of Psychology, Cardiff University, PO Box 901, CF10 3YG, Cardiff, UK.
| |
Collapse
|
24
|
Thier P, Haarmeier T, Chakraborty S, Lindner A, Tikhonov A. Cortical Substrates of Perceptual Stability during Eye Movements. Neuroimage 2001; 14:S33-9. [PMID: 11373130 DOI: 10.1006/nimg.2001.0840] [Citation(s) in RCA: 19] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We are usually unaware of retinal image motion resulting from our own movement. For instance, during slow-tracking eye movements, the world around us remains perceptually stable despite the retinal image slip induced by the eye movement. This example of perceptual invariance is achieved by subtracting an internal reference signal, reflecting the eye movement, from the retinal motion signal. If the two cancel each other, visual structures, which do not move, will also be perceived as nonmoving. If, however, the reference signal is too small or too large, a false eye-movement-induced motion of the external world will be perceived. We have exploited our ability to manipulate the size of the reference signal in an attempt to reveal the structures in visual cortex, encoding the perception of self-induced visual motion rather than the retinal motion signal. Using EEG and lately also MEG recordings in human subjects and single-unit recordings in monkeys, we have been able to show that our ability to perceive the world as stationary despite eye-movement-induced retinal image slip is based on "late" parts of the cortical hierarchy of motion processing, sparing the early stages up to cortical area MT and, among others, involving cortex at the junction between the parietal and temporal lobes close to the parieto-insular-vestibular cortex. Lesions of this network in humans render the visual system unable to compensate for the visual consequences of eye movements, giving rise to severe dizziness, whenever the eyes move smoothly.
Collapse
Affiliation(s)
- P Thier
- Department of Cognitive Neurology, University of Tübingen, Germany.
| | | | | | | | | |
Collapse
|
25
|
Abstract
The aim of this study was to test the hypothesis that an extra-retinal signal combines with retinal velocity in a linear manner as described by existing models to determine perceived velocity. To do so, we utilized a method that allowed the determination of the relative contributions of the retinal-velocity and the extra-retinal signals for the perception of stimulus velocity. We determined the velocity (speed and direction) of a stimulus viewed with stationary eyes that was perceptually the same as the velocity of the stimulus viewed with moving eyes. Eye movements were governed by the tracking (or pursuit) of a separate pursuit target. The velocity-matching data were unable to be fit with a model that linearly combined a retinal-velocity signal and an extra-retinal signal. A model that was successful in explaining the data was one that takes the difference between two simple saturating non-linear functions, g and f, each symmetric about the origin, but one having an interaction term. That is, the function g has two arguments: retinal velocity, R, and eye velocity, E. The only argument to f is retinal velocity, R. Each argument has a scaling parameter. A comparison of the goodness of fits between models demonstrated that the success of the model is the interaction term, i.e. the modification of the compensating eye velocity signal by the retinal velocity prior to combination.
Collapse
Affiliation(s)
- K A Turano
- Johns Hopkins University School of Medicine, Wilmer Eye Institute, Baltimore, MD, USA.
| | | |
Collapse
|
26
|
Abstract
Previous studies have found large misperceptions when subjects are reporting the perceived angle between two directions of motion moving transparently at an acute angle, the so called motion repulsion. While these errors have been assumed to be caused by interactions between the two directions present, we reassessed these earlier measurements taking into account recent findings about directional misperceptions affecting the perception of single motion (reference repulsion). While our measurements confirm that errors in directional judgments of transparent motions can indeed be as big as 22 degrees we find that motion repulsion, i.e. the interaction between two directions, contributes at most about 7 degrees to these errors. This value is comparable to similar repulsion effects in orientation perception and stereoscopic depth perception, suggesting that they share a common neural basis. Our data further suggest that fast time scale adaptation and/or more general interactions between neurons contribute to motion repulsion while tracking eye movements play little or no role. These findings should serve as important constraints for models of motion perception.
Collapse
Affiliation(s)
- H J Rauber
- Department of Neurology, University of Tübingen, Germany.
| | | |
Collapse
|
27
|
Abstract
It is usually held that perceptual spatial stability, despite smooth pursuit eye movements, is accomplished by comparing a signal reflecting retinal image slip with an internal reference signal, encoding the eye movement. The important consequence of this concept is that our subjective percept of visual motion reflects the outcome of this comparison rather than retinal image slip. In an attempt to localize the cortical networks underlying this comparison and therefore our subjective percept of visual motion, we exploited an imperfection inherent in it, which results in a movement illusion. If smooth pursuit is carried out across a stationary background, we perceive a tiny degree of illusionary background motion (Filehne illusion, or FI), rather than experiencing the ecologically optimal percept of stationarity. We have recently shown that this illusion can be modified substantially and predictably under laboratory conditions by visual motion unrelated to the eye movement. By making use of this finding, we were able to compare cortical potentials evoked by pursuit-induced retinal image slip under two conditions, which differed perceptually, while being identical physically. This approach allowed us to discern a pair of potentials, a parieto-occipital negativity (N300) followed by a frontal positivity (P300), whose amplitudes were solely determined by the subjective perception of visual motion irrespective of the physical attributes of the situation. This finding strongly suggests that subjective awareness of visual motion depends on neuronal activity in a parieto-occipito-frontal network, which excludes the early stages of visual processing.
Collapse
|
28
|
Lappe M. A model of the combination of optic flow and extraretinal eye movement signals in primate extrastriate visual cortex. Neural model of self-motion from optic flow and extraretinal cues. Neural Netw 1998; 11:397-414. [PMID: 12662818 DOI: 10.1016/s0893-6080(98)00013-6] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
The determination of the direction of heading from optic flow is a complicated task. To solve it the visual system complements the optic flow by non-visual information about the occurrence of eye movements. Psychophysical studies have shown that the need for this combination depends on the structure the visual scene. In a depth-rich visual environment motion parallax can be exploited to differentiate self-translation from eye rotation. In the absence of motion parallax, i.e. in the case of movement towards a frontoparallel plane, extraretinal signals are necessary for correct heading perception ([Warren and Hannon, 1990]). [Lappe and Rauschecker (1993b)] have proposed a model of visual heading detection that reproduces many of the psychophysical findings in the absence of extraretinal input and links them to properties of single neurons in the primate visual cortex. The present work proposes a neural network model that integrates extraretinal signals into this network. The model is compared with psychophysical and neurophysiological data from experiments in human and non-human primates. The combined visual/extraretinal model reproduces human behavior in the case of movement towards a frontoparallel plane. Single model neurons exhibit several similarities to neurons from the medial superior temporal (MST) area of the macaque monkey. Similar to MST cells ([Erickson and Thier, 1991]) they differentiate between self-induced visual motion that results from eye movements in a stationary environment, and real motion in the environment. The model predicts that this differentiation can also be achieved visually, i.e. without extraretinal input. Other simulations followed experiments by [Bradley et al. (1996)], in which flow fields were presented that simulated observer translation towards a frontoparallel plane plus an eye rotation. Similar to MST cells, model neurons shift their preference for the focus of expansion along the direction of the eye movement when extraretinal input is not available. They respond to the retinal location of the focus of expansion which is shifted by the eye movement. In the presence of extraretinal input the preference for the focus of expansion is largely invariant to eye movements and tied to the location of the focus of expansion with regard to the visual scene. The model proposes that extraretinal compensation for eye movements need not be perfect in single neurons to achieve accurate heading detection. It thereby shows that the incomplete compensation found in most MST neurons is sufficient to explain the psychophysical data.
Collapse
Affiliation(s)
- Markus Lappe
- Department of Zoology and Neurobiology, Ruhr University Bochum, Bochum, Germany
| |
Collapse
|
29
|
Haarmeier T, Thier P, Repnow M, Petersen D. False perception of motion in a patient who cannot compensate for eye movements. Nature 1997; 389:849-52. [PMID: 9349816 DOI: 10.1038/39872] [Citation(s) in RCA: 88] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
We are usually unaware of the motion of an image across our retina that results from our own movement. For instance, during slow-tracking eye movements we do not mistake the shift of the image projected onto the retina for motion of the world around us, but instead perceive a stable world. Following early suggestions by von Helmholtz, it is commonly believed that this spatial stability is achieved by subtracting the retinal motion signal from an internal reference signal, such as a copy of the movement command (efference copy). Object motion is perceived only if the two differ. Although this concept is widely accepted, its anatomical underpinning remains unknown. Here we describe the case of a patient with bilateral extrastriate cortex lesions, suffering from false perception of motion due to an inability to take eye movements into account when faced with self-induced retinal image slip. This is indicated by the fact that during smooth-pursuit eye movements, he perceives motion of the stationary world at a velocity that corresponds to the velocity of his eye movement; that is, he perceives the raw retinal image slip uncorrected for his own eye movements. We suspect that this deficiency reflects damage of a distinct parieto-occipital region that disentangles self-induced and externally induced visual motion by comparing retinal signals with a reference signal encoding eye movements and possibly ego-motion in general.
Collapse
Affiliation(s)
- T Haarmeier
- Section on Sensorimotor Research, Department of Neurology, University of Tübingen, Germany
| | | | | | | |
Collapse
|