1
|
Interactions among endogenous, exogenous, and agency-driven attentional selection mechanisms in interactive displays. Atten Percept Psychophys 2022; 84:1477-1488. [PMID: 35610415 DOI: 10.3758/s13414-022-02507-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/05/2022] [Indexed: 11/08/2022]
Abstract
Attentional selection is driven, in part, by a complex interplay between endogenous and exogenous cues. Recently, one's interactions with the physical world have also been shown to bias attention. Specifically, the sense of agency that arises when our actions cause predictable outcomes biases our attention toward those things which we control. We investigated how this agency-driven attentional bias interacts with simultaneously presented endogenous (words) and exogenous (color singletons) environmental cues. Participants controlled the movement of one object while others moved independently. In a subsequent search task, targets were either the previously controlled objects or not. Targets were also validly or invalidly cued. Both cue types influenced attention allocation. Endogenous cues and agency-driven attentional selection were independent and additive, indicating they are separable mechanisms of selection. In contrast, exogenous cues eliminated the effects of agency, indicating that perceptually salient environmental cues can override internally derived effects of agency. This is the first demonstration of a boundary condition on agency-driven selection.
Collapse
|
2
|
Awareness of voluntary action, rather than body ownership, improves motor control. Sci Rep 2021; 11:418. [PMID: 33432104 PMCID: PMC7801649 DOI: 10.1038/s41598-020-79910-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Accepted: 12/15/2020] [Indexed: 01/02/2023] Open
Abstract
Awareness of the body is essential for accurate motor control. However, how this awareness influences motor control is poorly understood. The awareness of the body includes awareness of visible body parts as one’s own (sense of body ownership) and awareness of voluntary actions over that visible body part (sense of agency). Here, I show that sense of agency over a visible hand improves the initiation of movement, regardless of sense of body ownership. The present study combined the moving rubber hand illusion, which allows experimental manipulation of agency and body ownership, and the finger-tracking paradigm, which allows behavioral quantification of motor control by the ability to coordinate eye with hand movements. This eye–hand coordination requires awareness of the hand to track the hand with the eye. I found that eye–hand coordination is improved when participants experience a sense of agency over a tracked artificial hand, regardless of their sense of body ownership. This improvement was selective for the initiation, but not maintenance, of eye–hand coordination. These results reveal that the prospective experience of explicit sense of agency improves motor control, suggesting that artificial manipulation of prospective agency may be beneficial to rehabilitation and sports training techniques.
Collapse
|
3
|
Handedness Matters for Motor Control But Not for Prediction. eNeuro 2019; 6:ENEURO.0136-19.2019. [PMID: 31138661 PMCID: PMC6557034 DOI: 10.1523/eneuro.0136-19.2019] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2019] [Revised: 05/11/2019] [Accepted: 05/15/2019] [Indexed: 12/15/2022] Open
Abstract
Skilled motor behavior relies on the ability to control the body and to predict the sensory consequences of this control. Although there is ample evidence that manual dexterity depends on handedness, it remains unclear whether control and prediction are similarly impacted. To address this issue, right-handed human participants performed two tasks with either the right or the left hand. In the first task, participants had to move a cursor with their hand so as to track a target that followed a quasi-random trajectory. This hand-tracking task allowed testing the ability to control the hand along an imposed trajectory. In the second task, participants had to track with their eyes a target that was self-moved through voluntary hand motion. This eye-tracking task allowed testing the ability to predict the visual consequences of hand movements. As expected, results showed that hand tracking was more accurate with the right hand than with the left hand. In contrast, eye tracking was similar in terms of spatial and temporal gaze attributes whether the target was moved by the right or the left hand. Although these results extend previous evidence for different levels of control by the two hands, they show that the ability to predict the visual consequences of self-generated actions does not depend on handedness. We propose that the greater dexterity exhibited by the dominant hand in many motor tasks stems from advantages in control, not in prediction. Finally, these findings support the notion that prediction and control are distinct processes.
Collapse
|
4
|
Wijesinghe LP, Triesch J, Shi BE. Robot End Effector Tracking Using Predictive Multisensory Integration. Front Neurorobot 2018; 12:66. [PMID: 30386227 PMCID: PMC6198278 DOI: 10.3389/fnbot.2018.00066] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2018] [Accepted: 09/20/2018] [Indexed: 11/20/2022] Open
Abstract
We propose a biologically inspired model that enables a humanoid robot to learn how to track its end effector by integrating visual and proprioceptive cues as it interacts with the environment. A key novel feature of this model is the incorporation of sensorimotor prediction, where the robot predicts the sensory consequences of its current body motion as measured by proprioceptive feedback. The robot develops the ability to perform smooth pursuit-like eye movements to track its hand, both in the presence and absence of visual input, and to track exteroceptive visual motions. Our framework makes a number of advances over past work. First, our model does not require a fiducial marker to indicate the robot hand explicitly. Second, it does not require the forward kinematics of the robot arm to be known. Third, it does not depend upon pre-defined visual feature descriptors. These are learned during interaction with the environment. We demonstrate that the use of prediction in multisensory integration enables the agent to incorporate the information from proprioceptive and visual cues better. The proposed model has properties that are qualitatively similar to the characteristics of human eye-hand coordination.
Collapse
Affiliation(s)
- Lakshitha P Wijesinghe
- Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong
| | - Jochen Triesch
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Bertram E Shi
- Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong
| |
Collapse
|
5
|
Maiello G, Kwon M, Bex PJ. Three-dimensional binocular eye-hand coordination in normal vision and with simulated visual impairment. Exp Brain Res 2018; 236:691-709. [PMID: 29299642 PMCID: PMC6693328 DOI: 10.1007/s00221-017-5160-8] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2017] [Accepted: 12/21/2017] [Indexed: 10/18/2022]
Abstract
Sensorimotor coupling in healthy humans is demonstrated by the higher accuracy of visually tracking intrinsically-rather than extrinsically-generated hand movements in the fronto-parallel plane. It is unknown whether this coupling also facilitates vergence eye movements for tracking objects in depth, or can overcome symmetric or asymmetric binocular visual impairments. Human observers were therefore asked to track with their gaze a target moving horizontally or in depth. The movement of the target was either directly controlled by the observer's hand or followed hand movements executed by the observer in a previous trial. Visual impairments were simulated by blurring stimuli independently in each eye. Accuracy was higher for self-generated movements in all conditions, demonstrating that motor signals are employed by the oculomotor system to improve the accuracy of vergence as well as horizontal eye movements. Asymmetric monocular blur affected horizontal tracking less than symmetric binocular blur, but impaired tracking in depth as much as binocular blur. There was a critical blur level up to which pursuit and vergence eye movements maintained tracking accuracy independent of blur level. Hand-eye coordination may therefore help compensate for functional deficits associated with eye disease and may be employed to augment visual impairment rehabilitation.
Collapse
Affiliation(s)
- Guido Maiello
- UCL Institute of Ophthalmology, University College London, 11-43 Bath Street, London, EC1V 9EL, UK.
- Department of Experimental Psychology, Justus-Liebig University Giessen, Otto-Behaghel-Str.10F, 35394, Giessen, Germany.
| | - MiYoung Kwon
- Department of Ophthalmology, University of Alabama at Birmingham, 700 S. 18th Street, Birmingham, AL, 35294-0009, USA
| | - Peter J Bex
- Department of Psychology, Northeastern University, 360 Huntington Ave, Boston, MA, 02115, USA
| |
Collapse
|
6
|
Effects of wrist tendon vibration and eye movements on manual aiming. Exp Brain Res 2018; 236:847-857. [PMID: 29353311 DOI: 10.1007/s00221-018-5180-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Accepted: 01/12/2018] [Indexed: 12/19/2022]
Abstract
In the present study, we investigated whether visual information mediates a proprioceptive illusion effect induced by muscle tendon vibration in manual aiming. Visual information was gradually degraded from a situation in which the targets were present and participants (n = 20; 22.3 ± 2.7 years) were permitted to make saccadic eye movements to designated target positions, to a condition in which the targets were not visible and participants were required to perform cyclical aiming while fixating a point between the two target positions. Local tendon vibration applied to the right wrist extensor muscles induced an illusory reduction of 15% in hand movement amplitude. This effect was greater in the fixation than in the saccade condition. Both anticipatory control and proprioceptive feedback are proposed to contribute to the observed effects. The primary saccade amplitude was also reduced by almost 4% when muscle tendon vibration was locally applied to the wrist. These results confirm a tight link between eye movements and manual perception and action. Moreover, the impact of the proprioceptive illusion on the ocular system indicates that the interaction between systems is bidirectional.
Collapse
|
7
|
Limited Contribution of Primary Motor Cortex in Eye-Hand Coordination: A TMS Study. J Neurosci 2017; 37:9730-9740. [PMID: 28893926 DOI: 10.1523/jneurosci.0564-17.2017] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2017] [Revised: 08/09/2017] [Accepted: 09/05/2017] [Indexed: 11/21/2022] Open
Abstract
The ability to track a moving target with the eye is substantially improved when the target is self-moved compared with when it is moved by an external agent. To account for this observation, it has been postulated that the oculomotor system has access to hand efference copy, thereby allowing to predict the motion of the visual target. Along this scheme, we tested the effect of transcranial magnetic stimulation (TMS) over the hand area of the primary motor cortex (M1) when human participants (50% females) are asked to track with their eyes a visual target whose horizontal motion is driven by their grip force. We reasoned that, if the output of M1 is used by the oculomotor system to keep track of the target, on top of inducing short latency disturbance of grip force, single-pulse TMS should also quickly disrupt ongoing eye motion. For comparison purposes, the effect of TMS over M1 was monitored when subjects tracked an externally moved target (while keeping their hand at rest or not). In both cases, results showed no alterations in smooth pursuit, meaning that its velocity was unaffected within the 25-125 ms epoch that followed TMS. Overall, our results imply that the output of M1 has limited contribution in driving the eye motion during our eye-hand coordination task. This study suggests that, if hand motor signals are accessed by the oculomotor system, this is upstream of M1.SIGNIFICANCE STATEMENT The ability to coordinate eye and hand actions is central in everyday activity. However, the neural mechanisms underlying this coordination remain to be clarified. A leading hypothesis is that the oculomotor system has access to hand motor signals. Here we explored this possibility by means of transcranial magnetic stimulation (TMS) over the hand area of the primary motor cortex (M1) when humans tracked with the eyes a visual target that was moved by the hand. As expected, ongoing hand action was perturbed 25-30 ms after TMS, but our results fail to show any disruption of eye motion, smooth pursuit velocity being unaffected. This work suggests that, if hand motor signals are accessed by the oculomotor system, this is upstream of M1.
Collapse
|
8
|
Abstract
When we knock on a door, we perceive the impact as a collection of simultaneous events, combining sound, sight, and tactile sensation. In reality, information from different modalities but from a single source is flowing inside the brain along different pathways, reaching processing centers at different times. Therefore, interpreting different sensory modalities which seem to occur simultaneously requires information processing that accounts for these different delays. As in a computer-based robotic system, does the brain use some explicit estimation of the time delay, to realign the sensory flows? Or does it compensate for temporal delays by representing them as changes in the body/environment mechanics? Using delayed-state or an approximation for delayed-state manipulations between visual and proprioceptive feedback during a tracking task, we show that tracking errors, grip forces, and learning curves are consistent with predictions of a representation that is based on approximation for delay, refuting an explicit delayed-state representation. Delayed-state representations are based on estimating the time elapsed between the movement commands and their observed consequences. In contrast, an approximation for delay representations result from estimating the instantaneous relation between the expected and observed motion variables, without explicit reference to time.
Collapse
|
9
|
Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics. eNeuro 2017; 4:eN-NWR-0101-17. [PMID: 28680964 PMCID: PMC5494895 DOI: 10.1523/eneuro.0101-17.2017] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2017] [Revised: 06/06/2017] [Accepted: 06/08/2017] [Indexed: 01/04/2023] Open
Abstract
Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.
Collapse
|
10
|
Chen J, Valsecchi M, Gegenfurtner KR. Role of motor execution in the ocular tracking of self-generated movements. J Neurophysiol 2016; 116:2586-2593. [PMID: 27628207 DOI: 10.1152/jn.00574.2016] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2016] [Accepted: 09/09/2016] [Indexed: 11/22/2022] Open
Abstract
When human observers track the movements of their own hand with their gaze, the eyes can start moving before the finger (i.e., anticipatory smooth pursuit). The signals driving anticipation could come from motor commands during finger motor execution or from motor intention and decision processes associated with self-initiated movements. For the present study, we built a mechanical device that could move a visual target either in the same direction as the participant's hand or in the opposite direction. Gaze pursuit of the target showed stronger anticipation if it moved in the same direction as the hand compared with the opposite direction, as evidenced by decreased pursuit latency, increased positional lead of the eye relative to target, increased pursuit gain, decreased saccade rate, and decreased delay at the movement reversal. Some degree of anticipation occurred for incongruent pursuit, indicating that there is a role for higher-level movement prediction in pursuit anticipation. The fact that anticipation was larger when target and finger moved in the same direction provides evidence for a direct coupling between finger and eye motor commands.
Collapse
Affiliation(s)
- Jing Chen
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Matteo Valsecchi
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Karl R Gegenfurtner
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| |
Collapse
|
11
|
Landelle C, Montagnini A, Madelain L, Danion F. Eye tracking a self-moved target with complex hand-target dynamics. J Neurophysiol 2016; 116:1859-1870. [PMID: 27466129 DOI: 10.1152/jn.00007.2016] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Accepted: 07/26/2016] [Indexed: 12/31/2022] Open
Abstract
Previous work has shown that the ability to track with the eye a moving target is substantially improved when the target is self-moved by the subject's hand compared with when being externally moved. Here, we explored a situation in which the mapping between hand movement and target motion was perturbed by simulating an elastic relationship between the hand and target. Our objective was to determine whether the predictive mechanisms driving eye-hand coordination could be updated to accommodate this complex hand-target dynamics. To fully appreciate the behavioral effects of this perturbation, we compared eye tracking performance when self-moving a target with a rigid mapping (simple) and a spring mapping as well as when the subject tracked target trajectories that he/she had previously generated when using the rigid or spring mapping. Concerning the rigid mapping, our results confirmed that smooth pursuit was more accurate when the target was self-moved than externally moved. In contrast, with the spring mapping, eye tracking had initially similar low spatial accuracy (though shorter temporal lag) in the self versus externally moved conditions. However, within ∼5 min of practice, smooth pursuit improved in the self-moved spring condition, up to a level similar to the self-moved rigid condition. Subsequently, when the mapping unexpectedly switched from spring to rigid, the eye initially followed the expected target trajectory and not the real one, thereby suggesting that subjects used an internal representation of the new hand-target dynamics. Overall, these results emphasize the stunning adaptability of smooth pursuit when self-maneuvering objects with complex dynamics.
Collapse
Affiliation(s)
- Caroline Landelle
- Institut de Neurosciences de la Timone UMR 7289, Aix Marseille Université, Centre National de la Recherche Scientifique (CNRS), Marseille, France; and
| | - Anna Montagnini
- Institut de Neurosciences de la Timone UMR 7289, Aix Marseille Université, Centre National de la Recherche Scientifique (CNRS), Marseille, France; and
| | | | - Frederic Danion
- Institut de Neurosciences de la Timone UMR 7289, Aix Marseille Université, Centre National de la Recherche Scientifique (CNRS), Marseille, France; and
| |
Collapse
|
12
|
Chujo Y, Jono Y, Tani K, Nomura Y, Hiraoka K. Corticospinal Excitability in the Hand Muscles is Decreased During Eye Movement with Visual Occlusion. Percept Mot Skills 2016; 122:238-55. [PMID: 27420319 DOI: 10.1177/0031512515625331] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Corticospinal excitability in the hand muscles decreases during smooth pursuit eye movement. The present study tested a hypothesis that the decrease in corticospinal excitability in the hand muscles at rest during eye movement is not caused by visual feedback but caused by motor commands to the eye muscles. Healthy men (M age = 28.4 yr., SD = 5.2) moved their eyes to the right with visual occlusion (dark goggles) while their arms and hands remained at rest. The motor-evoked potential in the hand muscles was suppressed by 19% in the third quarter of the eye-movement period, supporting a view that motor commands to the eye muscles are the cause of the decrease in corticospinal excitability in the hand muscles. The amount of the suppression was not significantly different among the muscles, indicating that modulation of corticospinal excitability in one muscle induced by eye movement is not dependent on whether eye movement direction and the direction of finger movement when the muscle contracts are identical. Thus, the finding failed to support a hypothetical view that motor commands to the eye muscles concomittantly produce motor commands to the hand muscles. Moreover, the amount of the suppression was not significantly different between the forearm positions, indicating that the suppression was not affected by proprioception of the forearm muscles when visual feedback is absent.
Collapse
Affiliation(s)
- Yuta Chujo
- Graduate School of Comprehensive Rehabilitation, Osaka Prefecture University, Japan
| | - Yasutomo Jono
- Graduate School of Comprehensive Rehabilitation, Osaka Prefecture University, Japan
| | - Keisuke Tani
- Graduate School of Comprehensive Rehabilitation, Osaka Prefecture University, Japan
| | - Yoshifumi Nomura
- Graduate School of Comprehensive Rehabilitation, Osaka Prefecture University, Japan
| | - Koichi Hiraoka
- College of Health and Human Sciences, Osaka Prefecture University, Japan
| |
Collapse
|
13
|
Chen J, Valsecchi M, Gegenfurtner KR. LRP predicts smooth pursuit eye movement onset during the ocular tracking of self-generated movements. J Neurophysiol 2016; 116:18-29. [PMID: 27009159 DOI: 10.1152/jn.00184.2016] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2016] [Accepted: 03/21/2016] [Indexed: 11/22/2022] Open
Abstract
Several studies have indicated that human observers are very efficient at tracking self-generated hand movements with their gaze, yet it is not clear whether this is simply a by-product of the predictability of self-generated actions or if it results from a deeper coupling of the somatomotor and oculomotor systems. In a first behavioral experiment we compared pursuit performance as observers either followed their own finger or tracked a dot whose motion was externally generated but mimicked their finger motion. We found that even when the dot motion was completely predictable in terms of both onset time and kinematics, pursuit was not identical to that produced as the observers tracked their finger, as evidenced by increased rate of catch-up saccades and by the fact that in the initial phase of the movement gaze was lagging behind the dot, whereas it was ahead of the finger. In a second experiment we recorded EEG in the attempt to find a direct link between the finger motor preparation, indexed by the lateralized readiness potential (LRP) and the latency of smooth pursuit. After taking into account finger movement onset variability, we observed larger LRP amplitudes associated with earlier smooth pursuit onset across trials. The same held across subjects, where average LRP onset correlated with average eye latency. The evidence from both experiments concurs to indicate that a strong coupling exists between the motor systems leading to eye and finger movements and that simple top-down predictive signals are unlikely to support optimal coordination.
Collapse
Affiliation(s)
- Jing Chen
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Matteo Valsecchi
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Karl R Gegenfurtner
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| |
Collapse
|
14
|
Kinematic property of target motion conditions gaze behavior and eye-hand synergy during manual tracking. Hum Mov Sci 2013; 32:1253-69. [PMID: 24054436 DOI: 10.1016/j.humov.2013.03.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2011] [Revised: 12/17/2012] [Accepted: 03/22/2013] [Indexed: 11/22/2022]
Abstract
This study investigated how frequency demand and motion feedback influenced composite ocular movements and eye-hand synergy during manual tracking. Fourteen volunteers conducted slow and fast force-tracking in which targets were displayed in either line-mode or wave-mode to guide manual tracking with target movement of direct position or velocity nature. The results showed that eye-hand synergy was a selective response of spatiotemporal coupling conditional on target rate and feedback mode. Slow and line-mode tracking exhibited stronger eye-hand coupling than fast and wave-mode tracking. Both eye movement and manual action led the target signal during fast-tracking, while the latency of ocular navigation during slow-tracking depended on the feedback mode. Slow-tracking resulted in more saccadic responses and larger pursuit gains than fast-tracking. Line-mode tracking led to larger pursuit gains but fewer and shorter gaze fixations than wave-mode tracking. During slow-tracking, incidences of saccade and gaze fixation fluctuated across a target cycle, peaking at velocity maximum and the maximal curvature of target displacement, respectively. For line-mode tracking, the incidence of smooth pursuit was phase-dependent, peaking at velocity maximum as well. Manual behavior of slow or line-mode tracking was better predicted by composite eye movements than that of fast or wave-mode tracking. In conclusion, manual tracking relied on versatile visual strategies to perceive target movements of different kinematic properties, which suggested a flexible coordinative control for the ocular and manual sensorimotor systems.
Collapse
|
15
|
Horino H, Mori N, Matsugi A, Kamata N, Hiraoka K. The effect of eye movement on the control of arm movement to a target. Somatosens Mot Res 2013; 30:153-9. [DOI: 10.3109/08990220.2013.790807] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
16
|
Eye-hand synergy and intermittent behaviors during target-directed tracking with visual and non-visual information. PLoS One 2012; 7:e51417. [PMID: 23236498 PMCID: PMC3517518 DOI: 10.1371/journal.pone.0051417] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2012] [Accepted: 10/31/2012] [Indexed: 11/19/2022] Open
Abstract
Visual feedback and non-visual information play different roles in tracking of an external target. This study explored the respective roles of the visual and non-visual information in eleven healthy volunteers who coupled the manual cursor to a rhythmically moving target of 0.5 Hz under three sensorimotor conditions: eye-alone tracking (EA), eye-hand tracking with visual feedback of manual outputs (EH tracking), and the same tracking without such feedback (EHM tracking). Tracking error, kinematic variables, and movement intermittency (saccade and speed pulse) were contrasted among tracking conditions. The results showed that EHM tracking exhibited larger pursuit gain, less tracking error, and less movement intermittency for the ocular plant than EA tracking. With the vision of manual cursor, EH tracking achieved superior tracking congruency of the ocular and manual effectors with smaller movement intermittency than EHM tracking, except that the rate precision of manual action was similar for both types of tracking. The present study demonstrated that visibility of manual consequences altered mutual relationships between movement intermittency and tracking error. The speed pulse metrics of manual output were linked to ocular tracking error, and saccade events were time-locked to the positional error of manual tracking during EH tracking. In conclusion, peripheral non-visual information is critical to smooth pursuit characteristics and rate control of rhythmic manual tracking. Visual information adds to eye-hand synchrony, underlying improved amplitude control and elaborate error interpretation during oculo-manual tracking.
Collapse
|
17
|
The brain uses efference copy information to optimise spatial memory. Exp Brain Res 2012; 224:189-97. [PMID: 23073714 DOI: 10.1007/s00221-012-3298-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2012] [Accepted: 10/03/2012] [Indexed: 10/27/2022]
Abstract
Does a motor response to a target improve the subsequent recall of the target position or can we simply use peripheral position information to guide an accurate response? We suggest that a motor plan of the hand can be enhanced with actual motor and efference copy feedback (GoGo trials), which is absent in the passive observation of a stimulus (NoGo trials). To investigate this effect during eye and hand coordination movements, we presented stimuli in two formats (memory guided or visually guided) under three modality conditions (eyes only, hands only (with eyes fixated), or eyes and hand together). We found that during coordinated movements, both the eye and hand response times were facilitated when efference feedback of the movement was provided. Furthermore, both eye and hand movements to remembered locations were significantly more accurate in the GoGo than in the NoGo trial types. These results reveal that an efference copy of a motor plan enhances memory for a location that is not only observed in eye movements, but also translated downstream into a hand movement. These results have significant implications on how we plan, code and guide behavioural responses, and how we can optimise accuracy and timing to a given target.
Collapse
|
18
|
White O, Lefèvre P, Wing AM, Bracewell RM, Thonnard JL. Active collisions in altered gravity reveal eye-hand coordination strategies. PLoS One 2012; 7:e44291. [PMID: 22984488 PMCID: PMC3440428 DOI: 10.1371/journal.pone.0044291] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2012] [Accepted: 08/01/2012] [Indexed: 11/19/2022] Open
Abstract
Most object manipulation tasks involve a series of actions demarcated by mechanical contact events, and gaze is usually directed to the locations of these events as the task unfolds. Typically, gaze foveates the target 200 ms in advance of the contact. This strategy improves manual accuracy through visual feedback and the use of gaze-related signals to guide the hand/object. Many studies have investigated eye-hand coordination in experimental and natural tasks; most of them highlighted a strong link between eye movements and hand or object kinematics. In this experiment, we analyzed gaze strategies in a collision task but in a very challenging dynamical context. Participants performed collisions while they were exposed to alternating episodes of microgravity, hypergravity and normal gravity. First, by isolating the effects of inertia in microgravity, we found that peak hand acceleration marked the transition between two modes of grip force control. Participants exerted grip forces that paralleled load force profiles, and then increased grip up to a maximum shifted after the collision. Second, we found that the oculomotor strategy adapted visual feedback of the controlled object around the collision, as demonstrated by longer durations of fixation after collision in new gravitational environments. Finally, despite large variability of arm dynamics in altered gravity, we found that saccades were remarkably time-locked to the peak hand acceleration in all conditions. In conclusion, altered gravity allowed light to be shed on predictive mechanisms used by the central nervous system to coordinate gaze, hand and grip motor actions during a mixed task that involved transport of an object and high impact loads.
Collapse
Affiliation(s)
- Olivier White
- Unité de Formation et de Recherche en Sciences et Techniques des Activités Physiques et Sportives, Université de Bourgogne, Dijon, France.
| | | | | | | | | |
Collapse
|
19
|
Hagan MA, Dean HL, Pesaran B. Spike-field activity in parietal area LIP during coordinated reach and saccade movements. J Neurophysiol 2011; 107:1275-90. [PMID: 22157119 PMCID: PMC3311693 DOI: 10.1152/jn.00867.2011] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
The posterior parietal cortex is situated between visual and motor areas and supports coordinated visually guided behavior. Area LIP in the intraparietal sulcus contains representations of visual space and has been extensively studied in the context of saccades. However, area LIP has not been studied during coordinated movements, so it is not known whether saccadic representations in area LIP are influenced by coordinated behavior. Here, we studied spiking and local field potential (LFP) activity in area LIP while subjects performed coordinated reaches and saccades or saccades alone to remembered target locations to test whether activity in area LIP is influenced by the presence of a coordinated reach. We find that coordination significantly changes the activity of individual neurons in area LIP, increasing or decreasing the firing rate when a reach is made with a saccade compared with when a saccade is made alone. Analyzing spike-field coherence demonstrates that area LIP neurons whose firing rate is suppressed during the coordinated task have activity temporally correlated with nearby LFP activity, which reflects the synaptic activity of populations of neurons. Area LIP neurons whose firing rate increases during the coordinated task do not show significant spike-field coherence. Furthermore, LFP power in area LIP is suppressed and does not increase when a coordinated reach is made with a saccade. These results demonstrate that area LIP neurons display different responses to coordinated reach and saccade movements, and that different spike rate responses are associated with different patterns of correlated activity. The population of neurons whose firing rate is suppressed is coherently active with local populations of LIP neurons. Overall, these results suggest that area LIP plays a role in coordinating visually guided actions through suppression of coherent patterns of saccade-related activity.
Collapse
Affiliation(s)
- Maureen A Hagan
- Center for Neural Science, New York University, New York, NY, USA
| | | | | |
Collapse
|
20
|
Medina J, Jax SA, Brown MJ, Coslett HB. Contributions of efference copy to limb localization: evidence from deafferentation. Brain Res 2010; 1355:104-11. [PMID: 20659430 DOI: 10.1016/j.brainres.2010.07.063] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2010] [Revised: 07/08/2010] [Accepted: 07/18/2010] [Indexed: 10/19/2022]
Abstract
Previous research with deafferented subjects suggests that efference copy can be used to update limb position. However, the contributions of efference copy to limb localization are currently unclear. We examined the performance of JDY, a woman with severe, longstanding proprioceptive deficits from a sensory peripheral neuropathy, on a reaching task to explore the contribution of efference copy to trajectory control. JDY and eight healthy controls reached without visual feedback to a target that either remained stationary or jumped to a second location after movement initiation. JDY consistently made hypermetric movements to the final target, exhibiting significant problems with amplitude control. Despite this amplitude control deficit, JDY's performance on jump trials showed that the angle of movement correction (angle between pre- and post-correction movement segments) was significantly correlated with the distance (but not time) of movement from start to turn point. These data suggest that despite an absence of proprioceptive and visual information regarding hand location, JDY derived information about movement distance that informed her movement correction on jump trials. The same type of information that permitted her to correct movement direction on-line, however, was not available for control of final arm position. We propose that efference copy can provide a consistent estimate of limb position that becomes less informative over the course of the movement. We discuss the implications of these data for current models of motor control.
Collapse
Affiliation(s)
- Jared Medina
- Department of Neurology, University of Pennsylvania, Philadelphia, PA 19104, USA.
| | | | | | | |
Collapse
|
21
|
Sarlegna FR, Baud-Bovy G, Danion F. Delayed visual feedback affects both manual tracking and grip force control when transporting a handheld object. J Neurophysiol 2010; 104:641-53. [PMID: 20538774 DOI: 10.1152/jn.00174.2010] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
When we manipulate an object, grip force is adjusted in anticipation of the mechanical consequences of hand motion (i.e., load force) to prevent the object from slipping. This predictive behavior is assumed to rely on an internal representation of the object dynamic properties, which would be elaborated via visual information before the object is grasped and via somatosensory feedback once the object is grasped. Here we examined this view by investigating the effect of delayed visual feedback during dextrous object manipulation. Adult participants manually tracked a sinusoidal target by oscillating a handheld object whose current position was displayed as a cursor on a screen along with the visual target. A delay was introduced between actual object displacement and cursor motion. This delay was linearly increased (from 0 to 300 ms) and decreased within 2-min trials. As previously reported, delayed visual feedback altered performance in manual tracking. Importantly, although the physical properties of the object remained unchanged, delayed visual feedback altered the timing of grip force relative to load force by about 50 ms. Additional experiments showed that this effect was not due to task complexity nor to manual tracking. A model inspired by the behavior of mass-spring systems suggests that delayed visual feedback may have biased the representation of object dynamics. Overall, our findings support the idea that visual feedback of object motion can influence the predictive control of grip force even when the object is grasped.
Collapse
Affiliation(s)
- Fabrice R Sarlegna
- Institute of Movement Sciences, Centre National de la Recherche Scientifique and Université de la Méditerranée, Marseille, France
| | | | | |
Collapse
|
22
|
Sarlegna FR, Malfait N, Bringoux L, Bourdin C, Vercher JL. Force-field adaptation without proprioception: Can vision be used to model limb dynamics? Neuropsychologia 2010; 48:60-7. [PMID: 19695273 DOI: 10.1016/j.neuropsychologia.2009.08.011] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2008] [Revised: 08/03/2009] [Accepted: 08/11/2009] [Indexed: 11/24/2022]
|
23
|
Thura D, Boussaoud D, Meunier M. Hand position affects saccadic reaction times in monkeys and humans. J Neurophysiol 2008; 99:2194-202. [PMID: 18337364 DOI: 10.1152/jn.01271.2007] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In daily life, activities requiring the hand and eye to work separately are as frequent as activities requiring tight eye-hand coordination, and we effortlessly switch from one type of activity to the other. Such flexibility is unlikely to be achieved without each effector "knowing" where the other one is at all times, even when it is static. Here, we provide behavioral evidence that the mere position of the static hand affects one eye movement parameter: saccadic reaction time. Two monkeys were trained and 11 humans instructed to perform nondelayed or delayed visually guided saccades to either a right or a left target while holding their hand at a location either near or far from the eye target. From trial to trial, target locations and hand positions varied pseudorandomly. Subjects were tested both when they could and when they could not see their hand. The main findings are 1) the presence of the static hand in the workspace did affect saccade initiation; 2) this interaction persisted when the hand was invisible; 3) it was strongly influenced by the delay duration: hand-target proximity retarded immediate saccades, whereas it could hasten delayed saccades; and 4) this held true both for humans and for each of the two monkeys. We propose that both visual and nonvisual hand position signals are used by the primates' oculomotor system for the planning and execution of saccades, and that this may result in a hand-eye competition for spatial attentional resources that explains the delay-dependent reversal observed.
Collapse
Affiliation(s)
- David Thura
- Institut de Neurosciences Cognitives de la Méditerranée, UMR 6193, Centre National de la Recherche Scientifique, 31 Chemin Joseph Aiguier, Marseille Cedex 20, France
| | | | | |
Collapse
|
24
|
Malfait N, Henriques DY, Gribble PL. Shape distortion produced by isolated mismatch between vision and proprioception. J Neurophysiol 2007; 99:231-43. [PMID: 17977930 DOI: 10.1152/jn.00507.2007] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
To investigate the nature of the visuomotor transformation, previous studies have used pointing tasks and examined how adaptation to a spatially localized mismatch between vision and proprioception generalizes across the workspace. Whereas some studies found extensive spatial generalization of single-point remapping, consistent with the hypothesis of a global realignment of visual and proprioceptive spaces, other studies reported limited transfer associated with variations in initial limb posture. Here, we investigated the effects of spatially localized remapping in the context of a visuomanual tracking task. Subjects tracked a visual target tracing a simple two-dimensional geometrical form without visual feedback except at a single point, where the visual display of the hand was shifted relative to its actual position. After adaptation, hand paths exhibited distortions relative to the visual templates that were inconsistent with the idea of a global realignment of visual and proprioceptive spaces. Results of a visuoproprioceptive matching task showed that these distortions were not limited to active movements but also affected perception of passive limb movements.
Collapse
Affiliation(s)
- Nicole Malfait
- Department of Psychology, University of Western Ontario, London, Canada.
| | | | | |
Collapse
|
25
|
Sauser EL, Billard AG. Dynamic updating of distributed neural representations using forward models. BIOLOGICAL CYBERNETICS 2006; 95:567-88. [PMID: 17143650 DOI: 10.1007/s00422-006-0131-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/19/2006] [Accepted: 10/21/2006] [Indexed: 05/12/2023]
Abstract
In this paper, we present a continuous attractor network model that we hypothesize will give some suggestion of the mechanisms underlying several neural processes such as velocity tuning to visual stimulus, sensory discrimination, sensorimotor transformations, motor control, motor imagery, and imitation. All of these processes share the fundamental characteristic of having to deal with the dynamic integration of motor and sensory variables in order to achieve accurate sensory prediction and/or discrimination. Such principles have already been described in the literature by other high-level modeling studies (Decety and Sommerville in Trends Cogn Sci 7:527-533, 2003; Oztop et al. in Neural Netw 19(3):254-271, 2006; Wolpert et al. in Philos Trans R Soc 358:593-602, 2003). With respect to these studies, our work is more concerned with biologically plausible neural dynamics at a population level. Indeed, we show that a relatively simple extension of the classical neural field models can endow these networks with additional dynamic properties for updating their internal representation using external commands. Moreover, an analysis of the interactions between our model and external inputs also shows interesting properties, which we argue are relevant for a better understanding of the neural processes of the brain.
Collapse
Affiliation(s)
- Eric L Sauser
- Learning Algorithms and Systems Laboratory (LASA), Ecole Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne, Switzerland.
| | | |
Collapse
|
26
|
Sarlegna FR, Gauthier GM, Bourdin C, Vercher JL, Blouin J. Internally driven control of reaching movements: A study on a proprioceptively deafferented subject. Brain Res Bull 2006; 69:404-15. [PMID: 16624672 DOI: 10.1016/j.brainresbull.2006.02.005] [Citation(s) in RCA: 66] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2005] [Revised: 02/06/2006] [Accepted: 02/09/2006] [Indexed: 11/17/2022]
Abstract
We investigated the possibility of controlling reaching movements on the sole basis of central mechanisms, i.e., without peripheral feedback on hand and target positions. A deafferented subject (GL) and control subjects reached with the unseen hand for a straight-ahead target that could be displaced laterally at movement onset. The shifted target was continuously or briefly lit, or not visible. In this latter condition, a beep from either side of subjects' head single-handedly signaled the change in the movement goal, so that movements could only be controlled through an internal representation of the memorised target position. Compared to controls, GL showed quantitatively similar corrections (77% of the target displacement, on an average) and similar reaction times to the target shift (mean = 516 ms), regardless of target visual information. These results highlight a remarkable capacity for controlling reaching movements on the sole basis of internally driven processes. On the other hand, trajectories in double-step trials differed drastically between GL and controls. Controls' trajectories were composed of two segments, the second of which brought the hand directly toward the displaced target. The patient produced three-segment, stair-like trajectories. The first and third segments were mainly in the sagittal plane and the second segment was a vector-image of the lateral target shift. A control experiment showed that GL's trajectories were not the result of a voluntary strategy used to adjust movement trajectory in the absence of peripheral information on hand position. We suggest that GL's trajectories reflect a deficit in interjoint coordination in the absence of proprioception.
Collapse
Affiliation(s)
- Fabrice R Sarlegna
- UMR Mouvement & Perception, CNRS and Université de la Méditerranée, 163 Avenue de Luminy, 13288 Marseille Cedex 9, France
| | | | | | | | | |
Collapse
|
27
|
Stenneken P, Prinz W, Cole J, Paillard J, Aschersleben G. The effect of sensory feedback on the timing of movements: evidence from deafferented patients. Brain Res 2006; 1084:123-31. [PMID: 16564509 DOI: 10.1016/j.brainres.2006.02.057] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2005] [Revised: 02/02/2006] [Accepted: 02/16/2006] [Indexed: 11/28/2022]
Abstract
The role of sensory feedback in the control of movements was investigated in two deafferented patients with complete loss of cutaneous touch and movement/position sense below the neck and two control groups of different ages. In a synchronized repetitive finger-tapping task in time with a regular auditory pacing signal, the deafferented participants showed a strong influence of extrinsic feedback. In contrast to controls who demonstrated a typical asynchrony between their taps and the pacing signal in all feedback conditions, the deafferented participants, with auditory feedback and visual monitoring, showed no asynchrony between finger taps and the pacing signal. These findings support the view that sensory information plays a crucial role in the anticipatory timing of movements.
Collapse
Affiliation(s)
- Prisca Stenneken
- Max Planck Institute for Human Cognitive and Brain Sciences, Department of Psychology (formerly Max Planck Institute for Psychological Research, Cognition and Action), Munich, Germany.
| | | | | | | | | |
Collapse
|
28
|
Balslev D, Nielsen FA, Lund TE, Law I, Paulson OB. Similar brain networks for detecting visuo-motor and visuo-proprioceptive synchrony. Neuroimage 2006; 31:308-12. [PMID: 16406606 DOI: 10.1016/j.neuroimage.2005.11.037] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2005] [Revised: 10/18/2005] [Accepted: 11/28/2005] [Indexed: 11/22/2022] Open
Abstract
The ability to recognize feedback from own movement as opposed to the movement of someone else is important for motor control and social interaction. The neural processes involved in feedback recognition are incompletely understood. Two competing hypotheses have been proposed: the stimulus is compared with either (a) the proprioceptive feedback or with (b) the motor command and if they match, then the external stimulus is identified as feedback. Hypothesis (a) predicts that the neural mechanisms or brain areas involved in distinguishing self from other during passive and active movement are similar, whereas hypothesis (b) predicts that they are different. In this fMRI study, healthy subjects saw visual cursor movement that was either synchronous or asynchronous with their active or passive finger movements. The aim was to identify the brain areas where the neural activity depended on whether the visual stimulus was feedback from own movement and to contrast the functional activation maps for active and passive movement. We found activity increases in the right temporoparietal cortex in the condition with asynchronous relative to synchronous visual feedback from both active and passive movements. However, no statistically significant difference was found between these sets of activated areas when the active and passive movement conditions were compared. With a posterior probability of 0.95, no brain voxel had a contrast effect above 0.11% of the whole-brain mean signal. These results do not support the hypothesis that recognition of visual feedback during active and passive movement relies on different brain areas.
Collapse
Affiliation(s)
- Daniela Balslev
- Neurobiology Research Unit, N9201, Copenhagen University Hospital, Rigshospitalet, Copenhagen, Denmark.
| | | | | | | | | |
Collapse
|
29
|
Guillaud E, Gauthier G, Vercher JL, Blouin J. Fusion of visuo-ocular and vestibular signals in arm motor control. J Neurophysiol 2005; 95:1134-46. [PMID: 16221749 DOI: 10.1152/jn.00453.2005] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Keeping the finger pointing at an Earth-fixed object during body displacements can be achieved if compensatory arm movements counteract the effect of the rotation on the hand's position in space. Here we investigated the fusion of signals that originated from systems having different neurophysiological properties (i.e., the visuo-oculomotor and vestibular systems) in the production of such compensatory arm movements. To this end, we analyzed the subjects' performance in three conditions that differed according to the information they provided about relative target-body motion. This information originated either from the vestibular or visuo-oculomotor system, or from a combination of the two. To highlight the integration of visuo-oculomotor and vestibular signals, we compared the arm response to motion frequencies presumed to allow or not to allow optimal vestibular and oculomotor responses. When they could be used in isolation, the ocular signals allowed long-latency but precise kinematics control of the arm movement, whereas vestibular signals allowed accurate motor response early in the rotation but their contribution declined as body rotation developed. Optimal performance was obtained throughout the whole movement and for all rotation frequencies when the visuo-oculomotor and vestibular signals could be used together. This increase in hand-tracking performance could not be explained by a unimodal model or an additive model of vestibular and ocular cues, even when using weighted signals. Rather, the results supported a functional model in which vestibular and visuo-oculomotor signals have different influences on the temporal and spatial aspects of hand movement compensating for body motion.
Collapse
Affiliation(s)
- Etienne Guillaud
- Unité Mixte de Recherche Mouvement et Perception, Centre National de la Recherche Scientifique et Université de la Méditerranée, Marseille, France
| | | | | | | |
Collapse
|
30
|
Sailer U, Eggert T, Straube A. Impaired temporal prediction and eye–hand coordination in patients with cerebellar lesions. Behav Brain Res 2005; 160:72-87. [PMID: 15836902 DOI: 10.1016/j.bbr.2004.11.020] [Citation(s) in RCA: 18] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2003] [Revised: 11/09/2004] [Accepted: 11/15/2004] [Indexed: 10/26/2022]
Abstract
This study investigated the effect of cerebellar lesions on temporal prediction and coordination in eye and hand movements. Nine patients with cerebellar lesions were compared to controls while they made saccades with and without simultaneous pointing movements towards a target that was either temporally predictable or non-predictable. The direction and amplitude of the target step was always predictable. Patients made much more early and late saccades than controls, but an equal amount of visually triggered saccades. This suggests that inappropriate saccades could be suppressed during the preparation of a goal-directed saccade. Hand movement frequency did not differ between both groups. Thus, cerebellar lesions can induce inappropriate saccades more easily than inappropriate hand movements. Controls, but not patients, generated visually triggered saccades of shorter latencies when the target was temporally predictable. Thus, the patients could not use information about target timing to synchronise visually triggered saccades with the target. They could, however, use this information to improve the suppression of inappropriate saccades. Regarding coordination, patients showed impairments in synchronising saccades with hand movements. Nevertheless, hand movements led to an enhancement of anticipatory saccades in patients as in controls. Moreover, hand movements and temporal predictability affected saccadic accuracy in both groups similarly. These results suggest that cerebellar lesions do not generally prevent access to temporal information on the rhythm of a target sequence or the timing of a planned hand movement. More specifically, the cerebellum seems to be crucial for synchronizing saccades with such learned or planned temporal events.
Collapse
Affiliation(s)
- Uta Sailer
- Section for Physiology, Department of Integrative Medical Biology, Umeå University, SE-90187 Umeå, Sweden.
| | | | | |
Collapse
|
31
|
van Donkelaar P, Siu KC, Walterschied J. Saccadic Output Is Influenced by Limb Kinetics During Eye—Hand Coordination. J Mot Behav 2004; 36:245-52. [PMID: 15262621 DOI: 10.3200/jmbr.36.3.245-252] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
In several recent studies, saccadic eye movements were found to be influenced by concurrent reaching movements. The authors investigated whether that influence originates in limb kinematic or kinetic signals. To dissociate those 2 possibilities, the authors required participants (N = 6) to generate pointing movements with a mass that either resisted or assisted limb motion. With practice, participants were able to generate pointing responses with very similar kinematics but whose kinetics varied in a systematic manner. The results showed that saccadic output was altered by the amount of force required to move the arm, consistent with an influence from limb kinetic signals. Because the interaction occurred before the pointing response began, the authors conclude that a predictive signal related to limb kinetics modulates saccadic output during tasks requiring eye-hand coordination.
Collapse
Affiliation(s)
- Paul van Donkelaar
- Department of Exercise and Movement Science, Institute of Neuroscience, University of Oregon, Eugene, OR, USA.
| | | | | |
Collapse
|
32
|
Vercher JL, Sarès F, Blouin J, Bourdin C, Gauthier G. Role of sensory information in updating internal models of the effector during arm tracking. PROGRESS IN BRAIN RESEARCH 2003; 142:203-22. [PMID: 12693263 DOI: 10.1016/s0079-6123(03)42015-3] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/24/2023]
Abstract
This chapter is divided into three main parts. Firstly, on the basis of the literature, we will shortly discuss how the recent introduction of the concept of internal models by Daniel Wolpert and Mitsuo Kawato contributes to a better understanding of what is motor learning and what is motor adaptation. Then, we will present a model of eye-hand co-ordination during self-moved target tracking, which we used as a way to specifically address these topics. Finally, we will show some evidence about the use of proprioceptive information for updating the internal models, in the context of eye-hand co-ordination. Motor and afferent information appears to contribute to the parametric adjustment (adaptation) between arm motor command and visual information about arm motion. The study reported here was aimed at assessing the contribution of arm proprioception in building (learning) and updating (adaptation) these representations. The subjects (including a deafferented subject) had to make back and forth movements with their forearm in the horizontal plane, over learned amplitude and at constant frequency, and to track an arm-driven target with their eyes. The dynamical conditions of arm movement were altered (unexpectedly or systematically) during the movement by changing the mechanical properties of the manipulandum. The results showed a significant change of the latency and the gain of the smooth pursuit system, before and after the perturbation for the control subjects, but not for the deafferented subject. Moreover, in control subjects, vibrations of the arm muscles prevented adaptation to the mechanical perturbation. These results suggest that in a self-moved target tracking task, the arm motor system shares with the smooth pursuit system an internal representation of the arm dynamical properties, and that arm proprioception is necessary to build this internal model. As suggested by Ghez et al. (1990) (Cold Spring Harbor Symp. Quant. Biol., 55: 837-8471), proprioception would allow control subjects to learn the inertial properties of the limb.
Collapse
Affiliation(s)
- Jean-Louis Vercher
- UMR 6152 'Mouvement et Perception', CNRS, Université de la Méditerranée, Campus scientifique de Luminy, F-13288 Marseille, France.
| | | | | | | | | |
Collapse
|
33
|
Bekkering H, Sailer U. Commentary: coordination of eye and hand in time and space. PROGRESS IN BRAIN RESEARCH 2003; 140:365-73. [PMID: 12508603 DOI: 10.1016/s0079-6123(02)40063-5] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
Abstract
Every day of our lives starts with a succession of actions that require eye-hand coordination. From the time we try to turn off the alarm clock and get dressed, to putting toothpaste on the brush and preparing coffee: all these goal-directed hand movements need to be coordinated with the information from the eye. When performing such simultaneous goal-directed eye and hand movements, both the time and location at which eye and hand land on the object need to be harmonized. For better localizing the alarm clock, we need to see it before we hit it. In order to use this visual information for an accurate hand movement, we need the eye to land on the same position, i.e. eye and hand both need to be on the alarm clock instead of the water glass. These two aspects, temporal and spatial coordination, have encouraged a great deal of research. On the following pages, we will summarize a number of findings on how this coordination could be achieved.
Collapse
Affiliation(s)
- Harold Bekkering
- Department of Experimental and Work Psychology, University of Groningen, Grote Kruisstraat 2/1, 9712 TS Groningen, The Netherlands
| | | |
Collapse
|
34
|
A real-time state predictor in motor control: study of saccadic eye movements during unseen reaching movements. J Neurosci 2002. [PMID: 12196595 DOI: 10.1523/jneurosci.22-17-07721.2002] [Citation(s) in RCA: 80] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Theoretical motor control predicts that because of delays in sensorimotor pathways, a neural system should exist in the brain that uses efferent copy of commands to the arm, sensory feedback, and an internal model of the dynamics of the arm to predict the future state of the hand (i.e., a forward model). We tested this theory under the hypothesis that saccadic eye movements, tracking an unseen reaching movement, would reflect the output of this state predictor. We found that in unperturbed reaching movements, saccade occurrence at any time t consistently provided an unbiased estimate of hand position at t + 196 msec. To investigate the behavior of this predictor during feedback error control, we applied 50 msec random-force perturbations to the moving hand. Saccades showed a sharp inhibition at 100 msec after perturbation. At approximately 170 msec, there was a sharp increase in saccade probabilities. These postperturbation saccades were an unbiased estimator of hand position at saccade time t + 150 msec. The ability of the brain to guide saccades to the future position of the hand failed when a force field unexpectedly changed the dynamics of the hand immediately after perturbation. The behavior of the eyes suggested that during reaching movements, the brain computes an estimate of future hand position based on an internal model that relies on real-time proprioceptive feedback. When an error occurs in reaching movements, the estimate of future hand position is recomputed. The saccade inhibition period that follows the hand perturbation may indicate the length of time it takes for this computation to take place.
Collapse
|
35
|
Abstract
When observers pursue a moving target with their eyes, they use predictions of future target positions in order to keep the target within the fovea. It was suggested that these predictions of smooth pursuit (SP) eye movements are computed only from the visual feedback of the target characteristics. As a consequence, if the target vanishes unexpectedly, the eye movements do not stop immediately, but they overshoot the vanishing point. We compared the spatial and temporal features of such predictive eye movements in a task with or without intentional control over the target vanishing point. If the observers stopped the target with a button press, the overshoot of the eyes was reduced compared to a condition where the offset was computer generated. Accordingly, the eyes started to decelerate well before the target offset and lagged further behind the target when it disappeared. The involvement of intentionally-generated expectancies in eye movement control was also obvious in the spatial trajectories of the eyes, which showed a clear flexion in anticipation of the circular motion path we used. These findings are discussed together with neurophysiological mechanisms underlying the SP eye movements.
Collapse
Affiliation(s)
- Sonja Stork
- Max Planck Institute for Psychological Research, Amalienstrasse 33, 80799 Munich, Germany.
| | | | | |
Collapse
|
36
|
Abstract
The cerebellum is known to have important functions in motor control, coordination, motor learning, and timing. It may have other "higher" functions as well, up to and including cognitive processing independent of motor behavior. In this article, we will review some of the evidence from functional imaging, lesion studies, electrophysiological recordings, and anatomy which support the theory that the cerebellum provides a "forward model" of the motor system. This forward model would be used for control of movement; it could also underlie a cerebellar role in coordination. In this role, the forward model would generate time-specific signals predicting the motion of each motor effector, essential for predictive control of, for example, eye and hand movements. Data are presented from human eye and hand tracking that support this. Tracking performance is better if eye and hand follow the same spatial trajectory, but better still if the eye leads the hand by about 75 to 100 ms. This suggests that information from the ocular control system feeds into the manual control system to assist its tracking.
Collapse
Affiliation(s)
- R C Miall
- University Laboratory of Physiology, Parks Road, Oxford, OX1 3PT, United Kingdom.
| | | |
Collapse
|
37
|
Laufer Y, Hocherman S, Dickstein R. Accuracy of reproducing hand position when using active compared with passive movement. PHYSIOTHERAPY RESEARCH INTERNATIONAL 2001; 6:65-75. [PMID: 11436672 DOI: 10.1002/pri.215] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
BACKGROUND AND PURPOSE Evaluating proprioception is relevant to physical rehabilitation because of its significance in motor control. One method of proprioceptive testing involves having subjects either imitate or point at a joint position or movement which was presented via a passive movement. However, as the muscle spindles are subject to central fusimotor control, the proprioceptive system may be better-tuned to movements created by active muscular contraction than to passive movements. The objective of the present study was to determine whether accuracy of reproducing hand position is dependent on whether proprioceptive input is obtained via an active or a passive movement. METHOD Thirty-nine healthy volunteers (mean age (+/- SD) 24.6 (+/- 3.6) years) participated in the study. Subjects' right hands, which were obscured from view, were acoustically guided to five targets on a digitizer tablet with either an active or passive upper extremity movement. Subjects were then asked to reproduce the targets' location by either reaching to them with the unseen hand or by use of a laser beam. Distance from target and angular deviations were calculated in both absolute and relative terms. Repeated measures analysis of variance (ANOVA) was performed for each variable followed by predetermined contrasts. RESULTS Comparison between the active and passive conditions when reconstruction of target location was guided kinaesthetically indicates significant differences in absolute distance, range and angular deviation. The comparison when reconstruction of target location was guided visually indicates significant differences in absolute distance, absolute angle and angular deviation. CONCLUSIONS The ability to reproduce hand position accurately is enhanced when position is encoded by active upper extremity movement compared with passive movement. The results have implications for the design of strategies for evaluating as well as treating patients with impaired proprioception and limited movement.
Collapse
Affiliation(s)
- Y Laufer
- Flieman Geriatric Rehabilitation Center, Haifa, Israel
| | | | | |
Collapse
|
38
|
Timmermann L, Gross J, Schmitz F, Freund HJ, Schnitzler A. Involvement of the motor cortex in pseudochoreoathetosis. Mov Disord 2001; 16:876-81. [PMID: 11746617 DOI: 10.1002/mds.1180] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
The pathophysiological background of involuntary movements in pseudochoreoathetosis is unclear. We therefore recorded in four patients with pseudochoreoathetosis and in six age-matched controls cortical activity with a whole-head magnetoencephalography (MEG) system and surface EMGs from hand muscles. Subjects performed the following tasks: 1) rest, and 2) constant finger stretch during forearm elevation; controls additionally simulated pseudochoreoathetotic finger movements. During rest, the patients showed involuntary finger movements associated with excessive MEG-EMG coherence at frequencies between 6 and 20 Hz, whereas coherence in controls simulating pseudochoreoathetotic movements did not exceed noise level (P < 0.02). During finger stretch, MEG-EMG coherence in patients was similar to that of controls. Cortical sources of MEG-EMG coherence in patients were localized in the contralateral motor cortex. We conclude that pseudochoreoathetosis is associated with pathologically increased corticomuscular coherence and thus differs, neurophysiologically, from voluntarily simulated pseudochoreoathetotic movements. The enhanced MEG-EMG coherence in pseudochoreoathetosis probably reflects a pathologically strong motor cortical drive of spinal motorneurons after deafferentation.
Collapse
Affiliation(s)
- L Timmermann
- Department of Neurology, Heinrich-Heine University, Duesseldorf, Germany
| | | | | | | | | |
Collapse
|
39
|
Miall RC, Reckess GZ, Imamizu H. The cerebellum coordinates eye and hand tracking movements. Nat Neurosci 2001; 4:638-44. [PMID: 11369946 DOI: 10.1038/88465] [Citation(s) in RCA: 193] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The cerebellum is thought to help coordinate movement. We tested this using functional magnetic resonance imaging (fMRI) of the human brain during visually guided tracking tasks requiring varying degrees of eye-hand coordination. The cerebellum was more active during independent rather than coordinated eye and hand tracking. However, in three further tasks, we also found parametric increases in cerebellar blood oxygenation signal (BOLD) as eye-hand coordination increased. Thus, the cerebellar BOLD signal has a non-monotonic relationship to tracking performance, with high activity during both coordinated and independent conditions. These data provide the most direct evidence from functional imaging that the cerebellum supports motor coordination. Its activity is consistent with roles in coordinating and learning to coordinate eye and hand movement.
Collapse
Affiliation(s)
- R C Miall
- University Laboratory of Physiology, Parks Road, Oxford, OX1 3PT, UK.
| | | | | |
Collapse
|
40
|
Weeks RA, Gerloff C, Honda M, Dalakas MC, Hallett M. Movement-related cerebellar activation in the absence of sensory input. J Neurophysiol 1999; 82:484-8. [PMID: 10400975 DOI: 10.1152/jn.1999.82.1.484] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Movement-related cerebellar activation may be due to sensory or motor processing. Ordinarily, sensory and motor processing are obligatorily linked, but in patients who have severe pansensory neuropathies with normal muscle strength, motor activity occurs in isolation. In the present study, positron emission tomography and functional magnetic resonance imaging in such patients showed no cerebellar activation with passive movement, whereas there was prominent movement-related cerebellar activation despite absence of proprioceptive or visual input. The results indicate that motor processing occurs within the cerebellum and do not support the recently advanced view that the cerebellum is primarily a sensory organ.
Collapse
Affiliation(s)
- R A Weeks
- Human Motor Control Section, Medical Neurology Branch, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892-1428, USA
| | | | | | | | | |
Collapse
|
41
|
Scarchilli K, Vercher JL, Gauthier GM, Cole J. Does the oculo-manual co-ordination control system use an internal model of the arm dynamics? Neurosci Lett 1999; 265:139-42. [PMID: 10327188 DOI: 10.1016/s0304-3940(99)00224-4] [Citation(s) in RCA: 20] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
The hypothesis that during self-moved target tracking, the eye-arm co-ordination control system uses an internal model of the arm dynamics was tested. The contribution of arm proprioception to this model was also assessed. Subjects (nine healthy adults and one deafferented subject) were requested to make forearm movements and visually track an arm-driven target. Unexpected changes in mechanical properties of the manipulandum were used to modify the dynamical conditions of arm movement. The smooth pursuit gain (SPG) was computed before and during the perturbation. Results showed a decrease of SPG during perturbation in control subjects only. We propose that an internal model of the arm dynamics may be used to co-ordinate eye and arm movements, and arm proprioception may contribute to this internal model.
Collapse
Affiliation(s)
- K Scarchilli
- UMR CNRS Mouvement et Perception, Université de la Méditerranée, Marseille, France
| | | | | | | |
Collapse
|
42
|
Collins CJ, Barnes GR. Independent control of head and gaze movements during head-free pursuit in humans. J Physiol 1999; 515 ( Pt 1):299-314. [PMID: 9925900 PMCID: PMC2269145 DOI: 10.1111/j.1469-7793.1999.299ad.x] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
1. Head and gaze movements are usually highly co-ordinated. Here we demonstrate that under certain circumstances they can be controlled independently and we investigate the role of anticipatory activity in this process. 2. In experiment 1, subjects tracked, with head and eyes, a sinusoidally moving target. Overall, head and gaze trajectories were tightly coupled. From moment to moment, however, the trajectories could be very different and head movements were significantly more variable than gaze movements. 3. Predictive head and gaze responses can be elicited by repeated presentation of an intermittently illuminated, constant velocity target. In experiment 2 this protocol elicited a build-up of anticipatory head and gaze velocity, in opposing directions, when subjects made head movements in the opposite direction to target movement whilst maintaining gaze on target. 4. In experiment 3, head and gaze movements were completely uncoupled. Subjects followed, with head and gaze, respectively, two targets moving at different, harmonically unrelated frequencies. This was possible when both targets were visual, and also when gaze followed a visual target at one frequency whilst the head was oscillated in time with an auditory tone modulated at the second frequency. 5. We conclude that these results provide evidence of a visuomotor predictive mechanism that continuously samples visual feedback information and stores it such that it can be accessed by either the eye or the head to generate anticipatory movements. This overcomes time delays in visuomotor processing and facilitates time-sharing of motor activities, making possible the performance of two tasks simultaneously.
Collapse
Affiliation(s)
- C J Collins
- Medical Research Council, Human Movement and Balance Unit, Institute of Neurology, Queen Square, London, WC1N 3BG, UK.
| | | |
Collapse
|
43
|
Miall RC. The cerebellum, predictive control and motor coordination. NOVARTIS FOUNDATION SYMPOSIUM 1999; 218:272-84; discussion 284-90. [PMID: 9949826 DOI: 10.1002/9780470515563.ch15] [Citation(s) in RCA: 39] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
I argue that the cerebellum has at least two related roles, both sub-served by its operation as a 'forward model' of the motor system. First, it provides an internal state estimate or sensory prediction that is used for online control of movements; second, these predictive state estimates are used to coordinate actions by different effectors in the normal coordination of eye and hand, reach and grasp, etc. Preliminary electrophysiological data from cerebellar cortical neurons in the monkey supports the hypothesis that a proportion of cells code for the sensory consequences of movement. In a contrast between normal visually guided movement of a cursor and mirror reversed movement, approximately half the sample of 47 directionally sensitive cells were found to code for the movement of the cursor controlled by the monkey's limb, and not the limb movement itself. Functional imaging of the human cerebellum further supports the hypothesis that the cerebellum is involved in motor coordination. Subjects were tested performing ocular tracking, manual tracking without eye movement, or combined eye and hand tracking. Activation of cerebellar areas related to movement of eyes or hand alone was significantly enhanced when the subjects performed coordinated eye and hand tracking of a visual target.
Collapse
Affiliation(s)
- R C Miall
- University Laboratory of Physiology, Oxford, UK
| |
Collapse
|
44
|
Vercher JL, Gauthier GM, Cole J, Blouin J. Role of arm proprioception in calibrating the arm-eye temporal coordination. Neurosci Lett 1997; 237:109-12. [PMID: 9453227 DOI: 10.1016/s0304-3940(97)00816-1] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
When subjects track with the eyes an arm-attached target, eye latency is shorter than when tracking an external target. This improved synchrony could result from either a common command addressed to the two systems or from an influence of the arm command on eye motion initiation. According to the first hypothesis, the eyes should start moving long before the arm, because of the difference in dynamics. We recorded arm and eye motion together with biceps muscle activity in controls and a deafferented subject. Data support the second hypothesis. Moreover, the deafferented subject showed a lesser correlation between arm and eye motions than controls, suggesting a role for arm proprioception in the calibration of the temporal relationship between arm and eye movements.
Collapse
Affiliation(s)
- J L Vercher
- UMR CNRS, Mouvement et Perception, Université de la Méditerranée CP 910, Marseille, France.
| | | | | | | |
Collapse
|
45
|
Abstract
Based on theoretical and computational studies it has been suggested that the central nervous system (CNS) internally simulates the behaviour of the motor system in planning, control and learning. Such an internal "forward" model is a representation of the motor system that uses the current state of the motor system and motor command to predict the next state. We will outline the uses of such internal models for solving several fundamental computational problems in motor control and then review the evidence for their existence and use by the CNS. Finally we speculate how the location of an internal model within the CNS may be identified. Copyright 1996 Elsevier Science Ltd.
Collapse
Affiliation(s)
- D M. Wolpert
- University Laboratory of Physiology, Oxford; and Sobell Department, Institute of Neurology, London, UK
| | | |
Collapse
|