1
|
Luabeya GN, Yan X, Freud E, Crawford JD. Influence of gaze, vision, and memory on hand kinematics in a placement task. J Neurophysiol 2024; 132:147-161. [PMID: 38836297 DOI: 10.1152/jn.00362.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Revised: 05/24/2024] [Accepted: 06/01/2024] [Indexed: 06/06/2024] Open
Abstract
People usually reach for objects to place them in some position and orientation, but the placement component of this sequence is often ignored. For example, reaches are influenced by gaze position, visual feedback, and memory delays, but their influence on object placement is unclear. Here, we tested these factors in a task where participants placed and oriented a trapezoidal block against two-dimensional (2-D) visual templates displayed on a frontally located computer screen. In experiment 1, participants matched the block to three possible orientations: 0° (horizontal), +45° and -45°, with gaze fixated 10° to the left/right. The hand and template either remained illuminated (closed-loop), or visual feedback was removed (open-loop). Here, hand location consistently overshot the template relative to gaze, especially in the open-loop task; likewise, orientation was influenced by gaze position (depending on template orientation and visual feedback). In experiment 2, a memory delay was added, and participants sometimes performed saccades (toward, away from, or across the template). In this task, the influence of gaze on orientation vanished, but location errors were influenced by both template orientation and final gaze position. Contrary to our expectations, the previous saccade metrics also impacted placement overshoot. Overall, hand orientation was influenced by template orientation in a nonlinear fashion. These results demonstrate interactions between gaze and orientation signals in the planning and execution of hand placement and suggest different neural mechanisms for closed-loop, open-loop, and memory delay placement.NEW & NOTEWORTHY Eye-hand coordination studies usually focus on object acquisition, but placement is equally important. We investigated how gaze position influences object placement toward a 2-D template with different levels of visual feedback. Like reach, placement overestimated goal location relative to gaze and was influenced by previous saccade metrics. Gaze also modulated hand orientation, depending on template orientation and level of visual feedback. Gaze influence was feedback-dependent, with location errors having no significant effect after a memory delay.
Collapse
Affiliation(s)
- Gaelle N Luabeya
- Centre for Vision Research and Vision: Science to Applications Program, York University, Toronto, Ontario, Canada
- Department of Biology, York University, Toronto, Ontario, Canada
| | - Xiaogang Yan
- Centre for Vision Research and Vision: Science to Applications Program, York University, Toronto, Ontario, Canada
| | - Erez Freud
- Centre for Vision Research and Vision: Science to Applications Program, York University, Toronto, Ontario, Canada
- Department of Biology, York University, Toronto, Ontario, Canada
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - J Douglas Crawford
- Centre for Vision Research and Vision: Science to Applications Program, York University, Toronto, Ontario, Canada
- Department of Biology, York University, Toronto, Ontario, Canada
- Department of Psychology, York University, Toronto, Ontario, Canada
- Department of Kinesiology & Health Sciences, York University, Toronto, Ontario, Canada
- Centre for Integrative and Applied Neuroscience, York University, Toronto, Ontario, Canada
| |
Collapse
|
2
|
Maij F, Seegelke C, Medendorp WP, Heed T. External location of touch is constructed post-hoc based on limb choice. eLife 2020; 9:57804. [PMID: 32945257 PMCID: PMC7561349 DOI: 10.7554/elife.57804] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 09/18/2020] [Indexed: 11/13/2022] Open
Abstract
When humans indicate on which hand a tactile stimulus occurred, they often err when their hands are crossed. This finding seemingly supports the view that the automatically determined touch location in external space affects limb assignment: the crossed right hand is localized in left space, and this conflict presumably provokes hand assignment errors. Here, participants judged on which hand the first of two stimuli, presented during a bimanual movement, had occurred, and then indicated its external location by a reach-to-point movement. When participants incorrectly chose the hand stimulated second, they pointed to where that hand had been at the correct, first time point, though no stimulus had occurred at that location. This behavior suggests that stimulus localization depended on hand assignment, not vice versa. It is, thus, incompatible with the notion of automatic computation of external stimulus location upon occurrence. Instead, humans construct external touch location post-hoc and on demand.
Collapse
Affiliation(s)
- Femke Maij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Christian Seegelke
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Tobias Heed
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
3
|
Goodman R, Manson GA, Tremblay L. Age-related Differences in Sensorimotor Transformations for Visual and/or Somatosensory Targets: Planning or Execution? Exp Aging Res 2020; 46:128-138. [DOI: 10.1080/0361073x.2020.1716153] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Rachel Goodman
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, Toronto, Ontario, Canada
| | - Gerome A. Manson
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, Toronto, Ontario, Canada
| | - Luc Tremblay
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
4
|
Dupin L, Haggard P. Dynamic Displacement Vector Interacts with Tactile Localization. Curr Biol 2019; 29:492-498.e3. [PMID: 30686734 PMCID: PMC6370943 DOI: 10.1016/j.cub.2018.12.032] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 11/15/2018] [Accepted: 12/18/2018] [Indexed: 11/25/2022]
Abstract
Locating a tactile stimulus on the body seems effortless and straightforward. However, the perceived location of a tactile stimulation can differ from its physical location [1, 2, 3]. Tactile mislocalizations can depend on the timing of successive stimulations [2, 4, 5], tactile motion mechanisms [6], or processes that “remap” stimuli from skin locations to external space coordinates [7, 8, 9, 10, 11]. We report six experiments demonstrating that the perception of tactile localization on a static body part is strongly affected by the displacement between the locations of two successive task-irrelevant actions. Participants moved their index finger between two keys. Each keypress triggered synchronous tactile stimulation at a randomized location on the immobilized wrist or forehead. Participants reported the location of the second tactile stimulation relative to the first. The direction of either active finger movements or passive finger displacements biased participants’ tactile orientation judgements (experiment 1). The effect generalized to tactile stimuli delivered to other body sites (experiment 2). Two successive keypresses, by different fingers at distinct locations, reproduced the effect (experiment 3). The effect remained even when the hand that moved was placed far from the tactile stimulation site (experiments 4 and 5). Temporal synchrony within 600 ms between the movement and tactile stimulations was necessary for the effect (experiment 6). Our results indicate that a dynamic displacement vector, defined as the location of one sensorimotor event relative to the one before, plays a strong role in structuring tactile spatial perception. Human tactile localization is biased by simultaneous finger displacement The shift between two successive events biases the relative localization of touches Both active and passive movements induce a bias, even if far from the touched site The bias effect is vectorially organized
Collapse
Affiliation(s)
- Lucile Dupin
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK.
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK
| |
Collapse
|
5
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
6
|
Gaze-centered coding of proprioceptive reach targets after effector movement: Testing the impact of online information, time of movement, and target distance. PLoS One 2017; 12:e0180782. [PMID: 28678886 PMCID: PMC5498052 DOI: 10.1371/journal.pone.0180782] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2016] [Accepted: 06/21/2017] [Indexed: 11/19/2022] Open
Abstract
In previous research, we demonstrated that spatial coding of proprioceptive reach targets depends on the presence of an effector movement (Mueller & Fiehler, Neuropsychologia, 2014, 2016). In these studies, participants were asked to reach in darkness with their right hand to a proprioceptive target (tactile stimulation on the finger tip) while their gaze was varied. They either moved their left, stimulated hand towards a target location or kept it stationary at this location where they received a touch on the fingertip to which they reached with their right hand. When the stimulated hand was moved, reach errors varied as a function of gaze relative to target whereas reach errors were independent of gaze when the hand was kept stationary. The present study further examines whether (a) the availability of proprioceptive online information, i.e. reaching to an online versus a remembered target, (b) the time of the effector movement, i.e. before or after target presentation, or (c) the target distance from the body influences gaze-centered coding of proprioceptive reach targets. We found gaze-dependent reach errors in the conditions which included a movement of the stimulated hand irrespective of whether proprioceptive information was available online or remembered. This suggests that an effector movement leads to gaze-centered coding for both online and remembered proprioceptive reach targets. Moreover, moving the stimulated hand before or after target presentation did not affect gaze-dependent reach errors, thus, indicating a continuous spatial update of positional signals of the stimulated hand rather than the target location per se. However, reaching to a location close to the body rather than farther away (but still within reachable space) generally decreased the influence of a gaze-centered reference frame.
Collapse
|
7
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
8
|
Mixed body- and gaze-centered coding of proprioceptive reach targets after effector movement. Neuropsychologia 2016; 87:63-73. [PMID: 27157885 DOI: 10.1016/j.neuropsychologia.2016.04.033] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2015] [Revised: 04/11/2016] [Accepted: 04/28/2016] [Indexed: 11/21/2022]
Abstract
Previous studies demonstrated that an effector movement intervening between encoding and reaching to a proprioceptive target determines the underlying reference frame: proprioceptive reach targets are represented in a gaze-independent reference frame if no movement occurs but are represented with respect to gaze after an effector movement (Mueller and Fiehler, 2014a). The present experiment explores whether an effector movement leads to a switch from a gaze-independent, body-centered reference frame to a gaze-dependent reference frame or whether a gaze-dependent reference frame is employed in addition to a gaze-independent, body-centered reference frame. Human participants were asked to reach in complete darkness to an unseen finger (proprioceptive target) of their left target hand indicated by a touch. They completed 2 conditions in which the target hand remained either stationary at the target location (stationary condition) or was actively moved to the target location, received a touch and was moved back before reaching to the target (moved condition). We dissociated the location of the movement vector relative to the body midline and to the gaze direction. Using correlation and regression analyses, we estimated the contribution of each reference frame based on horizontal reach errors in the stationary and moved conditions. Gaze-centered coding was only found in the moved condition, replicating our previous results. Body-centered coding dominated in the stationary condition while body- and gaze-centered coding contributed equally strong in the moved condition. Our results indicate a shift from body-centered to combined body- and gaze-centered coding due to an effector movement before reaching towards proprioceptive targets.
Collapse
|
9
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
10
|
Brandes J, Heed T. Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. J Neurosci 2015; 35:13648-58. [PMID: 26446218 PMCID: PMC6605379 DOI: 10.1523/jneurosci.1873-14.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2014] [Revised: 08/24/2015] [Accepted: 08/27/2015] [Indexed: 11/21/2022] Open
Abstract
Spatial target information for movement planning appears to be coded in a gaze-centered reference frame. In touch, however, location is initially coded with reference to the skin. Therefore, the tactile spatial location must be derived by integrating skin location and posture. It has been suggested that this recoding is impaired when the limb is placed in the opposite hemispace, for example, by limb crossing. Here, human participants reached toward visual and tactile targets located at uncrossed and crossed feet in a sensorimotor decision task. We characterized stimulus recoding by analyzing the timing and spatial profile of hand reaches. For tactile targets at crossed feet, skin-based information implicates the incorrect side, and only recoded information points to the correct location. Participants initiated straight reaches and redirected the hand toward a target presented in midflight. Trajectories to visual targets were unaffected by foot crossing. In contrast, trajectories to tactile targets were redirected later with crossed than uncrossed feet. Reaches to crossed feet usually continued straight until they were directed toward the correct tactile target and were not biased toward the skin-based target location. Occasional, far deflections toward the incorrect target were most likely when this target was implicated by trial history. These results are inconsistent with the suggestion that spatial transformations in touch are impaired by limb crossing, but are consistent with tactile location being recoded rapidly and efficiently, followed by integration of skin-based and external information to specify the reach target. This process may be implemented in a bounded integrator framework. SIGNIFICANCE STATEMENT How do you touch yourself, for instance, to scratch an itch? The place you need to reach is defined by a sensation on the skin, but our bodies are flexible, so this skin location could be anywhere in 3D space. The movement toward the tactile sensation must therefore be specified by merging skin location and body posture. By investigating human hand reach trajectories toward tactile stimuli on the feet, we provide experimental evidence that this transformation process is quick and efficient, and that its output is integrated with the original skin location in a fashion consistent with bounded integrator decision-making frameworks.
Collapse
Affiliation(s)
- Janina Brandes
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| |
Collapse
|
11
|
Heed T, Buchholz VN, Engel AK, Röder B. Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn Sci 2015; 19:251-8. [DOI: 10.1016/j.tics.2015.03.001] [Citation(s) in RCA: 65] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2014] [Revised: 03/04/2015] [Accepted: 03/05/2015] [Indexed: 10/23/2022]
|
12
|
Badde S, Röder B, Heed T. Flexibly weighted integration of tactile reference frames. Neuropsychologia 2014; 70:367-74. [PMID: 25447059 DOI: 10.1016/j.neuropsychologia.2014.10.001] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2014] [Revised: 09/29/2014] [Accepted: 10/01/2014] [Indexed: 10/24/2022]
Abstract
To estimate the location of a tactile stimulus, the brain seems to integrate different types of spatial information such as skin-based, anatomical coordinates and external, spatiotopic coordinates. The aim of the present study was to test whether the use of these coordinates is fixed, or whether they are weighted according to the task context. Participants made judgments about two tactile stimuli with different vibration characteristics, one applied to each hand. First, they always performed temporal order judgments (TOJ) of the tactile stimuli with respect to the stimulated hands that were either crossed or uncrossed. The resulting crossing effect, that is, impaired performance in crossed compared to uncrossed conditions, was used as a measure of reference frame weighting and was compared across conditions. Second, in dual judgment conditions participants subsequently made judgments about the stimulus vibration characteristics, either with respect to spatial location or with respect to temporal order. Responses in the spatial secondary task either accented anatomical (Experiment 1) or external (Experiment 2) coding. A TOJ crossing effect emerged in all conditions, and secondary tasks did not affect primary task performance in the uncrossed posture. Yet, the spatial secondary task resulted in improved crossed hands performance in the primary task, but only if the secondary judgment stressed the anatomical reference frame (Experiment 1), rather than the external reference frames (Experiment 2). Like the anatomically coded spatial secondary task, the temporal secondary task improved crossed hand performance of the primary task. The differential influence of the varying secondary tasks implies that integration weights assigned to the anatomical and external reference frames are not fixed. Rather, they are flexibly adjusted to the context, presumably through top-down modulation.
Collapse
Affiliation(s)
- Stephanie Badde
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| |
Collapse
|
13
|
Mueller S, Fiehler K. Effector movement triggers gaze-dependent spatial coding of tactile and proprioceptive-tactile reach targets. Neuropsychologia 2014; 62:184-93. [DOI: 10.1016/j.neuropsychologia.2014.07.025] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2013] [Revised: 06/10/2014] [Accepted: 07/22/2014] [Indexed: 11/27/2022]
|
14
|
Gherri E, Forster B. Attention to the body depends on eye-in-orbit position. Front Psychol 2014; 5:683. [PMID: 25071653 PMCID: PMC4086396 DOI: 10.3389/fpsyg.2014.00683] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2013] [Accepted: 06/13/2014] [Indexed: 11/13/2022] Open
Abstract
Attentional selectivity in touch is modulated by the position of the body in external space. For instance, during endogenous attention tasks in which tactile stimuli are presented to the hands, the effect of attention is reduced when the hands are placed far apart than when they are close together and when the hands are crossed as compared to when they are placed in their anatomical position. This suggests that both somatotopic and external spatial reference frames coding the hands’ locations contribute to the spatial selection of the relevant hand. Here we investigate whether tactile selection of hands is also modulated by the position of other body parts, not directly involved in tactile perception, such as eye-in-orbit (gaze direction). We asked participants to perform the same sustained tactile attention task while gazing laterally toward an eccentric fixation point (Eccentric gaze) or toward a central fixation point (Central gaze). Event-related potentials recorded in response to tactile non-target stimuli presented to the attended or unattended hand were compared as a function of gaze direction (Eccentric vs. Central conditions). Results revealed that attentional modulations were reduced in the Eccentric gaze condition as compared to the Central gaze condition in the time range of the Nd component (200–260 ms post-stimulus), demonstrating for the first time that the attentional selection of one of the hands is affected by the position of the eye in the orbit. Directing the eyes toward an eccentric position might be sufficient to create a misalignment between external and somatotopic frames of references reducing tactile attention. This suggests that the eye-in-orbit position contributes to the spatial selection of the task relevant body part.
Collapse
Affiliation(s)
- Elena Gherri
- Department of Psychology, University of Edinburgh, Edinburgh UK
| | | |
Collapse
|
15
|
Sutter C, Drewing K, Müsseler J. Multisensory integration in action control. Front Psychol 2014; 5:544. [PMID: 24959154 PMCID: PMC4051139 DOI: 10.3389/fpsyg.2014.00544] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 05/16/2014] [Indexed: 11/13/2022] Open
Affiliation(s)
- Christine Sutter
- Department of Work and Cognitive Psychology, RWTH Aachen University Aachen, Germany
| | - Knut Drewing
- Department for Experimental Psychology, Institute for Psychology, Justus-Liebig University Giessen, Germany
| | - Jochen Müsseler
- Department of Work and Cognitive Psychology, RWTH Aachen University Aachen, Germany
| |
Collapse
|