1
|
Gherri E, Garofalo G, O'Dowd A, Cudia A. The anticipatory effect of goal-directed action planning with a lower limb on peri-personal space. Cortex 2025; 185:170-183. [PMID: 40073715 DOI: 10.1016/j.cortex.2025.02.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2024] [Revised: 01/12/2025] [Accepted: 02/15/2025] [Indexed: 03/14/2025]
Abstract
Recent studies have demonstrated that the representation of peri-personal space (PPS) can be strongly modulated by the intention to execute a spatially-directed hand-movement. However, the question of whether analogous motor-induced PPS modulations can be observed during the planning and execution of goal-directed lower limbs movements has been scarcely investigated. Here we asked whether changes in the visuo-tactile PPS maps occur during the planning of a goal directed foot-movement. We asked participants to respond to the location of a tactile stimulus delivered to the index finger (top) or the thumb (bottom) of the right hand while ignoring a visual distractor presented at congruent or incongruent elevations, either close to the foot or close to the goal of the foot movement. This version of the cross-modal congruency task was performed under two different experimental conditions, as a baseline (static task, no movement involved) and embedded into a dual-task in which participants also had to plan and execute a goal-directed foot movement (dynamic task). In the static task, comparable cross-modal congruency effects (CCE) were present near the foot and near the movement goal. In the dynamic task, the CCE near the foot shrank considerably, whereas a sizable CCE was present near the movement goal. This anticipatory reweighting of the multisensory representation of near-space demonstrates that PPS is modulated by the intention to perform a goal-directed foot movement, with a weakened representation of the space around the currently occupied foot location when a movement is imminent.
Collapse
Affiliation(s)
- Elena Gherri
- Department of Philosophy, University of Bologna, Italy.
| | | | - Alan O'Dowd
- Trinity Institute of Neurosciences, Trinity College Dublin, Ireland
| | | |
Collapse
|
2
|
Casella A, Di Bello B, Aydin M, Lucia S, Russo FD, Pitzalis S. Modulation of anticipatory brain activity as a function of action complexity. Biol Psychol 2024; 193:108959. [PMID: 39644962 DOI: 10.1016/j.biopsycho.2024.108959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2024] [Revised: 12/02/2024] [Accepted: 12/04/2024] [Indexed: 12/09/2024]
Abstract
Stimulus-driven actions are preceded by preparatory brain activity that can be expressed by event-related potentials (ERP). Literature on this topic has focused on simple actions, such as the finger keypress, finding activity in frontal, parietal, and occipital areas detectable up to two seconds before the stimulus onset. Little is known about the preparatory brain activity when the action complexity increases, and specific brain areas designated to achieve movement integration intervene. This paper aims to identify the time course of preparatory brain activity associated with actions of increasing complexity using ERP analysis and a visuomotor discrimination task. Motor complexity was manipulated by asking nineteen volunteers to provide their response by simply pressing a key or by adding to the keypress arm extensions alone, or in combination with a standing step (involving the whole body). Results showed that these actions of increasing levels of complexity appear to be associated with different patterns of preparatory brain activity in which the found components were differently modulated. The simple keypress was characterized by the prominent motor excitatory preparation in premotor areas paralleled by the largest prefrontal inhibitory/attentional control. Reaching presented a dominant parietal preparation confirming the role of these integration areas in reaching actions toward a goal. Stepping was characterized by localized activity in the bilateral dorsomedial parieto-occipital areas attributable to sensory readiness, for the approaching stimulus. In conclusion, the brain can optimally anticipate any stimulus-driven action modulating the activity in the brain areas specialized in the preparation of that action type.
Collapse
Affiliation(s)
- Andrea Casella
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy.
| | - BiancaMaria Di Bello
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy.
| | - Merve Aydin
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy.
| | - Stefania Lucia
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy.
| | - Francesco Di Russo
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy; Santa Lucia Foundation IRCCS, Rome 00179, Italy.
| | - Sabrina Pitzalis
- Department of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome 00135, Italy; Santa Lucia Foundation IRCCS, Rome 00179, Italy.
| |
Collapse
|
3
|
Kida T, Kaneda T, Nishihira Y. ERP evidence of attentional somatosensory processing and stimulus-response coupling under different hand and arm postures. Front Hum Neurosci 2023; 17:1252686. [PMID: 38021238 PMCID: PMC10676239 DOI: 10.3389/fnhum.2023.1252686] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 10/16/2023] [Indexed: 12/01/2023] Open
Abstract
We investigated (1) the effects of divided and focused attention on event-related brain potentials (ERPs) elicited by somatosensory stimulation under different response modes, (2) the effects of hand position (closely-placed vs. separated hands) and arm posture (crossed vs. uncrossed forearms) on the attentional modulation of somatosensory ERPs, and (3) changes in the coupling of stimulus- and response-related processes by somatosensory attention using a single-trial analysis of P300 latency and reaction times. Electrocutaneous stimulation was presented randomly to the thumb or middle finger of the left or right hand at random interstimulus intervals (700-900 ms). Subjects attended unilaterally or bilaterally to stimuli in order to detect target stimuli by a motor response or counting. The effects of unilaterally-focused attention were also tested under different hand and arm positions. The amplitude of N140 in the divided attention condition was intermediate between unilaterally attended and unattended stimuli in the unilaterally-focused attention condition in both the mental counting and motor response tasks. Attended infrequent (target) stimuli elicited greater P300 in the unilaterally attention condition than in the divided attention condition. P300 latency was longer in the divided attention condition than in the unilaterally-focused attention condition in the motor response task, but remained unchanged in the counting task. Closely locating the hands had no impact, whereas crossing the forearms decreased the attentional enhancement in N140 amplitude. In contrast, these two manipulations uniformly decreased P300 amplitude and increased P300 latency. The correlation between single-trial P300 latency and RT was decreased by crossed forearms, but not by divided attention or closely-placed hands. Therefore, the present results indicate that focused and divided attention differently affected middle latency and late processing, and that hand position and arm posture also differently affected attentional processes and stimulus-response coupling.
Collapse
Affiliation(s)
- Tetsuo Kida
- Higher Brain Function Unit, Department of Functioning and Disability, Institute for Developmental Research, Aichi Developmental Disability Center, Kasugai, Japan
| | | | - Yoshiaki Nishihira
- Graduate School of Comprehensive Human Sciences, University of Tsukuba, Tsukuba, Japan
| |
Collapse
|
4
|
Gherri E, White F, Venables E. On the spread of spatial attention in touch: Evidence from Event-Related Brain potentials. Biol Psychol 2023; 178:108544. [PMID: 36931591 DOI: 10.1016/j.biopsycho.2023.108544] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Revised: 01/27/2023] [Accepted: 03/14/2023] [Indexed: 03/18/2023]
Abstract
To investigate the distribution of tactile spatial attention near the current attentional focus, participants were cued to attend to one of four body locations (hand or shoulder on the left or right side) to respond to infrequent tactile targets. In this Narrow attention task, effects of spatial attention on the ERPs elicited by tactile stimuli delivered to the hands were compared as a function of the distance from the attentional focus (Focus on the hand vs. Focus on the shoulder). When participants focused on the hand, attentional modulations of the sensory-specific P100 and N140 components were followed by the longer latency Nd component. Notably, when participants focused on the shoulder, they were unable to restrict their attentional resources to the cued location, as revealed by the presence of reliable attentional modulations at the hands. This effect of attention outside the attentional focus was delayed and reduced compared to that observed within the attentional focus, revealing the presence of an attentional gradient. In addition, to investigate whether the size of the attentional focus modulated the effects of tactile spatial attention on somatosensory processing, participants also completed the Broad attention task, in which they were cued to attend to two locations (both the hand and the shoulder) on the left or right side. Attentional modulations at the hands emerged later and were reduced in the Broad compared to the Narrow attention task, suggesting reduced attentional resources for a wider attentional focus.
Collapse
Affiliation(s)
- Elena Gherri
- Human Cognitive Neuroscience, University of Edinburgh, UK; Università di Bologna, Italy.
| | - Felicity White
- Human Cognitive Neuroscience, University of Edinburgh, UK
| | | |
Collapse
|
5
|
Stenzel H, Francombe J, Jackson PJB. Limits of Perceived Audio-Visual Spatial Coherence as Defined by Reaction Time Measurements. Front Neurosci 2019; 13:451. [PMID: 31191211 PMCID: PMC6538976 DOI: 10.3389/fnins.2019.00451] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2018] [Accepted: 04/23/2019] [Indexed: 11/30/2022] Open
Abstract
The ventriloquism effect describes the phenomenon of audio and visual signals with common features, such as a voice and a talking face merging perceptually into one percept even if they are spatially misaligned. The boundaries of the fusion of spatially misaligned stimuli are of interest for the design of multimedia products to ensure a perceptually satisfactory product. They have mainly been studied using continuous judgment scales and forced-choice measurement methods. These results vary greatly between different studies. The current experiment aims to evaluate audio-visual fusion using reaction time (RT) measurements as an indirect method of measurement to overcome these great variances. A two-alternative forced-choice (2AFC) word recognition test was designed and tested with noise and multi-talker speech background distractors. Visual signals were presented centrally and audio signals were presented between 0° and 31° audio-visual offset in azimuth. RT data were analyzed separately for the underlying Simon effect and attentional effects. In the case of the attentional effects, three models were identified but no single model could explain the observed RTs for all participants so data were grouped and analyzed accordingly. The results show that significant differences in RTs are measured from 5° to 10° onwards for the Simon effect. The attentional effect varied at the same audio-visual offset for two out of the three defined participant groups. In contrast with the prior research, these results suggest that, even for speech signals, small audio-visual offsets influence spatial integration subconsciously.
Collapse
Affiliation(s)
- Hanne Stenzel
- Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford, United Kingdom
| | | | - Philip J. B. Jackson
- Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford, United Kingdom
| |
Collapse
|
6
|
Forsberg A, O'Dowd A, Gherri E. Tool use modulates early stages of visuo-tactile integration in far space: Evidence from event-related potentials. Biol Psychol 2019; 145:42-54. [PMID: 30970269 DOI: 10.1016/j.biopsycho.2019.03.020] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 01/15/2019] [Accepted: 03/28/2019] [Indexed: 10/27/2022]
Abstract
The neural representation of multisensory space near the body is modulated by the active use of long tools in non-human primates. Here, we investigated whether the electrophysiological correlates of visuo-tactile integration in near and far space were modulated by active tool use in healthy humans. Participants responded to a tactile target delivered to one hand while an irrelevant visual stimulus was presented ipsilaterally in near or far space. This crossmodal task was performed after the use of either short or long tools. Crucially, the P100 components elicited by visuo-tactile stimuli was enhanced on far as compared to near space trials after the use of long tools, while no such difference was present after short tool use. Thus, we found increased neural responses in brain areas encoding tactile stimuli to the body when visual stimuli were presented close to the tip of the tool after long tool use. This increased visuo-tactile integration on far space trials following the use of long tools might indicate a transient remapping of multisensory space. We speculate that performing voluntary actions with long tools strengthens the representation of sensory information arising within portions of space (i.e. the hand and the tip of the tool) that are most functionally relevant to one's behavioural goals.
Collapse
Affiliation(s)
- Alicia Forsberg
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Alan O'Dowd
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Elena Gherri
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK.
| |
Collapse
|
7
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
8
|
Gomez-Ramirez M, Hysaj K, Niebur E. Neural mechanisms of selective attention in the somatosensory system. J Neurophysiol 2016; 116:1218-31. [PMID: 27334956 DOI: 10.1152/jn.00637.2015] [Citation(s) in RCA: 45] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2015] [Accepted: 06/09/2016] [Indexed: 11/22/2022] Open
Abstract
Selective attention allows organisms to extract behaviorally relevant information while ignoring distracting stimuli that compete for the limited resources of their central nervous systems. Attention is highly flexible, and it can be harnessed to select information based on sensory modality, within-modality feature(s), spatial location, object identity, and/or temporal properties. In this review, we discuss the body of work devoted to understanding mechanisms of selective attention in the somatosensory system. In particular, we describe the effects of attention on tactile behavior and corresponding neural activity in somatosensory cortex. Our focus is on neural mechanisms that select tactile stimuli based on their location on the body (somatotopic-based attention) or their sensory feature (feature-based attention). We highlight parallels between selection mechanisms in touch and other sensory systems and discuss several putative neural coding schemes employed by cortical populations to signal the behavioral relevance of sensory inputs. Specifically, we contrast the advantages and disadvantages of using a gain vs. spike-spike correlation code for representing attended sensory stimuli. We favor a neural network model of tactile attention that is composed of frontal, parietal, and subcortical areas that controls somatosensory cells encoding the relevant stimulus features to enable preferential processing throughout the somatosensory hierarchy. Our review is based on data from noninvasive electrophysiological and imaging data in humans as well as single-unit recordings in nonhuman primates.
Collapse
Affiliation(s)
- Manuel Gomez-Ramirez
- Department of Neuroscience, Brown University, Providence, Rhode Island; The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and The Solomon H. Snyder Department of Neuroscience, The Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Kristjana Hysaj
- The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and
| | - Ernst Niebur
- The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and The Solomon H. Snyder Department of Neuroscience, The Johns Hopkins School of Medicine, Baltimore, Maryland
| |
Collapse
|
9
|
Juravle G, Heed T, Spence C, Röder B. Neural correlates of tactile perception during pre-, peri-, and post-movement. Exp Brain Res 2016; 234:1293-305. [DOI: 10.1007/s00221-016-4589-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2015] [Accepted: 01/30/2016] [Indexed: 11/29/2022]
|
10
|
Gherri E, Forster B. Independent effects of eye gaze and spatial attention on the processing of tactile events: Evidence from event-related potentials. Biol Psychol 2015; 109:239-47. [PMID: 26101088 DOI: 10.1016/j.biopsycho.2015.05.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2014] [Revised: 04/30/2015] [Accepted: 05/31/2015] [Indexed: 10/23/2022]
Abstract
Directing one's gaze at a body part reduces detection speed and enhances the processing of tactile stimuli presented at the gazed location. Given the close links between spatial attention and the oculomotor system it is possible that these gaze- dependent modulations of touch are mediated by attentional mechanisms. To investigate this possibility, gaze direction and sustained tactile attention were orthogonally manipulated in the present study. Participants covertly attended to one hand to perform a tactile target-nontarget discrimination while they gazed at the same or opposite hand. Spatial attention resulted in enhancements of the somatosensory P100 and Nd components. In contrast, gaze resulted in modulations of the N140 component with more positive ERPs for gazed than non gazed stimuli. This dissociation in the pattern and timing of the effects of gaze and attention on somatosensory processing reveals that gaze and attention have independent effects on touch.
Collapse
Affiliation(s)
- Elena Gherri
- Cognitive Neuroscience Research Unit, City University London, UK.
| | - Bettina Forster
- Cognitive Neuroscience Research Unit, City University London, UK
| |
Collapse
|
11
|
Harris LR, Carnevale MJ, D’Amour S, Fraser LE, Harrar V, Hoover AEN, Mander C, Pritchett LM. How our body influences our perception of the world. Front Psychol 2015; 6:819. [PMID: 26124739 PMCID: PMC4464078 DOI: 10.3389/fpsyg.2015.00819] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 05/29/2015] [Indexed: 12/02/2022] Open
Abstract
Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
Collapse
Affiliation(s)
- Laurence R. Harris
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Michael J. Carnevale
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Sarah D’Amour
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lindsey E. Fraser
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Vanessa Harrar
- School of Optometry, University of Montreal, Montreal, QC, Canada
| | - Adria E. N. Hoover
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Charles Mander
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lisa M. Pritchett
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| |
Collapse
|
12
|
Wehrspaun CC, Pfabigan DM, Sailer U. Early event-related potentials indicate context-specific target processing for eye and hand motor systems. Neurosci Res 2013; 77:50-7. [PMID: 23968690 PMCID: PMC3867658 DOI: 10.1016/j.neures.2013.08.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2013] [Revised: 08/02/2013] [Accepted: 08/08/2013] [Indexed: 11/26/2022]
Abstract
Concurrent eye and hand movements toward a common visual target require different motor programs based on identical visual input. We used event-related brain potentials (ERP) to determine if and when the processing of the visual target differs for the two motor systems. The N2, an index for target evaluation, was more negative for the target of a hand than of an eye movement in two experiments. A possible interpretation for this finding is different visual target processing. Targets for hand movements require a different weighting of visual information, for example concerning features such as surface structure which are important for hand but not for eye movements. In experiment 2, the early C1-component, which had an average maximum at 67 ms following target onset, was significantly more negative when subjects pointed at the stimuli. Traditionally, the C1 has been regarded as a sensory component, but recent studies have linked it to higher order processing, such as attention and expectations. Thus, the present data indicate that target processing for eye or hand movements is already context-specific during early visual information processing. We suggest that differences in a target's relevance for upcoming movements modify target processing as well as sensory expectations.
Collapse
Affiliation(s)
- Claudia C Wehrspaun
- Department of Physiology, Anatomy and Genetics, Oxford University, South Parks Road, Oxford, OX1 3QX England, United Kingdom; Social, Cognitive and Affective Neuroscience Unit, Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria(1).
| | | | | |
Collapse
|