1
|
Alouit A, Gavaret M, Ramdani C, Lindberg PG, Dupin L. Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
Affiliation(s)
- Anaëlle Alouit
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Martine Gavaret
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
- GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, Service de neurophysiologie clinique, 1 Rue Cabanis, F-75014 Paris, France
| | - Céline Ramdani
- Service de Santé des Armées, Institut de Recherche Biomédicale des Armées, 1 Place du Général Valérie André, 91220 Brétigny-sur-Orge, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Lucile Dupin
- Université Paris Cité, INCC UMR 8002, CNRS, 45 Rue des Saints-Pères, F-75006 Paris, France
| |
Collapse
|
2
|
Fabio C, Salemme R, Koun E, Farnè A, Miller LE. Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools. J Cogn Neurosci 2022; 34:675-686. [DOI: 10.1162/jocn_a_01820] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Collapse
Affiliation(s)
- Cécile Fabio
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
| | - Romeo Salemme
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Eric Koun
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Alessandro Farnè
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- University of Trento, Rovereto, Italy
| | - Luke E. Miller
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- Donders Institute for Brain, Nijmegen, The Netherlands
| |
Collapse
|
3
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
4
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
5
|
Zheng W, Chen L. The Roles of Attentional Shifts and Attentional Reengagement in Resolving The Spatial Compatibility Effect in Tactile Simon-like Tasks. Sci Rep 2018; 8:8760. [PMID: 29884800 PMCID: PMC5993732 DOI: 10.1038/s41598-018-27114-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2017] [Accepted: 05/30/2018] [Indexed: 11/08/2022] Open
Abstract
The Simon effect refers to the acceleration of choice responses when the target position and response location are consistent compared with scenarios in which they are inconsistent, even if the target position is not relevant to the response. Here, we provide the first demonstration that the tactile Simon-like effect operates in an attention-shifting manner. In unimodal scenarios (Experiments 1-4), for the tactile direction task, the spatial compatibility effect was absent in the focused-attention condition but maintained in the divided-attention condition. For the tactile localization task, this pattern was reversed: the spatial compatibility effect occurred for the focused-attention condition but was reduced/absent in the divided-attention condition. In the audiotactile interaction scenario (Experiment 5), the reaction times (RTs) for discriminating the tactile motion direction were prolonged; however, a spatial compatibility effect was not observed. We propose that the temporal course of resolving conflicts between spatial codes during attentional shifts, including attentional reengagement, may account for the tactile Simon-like effect.
Collapse
Affiliation(s)
- Wanting Zheng
- School of Ophthalmology & Optometry, School of Biomedical Engineering, Wenzhou Medical University, Wenzhou, 325035, China.
| | - Lihan Chen
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, 100871, China.
- Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, China.
| |
Collapse
|
6
|
Medina S, Tamè L, Longo MR. Tactile localization biases are modulated by gaze direction. Exp Brain Res 2017; 236:31-42. [PMID: 29018928 DOI: 10.1007/s00221-017-5105-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Accepted: 10/05/2017] [Indexed: 01/03/2023]
Abstract
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants' hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points' locations, by elongating it, in the radio-ulnar axis.
Collapse
Affiliation(s)
- Sonia Medina
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| |
Collapse
|
7
|
Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychon Bull Rev 2016; 23:387-404. [PMID: 26350763 DOI: 10.3758/s13423-015-0918-0] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.
Collapse
|
8
|
Tamè L, Wühle A, Petri CD, Pavani F, Braun C. Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn 2016; 111:25-33. [PMID: 27816777 DOI: 10.1016/j.bandc.2016.10.005] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 10/22/2016] [Accepted: 10/24/2016] [Indexed: 10/20/2022]
Abstract
Localizing tactile stimuli on our body requires sensory information to be represented in multiple frames of reference along the sensory pathways. These reference frames include the representation of sensory information in skin coordinates, in which the spatial relationship of skin regions is maintained. The organization of the primary somatosensory cortex matches such somatotopic reference frame. In contrast, higher-order representations are based on external coordinates, in which body posture and gaze direction are taken into account in order to localise touch in other meaningful ways according to task demands. Dominance of one representation or the other, or the use of multiple representations with different weights, is thought to depend on contextual factors of cognitive and/or sensory origins. However, it is unclear under which situations a reference frame takes over another or when different reference frames are jointly used at the same time. The study of tactile mislocalizations at the fingers has shown a key role of the somatotopic frame of reference, both when touches are delivered unilaterally to a single hand, and when they are delivered bilaterally to both hands. Here, we took advantage of a well-established tactile mislocalization paradigm to investigate whether the reference frame used to integrate bilateral tactile stimuli can change as a function of the spatial relationship between the two hands. Specifically, supra-threshold interference stimuli were applied to the index or little fingers of the left hand 200ms prior to the application of a test stimulus on a finger of the right hand. Crucially, different hands postures were adopted (uncrossed or crossed). Results show that introducing a change in hand-posture triggered the concurrent use of somatotopic and external reference frames when processing bilateral touch at the fingers. This demonstrates that both somatotopic and external reference frames can be concurrently used to localise tactile stimuli on the fingers.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, UK.
| | - Anja Wühle
- MEG-Centre, University of Tübingen, Germany
| | | | - Francesco Pavani
- Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Centre, Lyon, France
| | - Christoph Braun
- MEG-Centre, University of Tübingen, Germany; Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
9
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
10
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
11
|
Noel JP, Wallace M. Relative contributions of visual and auditory spatial representations to tactile localization. Neuropsychologia 2016; 82:84-90. [PMID: 26768124 DOI: 10.1016/j.neuropsychologia.2016.01.005] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2015] [Revised: 12/21/2015] [Accepted: 01/04/2016] [Indexed: 11/17/2022]
Abstract
Spatial localization of touch is critically dependent upon coordinate transformation between different reference frames, which must ultimately allow for alignment between somatotopic and external representations of space. Although prior work has shown an important role for cues such as body posture in influencing the spatial localization of touch, the relative contributions of the different sensory systems to this process are unknown. In the current study, we had participants perform a tactile temporal order judgment (TOJ) under different body postures and conditions of sensory deprivation. Specifically, participants performed non-speeded judgments about the order of two tactile stimuli presented in rapid succession on their ankles during conditions in which their legs were either uncrossed or crossed (and thus bringing somatotopic and external reference frames into conflict). These judgments were made in the absence of 1) visual, 2) auditory, or 3) combined audio-visual spatial information by blindfolding and/or placing participants in an anechoic chamber. As expected, results revealed that tactile temporal acuity was poorer under crossed than uncrossed leg postures. Intriguingly, results also revealed that auditory and audio-visual deprivation exacerbated the difference in tactile temporal acuity between uncrossed to crossed leg postures, an effect not seen for visual-only deprivation. Furthermore, the effects under combined audio-visual deprivation were greater than those seen for auditory deprivation. Collectively, these results indicate that mechanisms governing the alignment between somatotopic and external reference frames extend beyond those imposed by body posture to include spatial features conveyed by the auditory and visual modalities - with a heavier weighting of auditory than visual spatial information. Thus, sensory modalities conveying exteroceptive spatial information contribute to judgments regarding the localization of touch.
Collapse
Affiliation(s)
- Jean-Paul Noel
- Neuroscience Graduate Program, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA; Vanderbilt Brain Institute, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA
| | - Mark Wallace
- Vanderbilt Brain Institute, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN 37235, USA; Department of Psychology, Vanderbilt University, Nashville, TN 37235, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN 37235, USA.
| |
Collapse
|
12
|
Brandes J, Heed T. Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. J Neurosci 2015; 35:13648-58. [PMID: 26446218 PMCID: PMC6605379 DOI: 10.1523/jneurosci.1873-14.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2014] [Revised: 08/24/2015] [Accepted: 08/27/2015] [Indexed: 11/21/2022] Open
Abstract
Spatial target information for movement planning appears to be coded in a gaze-centered reference frame. In touch, however, location is initially coded with reference to the skin. Therefore, the tactile spatial location must be derived by integrating skin location and posture. It has been suggested that this recoding is impaired when the limb is placed in the opposite hemispace, for example, by limb crossing. Here, human participants reached toward visual and tactile targets located at uncrossed and crossed feet in a sensorimotor decision task. We characterized stimulus recoding by analyzing the timing and spatial profile of hand reaches. For tactile targets at crossed feet, skin-based information implicates the incorrect side, and only recoded information points to the correct location. Participants initiated straight reaches and redirected the hand toward a target presented in midflight. Trajectories to visual targets were unaffected by foot crossing. In contrast, trajectories to tactile targets were redirected later with crossed than uncrossed feet. Reaches to crossed feet usually continued straight until they were directed toward the correct tactile target and were not biased toward the skin-based target location. Occasional, far deflections toward the incorrect target were most likely when this target was implicated by trial history. These results are inconsistent with the suggestion that spatial transformations in touch are impaired by limb crossing, but are consistent with tactile location being recoded rapidly and efficiently, followed by integration of skin-based and external information to specify the reach target. This process may be implemented in a bounded integrator framework. SIGNIFICANCE STATEMENT How do you touch yourself, for instance, to scratch an itch? The place you need to reach is defined by a sensation on the skin, but our bodies are flexible, so this skin location could be anywhere in 3D space. The movement toward the tactile sensation must therefore be specified by merging skin location and body posture. By investigating human hand reach trajectories toward tactile stimuli on the feet, we provide experimental evidence that this transformation process is quick and efficient, and that its output is integrated with the original skin location in a fashion consistent with bounded integrator decision-making frameworks.
Collapse
Affiliation(s)
- Janina Brandes
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| |
Collapse
|
13
|
Cicmil N, Meyer AP, Stein JF. Tactile Toe Agnosia and Percept of a "Missing Toe" in Healthy Humans. Perception 2015; 45:265-80. [PMID: 26562866 DOI: 10.1177/0301006615607122] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
A disturbance of body representation is central to many neurological and psychiatric conditions, but the mechanisms by which body representations are constructed by the brain are not fully understood. We demonstrate a directional disturbance in tactile identification of the toes in healthy humans. Nineteen young adult participants underwent tactile stimulation of the digits with the eyes closed and verbally reported the identity of the stimulated digit. In the majority of individuals, responses to the second and third toes were significantly biased toward the laterally neighboring digit. The directional bias was greater for the nondominant foot and was affected by the identity of the immediately preceding stimulated toe. Unexpectedly, 9/19 participants reported the subjective experience of a "missing toe" or "missing space" during the protocol. These findings challenge current models of somatosensory localization, as they cannot be explained simply by a lack of distinct representations for toes compared with fingers, or by overt toe-finger correspondences. We present a novel theory of equal spatial representations of digit width combined with a "preceding neighbor" effect to explain the observed phenomena. The diagnostic implications for neurological disorders that involve "digit agnosia" are discussed.
Collapse
Affiliation(s)
- Nela Cicmil
- Department of Physiology, Anatomy & Genetics, University of Oxford, UKThe Medical School, University of Oxford, John Radcliffe Hospital, Oxford, UK
| | - Achim P Meyer
- Bernstein Center for Computational Neuroscience, Humboldt University, Berlin, Germany
| | - John F Stein
- Department of Physiology, Anatomy & Genetics, University of Oxford, UK
| |
Collapse
|
14
|
Ossandón JP, König P, Heed T. Irrelevant tactile stimulation biases visual exploration in external coordinates. Sci Rep 2015; 5:10664. [PMID: 26021612 PMCID: PMC4448131 DOI: 10.1038/srep10664] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2015] [Accepted: 04/27/2015] [Indexed: 11/30/2022] Open
Abstract
We evaluated the effect of irrelevant tactile stimulation on humans’ free-viewing behavior during the exploration of complex static scenes. Specifically, we address the questions of (1) whether task-irrelevant tactile stimulation presented to subjects’ hands can guide visual selection during free viewing; (2) whether tactile stimulation can modulate visual exploratory biases that are independent of image content and task goals; and (3) in which reference frame these effects occur. Tactile stimulation to uncrossed and crossed hands during the viewing of static images resulted in long-lasting modulation of visual orienting responses. Subjects showed a well-known leftward bias during the early exploration of images, and this bias was modulated by tactile stimulation presented at image onset. Tactile stimulation, both at image onset and later during the trials, biased visual orienting toward the space ipsilateral to the stimulated hand, both in uncrossed and crossed hand postures. The long-lasting temporal and global spatial profile of the modulation of free viewing exploration by touch indicates that cross-modal cues produce orienting responses, which are coded exclusively in an external reference frame.
Collapse
Affiliation(s)
- José P Ossandón
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany
| | - Peter König
- 1] Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany [2] Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology &Neuropsychology, Faculty of Psychology &Movement Science, University of Hamburg, Hamburg, Germany
| |
Collapse
|
15
|
Dynamic Tuning of Tactile Localization to Body Posture. Curr Biol 2015; 25:512-7. [DOI: 10.1016/j.cub.2014.12.038] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2014] [Revised: 10/21/2014] [Accepted: 12/12/2014] [Indexed: 11/20/2022]
|
16
|
Processing load impairs coordinate integration for the localization of touch. Atten Percept Psychophys 2015; 76:1136-50. [PMID: 24550040 DOI: 10.3758/s13414-013-0590-2] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To perform an action toward a touch, the tactile spatial representation must be transformed from a skin-based, anatomical reference frame into an external reference frame. Evidence suggests that, after transformation, both anatomical and external coordinates are integrated for the location estimate. The present study investigated whether the calculation and integration of external coordinates are automatic processes. Participants made temporal order judgments (TOJs) of two tactile stimuli, one applied to each hand, in crossed and uncrossed postures. The influence of the external coordinates of touch was indicated by the performance difference between crossed and uncrossed postures, referred to as the crossing effect. To assess automaticity, the TOJ task was combined with a working memory task that varied in difficulty (size of the working memory set) and quality (verbal vs. spatial). In two studies, the crossing effect was consistently reduced under processing load. When the load level was adaptively adjusted to individual performance (Study 2), the crossing effect additionally varied as a function of the difficulty of the secondary task. These modulatory effects of processing load on the crossing effect were independent of the type of working memory. The sensitivity of the crossing effect to processing load suggests that coordinate integration for touch localization is not fully automatic. To reconcile the present results with previous findings, we suggest that the genuine remapping process-that is, the transformation of anatomical into external coordinates-proceeds automatically, whereas their integration in service of a combined location estimate is subject to top-down control.
Collapse
|
17
|
Heed T, Azañón E. Using time to investigate space: a review of tactile temporal order judgments as a window onto spatial processing in touch. Front Psychol 2014; 5:76. [PMID: 24596561 PMCID: PMC3925972 DOI: 10.3389/fpsyg.2014.00076] [Citation(s) in RCA: 80] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2013] [Accepted: 01/20/2014] [Indexed: 11/13/2022] Open
Abstract
To respond to a touch, it is often necessary to localize it in space, and not just on the skin. The computation of this external spatial location involves the integration of somatosensation with visual and proprioceptive information about current body posture. In the past years, the study of touch localization has received substantial attention and has become a central topic in the research field of multisensory integration. In this review, we will explore important findings from this research, zooming in on one specific experimental paradigm, the temporal order judgment (TOJ) task, which has proven particularly fruitful for the investigation of tactile spatial processing. In a typical TOJ task participants perform non-speeded judgments about the order of two tactile stimuli presented in rapid succession to different skin sites. This task could be solved without relying on external spatial coordinates. However, postural manipulations affect TOJ performance, indicating that external coordinates are in fact computed automatically. We show that this makes the TOJ task a reliable indicator of spatial remapping, and provide an overview over the versatile analysis options for TOJ. We introduce current theories of TOJ and touch localization, and then relate TOJ to behavioral and electrophysiological evidence from other paradigms, probing the benefit of TOJ for the study of spatial processing as well as related topics such as multisensory plasticity, body processing, and pain.
Collapse
Affiliation(s)
- Tobias Heed
- Department of Psychology and Human Movement Science, University of Hamburg Hamburg, Germany
| | - Elena Azañón
- Action and Body Group, Institute of Cognitive Neuroscience, University College London London, UK
| |
Collapse
|