1
|
Snir A, Cieśla K, Vekslar R, Amedi A. Highly compromised auditory spatial perception in aided congenitally hearing-impaired and rapid improvement with tactile technology. iScience 2024; 27:110808. [PMID: 39290844 PMCID: PMC11407022 DOI: 10.1016/j.isci.2024.110808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2024] [Revised: 07/11/2024] [Accepted: 08/21/2024] [Indexed: 09/19/2024] Open
Abstract
Spatial understanding is a multisensory construct while hearing is the only natural sense enabling the simultaneous perception of the entire 3D space. To test whether such spatial understanding is dependent on auditory experience, we study congenitally hearing-impaired users of assistive devices. We apply an in-house technology, which, inspired by the auditory system, performs intensity-weighting to represent external spatial positions and motion on the fingertips. We see highly impaired auditory spatial capabilities for tracking moving sources, which based on the "critical periods" theory emphasizes the role of nature in sensory development. Meanwhile, for tactile and audio-tactile spatial motion perception, the hearing-impaired show performance similar to typically hearing individuals. The immediate availability of 360° external space representation through touch, despite the lack of such experience during the lifetime, points to the significant role of nurture in spatial perception development, and to its amodal character. The findings show promise toward advancing multisensory solutions for rehabilitation.
Collapse
Affiliation(s)
- Adi Snir
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| | - Katarzyna Cieśla
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Mokra 17, 05-830 Kajetany, Nadarzyn, Poland
| | - Rotem Vekslar
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| | - Amir Amedi
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| |
Collapse
|
2
|
Otsuka S, Gao H, Hiraoka K. Contribution of external reference frame to tactile localization. Exp Brain Res 2024; 242:1957-1970. [PMID: 38918211 DOI: 10.1007/s00221-024-06877-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 06/18/2024] [Indexed: 06/27/2024]
Abstract
The purpose of the present study was to elucidate whether an external reference frame contributes to tactile localization in blindfolded healthy humans. In a session, the right forearm was passively moved until the elbow finally reached to the target angle, and participants reached the left index finger to the right middle fingertip. The locus of the right middle fingertip indicated by the participants deviated in the direction of the elbow extension when vibration was provided to the biceps brachii muscle during the passive movement. This finding indicates that proprioception contributes to the identification of the spatial coordinate of the specific body part in an external reference frame. In another session, the tactile stimulus was provided to the dorsal of the right hand during the passive movement, and the participants reached the left index finger to the spatial locus at which the tactile stimulus was provided. Vibration to the biceps brachii muscle did not change the perceived locus of the tactile stimulus indicated by the left index finger. This finding indicates that an external reference frame does not contribute to tactile localization during the passive movement. Humans may estimate the spatial coordinate of the tactile stimulus based on the time between the movement onset and the time at which the tactile stimulus is provided.
Collapse
Affiliation(s)
- Shunsuke Otsuka
- College of Health and Human Sciences, Osaka Prefecture University, Habikino city, Japan
| | - Han Gao
- Graduate School of Rehabilitation Science, Osaka Metropolitan University, Habikino city, Japan
| | - Koichi Hiraoka
- Department of Rehabilitation Science, School of Medicine, Osaka Metropolitan University, Habikino city, Japan.
| |
Collapse
|
3
|
Gabdreshov G, Magzymov D, Yensebayev N. Preliminary investigation of SEZUAL device for basic material identification and simple spatial navigation for blind and visually impaired people. Disabil Rehabil Assist Technol 2024; 19:1343-1350. [PMID: 36756982 DOI: 10.1080/17483107.2023.2176555] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Revised: 12/27/2022] [Accepted: 01/31/2023] [Indexed: 02/10/2023]
Abstract
PURPOSE we present a preliminary set of experimental studies that demonstrates device-aided echolocation enabling in blind and visually impaired individuals. The proposed device emits a click-like sound into the surrounding space and returning sound is perceived by participants to infer the surrounding environment. MATERIALS AND METHODS two sets of experiments were set up to evaluate the echolocation abilities of nine blind participants. The first setup was designed to identify four material types based on the sound reflection properties of materials, such as glass, metal, wood, and ceramics. The second setup was navigation through a basic maze with the device. RESULTS experimental data demonstrate that the use of the proposed device enables active echolocation abilities in blind participants, particularly for material identification and spatial mobility. CONCLUSION the proposed device can potentially be used to rehabilitate disabled blind and visually impaired individuals in terms of spatial mobility and orientation.
Collapse
|
4
|
Henrich MC, Garenfeld MA, Malesevic J, Strbac M, Dosen S. Encoding contact size using static and dynamic electrotactile finger stimulation: natural decoding vs. trained cues. Exp Brain Res 2024; 242:1047-1060. [PMID: 38467759 PMCID: PMC11078849 DOI: 10.1007/s00221-024-06794-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2023] [Accepted: 01/24/2024] [Indexed: 03/13/2024]
Abstract
Electrotactile stimulation through matrix electrodes is a promising technology to restore high-resolution tactile feedback in extended reality applications. One of the fundamental tactile effects that should be simulated is the change in the size of the contact between the finger and a virtual object. The present study investigated how participants perceive the increase of stimulation area when stimulating the index finger using static or dynamic (moving) stimuli produced by activating 1 to 6 electrode pads. To assess the ability to interpret the stimulation from the natural cues (natural decoding), without any prior training, the participants were instructed to draw the size of the stimulated area and identify the size difference when comparing two consecutive stimulations. To investigate if other "non-natural" cues can improve the size estimation, the participants were asked to enumerate the number of active pads following a training protocol. The results demonstrated that participants could perceive the change in size without prior training (e.g., the estimated area correlated with the stimulated area, p < 0.001; ≥ two-pad difference recognized with > 80% success rate). However, natural decoding was also challenging, as the response area changed gradually and sometimes in complex patterns when increasing the number of active pads (e.g., four extra pads needed for the statistically significant difference). Nevertheless, by training the participants to utilize additional cues the limitations of natural perception could be compensated. After the training, the mismatch in the activated and estimated number of pads was less than one pad regardless of the stimulus size. Finally, introducing the movement of the stimulus substantially improved discrimination (e.g., 100% median success rate to recognize ≥ one-pad difference). The present study, therefore, provides insights into stimulation size perception, and practical guidelines on how to modulate pad activation to change the perceived size in static and dynamic scenarios.
Collapse
Affiliation(s)
- Mauricio Carlos Henrich
- Department of Health Science and Technology, Aalborg University, Selma Lagerløfs Vej 249, 9260, Gistrup, Denmark
| | - Martin A Garenfeld
- Department of Health Science and Technology, Aalborg University, Selma Lagerløfs Vej 249, 9260, Gistrup, Denmark
| | | | - Matija Strbac
- Tecnalia Serbia Ltd, Deligradska 9/39, 11000, Belgrade, Serbia
| | - Strahinja Dosen
- Department of Health Science and Technology, Aalborg University, Selma Lagerløfs Vej 249, 9260, Gistrup, Denmark.
| |
Collapse
|
5
|
Abbasi A, Lassagne H, Estebanez L, Goueytes D, Shulz DE, Ego-Stengel V. Brain-machine interface learning is facilitated by specific patterning of distributed cortical feedback. SCIENCE ADVANCES 2023; 9:eadh1328. [PMID: 37738340 PMCID: PMC10516504 DOI: 10.1126/sciadv.adh1328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/12/2023] [Accepted: 08/23/2023] [Indexed: 09/24/2023]
Abstract
Neuroprosthetics offer great hope for motor-impaired patients. One obstacle is that fine motor control requires near-instantaneous, rich somatosensory feedback. Such distributed feedback may be recreated in a brain-machine interface using distributed artificial stimulation across the cortical surface. Here, we hypothesized that neuronal stimulation must be contiguous in its spatiotemporal dynamics to be efficiently integrated by sensorimotor circuits. Using a closed-loop brain-machine interface, we trained head-fixed mice to control a virtual cursor by modulating the activity of motor cortex neurons. We provided artificial feedback in real time with distributed optogenetic stimulation patterns in the primary somatosensory cortex. Mice developed a specific motor strategy and succeeded to learn the task only when the optogenetic feedback pattern was spatially and temporally contiguous while it moved across the topography of the somatosensory cortex. These results reveal spatiotemporal properties of the sensorimotor cortical integration that set constraints on the design of neuroprosthetics.
Collapse
Affiliation(s)
| | | | | | - Dorian Goueytes
- Université Paris-Saclay, CNRS, Institut des Neurosciences Paris-Saclay (NeuroPSI), 91400 Saclay, France
| | | | | |
Collapse
|
6
|
Klautke J, Foster C, Medendorp WP, Heed T. Dynamic spatial coding in parietal cortex mediates tactile-motor transformation. Nat Commun 2023; 14:4532. [PMID: 37500625 PMCID: PMC10374589 DOI: 10.1038/s41467-023-39959-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 07/05/2023] [Indexed: 07/29/2023] Open
Abstract
Movements towards touch on the body require integrating tactile location and body posture information. Tactile processing and movement planning both rely on posterior parietal cortex (PPC) but their interplay is not understood. Here, human participants received tactile stimuli on their crossed and uncrossed feet, dissociating stimulus location relative to anatomy versus external space. Participants pointed to the touch or the equivalent location on the other foot, which dissociates sensory and motor locations. Multi-voxel pattern analysis of concurrently recorded fMRI signals revealed that tactile location was coded anatomically in anterior PPC but spatially in posterior PPC during sensory processing. After movement instructions were specified, PPC exclusively represented the movement goal in space, in regions associated with visuo-motor planning and with regional overlap for sensory, rule-related, and movement coding. Thus, PPC flexibly updates its spatial codes to accommodate rule-based transformation of sensory input to generate movement to environment and own body alike.
Collapse
Affiliation(s)
- Janina Klautke
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Celia Foster
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany.
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany.
- Cognitive Psychology, Department of Psychology, University of Salzburg, Salzburg, Austria.
- Centre for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria.
| |
Collapse
|
7
|
Moharramipour A, Takahashi T, Kitazawa S. Distinctive modes of cortical communications in tactile temporal order judgment. Cereb Cortex 2023; 33:2982-2996. [PMID: 35811300 DOI: 10.1093/cercor/bhac255] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Revised: 06/03/2022] [Accepted: 06/04/2022] [Indexed: 11/12/2022] Open
Abstract
Temporal order judgment of two successive tactile stimuli delivered to our hands is often inverted when we cross our hands. The present study aimed to identify time-frequency profiles of the interactions across the cortical network associated with the crossed-hand tactile temporal order judgment task using magnetoencephalography. We found that the interactions across the cortical network were channeled to a low-frequency band (5-10 Hz) when the hands were uncrossed. However, the interactions became activated in a higher band (12-18 Hz) when the hands were crossed. The participants with fewer inverted judgments relied mainly on the higher band, whereas those with more frequent inverted judgments (reversers) utilized both. Moreover, reversers showed greater cortical interactions in the higher band when their judgment was correct compared to when it was inverted. Overall, the results show that the cortical network communicates in two distinctive frequency modes during the crossed-hand tactile temporal order judgment task. A default mode of communications in the low-frequency band encourages inverted judgments, and correct judgment is robustly achieved by recruiting the high-frequency mode.
Collapse
Affiliation(s)
- Ali Moharramipour
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Laboratory for Consciousness, Center for Brain Science (CBS), RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0106, Japan
| | - Toshimitsu Takahashi
- Department of Physiology, Dokkyo Medical University, 880 Kitakobayashi, Mibu, Shimotsuga, Tochigi 321-0293, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, 1-3 Yamakaoka, Suita, Osaka 565-0871, Japan
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan
| |
Collapse
|
8
|
Chen BW, Yang SH, Kuo CH, Chen JW, Lo YC, Kuo YT, Lin YC, Chang HC, Lin SH, Yu X, Qu B, Ro SCV, Lai HY, Chen YY. Neuro-Inspired Reinforcement Learning To Improve Trajectory Prediction In Reward-Guided Behavior. Int J Neural Syst 2022; 32:2250038. [DOI: 10.1142/s0129065722500381] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
9
|
Do motor plans affect sensorimotor state estimates during temporal decision-making with crossed vs. uncrossed hands? Failure to replicate the dynamic crossed-hand effect. Exp Brain Res 2022; 240:1529-1545. [PMID: 35332358 DOI: 10.1007/s00221-022-06349-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2021] [Accepted: 03/10/2022] [Indexed: 11/04/2022]
Abstract
Hermosillo et al. (J Neurosci 31: 10019-10022, 2011) have suggested that action planning of hand movements impacts decisions about the temporal order judgments regarding vibrotactile stimulation of the hands. Specifically, these authors reported that the crossed-hand effect, a confusion about which hand is which when held in a crossed posture, gradually reverses some 320 ms before the arms begin to move from an uncrossed to a crossed posture or vice versa, such that the crossed-hand is reversed at the time of movement onset in anticipation of the movement's end position. However, to date, no other study has attempted to replicate this dynamic crossed-hand effect. Therefore, in the present study, we conducted four experiments to revisit the question whether preparing uncrossed-to-crossed or crossed-to-uncrossed movements affects the temporo-spatial perception of tactile stimulation of the hands. We used a temporal order judgement (TOJ) task at different time stages during action planning to test whether TOJs are more difficult with crossed than uncrossed hands ("static crossed-hand effect") and, crucially, whether planning to cross or uncross the hands shows the opposite pattern of difficulties ("dynamic crossed-hand effect"). As expected, our results confirmed the static crossed-hand effect. However, the dynamic crossed-hand effect could not be replicated. In addition, we observed that participants delayed their movements with late somatosensory stimulation from the TOJ task, even when the stimulations were meaningless, suggesting that the TOJ task resulted in cross-modal distractions. Whereas the current findings are not inconsistent with a contribution of motor signals to posture perception, they cast doubt on observations that motor signals impact state estimates well before movement onset.
Collapse
|
10
|
Martel M, Fuchs X, Trojan J, Gockel V, Habets B, Heed T. Illusory tactile movement crosses arms and legs and is coded in external space. Cortex 2022; 149:202-225. [DOI: 10.1016/j.cortex.2022.01.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 11/08/2021] [Accepted: 01/24/2022] [Indexed: 11/03/2022]
|
11
|
Lorentz L, Unwalla K, Shore DI. Imagine Your Crossed Hands as Uncrossed: Visual Imagery Impacts the Crossed-Hands Deficit. Multisens Res 2021; 35:1-29. [PMID: 34690111 DOI: 10.1163/22134808-bja10065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Accepted: 10/06/2021] [Indexed: 11/19/2022]
Abstract
Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based in somatotopic coordinates, and an external reference frame, based in external spatial coordinates. Previous evidence in favour of the integration model employed manipulations to the external reference frame (e.g., blindfolding participants), which reduced the deficit by reducing conflict between the two reference frames. The present study extends this finding by asking blindfolded participants to visually imagine their crossed arms as uncrossed. This imagery manipulation further decreased the magnitude of the crossed-hands deficit by bringing information in the two reference frames into alignment. This imagery manipulation differentially affected males and females, which was consistent with the previously observed sex difference in this effect: females tend to show a larger crossed-hands deficit than males and females were more impacted by the imagery manipulation. Results are discussed in terms of the integration model of the crossed-hands deficit.
Collapse
Affiliation(s)
- Lisa Lorentz
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Kaian Unwalla
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
12
|
Different mechanisms of magnitude and spatial representation for tactile and auditory modalities. Exp Brain Res 2021; 239:3123-3132. [PMID: 34415367 PMCID: PMC8536643 DOI: 10.1007/s00221-021-06196-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Accepted: 08/11/2021] [Indexed: 11/04/2022]
Abstract
The human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.
Collapse
|
13
|
Unwalla K, Goldreich D, Shore DI. Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task. Multisens Res 2021; 34:1-32. [PMID: 34375947 DOI: 10.1163/22134808-bja10057] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Accepted: 06/10/2021] [Indexed: 11/19/2022]
Abstract
Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames - you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants to focus on either the hand that was stimulated first (an anatomical bias condition) or the location of the hand that was stimulated first (a spatiotopic bias condition). Spatiotopic-based responses produce a larger crossed-hands deficit, presumably by focusing observers on the external reference frame. In contrast, anatomical-based responses focus the observer on the internal reference frame and produce a smaller deficit. This manipulation thus provides evidence that observers can change the relative weight given to each reference frame. We quantify this effect using a probabilistic model that produces a population estimate of the relative weight given to each reference frame. We show that a spatiotopic bias can result in either a larger external weight (Experiment 1) or a smaller internal weight (Experiment 2) and provide an explanation of when each one would occur.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - Daniel Goldreich
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
14
|
Scotto CR, Moscatelli A, Pfeiffer T, Ernst MO. Visual pursuit biases tactile velocity perception. J Neurophysiol 2021; 126:540-549. [PMID: 34259048 DOI: 10.1152/jn.00541.2020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
During a smooth pursuit eye movement of a target stimulus, a briefly flashed stationary background appears to move in the opposite direction as the eye's motion-an effect known as the Filehne illusion. Similar illusions occur in audition, in the vestibular system, and in touch. Recently, we found that the movement of a surface perceived from tactile slip was biased if this surface was sensed with the moving hand. The analogy between these two illusions suggests similar mechanisms of motion processing between the vision and touch. In the present study, we further assessed the interplay between these two sensory channels by investigating a novel paradigm that associated an eye pursuit of a visual target with a tactile motion over the skin of the fingertip. We showed that smooth pursuit eye movements can bias the perceived direction of motion in touch. Similarly to the classical report from the Filehne illusion in vision, a static tactile surface was perceived as moving rightward with a leftward eye pursuit movement, and vice versa. However, this time the direction of surface motion was perceived from touch. The biasing effects of eye pursuit on tactile motion were modulated by the reliability of the tactile and visual stimuli, consistently with a Bayesian model of motion perception. Overall, these results support a modality- and effector-independent process with common representations for motion perception.NEW & NOTEWORTHY The study showed that smooth pursuit eye movement produces a bias in tactile motion perception. This phenomenon is modulated by the reliability of the tactile estimate and by the presence of a visual background, in line with the predictions of the Bayesian framework of motion perception. Overall, these results support the hypothesis of shared representations for motion perception.
Collapse
Affiliation(s)
- Cécile R Scotto
- Centre de Recherches sur la Cognition et l'Apprentissage, Université de Poitiers, Université François Rabelais de Tours, Centre National de la Recherche Scientifique, Poitiers, France
| | - Alessandro Moscatelli
- Department of Systems Medicine and Centre of Space Bio-Medicine, University of Rome "Tor Vergata", Rome, Italy.,Laboratory of Neuromotor Physiology, Istituto di Ricovero e Cura a Carattere Scientifico Santa Lucia Foundation, Rome, Italy
| | - Thies Pfeiffer
- Faculty of Technology and Cognitive Interaction Technology-Center of Excellence, Bielefeld University, Bielefeld, Germany
| | - Marc O Ernst
- Applied Cognitive Systems, Ulm University, Ulm, Germany
| |
Collapse
|
15
|
FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings. MULTIMODAL TECHNOLOGIES AND INTERACTION 2021. [DOI: 10.3390/mti5060029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023] Open
Abstract
We present and evaluate the concept of FeelMusic and evaluate an implementation of it. It is an augmentation of music through the haptic translation of core musical elements. Music and touch are intrinsic modes of affective communication that are physically sensed. By projecting musical features such as rhythm and melody into the haptic domain, we can explore and enrich this embodied sensation; hence, we investigated audio-tactile mappings that successfully render emotive qualities. We began by investigating the affective qualities of vibrotactile stimuli through a psychophysical study with 20 participants using the circumplex model of affect. We found positive correlations between vibration frequency and arousal across participants, but correlations with valence were specific to the individual. We then developed novel FeelMusic mappings by translating key features of music samples and implementing them with “Pump-and-Vibe”, a wearable interface utilising fluidic actuation and vibration to generate dynamic haptic sensations. We conducted a preliminary investigation to evaluate the FeelMusic mappings by gathering 20 participants’ responses to the musical, tactile and combined stimuli, using valence ratings and descriptive words from Hevner’s adjective circle to measure affect. These mappings, and new tactile compositions, validated that FeelMusic interfaces have the potential to enrich musical experiences and be a means of affective communication in their own right. FeelMusic is a tangible realisation of the expression “feel the music”, enriching our musical experiences.
Collapse
|
16
|
Applying a novel visual-to-touch sensory substitution for studying tactile reference frames. Sci Rep 2021; 11:10636. [PMID: 34017027 PMCID: PMC8137949 DOI: 10.1038/s41598-021-90132-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2020] [Accepted: 04/27/2021] [Indexed: 11/16/2022] Open
Abstract
Perceiving the spatial location and physical dimensions of touched objects is crucial for goal-directed actions. To achieve this, our brain transforms skin-based coordinates into a reference frame by integrating visual and posture information. In the current study, we examine the role of posture in mapping tactile sensations to a visual image. We developed a new visual-to-touch sensory substitution device that transforms images into a sequence of vibrations on the arm. 52 blindfolded participants performed spatial recognition tasks in three different arm postures and had to switch postures between trial blocks. As participants were not told which side of the device is down and which is up, they could choose how to map its vertical axis in their responses. Contrary to previous findings, we show that new proprioceptive inputs can be overridden in mapping tactile sensations. We discuss the results within the context of the spatial task and the various sensory contributions to the process.
Collapse
|
17
|
Moharramipour A, Kitazawa S. What Underlies a Greater Reversal in Tactile Temporal Order Judgment When the Hands Are Crossed? A Structural MRI Study. Cereb Cortex Commun 2021; 2:tgab025. [PMID: 34296170 PMCID: PMC8152922 DOI: 10.1093/texcom/tgab025] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2020] [Revised: 03/31/2021] [Accepted: 03/31/2021] [Indexed: 02/02/2023] Open
Abstract
Our subjective temporal order of two successive tactile stimuli, delivered one to each hand, is often inverted when our hands are crossed. However, there is great variability among different individuals. We addressed the question of why some show almost complete reversal, but others show little reversal. To this end, we obtained structural magnetic resonance imaging data from 42 participants who also participated in the tactile temporal order judgment (TOJ) task. We extracted the cortical thickness and the convoluted surface area as cortical characteristics in 68 regions. We found that the participants with a thinner, larger, and more convoluted cerebral cortex in 10 regions, including the right pars-orbitalis, right and left postcentral gyri, left precuneus, left superior parietal lobule, right middle temporal gyrus, left superior temporal gyrus, right cuneus, left supramarginal gyrus, and right rostral middle frontal gyrus, showed a smaller degree of judgment reversal. In light of major theoretical accounts, we suggest that cortical elaboration in the aforementioned regions improve the crossed-hand TOJ performance through better integration of the tactile stimuli with the correct spatial representations in the left parietal regions, better representation of spatial information in the postcentral gyrus, or improvement of top-down inhibitory control by the right pars-orbitalis.
Collapse
Affiliation(s)
- Ali Moharramipour
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, Osaka 565-0871, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, Osaka 565-0871, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka 565-0871, Japan
- Center for Information and Neural Networks, National Institute of Information and Communications Technology, Osaka University, Osaka 565-0871, Japan
| |
Collapse
|
18
|
Manfron L, Vanderclausen C, Legrain V. No Evidence for an Effect of the Distance Between the Hands on Tactile Temporal Order Judgments. Perception 2021; 50:294-307. [PMID: 33653176 DOI: 10.1177/0301006621998877] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Localizing somatosensory stimuli is an important process, as it allows us to spatially guide our actions toward the object entering in contact with the body. Accordingly, the positions of tactile inputs are coded according to both somatotopic and spatiotopic representations, the latter one considering the position of the stimulated limbs in external space. The spatiotopic representation has often been evidenced by means of temporal order judgment (TOJ) tasks. Participants' judgments about the order of appearance of two successive somatosensory stimuli are less accurate when the hands are crossed over the body midline than uncrossed but also when participants' hands are placed close together when compared with farther away. Moreover, these postural effects might depend on the vision of the stimulated limbs. The aim of this study was to test the influence of seeing the hands, on the modulation of tactile TOJ by the spatial distance between the stimulated limbs. The results showed no influence of the distance between the stimulated hands on TOJ performance and prevent us from concluding whether vision of the hands affects TOJ performance, or whether these variables interact. The reliability of such distance effect to investigate the spatial representations of tactile inputs is questioned.
Collapse
|
19
|
Abstract
To achieve visual space constancy, our brain remaps eye-centered projections of visual objects across saccades. Here, we measured saccade trajectory curvature following the presentation of visual, auditory, and audiovisual distractors in a double-step saccade task to investigate if this stability mechanism also accounts for localized sounds. We found that saccade trajectories systematically curved away from the position at which either a light or a sound was presented, suggesting that both modalities are represented in eye-centered oculomotor centers. Importantly, the same effect was observed when the distractor preceded the execution of the first saccade. These results suggest that oculomotor centers keep track of visual, auditory and audiovisual objects by remapping their eye-centered representations across saccades. Furthermore, they argue for the existence of a supra-modal map which keeps track of multi-sensory object locations across our movements to create an impression of space constancy.
Collapse
|
20
|
Maij F, Seegelke C, Medendorp WP, Heed T. External location of touch is constructed post-hoc based on limb choice. eLife 2020; 9:57804. [PMID: 32945257 PMCID: PMC7561349 DOI: 10.7554/elife.57804] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 09/18/2020] [Indexed: 11/13/2022] Open
Abstract
When humans indicate on which hand a tactile stimulus occurred, they often err when their hands are crossed. This finding seemingly supports the view that the automatically determined touch location in external space affects limb assignment: the crossed right hand is localized in left space, and this conflict presumably provokes hand assignment errors. Here, participants judged on which hand the first of two stimuli, presented during a bimanual movement, had occurred, and then indicated its external location by a reach-to-point movement. When participants incorrectly chose the hand stimulated second, they pointed to where that hand had been at the correct, first time point, though no stimulus had occurred at that location. This behavior suggests that stimulus localization depended on hand assignment, not vice versa. It is, thus, incompatible with the notion of automatic computation of external stimulus location upon occurrence. Instead, humans construct external touch location post-hoc and on demand.
Collapse
Affiliation(s)
- Femke Maij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Christian Seegelke
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Tobias Heed
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
21
|
Intact tactile detection yet biased tactile localization in a hand-centered frame of reference: Evidence from a dissociation. Neuropsychologia 2020; 147:107585. [PMID: 32841632 DOI: 10.1016/j.neuropsychologia.2020.107585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2019] [Revised: 04/20/2020] [Accepted: 08/10/2020] [Indexed: 11/21/2022]
Abstract
We examined the performance of an individual with subcortical damage, but an intact somatosensory thalamocortical pathway, to examine the functional architecture of tactile detection and tactile localization processes. Consistent with the intact somatosensory thalamocortical pathway, tactile detection on the contralesional hand was well within the normal range. Despite intact detection, the individual demonstrated substantial localization biases. Across all localization experiments, he consistently localized tactile stimuli to the left side in space relative to the long axis of his hand. This was observed when the contralesional hand was palm up, palm down, rotated 90° relative to the trunk, and when making verbal responses. Furthermore, control experiments demonstrated that this response pattern was unlikely a motor response error. These findings indicate that tactile localization on the body is influenced by proprioceptive information specifically in a hand-centered frame of reference. Furthermore, this also provides evidence that aspects of tactile localization are mediated by pathways outside of the primary somatosensory thalamocortical pathway.
Collapse
|
22
|
Wada M, Ikeda H, Kumagaya S. Atypical Effects of Visual Interference on Tactile Temporal Order Judgment in Individuals With Autism Spectrum Disorder. Multisens Res 2020; 34:129-151. [PMID: 33706272 DOI: 10.1163/22134808-bja10033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2020] [Accepted: 07/17/2020] [Indexed: 11/19/2022]
Abstract
Visual distractors interfere with tactile temporal order judgment (TOJ) at moderately short stimulus onset asynchronies (SOAs) in typically developing participants. Presentation of a rubber hand in a forward direction to the participant's hand enhances this effect, while that in an inverted direction weakens the effect. Individuals with autism spectrum disorder (ASD) have atypical multisensory processing; however, effects of interferences on atypical multisensory processing in ASD remain unclear. In this study, we examined the effects of visual interference on tactile TOJ in individuals with ASD. Two successive tactile stimuli were delivered to the index and ring fingers of a participant's right hand in an opaque box. A rubber hand was placed on the box in a forward or inverted direction. Concurrently, visual stimuli provided by light-emitting diodes on the fingers of the rubber hand were delivered in a congruent or incongruent order. Participants were required to judge the temporal order of the tactile stimuli regardless of visual distractors. In the absence of a visual stimulus, participants with ASD tended to judge the simultaneous stimuli as the ring finger being stimulated first during tactile TOJ compared with typically developing (TD) controls, and congruent visual stimuli eliminated the bias. When incongruent visual stimuli were delivered, judgment was notably reversed in participants with ASD, regardless of the direction of the rubber hand. The findings demonstrate that there are considerable effects of visual interferences on tactile TOJ in individuals with ASD.
Collapse
Affiliation(s)
- Makoto Wada
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan.,2Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka 432-8011, Japan
| | - Hanako Ikeda
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan
| | - Shinichiro Kumagaya
- 3Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo 153-8904, Japan
| |
Collapse
|
23
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
24
|
The influence of visual experience and cognitive goals on the spatial representations of nociceptive stimuli. Pain 2019; 161:328-337. [DOI: 10.1097/j.pain.0000000000001721] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
25
|
Hense M, Badde S, Köhne S, Dziobek I, Röder B. Visual and Proprioceptive Influences on Tactile Spatial Processing in Adults with Autism Spectrum Disorders. Autism Res 2019; 12:1745-1757. [PMID: 31507084 DOI: 10.1002/aur.2202] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 06/25/2019] [Accepted: 08/14/2019] [Indexed: 12/19/2022]
Abstract
Children with autism spectrum disorders (ASDs) often exhibit altered representations of the external world. Consistently, when localizing touch, children with ASDs were less influenced than their peers by changes of the stimulated limb's location in external space [Wada et al., Scientific Reports 2015, 4(1), 5985]. However, given the protracted development of an external-spatial dominance in tactile processing in typically developing children, this difference might reflect a developmental delay rather than a set suppression of external space in ASDs. Here, adults with ASDs and matched control-participants completed (a) the tactile temporal order judgment (TOJ) task previously used to test external-spatial representation of touch in children with ASDs and (b) a tactile-visual cross-modal congruency (CC) task which assesses benefits of task-irrelevant visual stimuli on tactile localization in external space. In both experiments, participants localized tactile stimuli to the fingers of each hand, while holding their hands either crossed or uncrossed. Performance differences between hand postures reflect the influence of external-spatial codes. In both groups, tactile TOJ-performance markedly decreased when participants crossed their hands and CC-effects were especially large if the visual stimulus was presented at the same side of external space as the task-relevant touch. The absence of group differences was statistically confirmed using Bayesian statistical modeling: adults with ASDs weighted external-spatial codes comparable to typically developed adults during tactile and visual-tactile spatio-temporal tasks. Thus, atypicalities in the spatial coding of touch for children with ASDs appear to reflect a developmental delay rather than a stable characteristic of ASD. Autism Res 2019, 12: 1745-1757. © 2019 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: A touched limb's location can be described twofold, with respect to the body (right hand) or the external world (right side). Children and adolescents with autism spectrum disorder (ASD) reportedly rely less than their peers on the external world. Here, adults with and without ASDs completed two tactile localization tasks. Both groups relied to the same degree on external world locations. This opens the possibility that the tendency to relate touch to the external world is typical in individuals with ASDs but emerges with a delay.
Collapse
Affiliation(s)
- Marlene Hense
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Department of Psychology, New York University, New York, New York
| | - Svenja Köhne
- Berlin School of Mind and Brain, Department of Psychology, Humboldt University Berlin, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Department of Psychology, Humboldt University Berlin, Berlin, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| |
Collapse
|
26
|
Alpha-band oscillations reflect external spatial coding for tactile stimuli in sighted, but not in congenitally blind humans. Sci Rep 2019; 9:9215. [PMID: 31239467 PMCID: PMC6592921 DOI: 10.1038/s41598-019-45634-w] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2018] [Accepted: 06/11/2019] [Indexed: 12/02/2022] Open
Abstract
We investigated the function of oscillatory alpha-band activity in the neural coding of spatial information during tactile processing. Sighted humans concurrently encode tactile location in skin-based and, after integration with posture, external spatial reference frames, whereas congenitally blind humans preferably use skin-based coding. Accordingly, lateralization of alpha-band activity in parietal regions during attentional orienting in expectance of tactile stimulation reflected external spatial coding in sighted, but skin-based coding in blind humans. Here, we asked whether alpha-band activity plays a similar role in spatial coding for tactile processing, that is, after the stimulus has been received. Sighted and congenitally blind participants were cued to attend to one hand in order to detect rare tactile deviant stimuli at this hand while ignoring tactile deviants at the other hand and tactile standard stimuli at both hands. The reference frames encoded by oscillatory activity during tactile processing were probed by adopting either an uncrossed or crossed hand posture. In sighted participants, attended relative to unattended standard stimuli suppressed the power in the alpha-band over ipsilateral centro-parietal and occipital cortex. Hand crossing attenuated this attentional modulation predominantly over ipsilateral posterior-parietal cortex. In contrast, although contralateral alpha-activity was enhanced for attended versus unattended stimuli in blind participants, no crossing effects were evident in the oscillatory activity of this group. These findings suggest that oscillatory alpha-band activity plays a pivotal role in the neural coding of external spatial information for touch.
Collapse
|
27
|
Leed JE, Chinn LK, Lockman JJ. Reaching to the Self: The Development of Infants' Ability to Localize Targets on the Body. Psychol Sci 2019; 30:1063-1073. [PMID: 31173538 DOI: 10.1177/0956797619850168] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023] Open
Abstract
This study focused on the development of infants' sensorimotor knowledge about the layout of their bodies. Little is known about the development of the body as a reaching space, despite the importance of this skill for many self-directed adaptive behaviors, such as removing foreign stimuli from the skin or scratching an itch. A new method was developed in which vibrating targets were placed on the heads and arms of 7- to 21-month-old infants (N = 78) to test reaching localization of targets. Manual localization improved with age, and visual localization was associated with successful reaching. Use of the ipsilateral or contralateral hand varied with body region: Infants primarily used the ipsilateral hand for head targets but the contralateral hand for arm targets, for which ipsilateral reaches were not biomechanically possible. The results of this research highlight a previously understudied form of self-knowledge involving a functional capacity to reach to tactile targets on the body surface.
Collapse
|
28
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
29
|
Sadibolova R, Tamè L, Longo MR. More than skin-deep: Integration of skin-based and musculoskeletal reference frames in localization of touch. J Exp Psychol Hum Percept Perform 2018; 44:1672-1682. [PMID: 30160504 PMCID: PMC6205026 DOI: 10.1037/xhp0000562] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Revised: 04/25/2018] [Accepted: 04/26/2018] [Indexed: 11/08/2022]
Abstract
The skin of the forearm is, in one sense, a flat 2-dimensional (2D) sheet, but in another sense approximately cylindrical, mirroring the 3-dimensional (3D) volumetric shape of the arm. The role of frames of reference based on the skin as a 2D sheet versus based on the musculoskeletal structure of the arm remains unclear. When we rotate the forearm from a pronated to a supinated posture, the skin on its surface is displaced. Thus, a marked location will slide with the skin across the underlying flesh, and the touch perceived at this location should follow this displacement if it is localized within a skin-based reference frame. We investigated, however, if the perceived tactile locations were also affected by the rearrangement in underlying musculoskeletal structure, that is, displaced medially and laterally on a pronated and supinated forearm, respectively. Participants pointed to perceived touches (Experiment 1), or marked them on a (3D) size-matched forearm on a computer screen (Experiment 2). The perceived locations were indeed displaced medially after forearm pronation in both response modalities. This misperception was reduced (Experiment 1), or absent altogether (Experiment 2) in the supinated posture when the actual stimulus grid moved laterally with the displaced skin. The grid was perceptually stretched at medial-lateral axis, and it was displaced distally, which suggest the influence of skin-based factors. Our study extends the tactile localization literature focused on the skin-based reference frame and on the effects of spatial positions of body parts by implicating the musculoskeletal factors in localization of touch on the body. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Collapse
|
30
|
Herweg NA, Kahana MJ. Spatial Representations in the Human Brain. Front Hum Neurosci 2018; 12:297. [PMID: 30104966 PMCID: PMC6078001 DOI: 10.3389/fnhum.2018.00297] [Citation(s) in RCA: 30] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2018] [Accepted: 07/06/2018] [Indexed: 11/13/2022] Open
Abstract
While extensive research on the neurophysiology of spatial memory has been carried out in rodents, memory research in humans had traditionally focused on more abstract, language-based tasks. Recent studies have begun to address this gap using virtual navigation tasks in combination with electrophysiological recordings in humans. These studies suggest that the human medial temporal lobe (MTL) is equipped with a population of place and grid cells similar to that previously observed in the rodent brain. Furthermore, theta oscillations have been linked to spatial navigation and, more specifically, to the encoding and retrieval of spatial information. While some studies suggest a single navigational theta rhythm which is of lower frequency in humans than rodents, other studies advocate for the existence of two functionally distinct delta-theta frequency bands involved in both spatial and episodic memory. Despite the general consensus between rodent and human electrophysiology, behavioral work in humans does not unequivocally support the use of a metric Euclidean map for navigation. Formal models of navigational behavior, which specifically consider the spatial scale of the environment and complementary learning mechanisms, may help to better understand different navigational strategies and their neurophysiological mechanisms. Finally, the functional overlap of spatial and declarative memory in the MTL calls for a unified theory of MTL function. Such a theory will critically rely upon linking task-related phenomena at multiple temporal and spatial scales. Understanding how single cell responses relate to ongoing theta oscillations during both the encoding and retrieval of spatial and non-spatial associations appears to be key toward developing a more mechanistic understanding of memory processes in the MTL.
Collapse
Affiliation(s)
- Nora A. Herweg
- Computational Memory Lab, Department of Psychology, University of Pennsylvania, Philadelphia, PA, United States
| | - Michael J. Kahana
- Computational Memory Lab, Department of Psychology, University of Pennsylvania, Philadelphia, PA, United States
| |
Collapse
|
31
|
Murphy S, Dalton P. Inattentional numbness and the influence of task difficulty. Cognition 2018; 178:1-6. [PMID: 29753983 DOI: 10.1016/j.cognition.2018.05.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2017] [Revised: 04/30/2018] [Accepted: 05/02/2018] [Indexed: 10/16/2022]
Abstract
Research suggests that clearly detectable stimuli can be missed when attention is focused elsewhere, particularly when the observer is engaged in a complex task. Although this phenomenon has been demonstrated in vision and audition, much less is known about the possibility of a similar phenomenon within touch. Across two experiments, we investigated reported awareness of an unexpected tactile event as a function of the difficulty of a concurrent tactile task. Participants were presented with sequences of tactile stimuli to one hand and performed either an easy or a difficult counting task. On the final trial, an additional tactile stimulus was concurrently presented to the unattended hand. Retrospective reports revealed that more participants in the difficult (vs. easy) condition remained unaware of this unexpected stimulus, even though it was clearly detectable under full attention conditions. These experiments are the first demonstrating the phenomenon of inattentional numbness modulated by concurrent tactile task difficulty.
Collapse
Affiliation(s)
- Sandra Murphy
- Department of Psychology, Royal Holloway, University of London, United Kingdom
| | - Polly Dalton
- Department of Psychology, Royal Holloway, University of London, United Kingdom.
| |
Collapse
|
32
|
Ambron E, Mas-Casadesús A, Gherri E. Hand distance modulates the electrophysiological correlates of target selection during a tactile search task. Psychophysiology 2018; 55:e13080. [DOI: 10.1111/psyp.13080] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2017] [Revised: 12/30/2017] [Accepted: 02/27/2018] [Indexed: 11/30/2022]
Affiliation(s)
- Elisabetta Ambron
- Laboratory for Cognition and Neural Stimulation, Neurology Department; School of Medicine University of Pennsylvania; Philadelphia Pennsylvania USA
| | - Anna Mas-Casadesús
- Human Cognitive Neuroscience, Department of Psychology; University of Edinburgh; Edinburgh United Kingdom
| | - Elena Gherri
- Human Cognitive Neuroscience, Department of Psychology; University of Edinburgh; Edinburgh United Kingdom
| |
Collapse
|
33
|
Task-irrelevant sounds influence both temporal order and apparent-motion judgments about tactile stimuli applied to crossed and uncrossed hands. Atten Percept Psychophys 2017; 80:773-783. [DOI: 10.3758/s13414-017-1476-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
34
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
35
|
Medina S, Tamè L, Longo MR. Tactile localization biases are modulated by gaze direction. Exp Brain Res 2017; 236:31-42. [PMID: 29018928 DOI: 10.1007/s00221-017-5105-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Accepted: 10/05/2017] [Indexed: 01/03/2023]
Abstract
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants' hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points' locations, by elongating it, in the radio-ulnar axis.
Collapse
Affiliation(s)
- Sonia Medina
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| |
Collapse
|
36
|
Tactile Spatiotemporal Perception Is Dependent on Preparatory Alpha Rhythms in the Parieto-occipital Lobe. J Neurosci 2017; 37:9350-9352. [PMID: 28954872 DOI: 10.1523/jneurosci.2029-17.2017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2017] [Revised: 08/20/2017] [Accepted: 08/28/2017] [Indexed: 11/21/2022] Open
|
37
|
Visual Experience Shapes the Neural Networks Remapping Touch into External Space. J Neurosci 2017; 37:10097-10103. [PMID: 28947578 DOI: 10.1523/jneurosci.1213-17.2017] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2017] [Revised: 07/26/2017] [Indexed: 11/21/2022] Open
Abstract
Localizing touch relies on the activation of skin-based and externally defined spatial frames of reference. Psychophysical studies have demonstrated that early visual deprivation prevents the automatic remapping of touch into external space. We used fMRI to characterize how visual experience impacts the brain circuits dedicated to the spatial processing of touch. Sighted and congenitally blind humans performed a tactile temporal order judgment (TOJ) task, either with the hands uncrossed or crossed over the body midline. Behavioral data confirmed that crossing the hands has a detrimental effect on TOJ judgments in sighted but not in early blind people. Crucially, the crossed hand posture elicited enhanced activity, when compared with the uncrossed posture, in a frontoparietal network in the sighted group only. Psychophysiological interaction analysis revealed, however, that the congenitally blind showed enhanced functional connectivity between parietal and frontal regions in the crossed versus uncrossed hand postures. Our results demonstrate that visual experience scaffolds the neural implementation of the location of touch in space.SIGNIFICANCE STATEMENT In daily life, we seamlessly localize touch in external space for action planning toward a stimulus making contact with the body. For efficient sensorimotor integration, the brain has therefore to compute the current position of our limbs in the external world. In the present study, we demonstrate that early visual deprivation alters the brain activity in a dorsal parietofrontal network typically supporting touch localization in the sighted. Our results therefore conclusively demonstrate the intrinsic role that developmental vision plays in scaffolding the neural implementation of touch perception.
Collapse
|
38
|
Świder K, Wronka E, Oosterman JM, van Rijn CM, Jongsma MLA. Influence of transient spatial attention on the P3 component and perception of painful and non-painful electric stimuli in crossed and uncrossed hands positions. PLoS One 2017; 12:e0182616. [PMID: 28873414 PMCID: PMC5584947 DOI: 10.1371/journal.pone.0182616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Accepted: 07/22/2017] [Indexed: 11/19/2022] Open
Abstract
Recent reports show that focusing attention on the location where pain is expected can enhance its perception. Moreover, crossing the hands over the body’s midline is known to impair the ability to localise stimuli and decrease tactile and pain sensations in healthy participants. The present study investigated the role of transient spatial attention on the perception of painful and non-painful electrical stimuli in conditions in which a match or a mismatch was induced between skin-based and external frames of reference (uncrossed and crossed hands positions, respectively). We measured the subjective experience (Numerical Rating Scale scores) and the electrophysiological response elicited by brief electric stimuli by analysing the P3 component of Event-Related Potentials (ERPs). Twenty-two participants underwent eight painful and eight non-painful stimulus blocks. The electrical stimuli were applied to either the left or the right hand, held in either a crossed or uncrossed position. Each stimulus was preceded by a direction cue (leftward or rightward arrow). In 80% of the trials, the arrow correctly pointed to the spatial regions where the stimulus would appear (congruent cueing). Our results indicated that congruent cues resulted in increased pain NRS scores compared to incongruent ones. For non-painful stimuli such an effect was observed only in the uncrossed hands position. For both non-painful and painful stimuli the P3 peak amplitudes were higher and occurred later for incongruently cued stimuli compared to congruent ones. However, we found that crossing the hands substantially reduced the cueing effect of the P3 peak amplitudes elicited by painful stimuli. Taken together, our results showed a strong influence of transient attention manipulations on the NRS ratings and on the brain activity. Our results also suggest that hand position may modulate the strength of the cueing effect, although differences between painful and non-painful stimuli exist.
Collapse
Affiliation(s)
- Karolina Świder
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- * E-mail:
| | - Eligiusz Wronka
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| | - Joukje M. Oosterman
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Clementina M. van Rijn
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Marijtje L. A. Jongsma
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
39
|
Gaze-centered coding of proprioceptive reach targets after effector movement: Testing the impact of online information, time of movement, and target distance. PLoS One 2017; 12:e0180782. [PMID: 28678886 PMCID: PMC5498052 DOI: 10.1371/journal.pone.0180782] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2016] [Accepted: 06/21/2017] [Indexed: 11/19/2022] Open
Abstract
In previous research, we demonstrated that spatial coding of proprioceptive reach targets depends on the presence of an effector movement (Mueller & Fiehler, Neuropsychologia, 2014, 2016). In these studies, participants were asked to reach in darkness with their right hand to a proprioceptive target (tactile stimulation on the finger tip) while their gaze was varied. They either moved their left, stimulated hand towards a target location or kept it stationary at this location where they received a touch on the fingertip to which they reached with their right hand. When the stimulated hand was moved, reach errors varied as a function of gaze relative to target whereas reach errors were independent of gaze when the hand was kept stationary. The present study further examines whether (a) the availability of proprioceptive online information, i.e. reaching to an online versus a remembered target, (b) the time of the effector movement, i.e. before or after target presentation, or (c) the target distance from the body influences gaze-centered coding of proprioceptive reach targets. We found gaze-dependent reach errors in the conditions which included a movement of the stimulated hand irrespective of whether proprioceptive information was available online or remembered. This suggests that an effector movement leads to gaze-centered coding for both online and remembered proprioceptive reach targets. Moreover, moving the stimulated hand before or after target presentation did not affect gaze-dependent reach errors, thus, indicating a continuous spatial update of positional signals of the stimulated hand rather than the target location per se. However, reaching to a location close to the body rather than farther away (but still within reachable space) generally decreased the influence of a gaze-centered reference frame.
Collapse
|
40
|
Crollen V, Albouy G, Lepore F, Collignon O. How visual experience impacts the internal and external spatial mapping of sensorimotor functions. Sci Rep 2017; 7:1022. [PMID: 28432316 PMCID: PMC5430802 DOI: 10.1038/s41598-017-01158-9] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2016] [Accepted: 03/27/2017] [Indexed: 11/21/2022] Open
Abstract
Tactile perception and motor production share the use of internally- and externally-defined coordinates. In order to examine how visual experience affects the internal/external coding of space for touch and movement, early blind (EB) and sighted controls (SC) took part in two experiments. In experiment 1, participants were required to perform a Temporal Order Judgment task (TOJ), either with their hands in parallel or crossed over the body midline. Confirming previous demonstration, crossing the hands led to a significant decrement in performance in SC but did not affect EB. In experiment 2, participants were trained to perform a sequence of five-finger movements. They were tested on their ability to produce, with the same hand but with the keypad turned upside down, the learned (internal) or the mirror (external) sequence. We observed significant transfer of motor sequence knowledge in both EB and SC irrespective of whether the representation of the sequence was internal or external. Together, these results demonstrate that visual experience differentially impacts the automatic weight attributed to internal versus external coordinates depending on task-specific spatial requirements.
Collapse
Affiliation(s)
- Virginie Crollen
- Centre for Mind/Brain Science, University of Trento, Mattarello, Italy.
| | - Geneviève Albouy
- Movement Control & Neuroplasticity Research Group, Department of Kinesiology, KU Leuven, Belgium
| | - Franco Lepore
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal, Montreal, Canada
| | - Olivier Collignon
- Centre for Mind/Brain Science, University of Trento, Mattarello, Italy.,Institute of Psychology (IPSY) and Institute of Neuroscience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| |
Collapse
|
41
|
|
42
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
43
|
Abstract
In this review, we examine how tactile misperceptions provide evidence regarding body representations. First, we propose that tactile detection and localization are serial processes, in contrast to parallel processing hypotheses based on patients with numbsense. Second, we discuss how information in primary somatosensory maps projects to body size and shape representations to localize touch on the skin surface, and how responses after use-dependent plasticity reflect changes in this mapping. Third, we review situations in which our body representations are inconsistent with our actual body shape, specifically discussing phantom limb phenomena and anesthetization. We discuss problems with the traditional remapping hypothesis in amputees, factors that modulate perceived body size and shape, and how changes in perceived body form influence tactile localization. Finally, we review studies in which brain-damaged individuals perceive touch on the opposite side of the body, and demonstrate how interhemispheric mechanisms can give rise to these anomalous percepts.
Collapse
Affiliation(s)
- Jared Medina
- a Department of Psychology , University of Delaware , Newark , DE , USA
| | - H Branch Coslett
- b Department of Neurology, Center for Cognitive Neuroscience , University of Pennsylvania , Philadelphia , PA , USA
| |
Collapse
|