1
|
Ujitoko Y, Takenaka Y, Hirota K. Effect of Normal Force Intensity on Tactile Motion Speed Perception Based on Spatiotemporal Cue. IEEE TRANSACTIONS ON HAPTICS 2025; 18:73-79. [PMID: 38198268 DOI: 10.1109/toh.2024.3352042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/12/2024]
Abstract
While the relative motion between the skin and objects in contact with it is essential to everyday tactile experiences, our understanding of how tactile motion is perceived via human tactile function is limited. Previous studies have explored the effect of normal force on speed perception under conditions where multiple motion cues on the skin (spatiotemporal cue, tangential skin deformation cue, and slip-induced vibration cue) were integrated. However, the effect of the normal force on speed perception in terms of each motion cue remains unclear since the multiple motion cues have not been adequately separated in the previously reported experiments. In this article, we aim to elucidate the effect of normal force in situations where the speed perception of tactile motion is based solely on a spatiotemporal cue. We developed a pin-array display which allowed us to vary the intensity of the normal force without causing tangential forces or slip-induced vibrations. Using the display, we conducted two psychophysical experiments. In Experiment 1, we found that the speed of the object was perceived to be 1.12-1.14 times faster when the intensity of the normal force was doubled. In Experiment 2, we did not observe significant differences in the discriminability of tactile speed caused by differences in normal force intensity. Our experimental results are of scientific significance and offer insights for engineering applications when using haptic displays that can only provide spatiotemporal cues represented by normal forces.
Collapse
|
2
|
Wang S, Shi X, Gong J, Liu W, Jin C, Sun J, Peng Y, Yang J. Artificial Retina Based on Organic Heterojunction Transistors for Mobile Recognition. NANO LETTERS 2024; 24:3204-3212. [PMID: 38416569 DOI: 10.1021/acs.nanolett.4c00087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/01/2024]
Abstract
The flicker frequency of incident light constitutes a critical determinant in biology. Nevertheless, the exploration of methods to simulate external light stimuli with varying frequencies and develop artificial retinal neurons capable of responsive behavior remains an open question. This study presents an artificial neuron comprising organic phototransistors. The triggering properties of neurons are modulated by optical input, enabling them to execute rudimentary synaptic functions, emulating the biological characteristics of retinal neurons. The artificial retinal neuron exhibits varying responses to incoming light frequencies, allowing it to replicate the persistent visual behavior of the human eye and facilitating image discrimination. Additionally, through seamless integration with circuitry, it can execute motion recognition on a machine cart, preventing collisions with high-speed obstacles. The artificial retinal neuron offers a cost-effective and energy-efficient route for future mobile robot processors.
Collapse
Affiliation(s)
- Shuyang Wang
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- State Key Laboratory of Precision Manufacturing for Extreme Service Performance, College of Mechanical and Electrical Engineering, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Xiaofang Shi
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Jiaying Gong
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Wanrong Liu
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Chenxing Jin
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Jia Sun
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- State Key Laboratory of Precision Manufacturing for Extreme Service Performance, College of Mechanical and Electrical Engineering, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Yongyi Peng
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| | - Junliang Yang
- Hunan Key Laboratory for Super Microstructure and Ultrafast Process, School of Physics and Electronics, Central South University, Changsha, Hunan 410083, People's Republic of China
- Hunan Key Laboratory of Nanophotonics and Devices, School of Physics and Electronics, Central South University, 932 South Lushan Road, Changsha, Hunan 410083, People's Republic of China
| |
Collapse
|
3
|
Brannick S, Vibell JF. Motion aftereffects in vision, audition, and touch, and their crossmodal interactions. Neuropsychologia 2023; 190:108696. [PMID: 37793544 DOI: 10.1016/j.neuropsychologia.2023.108696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 09/26/2023] [Accepted: 09/27/2023] [Indexed: 10/06/2023]
|
4
|
Onuki Y, Lakbila-Kamal O, Scheffer B, Van Someren EJW, Van der Werf YD. Selective Enhancement of Post-Sleep Visual Motion Perception by Repetitive Tactile Stimulation during Sleep. J Neurosci 2022; 42:7400-7411. [PMID: 35995563 PMCID: PMC9525164 DOI: 10.1523/jneurosci.1512-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 05/07/2022] [Accepted: 06/12/2022] [Indexed: 11/21/2022] Open
Abstract
Tactile sensations can bias visual perception in the awake state while visual sensitivity is known to be facilitated by sleep. It remains unknown, however, whether the tactile sensation during sleep can bias the visual improvement after sleep. Here, we performed nap experiments in human participants (n = 56, 18 males, 38 females) to demonstrate that repetitive tactile motion stimulation on the fingertip during slow wave sleep selectively enhanced subsequent visual motion detection. The visual improvement was associated with slow wave activity. The high activation at the high beta frequency was found in the occipital electrodes after the tactile motion stimulation during sleep, indicating a visual-tactile cross-modal interaction during sleep. Furthermore, a second experiment (n = 14, 14 females) to examine whether a hand- or head-centered coordination is dominant for the interpretation of tactile motion direction showed that the biasing effect on visual improvement occurs according to the hand-centered coordination. These results suggest that tactile information can be interpreted during sleep, and can induce the selective improvement of post-sleep visual motion detection.SIGNIFICANCE STATEMENT Tactile sensations can bias our visual perception as a form of cross-modal interaction. However, it was reported only in the awake state. Here we show that repetitive directional tactile motion stimulation on the fingertip during slow wave sleep selectively enhanced subsequent visual motion perception. Moreover, the visual improvement was positively associated with sleep slow wave activity. The tactile motion stimulation during slow wave activity increased the activation at the high beta frequency over the occipital electrodes. The visual improvement occurred in agreement with a hand-centered reference frame. These results suggest that our sleeping brain can interpret tactile information based on a hand-centered reference frame, which can cause the sleep-dependent improvement of visual motion detection.
Collapse
Affiliation(s)
- Yoshiyuki Onuki
- Department of Sleep and Cognition, Netherlands Institute for Neuroscience, an institute of the Royal Netherlands Academy of Arts and Sciences, Amsterdam, 1105BA, The Netherlands
| | - Oti Lakbila-Kamal
- Department of Sleep and Cognition, Netherlands Institute for Neuroscience, an institute of the Royal Netherlands Academy of Arts and Sciences, Amsterdam, 1105BA, The Netherlands
| | - Bo Scheffer
- Department of Sleep and Cognition, Netherlands Institute for Neuroscience, an institute of the Royal Netherlands Academy of Arts and Sciences, Amsterdam, 1105BA, The Netherlands
| | - Eus J W Van Someren
- Department of Sleep and Cognition, Netherlands Institute for Neuroscience, an institute of the Royal Netherlands Academy of Arts and Sciences, Amsterdam, 1105BA, The Netherlands
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Amsterdam Neuroscience, VU University Amsterdam, Amsterdam, 1081HV, The Netherlands
- Amsterdam UMC, Vrije Universiteit, Psychiatry, Amsterdam Neuroscience, Amsterdam, 1081HV, The Netherlands
| | - Ysbrand D Van der Werf
- Department of Anatomy and Neurosciences, Amsterdam UMC, location VU, University Medical Center, Amsterdam, 1081HZ, The Netherlands
| |
Collapse
|
5
|
Norman JF, Eaton JR, Gunter ML, Baig M. Aging and the perception of tactile speed. Sci Rep 2022; 12:5412. [PMID: 35354916 PMCID: PMC8967820 DOI: 10.1038/s41598-022-09493-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Accepted: 03/24/2022] [Indexed: 11/09/2022] Open
Abstract
Eighteen younger and older adults (mean ages were 20.4 and 72.8 years, respectively) participated in a tactile speed matching task. On any given trial, the participants felt the surfaces of rotating standard and test wheels with their index fingertip and were required to adjust the test wheel until its speed appeared to match that of the standard wheel. Three different standard speeds were utilized (30, 50, and 70 cm/s). The results indicated that while the accuracy of the participants' judgments was similar for younger and older adults, the precision (i.e., reliability across repeated trials) of the older participants' judgments deteriorated significantly relative to that exhibited by the younger adults. While adverse effects of age were obtained with regards to both the precision of tactile speed judgments and the participants' tactile acuity, there was nevertheless no significant correlation between the older adults' tactile acuities and the precision of their tactile speed judgments.
Collapse
Affiliation(s)
- J Farley Norman
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, 1906 College Heights Blvd. #22030, Bowling Green, KY, 42101-2030, USA. .,Center for Applied Science in Health and Aging, Western Kentucky University, Bowling Green, KY, 42101-2030, USA.
| | - Jerica R Eaton
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, 1906 College Heights Blvd. #22030, Bowling Green, KY, 42101-2030, USA
| | - McKenzie L Gunter
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, 1906 College Heights Blvd. #22030, Bowling Green, KY, 42101-2030, USA
| | - Maheen Baig
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, 1906 College Heights Blvd. #22030, Bowling Green, KY, 42101-2030, USA
| |
Collapse
|
6
|
Event-related potential correlates of visuo-tactile motion processing in congenitally deaf humans. Neuropsychologia 2022; 170:108209. [DOI: 10.1016/j.neuropsychologia.2022.108209] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2021] [Revised: 02/23/2022] [Accepted: 03/08/2022] [Indexed: 01/08/2023]
|
7
|
Gori M, Crepaldi M, Orciari L, Campus C, Merello A, Dellepiane D, Parmiggiani A. RoMAT: Robot for Multisensory Analysis and Testing of visual-tactile perceptual functions. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:4781-4786. [PMID: 34892280 DOI: 10.1109/embc46164.2021.9630479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
The present work aims to introduce a novel robotic platform suitable for investigating perception in multi-sensory motion tasks for individuals with and without sensory and motor disabilities. The system, called RoMAT, allows the study of how multisensory signals are integrated, taking into account the speed and direction of the stimuli. It is a robotic platform composed of a visual and tactile wheel mounted on two routable plates to be moved under the finger and the visual observation of the participants. We validated the system by implementing a rotation discrimination task considering two different sensory modalities: vision, touch and multisensory visual-tactile integration. Four healthy subjects were asked to report the length of motion rotation after perceiving a moving stimulus generated by the visual, tactile, or both stimuli. Results suggest that multisensory precision improves when multiple sensory stimulations are presented. The new system can therefore provide fundamental inputs in determining the perceptual principles of motion processing. Therefore, this device can be a potential system to design screening and rehabilitation protocols based on neuroscientific findings to be used in individuals with visual and motor impairments.Clinical relevance- This research presents a novel robotic motion simulator to deliver combined or independent stimulation of the visual and tactile sensory signals.
Collapse
|
8
|
Kuroki S. Motion Direction Discrimination with Tactile Random-Dot Kinematograms. Iperception 2021; 12:20416695211004620. [PMID: 33854748 PMCID: PMC8010832 DOI: 10.1177/20416695211004620] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2020] [Accepted: 03/03/2021] [Indexed: 01/14/2023] Open
Abstract
Motion detection is a fundamental sensory function for multiple modalities, including touch, but the mechanisms underlying tactile motion detection are not well understood. While previous findings supported the existence of high-level feature tracking, it remains unclear whether there also exist low-level motion sensing that directly detects a local spatio-temporal correlation in the skin-stimulation pattern. To elucidate this mechanism, we presented, on braille displays, tactile random-dot kinematograms, similar to those widely used in visual motion research, which enables us to independently manipulate feature trackability and various parameters of local motion. We found that a human observer is able to detect the direction of difficult-to-track tactile motions presented to the fingers and palms. In addition, the direction-discrimination performance was better when the stimuli were presented along the fingers than when presented across the fingers. These results indicate that low-level motion sensing, in addition to high-level tracking, contribute to tactile motion perception.
Collapse
Affiliation(s)
- Scinob Kuroki
- NTT Communication Science Laboratories, NTT Corporation, Kanagawa, Japan
| |
Collapse
|
9
|
Suzuishi Y, Hidaka S, Kuroki S. Visual motion information modulates tactile roughness perception. Sci Rep 2020; 10:13929. [PMID: 32811859 PMCID: PMC7435275 DOI: 10.1038/s41598-020-70831-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Accepted: 08/05/2020] [Indexed: 12/04/2022] Open
Abstract
We perceive the roughness of an object through our eyes and hands. Many crossmodal studies have reported that there is no clear visuo-tactile interaction in roughness perception using static visual cues. One exception is that the visual observation of task-irrelevant hand movements, not the texture of task-relevant objects, can enhance the performance of tactile roughness discrimination. Our study investigated whether task-irrelevant visual motion without either object roughness or bodily cues can influence tactile roughness perception. Participants were asked to touch abrasive papers while moving their hand laterally and viewing moving or static sine wave gratings without being able to see their hand, and to estimate the roughness magnitude of the tactile stimuli. Moving gratings with a low spatial frequency induced smoother roughness perceptions than static visual stimuli when the visual grating moved in the direction opposite the hand movements. The effects of visual motion did not appear when the visual stimuli had a high spatial frequency or when the participants touched the tactile stimuli passively. These results indicate that simple task-irrelevant visual movement without object roughness or bodily cues can modulate tactile roughness perception with active body movements in a spatial-frequency-selective manner.
Collapse
Affiliation(s)
- Yosuke Suzuishi
- Department of Psychology, Rikkyo University, 1-2-26, Kitano, Niiza-shi, Saitama, 352-8558, Japan. .,NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, 3-1, Morinosato-Wakamiya, Atsugi, Kanagawa, 243-0198, Japan.
| | - Souta Hidaka
- Department of Psychology, Rikkyo University, 1-2-26, Kitano, Niiza-shi, Saitama, 352-8558, Japan
| | - Scinob Kuroki
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, 3-1, Morinosato-Wakamiya, Atsugi, Kanagawa, 243-0198, Japan
| |
Collapse
|
10
|
Wada M, Ikeda H, Kumagaya S. Atypical Effects of Visual Interference on Tactile Temporal Order Judgment in Individuals With Autism Spectrum Disorder. Multisens Res 2020; 34:129-151. [PMID: 33706272 DOI: 10.1163/22134808-bja10033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2020] [Accepted: 07/17/2020] [Indexed: 11/19/2022]
Abstract
Visual distractors interfere with tactile temporal order judgment (TOJ) at moderately short stimulus onset asynchronies (SOAs) in typically developing participants. Presentation of a rubber hand in a forward direction to the participant's hand enhances this effect, while that in an inverted direction weakens the effect. Individuals with autism spectrum disorder (ASD) have atypical multisensory processing; however, effects of interferences on atypical multisensory processing in ASD remain unclear. In this study, we examined the effects of visual interference on tactile TOJ in individuals with ASD. Two successive tactile stimuli were delivered to the index and ring fingers of a participant's right hand in an opaque box. A rubber hand was placed on the box in a forward or inverted direction. Concurrently, visual stimuli provided by light-emitting diodes on the fingers of the rubber hand were delivered in a congruent or incongruent order. Participants were required to judge the temporal order of the tactile stimuli regardless of visual distractors. In the absence of a visual stimulus, participants with ASD tended to judge the simultaneous stimuli as the ring finger being stimulated first during tactile TOJ compared with typically developing (TD) controls, and congruent visual stimuli eliminated the bias. When incongruent visual stimuli were delivered, judgment was notably reversed in participants with ASD, regardless of the direction of the rubber hand. The findings demonstrate that there are considerable effects of visual interferences on tactile TOJ in individuals with ASD.
Collapse
Affiliation(s)
- Makoto Wada
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan.,2Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka 432-8011, Japan
| | - Hanako Ikeda
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan
| | - Shinichiro Kumagaya
- 3Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo 153-8904, Japan
| |
Collapse
|
11
|
Chen YP, Yeh CI, Lee TC, Huang JJ, Pei YC. Relative posture between head and finger determines perceived tactile direction of motion. Sci Rep 2020; 10:5494. [PMID: 32218502 PMCID: PMC7099024 DOI: 10.1038/s41598-020-62327-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2018] [Accepted: 03/12/2020] [Indexed: 11/09/2022] Open
Abstract
The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientation-selective units in the S1 cortex.
Collapse
Affiliation(s)
- Yueh-Peng Chen
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan.,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan.,School of Medicine, Chang Gung University, Taoyuan, Taiwan.,Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan.,Center for Artificial Intelligence in Medicine, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan
| | - Chun-I Yeh
- Department of Psychology, National Taiwan University, Taipei, Taiwan
| | - Tsung-Chi Lee
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan
| | - Jian-Jia Huang
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan.,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan.,School of Medicine, Chang Gung University, Taoyuan, Taiwan
| | - Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan. .,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan. .,School of Medicine, Chang Gung University, Taoyuan, Taiwan. .,Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan.
| |
Collapse
|
12
|
Haladjian HH, Anstis S, Wexler M, Cavanagh P. The Tactile Quartet: Comparing Ambiguous Apparent Motion in Tactile and Visual Stimuli. Perception 2019; 49:61-80. [PMID: 31707914 DOI: 10.1177/0301006619886237] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In the visual quartet, alternating diagonal pairs of dots produce apparent motion horizontally or vertically, depending on proximity. Here, we studied a tactile quartet where vibrating tactors were attached to the thumbs and index fingers of both hands. Apparent motion was felt either within hands (from index finger to thumb) or between hands. Participants adjusted the distance between their hands to find the point where motion changed directions. Surprisingly, switchovers occurred when between-hand distances were as much as twice that of within-hand distances—a general bias that was also found for tactile judgments of static distances. This expansion of within-hand felt distances was again seen when lights were placed on the hands rather than vibrating tactors. Importantly, switchover points were similar when the hands were placed at different depths, indicating that representations governing tactile motion were in perceptual three-dimensional space, not retinal two-dimensional space. This was true whether the quartets were visual stimuli on the hands or were purely visual on a monitor, suggesting that proximity is generally determined in three-dimensional coordinates for motion perception. Finally, the similarity of visual and tactile results suggests a common computation for apparent motion, albeit with different built-in distance biases for separate modalities.
Collapse
Affiliation(s)
- Harry H Haladjian
- Laboratoire Psychologie de la Perception, CNRS UMR 8424, Université Paris Descartes, France
| | - Stuart Anstis
- Department of Psychology, University of California, San Diego, CA, USA
| | - Mark Wexler
- Laboratoire Psychologie de la Perception, CNRS UMR 8424, Université Paris Descartes, France
| | - Patrick Cavanagh
- Laboratoire Psychologie de la Perception, CNRS UMR 8424, Université Paris Descartes, France; Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH, USA; Department of Psychology, York University, Glendon College, North York, ON, Canada
| |
Collapse
|
13
|
Delhaye BP, O'Donnell MK, Lieber JD, McLellan KR, Bensmaia SJ. Feeling fooled: Texture contaminates the neural code for tactile speed. PLoS Biol 2019; 17:e3000431. [PMID: 31454360 PMCID: PMC6711498 DOI: 10.1371/journal.pbio.3000431] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2019] [Accepted: 06/24/2019] [Indexed: 12/01/2022] Open
Abstract
Motion is an essential component of everyday tactile experience: most manual interactions involve relative movement between the skin and objects. Much of the research on the neural basis of tactile motion perception has focused on how direction is encoded, but less is known about how speed is. Perceived speed has been shown to be dependent on surface texture, but previous studies used only coarse textures, which span a restricted range of tangible spatial scales and provide a limited window into tactile coding. To fill this gap, we measured the ability of human observers to report the speed of natural textures—which span the range of tactile experience and engage all the known mechanisms of texture coding—scanned across the skin. In parallel experiments, we recorded the responses of single units in the nerve and in the somatosensory cortex of primates to the same textures scanned at different speeds. We found that the perception of speed is heavily influenced by texture: some textures are systematically perceived as moving faster than are others, and some textures provide a more informative signal about speed than do others. Similarly, the responses of neurons in the nerve and in cortex are strongly dependent on texture. In the nerve, although all fibers exhibit speed-dependent responses, the responses of Pacinian corpuscle–associated (PC) fibers are most strongly modulated by speed and can best account for human judgments. In cortex, approximately half of the neurons exhibit speed-dependent responses, and this subpopulation receives strong input from PC fibers. However, speed judgments seem to reflect an integration of speed-dependent and speed-independent responses such that the latter help to partially compensate for the strong texture dependence of the former. Our ability to sense the speed at which a surface moves across our skin is highly unreliable and depends on the texture of the surface. This study shows that speed illusions can be predicted from the responses of a specific population of nerve fibers and of their downstream targets; because the skin is too sparsely innervated to compute tactile speed accurately, the nervous system relies on a heuristic to estimate it.
Collapse
Affiliation(s)
- Benoit P. Delhaye
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Molly K. O'Donnell
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
| | - Justin D. Lieber
- Committee on Computational Neuroscience, University of Chicago, Illinois, United States of America
| | - Kristine R. McLellan
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
| | - Sliman J. Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
- Committee on Computational Neuroscience, University of Chicago, Illinois, United States of America
- * E-mail:
| |
Collapse
|
14
|
Moscatelli A, Scotto CR, Ernst MO. Illusory changes in the perceived speed of motion derived from proprioception and touch. J Neurophysiol 2019; 122:1555-1565. [PMID: 31314634 DOI: 10.1152/jn.00719.2018] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022] Open
Abstract
In vision, the perceived velocity of a moving stimulus differs depending on whether we pursue it with the eyes or not: A stimulus moving across the retina with the eyes stationary is perceived as being faster compared with a stimulus of the same physical speed that the observer pursues with the eyes, while its retinal motion is zero. This effect is known as the Aubert-Fleischl phenomenon. Here, we describe an analog phenomenon in touch. We asked participants to estimate the speed of a moving stimulus either from tactile motion only (i.e., motion across the skin), while keeping the hand world stationary, or from kinesthesia only by tracking the stimulus with a guided arm movement, such that the tactile motion on the finger was zero (i.e., only finger motion but no movement across the skin). Participants overestimated the velocity of the stimulus determined from tactile motion compared with kinesthesia in analogy with the visual Aubert-Fleischl phenomenon. In two follow-up experiments, we manipulated the stimulus noise by changing the texture of the touched surface. Similarly to the visual phenomenon, this significantly affected the strength of the illusion. This study supports the hypothesis of shared computations for motion processing between vision and touch.NEW & NOTEWORTHY In vision, the perceived velocity of a moving stimulus is different depending on whether we pursue it with the eyes or not, an effect known as the Aubert-Fleischl phenomenon. We describe an analog phenomenon in touch. We asked participants to estimate the speed of a moving stimulus either from tactile motion or by pursuing it with the hand. Participants overestimated the stimulus velocity measured from tactile motion compared with kinesthesia, in analogy with the visual Aubert-Fleischl phenomenon.
Collapse
Affiliation(s)
- Alessandro Moscatelli
- Department of Systems Medicine and Centre of Space Biomedicine, University of Rome Tor Vergata, Rome, Italy.,Laboratory of Neuromotor Physiology, IRCCS Santa Lucia Foundation, Rome, Italy.,Cognitive Interaction Technology-Cluster of Excellence, Bielefeld University, Bielefeld, Germany
| | - Cecile R Scotto
- Centre de Recherches sur la Cognition et l'Apprentissage, Université de Poitiers-Université de Tours-Centre National de la Recherche Scientifique, Poitiers, France.,Cognitive Interaction Technology-Cluster of Excellence, Bielefeld University, Bielefeld, Germany
| | - Marc O Ernst
- Applied Cognitive Psychology, Ulm University, Ulm, Germany.,Cognitive Interaction Technology-Cluster of Excellence, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
15
|
Abstract
There is an ongoing debate whether or not multisensory interactions require awareness of the sensory signals. Static visual and tactile stimuli have been shown to influence each other even in the absence of visual awareness. However, it is unclear if this finding generalizes to dynamic contexts. In the present study, we presented visual and tactile motion stimuli and induced fluctuations of visual awareness by means of binocular rivalry: two gratings which drifted in opposite directions were displayed, one to each eye. One visual motion stimulus dominated and reached awareness while the other visual stimulus was suppressed from awareness. Tactile motion stimuli were presented at random time points during the visual stimulation. The motion direction of a tactile stimulus always matched the direction of one of the concurrently presented visual stimuli. The visual gratings were differently tinted, and participants reported the color of the currently seen stimulus. Tactile motion delayed perceptual switches that ended dominance periods of congruently moving visual stimuli compared to switches during visual-only stimulation. In addition, tactile motion fostered the return to dominance of suppressed, congruently moving visual stimuli, but only if the tactile motion started at a late stage of the ongoing visual suppression period. At later stages, perceptual suppression is typically decreasing. These results suggest that visual awareness facilitates but does not gate multisensory interactions between visual and tactile motion signals.
Collapse
|
16
|
Delhaye BP, Long KH, Bensmaia SJ. Neural Basis of Touch and Proprioception in Primate Cortex. Compr Physiol 2018; 8:1575-1602. [PMID: 30215864 PMCID: PMC6330897 DOI: 10.1002/cphy.c170033] [Citation(s) in RCA: 120] [Impact Index Per Article: 17.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
The sense of proprioception allows us to keep track of our limb posture and movements and the sense of touch provides us with information about objects with which we come into contact. In both senses, mechanoreceptors convert the deformation of tissues-skin, muscles, tendons, ligaments, or joints-into neural signals. Tactile and proprioceptive signals are then relayed by the peripheral nerves to the central nervous system, where they are processed to give rise to percepts of objects and of the state of our body. In this review, we first examine briefly the receptors that mediate touch and proprioception, their associated nerve fibers, and pathways they follow to the cerebral cortex. We then provide an overview of the different cortical areas that process tactile and proprioceptive information. Next, we discuss how various features of objects-their shape, motion, and texture, for example-are encoded in the various cortical fields, and the susceptibility of these neural codes to attention and other forms of higher-order modulation. Finally, we summarize recent efforts to restore the senses of touch and proprioception by electrically stimulating somatosensory cortex. © 2018 American Physiological Society. Compr Physiol 8:1575-1602, 2018.
Collapse
Affiliation(s)
- Benoit P Delhaye
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, USA
| | - Katie H Long
- Committee on Computational Neuroscience, University of Chicago, Chicago, USA
| | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, USA.,Committee on Computational Neuroscience, University of Chicago, Chicago, USA
| |
Collapse
|
17
|
Churan J, Paul J, Klingenhoefer S, Bremmer F. Integration of visual and tactile information in reproduction of traveled distance. J Neurophysiol 2017; 118:1650-1663. [PMID: 28659463 DOI: 10.1152/jn.00342.2017] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2017] [Revised: 06/27/2017] [Accepted: 06/27/2017] [Indexed: 11/22/2022] Open
Abstract
In the natural world, self-motion always stimulates several different sensory modalities. Here we investigated the interplay between a visual optic flow stimulus simulating self-motion and a tactile stimulus (air flow resulting from self-motion) while human observers were engaged in a distance reproduction task. We found that adding congruent tactile information (i.e., speed of the air flow and speed of visual motion are directly proportional) to the visual information significantly improves the precision of the actively reproduced distances. This improvement, however, was smaller than predicted for an optimal integration of visual and tactile information. In contrast, incongruent tactile information (i.e., speed of the air flow and speed of visual motion are inversely proportional) did not improve subjects' precision indicating that incongruent tactile information and visual information were not integrated. One possible interpretation of the results is a link to properties of neurons in the ventral intraparietal area that have been shown to have spatially and action-congruent receptive fields for visual and tactile stimuli.NEW & NOTEWORTHY This study shows that tactile and visual information can be integrated to improve the estimates of the parameters of self-motion. This, however, happens only if the two sources of information are congruent-as they are in a natural environment. In contrast, an incongruent tactile stimulus is still used as a source of information about self-motion but it is not integrated with visual information.
Collapse
Affiliation(s)
- Jan Churan
- Department of Neurophysics, Marburg University, Marburg, Germany; and
| | - Johannes Paul
- Department of Neurophysics, Marburg University, Marburg, Germany; and
| | - Steffen Klingenhoefer
- Department of Neurophysics, Marburg University, Marburg, Germany; and.,Center for Molecular and Behavioral Neuroscience, Rutgers University, Newark, New Jersey
| | - Frank Bremmer
- Department of Neurophysics, Marburg University, Marburg, Germany; and
| |
Collapse
|
18
|
Amemiya T, Beck B, Walsh V, Gomi H, Haggard P. Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study. Sci Rep 2017; 7:40937. [PMID: 28106123 PMCID: PMC5247673 DOI: 10.1038/srep40937] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2016] [Accepted: 12/14/2016] [Indexed: 12/18/2022] Open
Abstract
Human imaging studies have reported activations associated with tactile motion perception in visual motion area V5/hMT+, primary somatosensory cortex (SI) and posterior parietal cortex (PPC; Brodmann areas 7/40). However, such studies cannot establish whether these areas are causally involved in tactile motion perception. We delivered double-pulse transcranial magnetic stimulation (TMS) while moving a single tactile point across the fingertip, and used signal detection theory to quantify perceptual sensitivity to motion direction. TMS over both SI and V5/hMT+, but not the PPC site, significantly reduced tactile direction discrimination. Our results show that V5/hMT+ plays a causal role in tactile direction processing, and strengthen the case for V5/hMT+ serving multimodal motion perception. Further, our findings are consistent with a serial model of cortical tactile processing, in which higher-order perceptual processing depends upon information received from SI. By contrast, our results do not provide clear evidence that the PPC site we targeted (Brodmann areas 7/40) contributes to tactile direction perception.
Collapse
Affiliation(s)
- Tomohiro Amemiya
- Institute of Cognitive Neuroscience, University College London, Alexandra House, 17 Queen Square London, WC1N 3AZ, United Kingdom.,NTT Communication Science Laboratories, NTT Corporation, 3-1 Wakamiya, Morinosato, Atsugi-shi, Kanagawa, 243-0198, Japan
| | - Brianna Beck
- Institute of Cognitive Neuroscience, University College London, Alexandra House, 17 Queen Square London, WC1N 3AZ, United Kingdom
| | - Vincent Walsh
- Institute of Cognitive Neuroscience, University College London, Alexandra House, 17 Queen Square London, WC1N 3AZ, United Kingdom
| | - Hiroaki Gomi
- NTT Communication Science Laboratories, NTT Corporation, 3-1 Wakamiya, Morinosato, Atsugi-shi, Kanagawa, 243-0198, Japan
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, Alexandra House, 17 Queen Square London, WC1N 3AZ, United Kingdom
| |
Collapse
|
19
|
Teraoka R, Teramoto W. Touch-contingent visual motion perception: tactile events drive visual motion perception. Exp Brain Res 2016; 235:903-912. [PMID: 27915368 DOI: 10.1007/s00221-016-4850-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2016] [Accepted: 11/28/2016] [Indexed: 10/20/2022]
Abstract
It has recently been demonstrated that the brain rapidly forms an association between concurrently presented sound sequences and visual motion. Once this association has been formed, the associated sound sequence can drive visual motion perception. This phenomenon is known as "sound-contingent visual motion perception" (SCVM). In the present study, we addressed the possibility of a similar association involving touch instead of audition. In a 9-min exposure session, two circles placed side by side were alternately presented to produce apparent motion in a horizontal direction. The onsets of the circle presentations were synchronized with vibrotactile stimulation on two different positions of the forearm. We then quantified pre- and post-exposure perceptual changes using a motion-nulling procedure. Results showed that after prolonged exposure to visuotactile stimuli, the tactile sequence influenced visual motion perception. Notably, this effect was specific to the previously exposed visual field, thus ruling out the possibility of simple response bias. These findings suggest that SCVM-like associations occur, at least to some extent, for the other modality combinations. Furthermore, the effect did not occur when the forearm posture was changed between the exposure and test phases, suggesting that the association is formed after integrating proprioceptive information.
Collapse
Affiliation(s)
- Ryo Teraoka
- Department of Information Science and Systems Engineering, Muroran Institute of Technology, 27-1 Mizumoto-cho, Muroran, Hokkaido, 050-8585, Japan
| | - Wataru Teramoto
- Department of Psychology, Kumamoto University, 2-40-1 Kurokami, Chuo-ku, Kumamoto, 860-8555, Japan.
| |
Collapse
|
20
|
Amemiya T, Hirota K, Ikei Y. Tactile Apparent Motion on the Torso Modulates Perceived Forward Self-Motion Velocity. IEEE TRANSACTIONS ON HAPTICS 2016; 9:474-482. [PMID: 27514066 DOI: 10.1109/toh.2016.2598332] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
The present study investigated whether a tactile flow created by a matrix of vibrators in a seat pan simultaneously presented with an optical flow in peripheral vision enhances the perceived forward velocity of self-motion. A brief tactile motion stimulus consisted of four successive rows of vibration, and the interstimulus onset between the tactile rows was varied to change the velocity of the tactile motion. The results show that the forward velocity of self-motion is significantly overestimated for rapid tactile flows and underestimated for slow ones, compared with optical flow alone or non-motion vibrotactile stimulation conditions. In addition, the effect with a temporal tactile rhythm without changing the stimulus location was smaller than that with spatiotemporal tactile motion, with the interstimulus onset interval to elicit a clear sensation of tactile apparent motion. These findings suggest that spatiotemporal tactile motion is effective in inducing a change in the perceived forward velocity of self-motion.
Collapse
|
21
|
Chancel M, Blanchard C, Guerraz M, Montagnini A, Kavounoudias A. Optimal visuotactile integration for velocity discrimination of self-hand movements. J Neurophysiol 2016; 116:1522-1535. [PMID: 27385802 DOI: 10.1152/jn.00883.2015] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2015] [Accepted: 07/06/2016] [Indexed: 11/22/2022] Open
Abstract
Illusory hand movements can be elicited by a textured disk or a visual pattern rotating under one's hand, while proprioceptive inputs convey immobility information (Blanchard C, Roll R, Roll JP, Kavounoudias A. PLoS One 8: e62475, 2013). Here, we investigated whether visuotactile integration can optimize velocity discrimination of illusory hand movements in line with Bayesian predictions. We induced illusory movements in 15 volunteers by visual and/or tactile stimulation delivered at six angular velocities. Participants had to compare hand illusion velocities with a 5°/s hand reference movement in an alternative forced choice paradigm. Results showed that the discrimination threshold decreased in the visuotactile condition compared with unimodal (visual or tactile) conditions, reflecting better bimodal discrimination. The perceptual strength (gain) of the illusions also increased: the stimulation required to give rise to a 5°/s illusory movement was slower in the visuotactile condition compared with each of the two unimodal conditions. The maximum likelihood estimation model satisfactorily predicted the improved discrimination threshold but not the increase in gain. When we added a zero-centered prior, reflecting immobility information, the Bayesian model did actually predict the gain increase but systematically overestimated it. Interestingly, the predicted gains better fit the visuotactile performances when a proprioceptive noise was generated by covibrating antagonist wrist muscles. These findings show that kinesthetic information of visual and tactile origins is optimally integrated to improve velocity discrimination of self-hand movements. However, a Bayesian model alone could not fully describe the illusory phenomenon pointing to the crucial importance of the omnipresent muscle proprioceptive cues with respect to other sensory cues for kinesthesia.
Collapse
Affiliation(s)
- M Chancel
- LNIA UMR 7260, Aix Marseille Université-Centre National de la Recherche Scientifique (CNRS), Marseille, France; LPNC UMR 5105, Université Savoie Mont Blanc-CNRS, Chambéry, France
| | - C Blanchard
- School of Psychology, University of Nottingham, Nottingham, United Kingdom; and
| | - M Guerraz
- LPNC UMR 5105, Université Savoie Mont Blanc-CNRS, Chambéry, France
| | - A Montagnini
- INT UMR 7289, Aix Marseille Université-CNRS, Marseille, France
| | - A Kavounoudias
- LNIA UMR 7260, Aix Marseille Université-Centre National de la Recherche Scientifique (CNRS), Marseille, France;
| |
Collapse
|
22
|
Watching a real moving object expands tactile duration: the role of task-irrelevant action context for subjective time. Atten Percept Psychophys 2016; 77:2768-80. [PMID: 26276220 DOI: 10.3758/s13414-015-0975-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Although it is well established that action contexts can expand the perceived durations of action-related events, whether action contexts also impact the subjective duration of events unrelated to the action remains an open issue. Here we examined how the automatic implicit reactions induced by viewing task-irrelevant, real moving objects influence tactile duration judgments. Participants were asked to make temporal bisection judgments of a tactile event while seeing a potentially catchable swinging ball. Approaching movement induced a tactile-duration overestimation relative to lateral movement and to a static baseline, and receding movement produced an expansion similar in duration to that from approaching movement. Interestingly, the effect of approaching movement on the subjective tactile duration was greatly reduced when participants held lightweight objects in their hands, relative to a hands-free condition, whereas no difference was obtained in the tactile-duration estimates between static hands-free and static hands-occupied conditions. The results indicate that duration perception is determined by internal bodily states as well as by sensory evidence.
Collapse
|
23
|
Cho Y, Craig JC, Hsiao SS, Bensmaia SJ. Vision is superior to touch in shape perception even with equivalent peripheral input. J Neurophysiol 2016; 115:92-9. [PMID: 26510760 PMCID: PMC4760472 DOI: 10.1152/jn.00654.2015] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2015] [Accepted: 10/23/2015] [Indexed: 11/22/2022] Open
Abstract
Results from previous studies suggest that two-dimensional spatial patterns are processed similarly in vision and touch when the patterns are equated for effective size or when visual stimuli are blurred to mimic the spatial filtering of the skin. In the present study, we measured subjects' ability to perceive the shape of familiar and unfamiliar visual and tactile patterns to compare form processing in the two modalities. As had been previously done, the two-dimensional tactile and visual patterns were adjusted in size to stimulate an equivalent number of receptors in the two modalities. We also distorted the visual patterns, using a filter that accurately mimics the spatial filtering effected by the skin to further equate the peripheral images in the two modalities. We found that vision consistently outperformed touch regardless of the precise perceptual task and of how familiar the patterns were. Based on an examination of both the earlier and present data, we conclude that visual processing of both familiar and unfamiliar two-dimensional patterns is superior to its tactile counterpart except under very restricted conditions.
Collapse
Affiliation(s)
- Yoonju Cho
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland; Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland
| | - J C Craig
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana; and
| | - S S Hsiao
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland; Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland
| | - S J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois
| |
Collapse
|
24
|
Abstract
While the different sensory modalities are sensitive to different stimulus energies, they are often charged with extracting analogous information about the environment. Neural systems may thus have evolved to implement similar algorithms across modalities to extract behaviorally relevant stimulus information, leading to the notion of a canonical computation. In both vision and touch, information about motion is extracted from a spatiotemporal pattern of activation across a sensory sheet (in the retina and in the skin, respectively), a process that has been extensively studied in both modalities. In this essay, we examine the processing of motion information as it ascends the primate visual and somatosensory neuraxes and conclude that similar computations are implemented in the two sensory systems. A close look at the cortical areas that support vision and touch suggests that the brain uses similar computational strategies to handle different kinds of sensory inputs.
Collapse
|
25
|
Moscatelli A, Hayward V, Wexler M, Ernst MO. Illusory Tactile Motion Perception: An Analog of the Visual Filehne Illusion. Sci Rep 2015; 5:14584. [PMID: 26412592 PMCID: PMC4585937 DOI: 10.1038/srep14584] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2015] [Accepted: 08/17/2015] [Indexed: 11/29/2022] Open
Abstract
We continually move our body and our eyes when exploring the world, causing our sensory surfaces, the skin and the retina, to move relative to external objects. In order to estimate object motion consistently, an ideal observer would transform estimates of motion acquired from the sensory surface into fixed, world-centered estimates, by taking the motion of the sensor into account. This ability is referred to as spatial constancy. Human vision does not follow this rule strictly and is therefore subject to perceptual illusions during eye movements, where immobile objects can appear to move. Here, we investigated whether one of these, the Filehne illusion, had a counterpart in touch. To this end, observers estimated the movement of a surface from tactile slip, with a moving or with a stationary finger. We found the perceived movement of the surface to be biased if the surface was sensed while moving. This effect exemplifies a failure of spatial constancy that is similar to the Filehne illusion in vision. We quantified this illusion by using a Bayesian model with a prior for stationarity, applied previously in vision. The analogy between vision and touch points to a modality-independent solution to the spatial constancy problem.
Collapse
Affiliation(s)
- Alessandro Moscatelli
- Department of Cognitive Neuroscience, University of Bielefeld, Bielefeld, Germany.,Cognitive Interaction Technology Centre of Excellence, University of Bielefeld, Bielefeld, Germany
| | - Vincent Hayward
- Sorbonne Universités, UPMC Univ Paris 06, UMR 7222, ISIR, F-75005, Paris, France
| | - Mark Wexler
- CNRS, UMR 7222, ISIR, F-75005, Paris, France.,Laboratoire Psychologie de la Perception and CNRS, Université Paris Descartes, F-75006 Paris, France
| | - Marc O Ernst
- Department of Cognitive Neuroscience, University of Bielefeld, Bielefeld, Germany.,Cognitive Interaction Technology Centre of Excellence, University of Bielefeld, Bielefeld, Germany.,Multisensory Perception and Action Group, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| |
Collapse
|
26
|
Honeine JL, Crisafulli O, Sozzi S, Schieppati M. Processing time of addition or withdrawal of single or combined balance-stabilizing haptic and visual information. J Neurophysiol 2015; 114:3097-110. [PMID: 26334013 DOI: 10.1152/jn.00618.2015] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2015] [Accepted: 08/28/2015] [Indexed: 12/28/2022] Open
Abstract
We investigated the integration time of haptic and visual input and their interaction during stance stabilization. Eleven subjects performed four tandem-stance conditions (60 trials each). Vision, touch, and both vision and touch were added and withdrawn. Furthermore, vision was replaced with touch and vice versa. Body sway, tibialis anterior, and peroneus longus activity were measured. Following addition or withdrawal of vision or touch, an integration time period elapsed before the earliest changes in sway were observed. Thereafter, sway varied exponentially to a new steady-state while reweighting occurred. Latencies of sway changes on sensory addition ranged from 0.6 to 1.5 s across subjects, consistently longer for touch than vision, and were regularly preceded by changes in muscle activity. Addition of vision and touch simultaneously shortened the latencies with respect to vision or touch separately, suggesting cooperation between sensory modalities. Latencies following withdrawal of vision or touch or both simultaneously were shorter than following addition. When vision was replaced with touch or vice versa, adding one modality did not interfere with the effect of withdrawal of the other, suggesting that integration of withdrawal and addition were performed in parallel. The time course of the reweighting process to reach the new steady-state was also shorter on withdrawal than addition. The effects of different sensory inputs on posture stabilization illustrate the operation of a time-consuming, possibly supraspinal process that integrates and fuses modalities for accurate balance control. This study also shows the facilitatory interaction of visual and haptic inputs in integration and reweighting of stance-stabilizing inputs.
Collapse
Affiliation(s)
- Jean-Louis Honeine
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia, Pavia, Italy; and Centro Studi Attività Motorie (CSAM), Fondazione Salvatore Maugeri (IRCSS), Pavia, Italy
| | - Oscar Crisafulli
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia, Pavia, Italy; and
| | - Stefania Sozzi
- Centro Studi Attività Motorie (CSAM), Fondazione Salvatore Maugeri (IRCSS), Pavia, Italy
| | - Marco Schieppati
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia, Pavia, Italy; and Centro Studi Attività Motorie (CSAM), Fondazione Salvatore Maugeri (IRCSS), Pavia, Italy
| |
Collapse
|
27
|
Krebber M, Harwood J, Spitzer B, Keil J, Senkowski D. Visuotactile motion congruence enhances gamma-band activity in visual and somatosensory cortices. Neuroimage 2015; 117:160-9. [DOI: 10.1016/j.neuroimage.2015.05.056] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2015] [Revised: 04/15/2015] [Accepted: 05/19/2015] [Indexed: 11/16/2022] Open
|
28
|
Méndez-Balbuena I, Huidobro N, Silva M, Flores A, Trenado C, Quintanar L, Arias-Carrión O, Kristeva R, Manjarrez E. Effect of mechanical tactile noise on amplitude of visual evoked potentials: multisensory stochastic resonance. J Neurophysiol 2015; 114:2132-43. [PMID: 26156387 DOI: 10.1152/jn.00457.2015] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2015] [Accepted: 07/06/2015] [Indexed: 11/22/2022] Open
Abstract
The present investigation documents the electrophysiological occurrence of multisensory stochastic resonance in the human visual pathway elicited by tactile noise. We define multisensory stochastic resonance of brain evoked potentials as the phenomenon in which an intermediate level of input noise of one sensory modality enhances the brain evoked response of another sensory modality. Here we examined this phenomenon in visual evoked potentials (VEPs) modulated by the addition of tactile noise. Specifically, we examined whether a particular level of mechanical Gaussian noise applied to the index finger can improve the amplitude of the VEP. We compared the amplitude of the positive P100 VEP component between zero noise (ZN), optimal noise (ON), and high mechanical noise (HN). The data disclosed an inverted U-like graph for all the subjects, thus demonstrating the occurrence of a multisensory stochastic resonance in the P100 VEP.
Collapse
Affiliation(s)
| | - Nayeli Huidobro
- Instituto de Fisiología, Benemérita Universidad Autónoma de Puebla, Puebla, Mexico
| | - Mayte Silva
- Instituto de Fisiología, Benemérita Universidad Autónoma de Puebla, Puebla, Mexico
| | - Amira Flores
- Instituto de Fisiología, Benemérita Universidad Autónoma de Puebla, Puebla, Mexico
| | - Carlos Trenado
- Institute of Clinical Neuroscience, Heinrich Heine University, Düsseldorf, Germany
| | - Luis Quintanar
- Facultad de Psicología, Benemérita Universidad Autónoma de Puebla, Puebla, Mexico
| | - Oscar Arias-Carrión
- Unidad de Trastornos del Movimiento y Sueño (TMS), Hospital General Dr. Manuel Gea González/IFC-UNAM, Mexico City, Mexico; and
| | - Rumyana Kristeva
- Department of Neurology, University of Freiburg, Freiburg, Germany
| | - Elias Manjarrez
- Instituto de Fisiología, Benemérita Universidad Autónoma de Puebla, Puebla, Mexico;
| |
Collapse
|
29
|
Abstract
The manipulation of objects commonly involves motion between object and skin. In this review, we discuss the neural basis of tactile motion perception and its similarities with its visual counterpart. First, much like in vision, the perception of tactile motion relies on the processing of spatiotemporal patterns of activation across populations of sensory receptors. Second, many neurons in primary somatosensory cortex are highly sensitive to motion direction, and the response properties of these neurons draw strong analogies to those of direction-selective neurons in visual cortex. Third, tactile speed may be encoded in the strength of the response of cutaneous mechanoreceptive afferents and of a subpopulation of speed-sensitive neurons in cortex. However, both afferent and cortical responses are strongly dependent on texture as well, so it is unclear how texture and speed signals are disambiguated. Fourth, motion signals from multiple fingers must often be integrated during the exploration of objects, but the way these signals are combined is complex and remains to be elucidated. Finally, visual and tactile motion perception interact powerfully, an integration process that is likely mediated by visual association cortex.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan, Republic of China; Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan, Republic of China
| | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois; and Committee on Computational Neuroscience, University of Chicago, Chicago, Illinois
| |
Collapse
|
30
|
Tactile and visual motion direction processing in hMT+/V5. Neuroimage 2014; 84:420-7. [DOI: 10.1016/j.neuroimage.2013.09.004] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2013] [Revised: 08/20/2013] [Accepted: 09/03/2013] [Indexed: 11/18/2022] Open
|
31
|
The Haptic Analog of the Visual Aubert-Fleischl Phenomenon. HAPTICS: NEUROSCIENCE, DEVICES, MODELING, AND APPLICATIONS 2014. [DOI: 10.1007/978-3-662-44196-1_5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
|
32
|
Pei YC, Chang TY, Lee TC, Saha S, Lai HY, Gomez-Ramirez M, Chou SW, Wong AMK. Cross-modal sensory integration of visual-tactile motion information: instrument design and human psychophysics. SENSORS 2013; 13:7212-23. [PMID: 23727955 PMCID: PMC3715219 DOI: 10.3390/s130607212] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2013] [Revised: 05/22/2013] [Accepted: 05/23/2013] [Indexed: 11/23/2022]
Abstract
Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
- Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +886-33281200 (ext. 8146); Fax: +886-33281200 (ext. 2667)
| | - Ting-Yu Chang
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Tsung-Chi Lee
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Sudipta Saha
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Hsin-Yi Lai
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Manuel Gomez-Ramirez
- The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
| | - Shih-Wei Chou
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Alice M. K. Wong
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| |
Collapse
|
33
|
Bair WN, Kiemel T, Jeka JJ, Clark JE. Development of multisensory reweighting is impaired for quiet stance control in children with developmental coordination disorder (DCD). PLoS One 2012; 7:e40932. [PMID: 22815872 PMCID: PMC3399799 DOI: 10.1371/journal.pone.0040932] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2012] [Accepted: 06/15/2012] [Indexed: 11/29/2022] Open
Abstract
Background Developmental Coordination Disorder (DCD) is a leading movement disorder in children that commonly involves poor postural control. Multisensory integration deficit, especially the inability to adaptively reweight to changing sensory conditions, has been proposed as a possible mechanism but with insufficient characterization. Empirical quantification of reweighting significantly advances our understanding of its developmental onset and improves the characterization of its difference in children with DCD compared to their typically developing (TD) peers. Methodology/Principal Findings Twenty children with DCD (6.6 to 11.8 years) were tested with a protocol in which visual scene and touch bar simultaneously oscillateded medio-laterally at different frequencies and various amplitudes. Their data were compared to data on TD children (4.2 to 10.8 years) from a previous study. Gains and phases were calculated for medio-lateral responses of the head and center of mass to both sensory stimuli. Gains and phases were simultaneously fitted by linear functions of age for each amplitude condition, segment, modality and group. Fitted gains and phases at two comparison ages (6.6 and 10.8 years) were tested for reweighting within each group and for group differences. Children with DCD reweight touch and vision at a later age (10.8 years) than their TD peers (4.2 years). Children with DCD demonstrate a weak visual reweighting, no advanced multisensory fusion and phase lags larger than those of TD children in response to both touch and vision. Conclusions/Significance Two developmental perspectives, postural body scheme and dorsal stream development, are provided to explain the weak vision reweighting. The lack of multisensory fusion supports the notion that optimal multisensory integration is a slow developmental process and is vulnerable in children with DCD.
Collapse
Affiliation(s)
- Woei-Nan Bair
- Department of Physical Therapy and Rehabilitation Science, University of Maryland, Baltimore, Baltimore, Maryland, United States of America.
| | | | | | | |
Collapse
|
34
|
Touching motion: rTMS on the human middle temporal complex interferes with tactile speed perception. Brain Topogr 2012; 25:389-98. [PMID: 22367586 DOI: 10.1007/s10548-012-0223-4] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2011] [Accepted: 02/13/2012] [Indexed: 10/28/2022]
Abstract
Brain functional and psychophysical studies have clearly demonstrated that visual motion perception relies on the activity of the middle temporal complex (hMT+). However, recent studies have shown that hMT+ seems to be also activated during tactile motion perception, suggesting that this visual extrastriate area is involved in the processing and integration of motion, irrespective of the sensorial modality. In the present study, we used repetitive transcranial magnetic stimulation (rTMS) to assess whether hMT+ plays a causal role in tactile motion processing. Blindfolded participants detected changes in the speed of a grid of tactile moving points with their finger (i.e. tactile modality). The experiment included three different conditions: a control condition with no TMS and two TMS conditions, i.e. hMT+-rTMS and posterior parietal cortex (PPC)-rTMS. Accuracies were significantly impaired during hMT+-rTMS but not in the other two conditions (No-rTMS or PPC-rTMS), moreover, thresholds for detecting speed changes were significantly higher in the hMT+-rTMS with respect to the control TMS conditions. These findings provide stronger evidence that the activity of the hMT+ area is involved in tactile speed processing, which may be consistent with the hypothesis of a supramodal role for that cortical region in motion processing.
Collapse
|
35
|
Merlo JL. Cross-modal congruency benefits for combined tactile and visual signaling. AMERICAN JOURNAL OF PSYCHOLOGY 2011. [DOI: 10.5406/amerjpsyc.124.4.0413] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
36
|
Tomassini A, Gori M, Burr D, Sandini G, Morrone MC. Perceived duration of Visual and Tactile Stimuli Depends on Perceived Speed. Front Integr Neurosci 2011; 5:51. [PMID: 21941471 PMCID: PMC3170919 DOI: 10.3389/fnint.2011.00051] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2011] [Accepted: 08/23/2011] [Indexed: 11/13/2022] Open
Abstract
IT IS KNOWN THAT THE PERCEIVED DURATION OF VISUAL STIMULI IS STRONGLY INFLUENCED BY SPEED: faster moving stimuli appear to last longer. To test whether this is a general property of sensory systems we asked participants to reproduce the duration of visual and tactile gratings, and visuo-tactile gratings moving at a variable speed (3.5-15 cm/s) for three different durations (400, 600, and 800 ms). For both modalities, the apparent duration of the stimulus increased strongly with stimulus speed, more so for tactile than for visual stimuli. In addition, visual stimuli were perceived to last approximately 200 ms longer than tactile stimuli. The apparent duration of visuo-tactile stimuli lay between the unimodal estimates, as the Bayesian account predicts, but the bimodal precision of the reproduction did not show the theoretical improvement. A cross-modal speed-matching task revealed that visual stimuli were perceived to move faster than tactile stimuli. To test whether the large difference in the perceived duration of visual and tactile stimuli resulted from the difference in their perceived speed, we repeated the time reproduction task with visual and tactile stimuli matched in apparent speed. This reduced, but did not completely eliminate the difference in apparent duration. These results show that for both vision and touch, perceived duration depends on speed, pointing to common strategies of time perception.
Collapse
Affiliation(s)
- Alice Tomassini
- Department of Robotics, Brain and Cognitive Sciences, Istituto Italiano di Tecnologia Genova, Italy
| | | | | | | | | |
Collapse
|
37
|
Gori M, Mazzilli G, Sandini G, Burr D. Cross-Sensory Facilitation Reveals Neural Interactions between Visual and Tactile Motion in Humans. Front Psychol 2011; 2:55. [PMID: 21734892 PMCID: PMC3110703 DOI: 10.3389/fpsyg.2011.00055] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2010] [Accepted: 03/23/2011] [Indexed: 11/13/2022] Open
Abstract
Many recent studies show that the human brain integrates information across the different senses and that stimuli of one sensory modality can enhance the perception of other modalities. Here we study the processes that mediate cross-modal facilitation and summation between visual and tactile motion. We find that while summation produced a generic, non-specific improvement of thresholds, probably reflecting higher-order interaction of decision signals, facilitation reveals a strong, direction-specific interaction, which we believe reflects sensory interactions. We measured visual and tactile velocity discrimination thresholds over a wide range of base velocities and conditions. Thresholds for both visual and tactile stimuli showed the characteristic “dipper function,” with the minimum thresholds occurring at a given “pedestal speed.” When visual and tactile coherent stimuli were combined (summation condition) the thresholds for these multisensory stimuli also showed a “dipper function” with the minimum thresholds occurring in a similar range to that for unisensory signals. However, the improvement of multisensory thresholds was weak and not directionally specific, well predicted by the maximum-likelihood estimation model (agreeing with previous research). A different technique (facilitation) did, however, reveal direction-specific enhancement. Adding a non-informative “pedestal” motion stimulus in one sensory modality (vision or touch) selectively lowered thresholds in the other, by the same amount as pedestals in the same modality. Facilitation did not occur for neutral stimuli like sounds (that would also have reduced temporal uncertainty), nor for motion in opposite direction, even in blocked trials where the subjects knew that the motion was in the opposite direction showing that the facilitation was not under subject control. Cross-sensory facilitation is strong evidence for functionally relevant cross-sensory integration at early levels of sensory processing.
Collapse
Affiliation(s)
- Monica Gori
- Istituto Italiano di Tecnologia, Robotics, Brain and Cognitive Sciences Genova, Italy
| | | | | | | |
Collapse
|
38
|
Yau JM, Weber AI, Bensmaia SJ. Separate mechanisms for audio-tactile pitch and loudness interactions. Front Psychol 2010; 1:160. [PMID: 21887147 PMCID: PMC3157934 DOI: 10.3389/fpsyg.2010.00160] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2010] [Accepted: 09/09/2010] [Indexed: 11/13/2022] Open
Abstract
A major goal in perceptual neuroscience is to understand how signals from different sensory modalities are combined to produce stable and coherent representations. We previously investigated interactions between audition and touch, motivated by the fact that both modalities are sensitive to environmental oscillations. In our earlier study, we characterized the effect of auditory distractors on tactile frequency and intensity perception. Here, we describe the converse experiments examining the effect of tactile distractors on auditory processing. Because the two studies employ the same psychophysical paradigm, we combined their results for a comprehensive view of how auditory and tactile signals interact and how these interactions depend on the perceptual task. Together, our results show that temporal frequency representations are perceptually linked regardless of the attended modality. In contrast, audio-tactile loudness interactions depend on the attended modality: Tactile distractors influence judgments of auditory intensity, but judgments of tactile intensity are impervious to auditory distraction. Lastly, we show that audio-tactile loudness interactions depend critically on stimulus timing, while pitch interactions do not. These results reveal that auditory and tactile inputs are combined differently depending on the perceptual task. That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two are mediated by separate neural mechanisms. These findings underscore the complexity and specificity of multisensory interactions.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neurology, Division of Cognitive Neuroscience, Johns Hopkins University School of Medicine Baltimore, MD, USA
| | | | | |
Collapse
|
39
|
Kim S, James TW. Enhanced effectiveness in visuo-haptic object-selective brain regions with increasing stimulus salience. Hum Brain Mapp 2010; 31:678-93. [PMID: 19830683 DOI: 10.1002/hbm.20897] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
The occipital and parietal lobes contain regions that are recruited for both visual and haptic object processing. The purpose of the present study was to characterize the underlying neural mechanisms for bimodal integration of vision and haptics in these visuo-haptic object-selective brain regions to find out whether these brain regions are sites of neuronal or areal convergence. Our sensory conditions consisted of visual-only (V), haptic-only (H), and visuo-haptic (VH), which allowed us to evaluate integration using the superadditivity metric. We also presented each stimulus condition at two different levels of signal-to-noise ratio or salience. The salience manipulation allowed us to assess integration using the rule of inverse effectiveness. We were able to localize previously described visuo-haptic object-selective regions in the lateral occipital cortex (lateral occipital tactile-visual area) and the intraparietal sulcus, and also localized a new region in the left anterior fusiform gyrus. There was no evidence of superadditivity with the VH stimulus at either level of salience in any of the regions. There was, however, a strong effect of salience on multisensory enhancement: the response to the VH stimulus was more enhanced at higher salience across all regions. In other words, the regions showed enhanced integration of the VH stimulus with increasing effectiveness of the unisensory stimuli. We called the effect "enhanced effectiveness." The presence of enhanced effectiveness in visuo-haptic object-selective brain regions demonstrates neuronal convergence of visual and haptic sensory inputs for the purpose of processing object shape.
Collapse
Affiliation(s)
- Sunah Kim
- Cognitive Science Program, Indiana University, Bloomington, Indiana 47405, USA.
| | | |
Collapse
|
40
|
Pei YC, Hsiao SS, Craig JC, Bensmaia SJ. Shape invariant coding of motion direction in somatosensory cortex. PLoS Biol 2010; 8:e1000305. [PMID: 20126380 PMCID: PMC2814823 DOI: 10.1371/journal.pbio.1000305] [Citation(s) in RCA: 68] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2009] [Accepted: 12/29/2009] [Indexed: 11/19/2022] Open
Abstract
A subpopulation of neurons in primate somatosensory cortex signal the direction in which objects move across the skin of the fingertips. Invariant representations of stimulus features are thought to play an important role in producing stable percepts of objects. In the present study, we assess the invariance of neural representations of tactile motion direction with respect to other stimulus properties. To this end, we record the responses evoked in individual neurons in somatosensory cortex of primates, including areas 3b, 1, and 2, by three types of motion stimuli, namely scanned bars and dot patterns, and random dot displays, presented to the fingertips of macaque monkeys. We identify a population of neurons in area 1 that is highly sensitive to the direction of stimulus motion and whose motion signals are invariant across stimulus types and conditions. The motion signals conveyed by individual neurons in area 1 can account for the ability of human observers to discriminate the direction of motion of these stimuli, as measured in paired psychophysical experiments. We conclude that area 1 contains a robust representation of motion and discuss similarities in the neural mechanisms of visual and tactile motion processing. When we physically interact with an object, our hands convey information about the shape of the object, its texture, its compliance, and its thermal properties. This information allows us to manipulate tools and to recognize objects based on tactile exploration alone. One of the hallmarks of tactile object recognition is that it involves movement between the skin and the object. In this study, we investigate how the direction in which objects move relative to the skin is represented in the brain. Specifically, we scan a variety of stimuli, including bars and dot patterns, across the fingers of non-human primates while recording the evoked neuronal activity. We find that a population of neurons in somatosensory cortex encodes the direction of moving stimuli regardless of the shape of the stimuli, the speed at which they are scanned across the skin, or the force with which they contact the skin. We show that these neurons can account for our ability to perceive the direction of motion of tactile stimuli.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital and Chang Gung University, Taiwan
| | - Steven S. Hsiao
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - James C. Craig
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Sliman J. Bensmaia
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
- * E-mail:
| |
Collapse
|
41
|
JAMES L. MERLO, AARON R. DULEY, PETER A. HANCOCK. Cross-modal congruency benefits for combined tactile and visual signaling. AMERICAN JOURNAL OF PSYCHOLOGY 2010; 123:413-24. [DOI: 10.5406/amerjpsyc.123.4.0413] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
42
|
Affiliation(s)
- Mark Hollins
- Department of Psychology, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599;
| |
Collapse
|
43
|
Abstract
Current views on multisensory motion integration assume separate substrates where visual motion perceptually dominates tactile motion [1, 2]. However, recent neuroimaging findings demonstrate strong activation of visual motion processing areas by tactile stimuli [3-6], implying a potentially bidirectional relationship. To test the relationship between visual and tactile motion processing, we examined the transfer of motion aftereffects. In the well-known visual motion aftereffect, adapting to visual motion in one direction causes a subsequently presented stationary stimulus to be perceived as moving in the opposite direction [7, 8]. The existence of motion aftereffects in the tactile domain was debated [9-11], though robust tactile motion aftereffects have recently been demonstrated [12, 13]. By using a motion adaptation paradigm, we found that repeated exposure to visual motion in a given direction produced a tactile motion aftereffect, the illusion of motion in the opponent direction across the finger pad. We also observed that repeated exposure to tactile motion induces a visual motion aftereffect, biasing the perceived direction of counterphase gratings. These crossmodal aftereffects, operating both from vision to touch and from touch to vision, present strong behavioral evidence that the processing of visual and tactile motion rely on shared representations that dynamically impact modality-specific perception.
Collapse
Affiliation(s)
- Talia Konkle
- McGovern Institute for Brain Research and Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 46-2171, Cambridge, MA 02139, USA.
| | | | | | | |
Collapse
|
44
|
Konkle T, Wang Q, Hayward V, Moore CI. Motion aftereffects transfer between touch and vision. Curr Biol 2009; 19:745-50. [PMID: 19361996 PMCID: PMC3398123 DOI: 10.1016/j.cub.2009.03.035] [Citation(s) in RCA: 83] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2008] [Revised: 02/15/2009] [Accepted: 03/03/2009] [Indexed: 11/21/2022]
Abstract
Current views on multisensory motion integration assume separate substrates where visual motion perceptually dominates tactile motion [1, 2]. However, recent neuroimaging findings demonstrate strong activation of visual motion processing areas by tactile stimuli [3-6], implying a potentially bidirectional relationship. To test the relationship between visual and tactile motion processing, we examined the transfer of motion aftereffects. In the well-known visual motion aftereffect, adapting to visual motion in one direction causes a subsequently presented stationary stimulus to be perceived as moving in the opposite direction [7, 8]. The existence of motion aftereffects in the tactile domain was debated [9-11], though robust tactile motion aftereffects have recently been demonstrated [12, 13]. By using a motion adaptation paradigm, we found that repeated exposure to visual motion in a given direction produced a tactile motion aftereffect, the illusion of motion in the opponent direction across the finger pad. We also observed that repeated exposure to tactile motion induces a visual motion aftereffect, biasing the perceived direction of counterphase gratings. These crossmodal aftereffects, operating both from vision to touch and from touch to vision, present strong behavioral evidence that the processing of visual and tactile motion rely on shared representations that dynamically impact modality-specific perception.
Collapse
Affiliation(s)
- Talia Konkle
- McGovern Institute for Brain Research and Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 46-2171, Cambridge, MA 02139, USA
| | - Qi Wang
- Department of Biomedical Engineering, Georgia Institute of Technology, 313 Ferst Drive, Atlanta, GA 30332-0535, USA
| | - Vincent Hayward
- Institute des Systemes Intelligents et de Robotique, Universite Pierre et Marie Curie, 4 place Jussieu, 75252 Paris, France
| | - Christopher I. Moore
- McGovern Institute for Brain Research and Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 46-2171, Cambridge, MA 02139, USA
| |
Collapse
|
45
|
Stekelenburg JJ, Vroomen J. Neural correlates of audiovisual motion capture. Exp Brain Res 2009; 198:383-90. [PMID: 19296094 PMCID: PMC2733180 DOI: 10.1007/s00221-009-1763-z] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2008] [Accepted: 02/28/2009] [Indexed: 11/28/2022]
Abstract
Visual motion can affect the perceived direction of auditory motion (i.e., audiovisual motion capture). It is debated, though, whether this effect occurs at perceptual or decisional stages. Here, we examined the neural consequences of audiovisual motion capture using the mismatch negativity (MMN), an event-related brain potential reflecting pre-attentive auditory deviance detection. In an auditory-only condition occasional changes in the direction of a moving sound (deviant) elicited an MMN starting around 150 ms. In an audiovisual condition, auditory standards and deviants were synchronized with a visual stimulus that moved in the same direction as the auditory standards. These audiovisual deviants did not evoke an MMN, indicating that visual motion reduced the perceptual difference between sound motion of standards and deviants. The inhibition of the MMN by visual motion provides evidence that auditory and visual motion signals are integrated at early sensory processing stages.
Collapse
|
46
|
Occelli V, Spence C, Zampini M. The effect of sound intensity on the audiotactile crossmodal dynamic capture effect. Exp Brain Res 2008; 193:409-19. [PMID: 19011842 DOI: 10.1007/s00221-008-1637-9] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2008] [Accepted: 10/25/2008] [Indexed: 11/26/2022]
Abstract
We investigated the effect of varying sound intensity on the audiotactile crossmodal dynamic capture effect. Participants had to discriminate the direction of a target stream (tactile, Experiment 1; auditory, Experiment 2) while trying to ignore the direction of a distractor stream presented in a different modality (auditory, Experiment 1; tactile, Experiment 2). The distractor streams could either be spatiotemporally congruent or incongruent with respect to the target stream. In half of the trials, the participants were presented with auditory stimuli at 75 dB(A) while in the other half of the trials they were presented with auditory stimuli at 82 dB(A). Participants' performance on both tasks was significantly affected by the intensity of the sounds. Namely, the crossmodal capture of tactile motion by audition was stronger with the more intense (vs. less intense) auditory distractors (Experiment 1), whereas the capture effect exerted by the tactile distractors was stronger for less intense (than for more intense) auditory targets (Experiment 2). The crossmodal dynamic capture was larger in Experiment 1 than in Experiment 2, with a stronger congruency effect when the target streams were presented in the tactile (vs. auditory) modality. Two explanations are put forward to account for these results: an attentional biasing toward the more intense auditory stimuli, and a modulation induced by the relative perceptual weight of, respectively, the auditory and the tactile signals.
Collapse
Affiliation(s)
- Valeria Occelli
- Department of Cognitive Sciences and Education, University of Trento, Rovereto, TN, Italy.
| | | | | |
Collapse
|
47
|
Nakashita S, Saito DN, Kochiyama T, Honda M, Tanabe HC, Sadato N. Tactile-visual integration in the posterior parietal cortex: a functional magnetic resonance imaging study. Brain Res Bull 2007; 75:513-25. [PMID: 18355627 DOI: 10.1016/j.brainresbull.2007.09.004] [Citation(s) in RCA: 49] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2007] [Revised: 08/30/2007] [Accepted: 09/06/2007] [Indexed: 10/22/2022]
Abstract
To explore the neural substrates of visual-tactile crossmodal integration during motion direction discrimination, we conducted functional magnetic resonance imaging with 15 subjects. We initially performed independent unimodal visual and tactile experiments involving motion direction matching tasks. Visual motion discrimination activated the occipital cortex bilaterally, extending to the posterior portion of the superior parietal lobule, and the dorsal and ventral premotor cortex. Tactile motion direction discrimination activated the bilateral parieto-premotor cortices. The left superior parietal lobule, intraparietal sulcus, bilateral premotor cortices and right cerebellum were activated during both visual and tactile motion discrimination. Tactile discrimination deactivated the visual cortex including the middle temporal/V5 area. To identify the crossmodal interference of the neural activities in both the unimodal and the multimodal areas, tactile and visual crossmodal experiments with event-related designs were also performed by the same subjects who performed crossmodal tactile-visual tasks or intramodal tactile-tactile and visual-visual matching tasks within the same session. The activities detected during intramodal tasks in the visual regions (including the middle temporal/V5 area) and the tactile regions were suppressed during crossmodal conditions compared with intramodal conditions. Within the polymodal areas, the left superior parietal lobule and the premotor areas were activated by crossmodal tasks. The left superior parietal lobule was more prominently activated under congruent event conditions than under incongruent conditions. These findings suggest that a reciprocal and competitive association between the unimodal and polymodal areas underlies the interaction between motion direction-related signals received simultaneously from different sensory modalities.
Collapse
Affiliation(s)
- Satoru Nakashita
- Department of Physiological Sciences, The Graduate University for Advanced Studies (Sokendai), Kanagawa 240-0193, Japan
| | | | | | | | | | | |
Collapse
|
48
|
Killebrew JH, Bensmaïa SJ, Dammann JF, Denchev P, Hsiao SS, Craig JC, Johnson KO. A dense array stimulator to generate arbitrary spatio-temporal tactile stimuli. J Neurosci Methods 2006; 161:62-74. [PMID: 17134760 PMCID: PMC1851669 DOI: 10.1016/j.jneumeth.2006.10.012] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2006] [Revised: 10/09/2006] [Accepted: 10/09/2006] [Indexed: 11/19/2022]
Abstract
The generation and presentation of tactile stimuli presents a unique challenge. Unlike vision and audition, in which standard equipment such as monitors and audio systems can be used for most experiments, tactile stimuli and/or stimulators often have to be tailor-made for a given study. Here, we present a novel tactile stimulator designed to present arbitrary spatio-temporal stimuli to the skin. The stimulator consists of 400 pins, arrayed over a 1cm(2) area, each under independent computer control. The dense array allows for an unprecedented number of stimuli to be presented within an experimental session (e.g., up to 1200 stimuli per minute) and for stimuli to be generated adaptively. The stimulator can be used in a variety of modes and can deliver indented and scanned patterns as well as stimuli defined by mathematical spatio-temporal functions (e.g., drifting sinusoids). We describe the hardware and software of the system, and discuss previous and prospective applications.
Collapse
Affiliation(s)
| | | | | | - Peter Denchev
- Krieger Mind/Brain Institute, Johns Hopkins University, USA
| | - Steven S. Hsiao
- Krieger Mind/Brain Institute, Johns Hopkins University, USA
- Department of Neuroscience, Johns Hopkins University, USA
| | - James C. Craig
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA
- * Corresponding author. Tel.: +1 812 855 3926. E-mail address: (J.C. Craig)
| | - Kenneth O. Johnson
- Krieger Mind/Brain Institute, Johns Hopkins University, USA
- Department of Neuroscience, Johns Hopkins University, USA
| |
Collapse
|