1
|
Dechaux A, Haytam-Mahsoub M, Kitazaki M, Lagarde J, Ganesh G. Multi-sensory feedback improves spatially compatible sensori-motor responses. Sci Rep 2022; 12:20253. [PMID: 36424417 PMCID: PMC9691706 DOI: 10.1038/s41598-022-24028-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Accepted: 11/09/2022] [Indexed: 11/26/2022] Open
Abstract
To interact with machines, from computers to cars, we need to monitor multiple sensory stimuli, and respond to them with specific motor actions. It has been shown that our ability to react to a sensory stimulus is dependent on both the stimulus modality, as well as the spatial compatibility of the stimulus and the required response. However, the compatibility effects have been examined for sensory modalities individually, and rarely for scenarios requiring individuals to choose from multiple actions. Here, we compared response time of participants when they had to choose one of several spatially distinct, but compatible, responses to visual, tactile or simultaneous visual and tactile stimuli. We observed that the presence of both tactile and visual stimuli consistently improved the response time relative to when either stimulus was presented alone. While we did not observe a difference in response times of visual and tactile stimuli, the spatial stimulus localization was observed to be faster for visual stimuli compared to tactile stimuli.
Collapse
Affiliation(s)
- A. Dechaux
- grid.464638.b0000 0004 0599 0488UM-CNRS Laboratoire d’Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), 161, Rue Ada, Montpellier, France
| | - M. Haytam-Mahsoub
- grid.464638.b0000 0004 0599 0488UM-CNRS Laboratoire d’Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), 161, Rue Ada, Montpellier, France
| | - M. Kitazaki
- grid.412804.b0000 0001 0945 2394Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Aichi Japan
| | - J. Lagarde
- grid.121334.60000 0001 2097 0141Euromov Digital Health in Motion (DHM) Laboratory, University of Montpellier, 700 Avenue du Pic Saint Loup, Montpellier, France
| | - G. Ganesh
- grid.464638.b0000 0004 0599 0488UM-CNRS Laboratoire d’Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), 161, Rue Ada, Montpellier, France
| |
Collapse
|
2
|
Different mechanisms of magnitude and spatial representation for tactile and auditory modalities. Exp Brain Res 2021; 239:3123-3132. [PMID: 34415367 PMCID: PMC8536643 DOI: 10.1007/s00221-021-06196-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Accepted: 08/11/2021] [Indexed: 11/04/2022]
Abstract
The human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.
Collapse
|
3
|
Bollini A, Campus C, Esposito D, Gori M. The Magnitude Effect on Tactile Spatial Representation: The Spatial-Tactile Association for Response Code (STARC) Effect. Front Neurosci 2020; 14:557063. [PMID: 33132821 PMCID: PMC7550691 DOI: 10.3389/fnins.2020.557063] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2020] [Accepted: 08/27/2020] [Indexed: 12/04/2022] Open
Abstract
The human brain uses perceptual information to create a correct representation of the external world. Converging data indicate that the perceptual processing of, space, and quantities frequently is based on a shared mental magnitude system, where low and high quantities are represented in the left and right space, respectively. The present study explores how the magnitude affects spatial representation in the tactile modality. We investigated these processes using stimulus-response (S-R) compatibility tasks (i.e., sensorimotor tasks that present an association/dissociation between the perception of a stimulus and the required action, generally increasing/decreasing accuracy and decreasing/increasing reaction times of the subject). In our study, the participant performed a discrimination task between high- and low-frequency vibrotactile stimuli, regardless of the stimulation’s spatial position. When the response code was incompatible with the mental magnitude line (i.e., left button for high-frequency and right button for low-frequency responses), we found that the participants bypassed the spatial congruence, showing a magnitude S-R compatibility effect. We called this phenomenon the Spatial–Tactile Association of Response Codes (STARC) effect. Moreover, we observed that the internal frame of reference embodies the STARC effect. Indeed, the participants’ performance reversed between uncrossed- and crossed-hands posture, suggesting that spatial reference frames play a role in the process of expressing mental magnitude, at least in terms of the tactile modality.
Collapse
Affiliation(s)
- Alice Bollini
- Unit for Visually Impaired People, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Claudio Campus
- Unit for Visually Impaired People, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Davide Esposito
- Unit for Visually Impaired People, Istituto Italiano di Tecnologia, Genoa, Italy.,DIBRIS, Università di Genova, Genoa, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
4
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
5
|
Low and high stimulation frequencies differentially affect automated response selection in the superior parietal cortex - implications for somatosensory area processes. Sci Rep 2020; 10:3954. [PMID: 32127632 PMCID: PMC7054528 DOI: 10.1038/s41598-020-61025-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2019] [Accepted: 02/19/2020] [Indexed: 01/09/2023] Open
Abstract
Response inhibition as a central facet of executive functioning is no homogeneous construct. Interference inhibition constitutes a subcomponent of response inhibition and refers to inhibitory control over responses that are automatically triggered by irrelevant stimulus dimensions as measured by the Simon task. While there is evidence that the area-specific modulation of tactile information affects the act of action withholding, effects in the context of interference inhibition remain elusive. We conducted a tactile version of the Simon task with stimuli designed to be predominantly processed in the primary (40 Hz) or secondary (150 Hz) somatosensory cortex. On the basis of EEG recordings, we performed signal decomposition and source localization. Behavioral results reveal that response execution is more efficient when sensory information is mainly processed via SII, compared to SI sensory areas during non-conflicting trials. When accounting for intermingled coding levels by temporally decomposing EEG data, the results show that experimental variations depending on sensory area-specific processing differences specifically affect motor and not sensory processes. Modulations of motor-related processes are linked to activation differences in the superior parietal cortex (BA7). It is concluded that the SII cortical area supporting cognitive preprocessing of tactile input fosters automatic tactile information processing by facilitating stimulus-response mapping in posterior parietal regions.
Collapse
|
6
|
Castro L, Soto-Faraco S, Morís Fernández L, Ruzzoli M. The breakdown of the Simon effect in cross-modal contexts: EEG evidence. Eur J Neurosci 2019; 47:832-844. [PMID: 29495127 DOI: 10.1111/ejn.13882] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2017] [Revised: 01/25/2018] [Accepted: 02/13/2018] [Indexed: 12/01/2022]
Abstract
In everyday life, we often must coordinate information across spatial locations and different senses for action. It is well known, for example, that reactions are faster when an imperative stimulus and its required response are congruent than when they are not, even if stimulus location itself is completely irrelevant for the task (the so-called Simon effect). However, because these effects have been frequently investigated in single-modality scenarios, the consequences of spatial congruence when more than one sensory modality is at play are less well known. Interestingly, at a behavioral level, the visual Simon effect vanishes in mixed (visual and tactile) modality scenarios, suggesting that irrelevant spatial information ceases to exert influence on vision. To shed some light on this surprising result, here we address the expression of irrelevant spatial information in EEG markers typical of the visual Simon effect (P300, theta power modulation, LRP) in mixed-modality contexts. Our results show no evidence for the visual-spatial information to affect performance at behavioral and neurophysiological levels. The absence of evidence of the neural markers of visual S-R conflict in the mixed-modality scenario implies that some aspects of spatial representations that are strongly expressed in single-modality scenarios might be bypassed.
Collapse
Affiliation(s)
- Leonor Castro
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, Edifici Mercè Rodoreda, carrer Ramon Trias Fargas 25-27, 08005, Barcelona, Spain
| | - Salvador Soto-Faraco
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, Edifici Mercè Rodoreda, carrer Ramon Trias Fargas 25-27, 08005, Barcelona, Spain.,Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Luis Morís Fernández
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, Edifici Mercè Rodoreda, carrer Ramon Trias Fargas 25-27, 08005, Barcelona, Spain
| | - Manuela Ruzzoli
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, Edifici Mercè Rodoreda, carrer Ramon Trias Fargas 25-27, 08005, Barcelona, Spain
| |
Collapse
|
7
|
Rahman MS, Yau JM. Somatosensory interactions reveal feature-dependent computations. J Neurophysiol 2019; 122:5-21. [DOI: 10.1152/jn.00168.2019] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.
Collapse
Affiliation(s)
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
8
|
External coding and salience in the tactile Simon effect. Acta Psychol (Amst) 2019; 198:102874. [PMID: 31299458 DOI: 10.1016/j.actpsy.2019.102874] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Revised: 06/14/2019] [Accepted: 06/20/2019] [Indexed: 11/24/2022] Open
Abstract
Previous studies have demonstrated a tactile Simon effect in which stimulus codes are generated based on the stimulated hand, not on limb position in external space (the somatotopic Simon effect). However, given evidence from visual Simon effect studies demonstrating that multiple stimulus codes can be generated for a single stimulus, we examined whether multiple stimulus codes can be generated for tactile stimuli as well. In our first experiment using four stimulators (two on each side of the hand), we found novel evidence for a hand-centered Simon effect, along with the typical somatotopic Simon effect. Next, we examined whether the potential salience of these somatotopic codes could be reduced, by testing only one hand with two stimulators attached. In Experiments 2-4, we found a strong hand-centered Simon effect with a diminished somatotopic Simon effect, providing evidence that stimulus salience can change the weighting of somatosensory stimulus coding. Finally, we also found novel evidence that the hand-centered Simon effect is coded in external, not somatotopic, coordinates. Furthermore, the diminished somatotopic Simon effect when testing on one hand only provides evidence that salience is an important factor in modulating the tactile Simon effect.
Collapse
|
9
|
Pérusseau-Lambert A, Anastassova M, Boukallel M, Chetouani M, Grynszpan O. The social Simon effect in the tactile sensory modality: a negative finding. Cogn Process 2019; 20:299-307. [PMID: 30993409 DOI: 10.1007/s10339-019-00911-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 02/26/2019] [Indexed: 10/27/2022]
Abstract
This study seeks to investigate whether users activate cognitive representations of their partner's action when they are involved in tactile collaborative tasks. The social Simon effect is a spatial stimulus-response interference induced by the mere presence of a partner in a go/nogo task. It has been extensively studied in the visual and auditory sensory modalities, but never before in the tactile modality. We compared the performances of 28 participants in three tasks: (1) a standard Simon task where participants responded to two different tactile stimuli applied to their fingertips with either their left or right foot, (2) an individual go/nogo task where participants responded to only one stimulus and (3) a social go/nogo task where they again responded to only one stimulus, but were partnered with another person who responded to the complementary stimulus. The interference effect due to spatial incongruence between the side where participants received the stimulus and the foot used to answer increased significantly in the standard Simon task compared to the social go/nogo task. Such a difference was not observed between the social and individual go/nogo tasks. Performances were nevertheless enhanced in the social go/nogo task, but irrespectively of the stimulus-response congruency. This study is the first to report a negative result for the social Simon effect in the tactile modality. Results suggest that cognitive representation of the co-actor is weaker in this modality.
Collapse
Affiliation(s)
- Alix Pérusseau-Lambert
- Institut des Systèmes Intelligents et de Robotique (ISIR), Sorbonne Université, CNRS, 4 place Jussieu, 75252, Paris CEDEX 05, France.,CEA, LIST, Sensorial and Ambient Interfaces Laboratory, 91191, Gif-Sur-Yvette CEDEX, France
| | - Margarita Anastassova
- CEA, LIST, Sensorial and Ambient Interfaces Laboratory, 91191, Gif-Sur-Yvette CEDEX, France
| | - Mehdi Boukallel
- CEA, LIST, Sensorial and Ambient Interfaces Laboratory, 91191, Gif-Sur-Yvette CEDEX, France
| | - Mohamed Chetouani
- Institut des Systèmes Intelligents et de Robotique (ISIR), Sorbonne Université, CNRS, 4 place Jussieu, 75252, Paris CEDEX 05, France
| | - Ouriel Grynszpan
- Institut des Systèmes Intelligents et de Robotique (ISIR), Sorbonne Université, CNRS, 4 place Jussieu, 75252, Paris CEDEX 05, France.
| |
Collapse
|
10
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
11
|
Mahani MAN, Bausenhart KM, Ahmadabadi MN, Ulrich R. Multimodal Simon Effect: A Multimodal Extension of the Diffusion Model for Conflict Tasks. Front Hum Neurosci 2019; 12:507. [PMID: 30687039 PMCID: PMC6333713 DOI: 10.3389/fnhum.2018.00507] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2018] [Accepted: 12/05/2018] [Indexed: 11/29/2022] Open
Abstract
In conflict tasks, like the Simon task, it is usually investigated how task-irrelevant information affects the processing of task-relevant information. In the present experiments, we extended the Simon task to a multimodal setup, in which task-irrelevant information emerged from two sensory modalities. Specifically, in Experiment 1, participants responded to the identity of letters presented at a left, right, or central position with a left- or right-hand response. Additional tactile stimulation occurred on a left, right, or central position on the horizontal body plane. Response congruency of the visual and tactile stimulation was orthogonally varied. In Experiment 2, the tactile stimulation was replaced by auditory stimulation. In both experiments, the visual task-irrelevant information produced congruency effects such that responses were slower and less accurate in incongruent than incongruent conditions. Furthermore, in Experiment 1, such congruency effects, albeit smaller, were also observed for the tactile task-irrelevant stimulation. In Experiment 2, the auditory task-irrelevant stimulation produced the smallest effects. Specifically, the longest reaction times emerged in the neutral condition, while incongruent and congruent conditions differed only numerically. This suggests that in the co-presence of multiple task-irrelevant information sources, location processing is more strongly determined by visual and tactile spatial information than by auditory spatial information. An extended version of the Diffusion Model for Conflict Tasks (DMC) was fitted to the results of both experiments. This Multimodal Diffusion Model for Conflict Tasks (MDMC), and a model variant involving faster processing in the neutral visual condition (FN-MDMC), provided reasonable fits for the observed data. These model fits support the notion that multimodal task-irrelevant information superimposes across sensory modalities and automatically affects the controlled processing of task-relevant information.
Collapse
Affiliation(s)
- Mohammad-Ali Nikouei Mahani
- Cognition and Perception, Department of Psychology, University of Tübingen, Tübingen, Germany
- Cognitive Systems Lab, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, Iran
| | - Karin Maria Bausenhart
- Cognition and Perception, Department of Psychology, University of Tübingen, Tübingen, Germany
| | - Majid Nili Ahmadabadi
- Cognitive Systems Lab, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, Iran
| | - Rolf Ulrich
- Cognition and Perception, Department of Psychology, University of Tübingen, Tübingen, Germany
| |
Collapse
|
12
|
Zheng W, Chen L. The Roles of Attentional Shifts and Attentional Reengagement in Resolving The Spatial Compatibility Effect in Tactile Simon-like Tasks. Sci Rep 2018; 8:8760. [PMID: 29884800 PMCID: PMC5993732 DOI: 10.1038/s41598-018-27114-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2017] [Accepted: 05/30/2018] [Indexed: 11/08/2022] Open
Abstract
The Simon effect refers to the acceleration of choice responses when the target position and response location are consistent compared with scenarios in which they are inconsistent, even if the target position is not relevant to the response. Here, we provide the first demonstration that the tactile Simon-like effect operates in an attention-shifting manner. In unimodal scenarios (Experiments 1-4), for the tactile direction task, the spatial compatibility effect was absent in the focused-attention condition but maintained in the divided-attention condition. For the tactile localization task, this pattern was reversed: the spatial compatibility effect occurred for the focused-attention condition but was reduced/absent in the divided-attention condition. In the audiotactile interaction scenario (Experiment 5), the reaction times (RTs) for discriminating the tactile motion direction were prolonged; however, a spatial compatibility effect was not observed. We propose that the temporal course of resolving conflicts between spatial codes during attentional shifts, including attentional reengagement, may account for the tactile Simon-like effect.
Collapse
Affiliation(s)
- Wanting Zheng
- School of Ophthalmology & Optometry, School of Biomedical Engineering, Wenzhou Medical University, Wenzhou, 325035, China.
| | - Lihan Chen
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, 100871, China.
- Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, China.
| |
Collapse
|
13
|
Sensory neural pathways revisited to unravel the temporal dynamics of the Simon effect: A model-based cognitive neuroscience approach. Neurosci Biobehav Rev 2017; 77:48-57. [DOI: 10.1016/j.neubiorev.2017.02.023] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2016] [Revised: 01/23/2017] [Accepted: 02/22/2017] [Indexed: 10/20/2022]
|
14
|
Medina J, DePasquale C. Influence of the body schema on mirror-touch synesthesia. Cortex 2017; 88:53-65. [DOI: 10.1016/j.cortex.2016.12.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2016] [Revised: 10/20/2016] [Accepted: 12/14/2016] [Indexed: 12/27/2022]
|
15
|
Tamè L, Wühle A, Petri CD, Pavani F, Braun C. Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn 2016; 111:25-33. [PMID: 27816777 DOI: 10.1016/j.bandc.2016.10.005] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 10/22/2016] [Accepted: 10/24/2016] [Indexed: 10/20/2022]
Abstract
Localizing tactile stimuli on our body requires sensory information to be represented in multiple frames of reference along the sensory pathways. These reference frames include the representation of sensory information in skin coordinates, in which the spatial relationship of skin regions is maintained. The organization of the primary somatosensory cortex matches such somatotopic reference frame. In contrast, higher-order representations are based on external coordinates, in which body posture and gaze direction are taken into account in order to localise touch in other meaningful ways according to task demands. Dominance of one representation or the other, or the use of multiple representations with different weights, is thought to depend on contextual factors of cognitive and/or sensory origins. However, it is unclear under which situations a reference frame takes over another or when different reference frames are jointly used at the same time. The study of tactile mislocalizations at the fingers has shown a key role of the somatotopic frame of reference, both when touches are delivered unilaterally to a single hand, and when they are delivered bilaterally to both hands. Here, we took advantage of a well-established tactile mislocalization paradigm to investigate whether the reference frame used to integrate bilateral tactile stimuli can change as a function of the spatial relationship between the two hands. Specifically, supra-threshold interference stimuli were applied to the index or little fingers of the left hand 200ms prior to the application of a test stimulus on a finger of the right hand. Crucially, different hands postures were adopted (uncrossed or crossed). Results show that introducing a change in hand-posture triggered the concurrent use of somatotopic and external reference frames when processing bilateral touch at the fingers. This demonstrates that both somatotopic and external reference frames can be concurrently used to localise tactile stimuli on the fingers.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, UK.
| | - Anja Wühle
- MEG-Centre, University of Tübingen, Germany
| | | | - Francesco Pavani
- Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Centre, Lyon, France
| | - Christoph Braun
- MEG-Centre, University of Tübingen, Germany; Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
16
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|