1
|
Mastria G, Bertoni T, Perrin H, Akulenko N, Risso G, Akselrod M, Guanziroli E, Molteni F, Hagmann P, Bassolino M, Serino A. Body ownership alterations in stroke emerge from reduced proprioceptive precision and damage to the frontoparietal network. MED 2025; 6:100536. [PMID: 39532102 DOI: 10.1016/j.medj.2024.10.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Revised: 05/13/2024] [Accepted: 10/14/2024] [Indexed: 11/16/2024]
Abstract
BACKGROUND Stroke patients often experience alterations in their subjective feeling of ownership for the affected limb, which can hinder motor function and interfere with rehabilitation. In this study, we aimed at disentangling the complex relationship between sensory impairment, body ownership (BO), and motor control in stroke patients. METHODS We recruited 20 stroke patients with unilateral upper limb sensory deficits and 35 age-matched controls. Participants performed a virtual reality reaching task with a varying displacement between their real unseen hand and a visible virtual hand. We measured reaching errors and subjective ownership ratings as indicators of hand ownership. Reaching errors were modeled using a probabilistic causal inference model, in which ownership for the virtual hand is inferred from the level of congruency between visual and proprioceptive inputs and used to weigh the amount of visual adjustment to reaching movements. FINDINGS Stroke patients were more likely to experience ownership over an incongruent virtual hand and integrate it into their motor plans. The model explained this tendency in terms of a decreased capability of detecting visuo-proprioceptive incongruences, proportionally to the amount of proprioceptive deficit. Lesion analysis further revealed that BO alterations, not fully explained by the proprioceptive deficit, are linked to frontoparietal network damage, suggesting a disruption in higher-level multisensory integration functions. CONCLUSIONS Collectively, our results show that BO alterations in stroke patients can be quantitatively predicted and explained in a computational framework as the result of sensory loss and higher-level multisensory integration deficits. FUNDING Swiss National Science Foundation (163951).
Collapse
Affiliation(s)
- Giulio Mastria
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland.
| | - Tommaso Bertoni
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| | - Henri Perrin
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| | - Nikita Akulenko
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| | - Gaia Risso
- Institute of Health, School of Health Sciences, HES-SO Valais-Wallis, 1950 Sion, Switzerland; The Sense Innovation & Research Center, 1950 Sion and Lausanne, Switzerland
| | - Michel Akselrod
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| | - Eleonora Guanziroli
- Valduce Hospital "Villa Beretta" Rehabilitation Center, 23845 Costa Masnaga, Italy
| | - Franco Molteni
- Valduce Hospital "Villa Beretta" Rehabilitation Center, 23845 Costa Masnaga, Italy
| | - Patric Hagmann
- Connectomics Lab, Department of Radiology, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| | - Michela Bassolino
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland; Institute of Health, School of Health Sciences, HES-SO Valais-Wallis, 1950 Sion, Switzerland; The Sense Innovation & Research Center, 1950 Sion and Lausanne, Switzerland
| | - Andrea Serino
- MySpace Lab, Department of Clinical Neurosciences, Lausanne University Hospital, University of Lausanne, 1011 Lausanne, Switzerland
| |
Collapse
|
2
|
Aurucci GV, Preatoni G, Risso G, Raspopovic S. Amputees but not healthy subjects optimally integrate non-spatially matched visuo-tactile stimuli. iScience 2025; 28:111685. [PMID: 39886468 PMCID: PMC11780163 DOI: 10.1016/j.isci.2024.111685] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2024] [Revised: 10/08/2024] [Accepted: 12/20/2024] [Indexed: 02/01/2025] Open
Abstract
Our brain combines sensory inputs to create a univocal perception, enhanced when stimuli originate from the same location. Following amputation, distorted body representations may disrupt visuo-tactile integration at the amputated leg. We aim to unveil the principles guiding optimal and cognitive-efficient visuo-tactile integration at both intact and amputated legs. Hence, we designed a VR electro-stimulating platform to assess the functional and cognitive correlates of visuo-tactile integration in two amputees and sixteen healthy subjects performing a 2-alternative forced choice (2AFC) task. We showed that amputees optimally integrate non-spatially matched stimuli at the amputated leg but not the intact leg (tactile cue at the stump/thigh and visual cue under the virtual foot), while healthy controls only integrated spatially matched visuo-tactile stimuli. Optimal integration also reduced 2AFC task reaction times and was confirmed by cognitive EEG-based mental workload reduction. These findings offer insights into multisensory integration processes, opening new perspectives on amputees' brain plasticity.
Collapse
Affiliation(s)
- Giuseppe Valerio Aurucci
- Laboratory for Neuroengineering, Department of Health Science and Technology, Institute for Robotics and Intelligent Systems, ETH Zürich, 8092 Zürich, Switzerland
| | - Greta Preatoni
- Laboratory for Neuroengineering, Department of Health Science and Technology, Institute for Robotics and Intelligent Systems, ETH Zürich, 8092 Zürich, Switzerland
| | - Gaia Risso
- Institute of Health, School of Health Sciences, HES-SO Valais-Wallis, 1950 Sion, Switzerland
- The Sense Innovation & Research Center, 1950 Sion and Lausanne, Switzerland
| | - Stanisa Raspopovic
- Laboratory for Neuroengineering, Department of Health Science and Technology, Institute for Robotics and Intelligent Systems, ETH Zürich, 8092 Zürich, Switzerland
| |
Collapse
|
3
|
Errante A, Rossi Sebastiano A, Castellani N, Rozzi S, Fogassi L, Garbarini F. Shared body representation constraints in human and non-human primates behavior. Cortex 2024; 181:179-193. [PMID: 39550836 DOI: 10.1016/j.cortex.2024.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2024] [Revised: 08/30/2024] [Accepted: 10/06/2024] [Indexed: 11/19/2024]
Abstract
Previous studies indicated that the sense of body ownership (i.e., the feeling that our body parts belong to us; SBO) can be experimentally modulated in humans. Here, we focused on SBO from an across-species perspective, by investigating whether similar bottom-up and top-down constraints that consent to build SBO in humans also operate to build it in monkeys. To this aim, one monkey and a cohort of humans (N = 20) performed a paradigm combining the well-known rubber hand illusion (RHI), able to induce a fake hand embodiment, and a hand-identification reaching task, borrowed from the clinical evaluation of patients with SBO disorders. This task consisted of reaching one's own hand with the other, while presenting a fake hand in different conditions controlling for bottom-up (synchronicity of the visuo-tactile stimulation) and top-down (congruency of the fake hand position relative to the monkey's body) SBO constraints. Spatiotemporal kinematic features of such self-directed movements were measured. Our results show that, when the monkey aimed at the own hand, the trajectory of self-directed movements was attracted by the position of the hand believed to be one's own (i.e., the fake hand), as in humans. Interestingly, such an effect was present only when both bottom-up and top-down constraints were met. Moreover, in the monkey, besides displacement of movement trajectory, also other kinematic parameters (velocity peak, deceleration phase) showed sensitivity to the embodiment effect. Overall, if replicated in a larger sample of monkeys, these results should support the view that human and non-human primates share similar body representation constraints and that they are able to modulate the motor behavior in both species.
Collapse
Affiliation(s)
- A Errante
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | | | - N Castellani
- MANIBUS Lab, Psychology Department, University of Turin, Turin, Italy; MoMiLab, IMT School for Advanced Studies, Lucca, Italy
| | - S Rozzi
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - L Fogassi
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - F Garbarini
- MANIBUS Lab, Psychology Department, University of Turin, Turin, Italy; Neuroscience Institute of Turin (NIT), Turin, Italy.
| |
Collapse
|
4
|
Limanowski J, Adams RA, Kilner J, Parr T. The Many Roles of Precision in Action. ENTROPY (BASEL, SWITZERLAND) 2024; 26:790. [PMID: 39330123 PMCID: PMC11431491 DOI: 10.3390/e26090790] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/02/2024] [Revised: 09/05/2024] [Accepted: 09/07/2024] [Indexed: 09/28/2024]
Abstract
Active inference describes (Bayes-optimal) behaviour as being motivated by the minimisation of surprise of one's sensory observations, through the optimisation of a generative model (of the hidden causes of one's sensory data) in the brain. One of active inference's key appeals is its conceptualisation of precision as biasing neuronal communication and, thus, inference within generative models. The importance of precision in perceptual inference is evident-many studies have demonstrated the importance of ensuring precision estimates are correct for normal (healthy) sensation and perception. Here, we highlight the many roles precision plays in action, i.e., the key processes that rely on adequate estimates of precision, from decision making and action planning to the initiation and control of muscle movement itself. Thereby, we focus on the recent development of hierarchical, "mixed" models-generative models spanning multiple levels of discrete and continuous inference. These kinds of models open up new perspectives on the unified description of hierarchical computation, and its implementation, in action. Here, we highlight how these models reflect the many roles of precision in action-from planning to execution-and the associated pathologies if precision estimation goes wrong. We also discuss the potential biological implementation of the associated message passing, focusing on the role of neuromodulatory systems in mediating different kinds of precision.
Collapse
Affiliation(s)
- Jakub Limanowski
- Institute of Psychology, University of Greifswald, 17487 Greifswald, Germany
| | - Rick A. Adams
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AZ, UK; (R.A.A.); (J.K.)
- Centre for Medical Image Computing, University College London, London WC1N 6LJ, UK
| | - James Kilner
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AZ, UK; (R.A.A.); (J.K.)
| | - Thomas Parr
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX1 4AL, UK;
| |
Collapse
|
5
|
Peviani VC, Miller LE, Medendorp WP. Biases in hand perception are driven by somatosensory computations, not a distorted hand model. Curr Biol 2024; 34:2238-2246.e5. [PMID: 38718799 DOI: 10.1016/j.cub.2024.04.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 02/09/2024] [Accepted: 04/04/2024] [Indexed: 05/23/2024]
Abstract
To sense and interact with objects in the environment, we effortlessly configure our fingertips at desired locations. It is therefore reasonable to assume that the underlying control mechanisms rely on accurate knowledge about the structure and spatial dimensions of our hand and fingers. This intuition, however, is challenged by years of research showing drastic biases in the perception of finger geometry.1,2,3,4,5 This perceptual bias has been taken as evidence that the brain's internal representation of the body's geometry is distorted,6 leading to an apparent paradox regarding the skillfulness of our actions.7 Here, we propose an alternative explanation of the biases in hand perception-they are the result of the Bayesian integration of noisy, but unbiased, somatosensory signals about finger geometry and posture. To address this hypothesis, we combined Bayesian reverse engineering with behavioral experimentation on joint and fingertip localization of the index finger. We modeled the Bayesian integration either in sensory or in space-based coordinates, showing that the latter model variant led to biases in finger perception despite accurate representation of finger length. Behavioral measures of joint and fingertip localization responses showed similar biases, which were well fitted by the space-based, but not the sensory-based, model variant. The space-based model variant also outperformed a distorted hand model with built-in geometric biases. In total, our results suggest that perceptual distortions of finger geometry do not reflect a distorted hand model but originate from near-optimal Bayesian inference on somatosensory signals.
Collapse
Affiliation(s)
- Valeria C Peviani
- Donders Institute for Cognition and Behavior, Radboud University, Nijmegen 6525 GD, the Netherlands.
| | - Luke E Miller
- Donders Institute for Cognition and Behavior, Radboud University, Nijmegen 6525 GD, the Netherlands
| | - W Pieter Medendorp
- Donders Institute for Cognition and Behavior, Radboud University, Nijmegen 6525 GD, the Netherlands
| |
Collapse
|
6
|
O'Kane SH, Chancel M, Ehrsson HH. Hierarchical and dynamic relationships between body part ownership and full-body ownership. Cognition 2024; 246:105697. [PMID: 38364444 DOI: 10.1016/j.cognition.2023.105697] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 12/12/2023] [Accepted: 12/13/2023] [Indexed: 02/18/2024]
Abstract
What is the relationship between experiencing individual body parts and the whole body as one's own? We theorised that body part ownership is driven primarily by the perceptual binding of visual and somatosensory signals from specific body parts, whereas full-body ownership depends on a more global binding process based on multisensory information from several body segments. To examine this hypothesis, we used a bodily illusion and asked participants to rate illusory changes in ownership over five different parts of a mannequin's body and the mannequin as a whole, while we manipulated the synchrony or asynchrony of visual and tactile stimuli delivered to three different body parts. We found that body part ownership was driven primarily by local visuotactile synchrony and could be experienced relatively independently of full-body ownership. Full-body ownership depended on the number of synchronously stimulated parts in a nonlinear manner, with the strongest full-body ownership illusion occurring when all parts received synchronous stimulation. Additionally, full-body ownership influenced body part ownership for nonstimulated body parts, and skin conductance responses provided physiological evidence supporting an interaction between body part and full-body ownership. We conclude that body part and full-body ownership correspond to different processes and propose a hierarchical probabilistic model to explain the relationship between part and whole in the context of multisensory awareness of one's own body.
Collapse
Affiliation(s)
- Sophie H O'Kane
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Marie Chancel
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden; Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, 38000 Grenoble, France
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| |
Collapse
|
7
|
Hapuarachchi H, Ishimoto H, Kitazaki M. Temporal visuomotor synchrony induces embodiment towards an avatar with biomechanically impossible arm movements. Iperception 2023; 14:20416695231211699. [PMID: 37969571 PMCID: PMC10631331 DOI: 10.1177/20416695231211699] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2023] [Accepted: 10/17/2023] [Indexed: 11/17/2023] Open
Abstract
Visuomotor synchrony in time and space induces a sense of embodiment towards virtual bodies experienced in first-person view using Virtual Reality (VR). Here, we investigated whether temporal visuomotor synchrony affects avatar embodiment even when the movements of the virtual arms are spatially altered from those of the user in a non-human-like manner. In a within-subjects design VR experiment, participants performed a reaching task controlling an avatar whose lower arms bent in inversed and biomechanically impossible directions from the elbow joints. They performed the reaching task using this "unnatural avatar" as well as a "natural avatar," whose arm movements and positions spatially matched the user. The reaching tasks were performed with and without a one second delay between the real and virtual movements. While the senses of body ownership and agency towards the unnatural avatar were significantly lower compared to those towards the natural avatar, temporal visuomotor synchrony did significantly increase the sense of embodiment towards the unnatural avatar as well as the natural avatar. These results suggest that temporal visuomotor synchrony is crucial for inducing embodiment even when the spatial match between the real and virtual limbs is disrupted with movements outside the pre-existing cognitive representations of the human body.
Collapse
Affiliation(s)
- Harin Hapuarachchi
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| | - Hiroki Ishimoto
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| | - Michiteru Kitazaki
- Department of Computer Science and Engineering, Toyohashi University of Technology, Toyohashi, Japan
| |
Collapse
|
8
|
Carranza E, Bertoni T, Mastria G, Boos A, Bassolino M, Serino A, Pirondini E. Feasibility and Validation of a Robotic-Based Multisensory Integration Assessment in Healthy Controls and a Stroke Patient. IEEE Int Conf Rehabil Robot 2023; 2023:1-6. [PMID: 37941286 DOI: 10.1109/icorr58425.2023.10304735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2023]
Abstract
After experiencing brain damage, stroke patients commonly suffer from motor and sensory impairments that impact their ability to perform volitional movements. Visuo-proprioceptive integration is a critical component of voluntary movement, allowing for accurate movements and a sense of ownership over one's body. While recent studies have increased our understanding of the balance between visual compensation and proprioceptive deficits in stroke patients, quantitative methods for studying multisensory integration are still lacking. This study aimed to evaluate the feasibility of adapting a 3D visuo-proprioceptive disparity (VPD) task into a 2D setting using an upper-limb robotic platform for moderate to severe chronic stroke patients. To assess the implementation of the 2D task, a cohort of unimpaired healthy participants performed the VPD task in both a 3D and 2D environment. We used a computational Bayesian model to predict errors in visuo-proprioceptive integration and compared the model's predictions to real behavioral data. Our findings indicated that the behavioral trends observed in the 2D and 3D tasks were similar, and the model accurately predicted behavior. We then evaluated the feasibility of our task to assess post-stroke deficits in a patient with severe motor and sensory deficits. Ultimately, this work may help to improve our understanding of the significance of visuo-proprioceptive integration and aid in the development of better rehabilitation therapies for improving sensorimotor outcomes in stroke patients.
Collapse
|