1
|
De Pastina R, Chiarella SG, Simione L, Raffone A, Pazzaglia M. The remapping of peripersonal space after stroke, spinal cord injury and amputation: A PRISMA systematic review. Neurosci Biobehav Rev 2025; 173:106168. [PMID: 40252881 DOI: 10.1016/j.neubiorev.2025.106168] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2024] [Revised: 02/24/2025] [Accepted: 04/16/2025] [Indexed: 04/21/2025]
Abstract
Peripersonal space (PPS) is the body-centered area where interactions occur and objects can be reached. Its boundaries are dynamic, modulated by ongoing sensorimotor experiences: limb immobilization shrinks PPS, whereas tool use expands it. However, consistent clinical information on PPS alterations remains limited due to methodological heterogeneity, varying types and severities of sensorimotor disorders, and diverse experimental paradigms. This review explores the causal mechanisms of PPS processing by integrating findings from brain-lesioned patients and individuals with body deafferentation, such as amputees and spinal cord injury (SCI) patients. By comparing the effects of brain lesions and sensorimotor deafferentation, it clarifies how PPS is encoded, maintained, and reorganized following central nervous system damage, bodily changes, and the use of assistive devices. A systematic search of Scopus, Web of Science, and PubMed identified 17 studies: 4 on stroke patients (N = 100), 6 on SCI patients (N = 104), and 7 on amputees (N = 65). Risk of bias was assessed using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Despite the limited number of studies and methodological variability, findings consistently show that sensorimotor changes significantly affect PPS. Notably, a contraction of PPS around the affected limb was observed in stroke, SCI patients, and amputees. Assistive devices were able to restore PPS after training, or even immediately in the case of prosthesis use. A shared neurophysiological mechanism across these conditions may underlie PPS as an online construct, continuously updated to reflect the body's current state and its interaction with the environment.
Collapse
Affiliation(s)
- Riccardo De Pastina
- Dipartimento di Psicologia, Università di Roma "Sapienza", Rome 00185, Italy.
| | - Salvatore Gaetano Chiarella
- International School for Advanced Studies (SISSA), Trieste 34136, Italy; Dipartimento di Scienze Umanistiche e Sociali Internazionali, UNINT, Università degli Studi Internazionali di Roma, Rome 00147, Italy
| | - Luca Simione
- Dipartimento di Scienze Umanistiche e Sociali Internazionali, UNINT, Università degli Studi Internazionali di Roma, Rome 00147, Italy; Institute of Cognitive Sciences and Technologies (ISTC), National Research Council (CNR), Rome 00185, Italy
| | - Antonino Raffone
- Dipartimento di Psicologia, Università di Roma "Sapienza", Rome 00185, Italy
| | - Mariella Pazzaglia
- Dipartimento di Psicologia, Università di Roma "Sapienza", Rome 00185, Italy; Body and Action Lab, IRCCS Fondazione Santa Lucia, Rome 00179, Italy
| |
Collapse
|
2
|
Choi I, Lee SH. Locomotion-dependent auditory gating to the parietal cortex guides multisensory decisions. Nat Commun 2025; 16:2308. [PMID: 40055344 PMCID: PMC11889129 DOI: 10.1038/s41467-025-57347-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Accepted: 02/13/2025] [Indexed: 05/13/2025] Open
Abstract
Decision-making in mammals fundamentally relies on integrating multiple sensory inputs, with conflicting information resolved flexibly based on a dominant sensory modality. However, the neural mechanisms underlying state-dependent changes in sensory dominance remain poorly understood. Our study demonstrates that locomotion in mice shifts auditory-dominant decisions toward visual dominance during audiovisual conflicts. Using circuit-specific calcium imaging and optogenetic manipulations, we found that weakened visual representation in the posterior parietal cortex (PPC) leads to auditory-dominant decisions in stationary mice. Prolonged locomotion, however, promotes visual dominance by inhibiting auditory cortical neurons projecting to the PPC (ACPPC). This shift is mediated by secondary motor cortical neurons projecting to the auditory cortex (M2AC), which specifically inhibit ACPPC neurons without affecting auditory cortical projections to the striatum (ACSTR). Our findings reveal the neural circuit mechanisms underlying auditory gating to the association cortex depending on locomotion states, providing insights into the state-dependent changes in sensory dominance during multisensory decision-making.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, IBS, Daejeon, 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, IBS, Daejeon, 34141, Republic of Korea.
- Department of Biological Sciences, KAIST, Daejeon, 34141, Republic of Korea.
| |
Collapse
|
3
|
Heurley LP, Obrecht L, Vanborren H, Touzard F, Brouillet T. The prediction-confirmation account of the sense of body ownership: Evidence from a rubber hand illusion paradigm. Psychon Bull Rev 2025; 32:442-451. [PMID: 39105938 DOI: 10.3758/s13423-024-02553-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/22/2024] [Indexed: 08/07/2024]
Abstract
We investigated the contribution of multisensory predictions to body ownership, and beyond, to the integration of body-related signals. Contrary to the prevailing idea, according to which, to be integrated, cues necessarily have to be perceived simultaneously, we instead proposed the prediction-confirmation account. According to this account, a perceived cue can be integrated with a predicted cue as long as both signals are relatively simultaneous. To test this hypothesis, a standard rubber hand illusion (RHI) paradigm was used. In the first part of each trial, the illusion was induced while participants observed the rubber hand being touched with a paintbrush. In the subsequent part of the trial, (i) both rubber hand and the participant's real hand were stroked as before (i.e., visible/synchronous condition), (ii) the rubber hand was not stroke anymore (i.e., visible/tactile-only condition), or (iii) both rubber hand and the participant's real hand were synchronously stroked while the location where the rubber hand was touched was occulted (i.e., occulted/synchronous condition). However, in this latter condition, participants still perceived the approaching movement of the paintbrush. Thus, based on this visual cue, the participants can properly predict the timepoint at which the tactile cue should occur (i.e., visuotactile predictions). Our major finding was that compared with the visible/tactile-only condition, the occulted/synchronous condition did not exhibit a decrease of the RHI as in the visible/synchronous condition. This finding supports the prediction-confirmation account and suggests that this mechanism operates even in the standard version of the RHI.
Collapse
Affiliation(s)
- Loïc P Heurley
- Laboratoire sur les Interactions Cognition, Action, Émotion (LICAE)-Université Paris Nanterre, 200 avenue, 92001, de La République, Nanterre Cedex, France.
| | - Léa Obrecht
- Laboratoire sur les Interactions Cognition, Action, Émotion (LICAE)-Université Paris Nanterre, 200 avenue, 92001, de La République, Nanterre Cedex, France
| | - Hélène Vanborren
- Laboratoire sur les Interactions Cognition, Action, Émotion (LICAE)-Université Paris Nanterre, 200 avenue, 92001, de La République, Nanterre Cedex, France
| | - Fleur Touzard
- Laboratoire sur les Interactions Cognition, Action, Émotion (LICAE)-Université Paris Nanterre, 200 avenue, 92001, de La République, Nanterre Cedex, France
| | - Thibaut Brouillet
- Laboratoire sur les Interactions Cognition, Action, Émotion (LICAE)-Université Paris Nanterre, 200 avenue, 92001, de La République, Nanterre Cedex, France
| |
Collapse
|
4
|
Froesel M, Gacoin M, Clavagnier S, Hauser M, Goudard Q, Ben Hamed S. Macaque claustrum, pulvinar and putative dorsolateral amygdala support the cross-modal association of social audio-visual stimuli based on meaning. Eur J Neurosci 2024; 59:3203-3223. [PMID: 38637993 DOI: 10.1111/ejn.16328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 02/14/2024] [Accepted: 03/07/2024] [Indexed: 04/20/2024]
Abstract
Social communication draws on several cognitive functions such as perception, emotion recognition and attention. The association of audio-visual information is essential to the processing of species-specific communication signals. In this study, we use functional magnetic resonance imaging in order to identify the subcortical areas involved in the cross-modal association of visual and auditory information based on their common social meaning. We identified three subcortical regions involved in audio-visual processing of species-specific communicative signals: the dorsolateral amygdala, the claustrum and the pulvinar. These regions responded to visual, auditory congruent and audio-visual stimulations. However, none of them was significantly activated when the auditory stimuli were semantically incongruent with the visual context, thus showing an influence of visual context on auditory processing. For example, positive vocalization (coos) activated the three subcortical regions when presented in the context of positive facial expression (lipsmacks) but not when presented in the context of negative facial expression (aggressive faces). In addition, the medial pulvinar and the amygdala presented multisensory integration such that audiovisual stimuli resulted in activations that were significantly higher than those observed for the highest unimodal response. Last, the pulvinar responded in a task-dependent manner, along a specific spatial sensory gradient. We propose that the dorsolateral amygdala, the claustrum and the pulvinar belong to a multisensory network that modulates the perception of visual socioemotional information and vocalizations as a function of the relevance of the stimuli in the social context. SIGNIFICANCE STATEMENT: Understanding and correctly associating socioemotional information across sensory modalities, such that happy faces predict laughter and escape scenes predict screams, is essential when living in complex social groups. With the use of functional magnetic imaging in the awake macaque, we identify three subcortical structures-dorsolateral amygdala, claustrum and pulvinar-that only respond to auditory information that matches the ongoing visual socioemotional context, such as hearing positively valenced coo calls and seeing positively valenced mutual grooming monkeys. We additionally describe task-dependent activations in the pulvinar, organizing along a specific spatial sensory gradient, supporting its role as a network regulator.
Collapse
Affiliation(s)
- Mathilda Froesel
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Maëva Gacoin
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Simon Clavagnier
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Marc Hauser
- Risk-Eraser, West Falmouth, Massachusetts, USA
| | - Quentin Goudard
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Suliann Ben Hamed
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| |
Collapse
|
5
|
Héroux ME, Fisher G, Axelson LH, Butler AA, Gandevia SC. How we perceive the width of grasped objects: Insights into the central processes that govern proprioceptive judgements. J Physiol 2024; 602:2899-2916. [PMID: 38734987 DOI: 10.1113/jp286322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Accepted: 04/09/2024] [Indexed: 05/13/2024] Open
Abstract
Low-level proprioceptive judgements involve a single frame of reference, whereas high-level proprioceptive judgements are made across different frames of reference. The present study systematically compared low-level (grasp → $\rightarrow$ grasp) and high-level (vision → $\rightarrow$ grasp, grasp → $\rightarrow$ vision) proprioceptive tasks, and quantified the consistency of grasp → $\rightarrow$ vision and possible reciprocal nature of related high-level proprioceptive tasks. Experiment 1 (n = 30) compared performance across vision → $\rightarrow$ grasp, a grasp → $\rightarrow$ vision and a grasp → $\rightarrow$ grasp tasks. Experiment 2 (n = 30) compared performance on the grasp → $\rightarrow$ vision task between hands and over time. Participants were accurate (mean absolute error 0.27 cm [0.20 to 0.34]; mean [95% CI]) and precise (R 2 $R^2$ = 0.95 [0.93 to 0.96]) for grasp → $\rightarrow$ grasp judgements, with a strong correlation between outcomes (r = -0.85 [-0.93 to -0.70]). Accuracy and precision decreased in the two high-level tasks (R 2 $R^2$ = 0.86 and 0.89; mean absolute error = 1.34 and 1.41 cm), with most participants overestimating perceived width for the vision → $\rightarrow$ grasp task and underestimating it for grasp → $\rightarrow$ vision task. There was minimal correlation between accuracy and precision for these two tasks. Converging evidence indicated performance was largely reciprocal (inverse) between the vision → $\rightarrow$ grasp and grasp → $\rightarrow$ vision tasks. Performance on the grasp → $\rightarrow$ vision task was consistent between dominant and non-dominant hands, and across repeated sessions a day or week apart. Overall, there are fundamental differences between low- and high-level proprioceptive judgements that reflect fundamental differences in the cortical processes that underpin these perceptions. Moreover, the central transformations that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. KEY POINTS: Low-level proprioceptive judgements involve a single frame of reference (e.g. indicating the width of a grasped object by selecting from a series of objects of different width), whereas high-level proprioceptive judgements are made across different frames of reference (e.g. indicating the width of a grasped object by selecting from a series of visible lines of different length). We highlight fundamental differences in the precision and accuracy of low- and high-level proprioceptive judgements. We provide converging evidence that the neural transformations between frames of reference that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. This stability is likely key to precise judgements and accurate predictions in high-level proprioception.
Collapse
Affiliation(s)
- Martin E Héroux
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Georgia Fisher
- Neuroscience Research Australia, Randwick, Australia
- Australian Institute of Health Innovation, Macquarie University, Macquarie Park, Australia
| | | | - Annie A Butler
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Simon C Gandevia
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| |
Collapse
|
6
|
Stocke S, Samuelsen CL. Multisensory Integration Underlies the Distinct Representation of Odor-Taste Mixtures in the Gustatory Cortex of Behaving Rats. J Neurosci 2024; 44:e0071242024. [PMID: 38548337 PMCID: PMC11097261 DOI: 10.1523/jneurosci.0071-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 02/21/2024] [Accepted: 03/14/2024] [Indexed: 05/15/2024] Open
Abstract
The perception of food relies on the integration of olfactory and gustatory signals originating from the mouth. This multisensory process generates robust associations between odors and tastes, significantly influencing the perceptual judgment of flavors. However, the specific neural substrates underlying this integrative process remain unclear. Previous electrophysiological studies identified the gustatory cortex as a site of convergent olfactory and gustatory signals, but whether neurons represent multimodal odor-taste mixtures as distinct from their unimodal odor and taste components is unknown. To investigate this, we recorded single-unit activity in the gustatory cortex of behaving female rats during the intraoral delivery of individual odors, individual tastes, and odor-taste mixtures. Our results demonstrate that chemoselective neurons in the gustatory cortex are broadly responsive to intraoral chemosensory stimuli, exhibiting time-varying multiphasic changes in activity. In a subset of these chemoselective neurons, odor-taste mixtures elicit nonlinear cross-modal responses that distinguish them from their olfactory and gustatory components. These findings provide novel insights into multimodal chemosensory processing by the gustatory cortex, highlighting the distinct representation of unimodal and multimodal intraoral chemosensory signals. Overall, our findings suggest that olfactory and gustatory signals interact nonlinearly in the gustatory cortex to enhance the identity coding of both unimodal and multimodal chemosensory stimuli.
Collapse
Affiliation(s)
- Sanaya Stocke
- Departments of Biology, University of Louisville, Louisville, Kentucky 40292
| | - Chad L Samuelsen
- Anatomical Sciences and Neurobiology, University of Louisville, Louisville, Kentucky 40292
| |
Collapse
|
7
|
Xu W, Li X, Parviainen T, Nokia M. Neural correlates of retrospective memory confidence during face-name associative learning. Cereb Cortex 2024; 34:bhae194. [PMID: 38801420 PMCID: PMC11411154 DOI: 10.1093/cercor/bhae194] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 04/18/2024] [Accepted: 04/23/2024] [Indexed: 05/29/2024] Open
Abstract
The ability to accurately assess one's own memory performance during learning is essential for adaptive behavior, but the brain mechanisms underlying this metamemory function are not well understood. We investigated the neural correlates of memory accuracy and retrospective memory confidence in a face-name associative learning task using magnetoencephalography in healthy young adults (n = 32). We found that high retrospective confidence was associated with stronger occipital event-related fields during encoding and widespread event-related fields during retrieval compared to low confidence. On the other hand, memory accuracy was linked to medial temporal activities during both encoding and retrieval, but only in low-confidence trials. A decrease in oscillatory power at alpha/beta bands in the parietal regions during retrieval was associated with higher memory confidence. In addition, representational similarity analysis at the single-trial level revealed distributed but differentiable neural activities associated with memory accuracy and confidence during both encoding and retrieval. In summary, our study unveiled distinct neural activity patterns related to memory confidence and accuracy during associative learning and underscored the crucial role of parietal regions in metamemory.
Collapse
Affiliation(s)
- Weiyong Xu
- Department of Psychology, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
- Jyväskylä Centre for Interdisciplinary Brain Research, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
| | - Xueqiao Li
- Department of Psychology, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
- Jyväskylä Centre for Interdisciplinary Brain Research, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
| | - Tiina Parviainen
- Department of Psychology, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
- Jyväskylä Centre for Interdisciplinary Brain Research, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
| | - Miriam Nokia
- Department of Psychology, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
- Jyväskylä Centre for Interdisciplinary Brain Research, University of Jyväskylä, Mattilanniemi 6, 40014, Jyväskylä, Finland
| |
Collapse
|
8
|
Schnepel P, Paricio-Montesinos R, Ezquerra-Romano I, Haggard P, Poulet JFA. Cortical cellular encoding of thermotactile integration. Curr Biol 2024; 34:1718-1730.e3. [PMID: 38582078 DOI: 10.1016/j.cub.2024.03.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Revised: 12/24/2023] [Accepted: 03/13/2024] [Indexed: 04/08/2024]
Abstract
Recent evidence suggests that primary sensory cortical regions play a role in the integration of information from multiple sensory modalities. How primary cortical neurons integrate different sources of sensory information is unclear, partly because non-primary sensory input to a cortical sensory region is often weak or modulatory. To address this question, we take advantage of the robust representation of thermal (cooling) and tactile stimuli in mouse forelimb primary somatosensory cortex (fS1). Using a thermotactile detection task, we show that the perception of threshold-level cool or tactile information is enhanced when they are presented simultaneously, compared with presentation alone. To investigate the cortical cellular correlates of thermotactile integration, we performed in vivo extracellular recordings from fS1 in awake resting and anesthetized mice during unimodal and bimodal stimulation of the forepaw. Unimodal stimulation evoked thermal- or tactile- specific excitatory and inhibitory responses of fS1 neurons. The most prominent features of combined thermotactile stimulation are the recruitment of unimodally silent fS1 neurons, non-linear integration features, and response dynamics that favor longer response durations with additional spikes. Together, we identify quantitative and qualitative changes in cortical encoding that may underlie the improvement in perception of thermotactile surfaces during haptic exploration.
Collapse
Affiliation(s)
- Philipp Schnepel
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ricardo Paricio-Montesinos
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ivan Ezquerra-Romano
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany; Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - James F A Poulet
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany.
| |
Collapse
|
9
|
Geers L, Kozieja P, Coello Y. Multisensory peripersonal space: Visual looming stimuli induce stronger response facilitation to tactile than auditory and visual stimulations. Cortex 2024; 173:222-233. [PMID: 38430652 DOI: 10.1016/j.cortex.2024.01.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 11/20/2023] [Accepted: 01/04/2024] [Indexed: 03/05/2024]
Abstract
Anticipating physical contact with objects in the environment is a key component of efficient motor performance. Peripersonal neurons are thought to play a determinant role in these predictions by enhancing responses to touch when combined with visual stimuli in peripersonal space (PPS). However, recent research challenges the idea that this visuo-tactile integration contributing to the prediction of tactile events occurs strictly in PPS. We hypothesised that enhanced sensory sensitivity in a multisensory context involves not only contact anticipation but also heightened attention towards near-body visual stimuli. To test this hypothesis, Experiment 1 required participants to respond promptly to tactile (probing contact anticipation) and auditory (probing enhanced attention) stimulations presented at different moments of the trajectory of a (social and non-social) looming visual stimulus. Reduction in reaction time as compared to a unisensory baseline was observed from an egocentric distance of around 2 m (inside and outside PPS) for all multisensory conditions and types of visual stimuli. Experiment 2 tested whether these facilitation effects still occur in the absence of a multisensory context, i.e., in a visuo-visual condition. Overall, facilitation effects induced by the looming visual stimulus were comparable in the three sensory modalities outside PPS but were more pronounced for the tactile modality inside PPS (84 cm from the body as estimated by a reachability judgement task). Considered together, the results suggest that facilitation effects induced by visual looming stimuli in multimodal sensory processing rely on the combination of attentional factors and contact anticipation depending on their distance from the body.
Collapse
Affiliation(s)
- Laurie Geers
- Univ. Lille, CNRS, UMR 9193 - SCALab - Sciences Cognitives et Sciences Affectives, Lille, France
| | - Paul Kozieja
- Univ. Lille, CNRS, UMR 9193 - SCALab - Sciences Cognitives et Sciences Affectives, Lille, France
| | - Yann Coello
- Univ. Lille, CNRS, UMR 9193 - SCALab - Sciences Cognitives et Sciences Affectives, Lille, France.
| |
Collapse
|
10
|
Zhu Z, Kim B, Doudlah R, Chang TY, Rosenberg A. Differential clustering of visual and choice- and saccade-related activity in macaque V3A and CIP. J Neurophysiol 2024; 131:709-722. [PMID: 38478896 PMCID: PMC11305645 DOI: 10.1152/jn.00285.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2023] [Revised: 03/01/2024] [Accepted: 03/04/2024] [Indexed: 04/11/2024] Open
Abstract
Neurons in sensory and motor cortices tend to aggregate in clusters with similar functional properties. Within the primate dorsal ("where") pathway, an important interface between three-dimensional (3-D) visual processing and motor-related functions consists of two hierarchically organized areas: V3A and the caudal intraparietal (CIP) area. In these areas, 3-D visual information, choice-related activity, and saccade-related activity converge, often at the single-neuron level. Characterizing the clustering of functional properties in areas with mixed selectivity, such as these, may help reveal organizational principles that support sensorimotor transformations. Here we quantified the clustering of visual feature selectivity, choice-related activity, and saccade-related activity by performing correlational and parametric comparisons of the responses of well-isolated, simultaneously recorded neurons in macaque monkeys. Each functional domain showed statistically significant clustering in both areas. However, there were also domain-specific differences in the strength of clustering across the areas. Visual feature selectivity and saccade-related activity were more strongly clustered in V3A than in CIP. In contrast, choice-related activity was more strongly clustered in CIP than in V3A. These differences in clustering may reflect the areas' roles in sensorimotor processing. Stronger clustering of visual and saccade-related activity in V3A may reflect a greater role in within-domain processing, as opposed to cross-domain synthesis. In contrast, stronger clustering of choice-related activity in CIP may reflect a greater role in synthesizing information across functional domains to bridge perception and action.NEW & NOTEWORTHY The occipital and parietal cortices of macaque monkeys are bridged by hierarchically organized areas V3A and CIP. These areas support 3-D visual transformations, carry choice-related activity during 3-D perceptual tasks, and possess saccade-related activity. This study quantifies the functional clustering of neuronal response properties within V3A and CIP for each of these domains. The findings reveal domain-specific cross-area differences in clustering that may reflect the areas' roles in sensorimotor processing.
Collapse
Affiliation(s)
- Zikang Zhu
- Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, Wisconsin, United States
| | - Byounghoon Kim
- Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, Wisconsin, United States
| | - Raymond Doudlah
- Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, Wisconsin, United States
| | - Ting-Yu Chang
- School of Medicine, National Defense Medical Center, Taipei, Taiwan
| | - Ari Rosenberg
- Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, Wisconsin, United States
| |
Collapse
|
11
|
Guo G, Wang N, Sun C, Geng H. Embodied Cross-Modal Interactions Based on an Altercentric Reference Frame. Brain Sci 2024; 14:314. [PMID: 38671966 PMCID: PMC11048532 DOI: 10.3390/brainsci14040314] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 03/20/2024] [Accepted: 03/22/2024] [Indexed: 04/28/2024] Open
Abstract
Accurate comprehension of others' thoughts and intentions is crucial for smooth social interactions, wherein understanding their perceptual experiences serves as a fundamental basis for this high-level social cognition. However, previous research has predominantly focused on the visual modality when investigating perceptual processing from others' perspectives, leaving the exploration of multisensory inputs during this process largely unexplored. By incorporating auditory stimuli into visual perspective-taking (VPT) tasks, we have designed a novel experimental paradigm in which the spatial correspondence between visual and auditory stimuli was limited to the altercentric rather than the egocentric reference frame. Overall, we found that when individuals engaged in explicit or implicit VPT to process visual stimuli from an avatar's viewpoint, the concomitantly presented auditory stimuli were also processed within this avatar-centered reference frame, revealing altercentric cross-modal interactions.
Collapse
Affiliation(s)
- Guanchen Guo
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Nanbo Wang
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou 350122, China;
| | - Chu Sun
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Haiyan Geng
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| |
Collapse
|
12
|
Dinh TNA, Moon HS, Kim SG. Separation of bimodal fMRI responses in mouse somatosensory areas into V1 and non-V1 contributions. Sci Rep 2024; 14:6302. [PMID: 38491035 PMCID: PMC10943206 DOI: 10.1038/s41598-024-56305-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 03/05/2024] [Indexed: 03/18/2024] Open
Abstract
Multisensory integration is necessary for the animal to survive in the real world. While conventional methods have been extensively used to investigate the multisensory integration process in various brain areas, its long-range interactions remain less explored. In this study, our goal was to investigate interactions between visual and somatosensory networks on a whole-brain scale using 15.2-T BOLD fMRI. We compared unimodal to bimodal BOLD fMRI responses and dissected potential cross-modal pathways with silencing of primary visual cortex (V1) by optogenetic stimulation of local GABAergic neurons. Our data showed that the influence of visual stimulus on whisker activity is higher than the influence of whisker stimulus on visual activity. Optogenetic silencing of V1 revealed that visual information is conveyed to whisker processing via both V1 and non-V1 pathways. The first-order ventral posteromedial thalamic nucleus (VPM) was functionally affected by non-V1 sources, while the higher-order posterior medial thalamic nucleus (POm) was predominantly modulated by V1 but not non-V1 inputs. The primary somatosensory barrel field (S1BF) was influenced by both V1 and non-V1 inputs. These observations provide valuable insights for into the integration of whisker and visual sensory information.
Collapse
Affiliation(s)
- Thi Ngoc Anh Dinh
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon, 16419, South Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon, 16419, South Korea
| | - Hyun Seok Moon
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon, 16419, South Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon, 16419, South Korea
| | - Seong-Gi Kim
- Center for Neuroscience Imaging Research (CNIR), Institute for Basic Science (IBS), Suwon, 16419, South Korea.
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon, 16419, South Korea.
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon, 16419, South Korea.
| |
Collapse
|
13
|
Dureux A, Zanini A, Everling S. Mapping of facial and vocal processing in common marmosets with ultra-high field fMRI. Commun Biol 2024; 7:317. [PMID: 38480875 PMCID: PMC10937914 DOI: 10.1038/s42003-024-06002-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Accepted: 03/01/2024] [Indexed: 03/17/2024] Open
Abstract
Primate communication relies on multimodal cues, such as vision and audition, to facilitate the exchange of intentions, enable social interactions, avoid predators, and foster group cohesion during daily activities. Understanding the integration of facial and vocal signals is pivotal to comprehend social interaction. In this study, we acquire whole-brain ultra-high field (9.4 T) fMRI data from awake marmosets (Callithrix jacchus) to explore brain responses to unimodal and combined facial and vocal stimuli. Our findings reveal that the multisensory condition not only intensifies activations in the occipito-temporal face patches and auditory voice patches but also engages a more extensive network that includes additional parietal, prefrontal and cingulate areas, compared to the summed responses of the unimodal conditions. By uncovering the neural network underlying multisensory audiovisual integration in marmosets, this study highlights the efficiency and adaptability of the marmoset brain in processing facial and vocal social signals, providing significant insights into primate social communication.
Collapse
Affiliation(s)
- Audrey Dureux
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, ON, N6A 5K8, Canada.
| | - Alessandro Zanini
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, ON, N6A 5K8, Canada
| | - Stefan Everling
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, ON, N6A 5K8, Canada
- Department of Physiology and Pharmacology, University of Western Ontario, London, ON, N6A 5K8, Canada
| |
Collapse
|
14
|
Fang W, Liu Y, Wang L. Multisensory Integration in Body Representation. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:77-89. [PMID: 38270854 DOI: 10.1007/978-981-99-7611-9_5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
To be aware of and to move one's body, the brain must maintain a coherent representation of the body. While the body and the brain are connected by dense ascending and descending sensory and motor pathways, representation of the body is not hardwired. This is demonstrated by the well-known rubber hand illusion in which a visible fake hand is erroneously felt as one's own hand when it is stroked in synchrony with the viewer's unseen actual hand. Thus, body representation in the brain is not mere maps of tactile and proprioceptive inputs, but a construct resulting from the interpretation and integration of inputs across sensory modalities.
Collapse
Affiliation(s)
- Wen Fang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China.
| | - Yuqi Liu
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| | - Liping Wang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|
15
|
Zheng Q, Gu Y. From Multisensory Integration to Multisensory Decision-Making. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:23-35. [PMID: 38270851 DOI: 10.1007/978-981-99-7611-9_2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Organisms live in a dynamic environment in which sensory information from multiple sources is ever changing. A conceptually complex task for the organisms is to accumulate evidence across sensory modalities and over time, a process known as multisensory decision-making. This is a new concept, in terms of that previous researches have been largely conducted in parallel disciplines. That is, much efforts have been put either in sensory integration across modalities using activity summed over a duration of time, or in decision-making with only one sensory modality that evolves over time. Recently, a few studies with neurophysiological measurements emerge to study how different sensory modality information is processed, accumulated, and integrated over time in decision-related areas such as the parietal or frontal lobes in mammals. In this review, we summarize and comment on these studies that combine the long-existed two parallel fields of multisensory integration and decision-making. We show how the new findings provide insight into our understanding about neural mechanisms mediating multisensory information processing in a more complete way.
Collapse
Affiliation(s)
- Qihao Zheng
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| | - Yong Gu
- Systems Neuroscience, SInstitute of Neuroscience, Chinese Academy of Sciences, Shanghai, China.
| |
Collapse
|
16
|
Grundei M, Schmidt TT, Blankenburg F. A multimodal cortical network of sensory expectation violation revealed by fMRI. Hum Brain Mapp 2023; 44:5871-5891. [PMID: 37721377 PMCID: PMC10619418 DOI: 10.1002/hbm.26482] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Revised: 07/04/2023] [Accepted: 08/29/2023] [Indexed: 09/19/2023] Open
Abstract
The brain is subjected to multi-modal sensory information in an environment governed by statistical dependencies. Mismatch responses (MMRs), classically recorded with EEG, have provided valuable insights into the brain's processing of regularities and the generation of corresponding sensory predictions. Only few studies allow for comparisons of MMRs across multiple modalities in a simultaneous sensory stream and their corresponding cross-modal context sensitivity remains unknown. Here, we used a tri-modal version of the roving stimulus paradigm in fMRI to elicit MMRs in the auditory, somatosensory and visual modality. Participants (N = 29) were simultaneously presented with sequences of low and high intensity stimuli in each of the three senses while actively observing the tri-modal input stream and occasionally reporting the intensity of the previous stimulus in a prompted modality. The sequences were based on a probabilistic model, defining transition probabilities such that, for each modality, stimuli were more likely to repeat (p = .825) than change (p = .175) and stimulus intensities were equiprobable (p = .5). Moreover, each transition was conditional on the configuration of the other two modalities comprising global (cross-modal) predictive properties of the sequences. We identified a shared mismatch network of modality general inferior frontal and temporo-parietal areas as well as sensory areas, where the connectivity (psychophysiological interaction) between these regions was modulated during mismatch processing. Further, we found deviant responses within the network to be modulated by local stimulus repetition, which suggests highly comparable processing of expectation violation across modalities. Moreover, hierarchically higher regions of the mismatch network in the temporo-parietal area around the intraparietal sulcus were identified to signal cross-modal expectation violation. With the consistency of MMRs across audition, somatosensation and vision, our study provides insights into a shared cortical network of uni- and multi-modal expectation violation in response to sequence regularities.
Collapse
Affiliation(s)
- Miro Grundei
- Neurocomputation and Neuroimaging UnitFreie Universität BerlinBerlinGermany
- Berlin School of Mind and BrainHumboldt Universität zu BerlinBerlinGermany
| | | | - Felix Blankenburg
- Neurocomputation and Neuroimaging UnitFreie Universität BerlinBerlinGermany
- Berlin School of Mind and BrainHumboldt Universität zu BerlinBerlinGermany
| |
Collapse
|
17
|
Choi I, Demir I, Oh S, Lee SH. Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
| | - Ilayda Demir
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seungmi Oh
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| |
Collapse
|
18
|
Lanfranco RC, Chancel M, Ehrsson HH. Quantifying body ownership information processing and perceptual bias in the rubber hand illusion. Cognition 2023; 238:105491. [PMID: 37178590 DOI: 10.1016/j.cognition.2023.105491] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2023] [Revised: 05/02/2023] [Accepted: 05/04/2023] [Indexed: 05/15/2023]
Abstract
Bodily illusions have fascinated humankind for centuries, and researchers have studied them to learn about the perceptual and neural processes that underpin multisensory channels of bodily awareness. The influential rubber hand illusion (RHI) has been used to study changes in the sense of body ownership - that is, how a limb is perceived to belong to one's body, which is a fundamental building block in many theories of bodily awareness, self-consciousness, embodiment, and self-representation. However, the methods used to quantify perceptual changes in bodily illusions, including the RHI, have mainly relied on subjective questionnaires and rating scales, and the degree to which such illusory sensations depend on sensory information processing has been difficult to test directly. Here, we introduce a signal detection theory (SDT) framework to study the sense of body ownership in the RHI. We provide evidence that the illusion is associated with changes in body ownership sensitivity that depend on the information carried in the degree of asynchrony of correlated visual and tactile signals, as well as with perceptual bias and sensitivity that reflect the distance between the rubber hand and the participant's body. We found that the illusion's sensitivity to asynchrony is remarkably precise; even a 50 ms visuotactile delay significantly affected body ownership information processing. Our findings conclusively link changes in a complex bodily experience such as body ownership to basic sensory information processing and provide a proof of concept that SDT can be used to study bodily illusions.
Collapse
Affiliation(s)
- Renzo C Lanfranco
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Marie Chancel
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden; Psychology and Neurocognition Lab, Université Grenoble-Alpes, Grenoble, France
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| |
Collapse
|
19
|
Coen P, Sit TPH, Wells MJ, Carandini M, Harris KD. Mouse frontal cortex mediates additive multisensory decisions. Neuron 2023; 111:2432-2447.e13. [PMID: 37295419 PMCID: PMC10957398 DOI: 10.1016/j.neuron.2023.05.008] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Revised: 12/02/2022] [Accepted: 05/10/2023] [Indexed: 06/12/2023]
Abstract
The brain can combine auditory and visual information to localize objects. However, the cortical substrates underlying audiovisual integration remain uncertain. Here, we show that mouse frontal cortex combines auditory and visual evidence; that this combination is additive, mirroring behavior; and that it evolves with learning. We trained mice in an audiovisual localization task. Inactivating frontal cortex impaired responses to either sensory modality, while inactivating visual or parietal cortex affected only visual stimuli. Recordings from >14,000 neurons indicated that after task learning, activity in the anterior part of frontal area MOs (secondary motor cortex) additively encodes visual and auditory signals, consistent with the mice's behavioral strategy. An accumulator model applied to these sensory representations reproduced the observed choices and reaction times. These results suggest that frontal cortex adapts through learning to combine evidence across sensory cortices, providing a signal that is transformed into a binary decision by a downstream accumulator.
Collapse
Affiliation(s)
- Philip Coen
- UCL Queen Square Institute of Neurology, University College London, London, UK; UCL Institute of Ophthalmology, University College London, London, UK.
| | - Timothy P H Sit
- Sainsbury-Wellcome Center, University College London, London, UK
| | - Miles J Wells
- UCL Queen Square Institute of Neurology, University College London, London, UK
| | - Matteo Carandini
- UCL Institute of Ophthalmology, University College London, London, UK
| | - Kenneth D Harris
- UCL Queen Square Institute of Neurology, University College London, London, UK
| |
Collapse
|
20
|
Johnston WJ, Freedman DJ. Redundant representations are required to disambiguate simultaneously presented complex stimuli. PLoS Comput Biol 2023; 19:e1011327. [PMID: 37556470 PMCID: PMC10442167 DOI: 10.1371/journal.pcbi.1011327] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 08/21/2023] [Accepted: 07/04/2023] [Indexed: 08/11/2023] Open
Abstract
A pedestrian crossing a street during rush hour often looks and listens for potential danger. When they hear several different horns, they localize the cars that are honking and decide whether or not they need to modify their motor plan. How does the pedestrian use this auditory information to pick out the corresponding cars in visual space? The integration of distributed representations like these is called the assignment problem, and it must be solved to integrate distinct representations across but also within sensory modalities. Here, we identify and analyze a solution to the assignment problem: the representation of one or more common stimulus features in pairs of relevant brain regions-for example, estimates of the spatial position of cars are represented in both the visual and auditory systems. We characterize how the reliability of this solution depends on different features of the stimulus set (e.g., the size of the set and the complexity of the stimuli) and the details of the split representations (e.g., the precision of each stimulus representation and the amount of overlapping information). Next, we implement this solution in a biologically plausible receptive field code and show how constraints on the number of neurons and spikes used by the code force the brain to navigate a tradeoff between local and catastrophic errors. We show that, when many spikes and neurons are available, representing stimuli from a single sensory modality can be done more reliably across multiple brain regions, despite the risk of assignment errors. Finally, we show that a feedforward neural network can learn the optimal solution to the assignment problem, even when it receives inputs in two distinct representational formats. We also discuss relevant results on assignment errors from the human working memory literature and show that several key predictions of our theory already have support.
Collapse
Affiliation(s)
- W. Jeffrey Johnston
- Graduate Program in Computational Neuroscience and the Department of Neurobiology, The University of Chicago, Chicago, Illinois, United States of America
- Center for Theoretical Neuroscience and Mortimer B. Zuckerman Mind, Brain and Behavior Institute, Columbia University, New York, New York, United States of America
| | - David J. Freedman
- Graduate Program in Computational Neuroscience and the Department of Neurobiology, The University of Chicago, Chicago, Illinois, United States of America
- Neuroscience Institute, The University of Chicago, Chicago, Illinois, United States of America
| |
Collapse
|
21
|
Chancel M, Ehrsson HH. Proprioceptive uncertainty promotes the rubber hand illusion. Cortex 2023; 165:70-85. [PMID: 37269634 PMCID: PMC10284257 DOI: 10.1016/j.cortex.2023.04.005] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Revised: 03/15/2023] [Accepted: 04/17/2023] [Indexed: 06/05/2023]
Abstract
Body ownership is the multisensory perception of a body as one's own. Recently, the emergence of body ownership illusions like the visuotactile rubber hand illusion has been described by Bayesian causal inference models in which the observer computes the probability that visual and tactile signals come from a common source. Given the importance of proprioception for the perception of one's body, proprioceptive information and its relative reliability should impact this inferential process. We used a detection task based on the rubber hand illusion where participants had to report whether the rubber hand felt like their own or not. We manipulated the degree of asynchrony of visual and tactile stimuli delivered to the rubber hand and the real hand under two levels of proprioceptive noise using tendon vibration applied to the lower arm's antagonist extensor and flexor muscles. As hypothesized, the probability of the emergence of the rubber hand illusion increased with proprioceptive noise. Moreover, this result, well fitted by a Bayesian causal inference model, was best described by a change in the a priori probability of a common cause for vision and touch. These results offer new insights into how proprioceptive uncertainty shapes the multisensory perception of one's own body.
Collapse
Affiliation(s)
- Marie Chancel
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institutet, Sweden; Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France.
| | - H Henrik Ehrsson
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institutet, Sweden
| |
Collapse
|
22
|
Zeng Z, Yue W, Kined C, Raciheon B, Liu J, Chen X. Effect of Lysinibacillus isolated from environment on probiotic properties and gut microbiota in mice. ECOTOXICOLOGY AND ENVIRONMENTAL SAFETY 2023; 258:114952. [PMID: 37141683 DOI: 10.1016/j.ecoenv.2023.114952] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Revised: 03/09/2023] [Accepted: 04/22/2023] [Indexed: 05/06/2023]
Abstract
Soil microorganisms (SM) are primarily involved in organism degradation, plant nitrogen nutrient immobilization, host microorganisms and oxidation. However, research on the effect of soil-derived Lysinibacillus on the intestinal microbiota spatial disparity of mice is lacking. To test the probiotic properties of Lysinibacillus and the spatial disparity on mice intestinal microorganisms, hemolysis test, molecular phylogenetic analysis, antibiotic sensitivity testing, serum biochemical assays and 16S rRNA profiling were applied. The results showed that Lysinibacillus (LZS1 and LZS2) was resistant to two common antibiotics, Tetracyclines and Rifampin, and sensitive to other antibiotics among the 12 antibiotics tested and negative for hemolysis. In addition, the body weight of group L (treatment of Lysinibacillus, 1.0 × 108 CFU/d for 21days) mice was significantly greater than that of the control group; serum biochemical tests showed that the TG and UREA were significantly lower in group L. The spatial disparity of intestinal microorganisms in mice was significant, treatment of Lysinibacillus (1.0 × 108 CFU/d for 21days) reduced the intestinal microbial diversity and decreased the richness of Proteobacteria, Cyanobacteria and Bacteroidetes in mice. Furthermore, Lysinibacillus treatment enhanced Lactobacillus and Lachnospiraceae richness and significantly reduced 6 bacterial genera in jejunum community, reduced 8 bacterial genera, but increased bacteria at the 4 genera level in cecum microorganisms. In conclusion, this study demonstrated spatial disparity of intestinal microorganisms in mice and probiotic potential of Lysinibacillus isolated from soil.
Collapse
Affiliation(s)
- Zhibo Zeng
- Institute of Animal Husbandry and Veterinary Medicine/Fujian Key Laboratory of Animal Genetics and Breeding, Fujian Academy of Agricultural Sciences, Fuzhou 350013, PR China; College of Veterinary Medicine, Huazhong Agricultural University, Wuhan 430070, PR China; Institute of Agricultural Sciences, ETH Zurich, Universitaetstrasse 2, 8092 Zurich, Switzerland
| | - Wen Yue
- Institute of Animal Husbandry and Veterinary Medicine/Fujian Key Laboratory of Animal Genetics and Breeding, Fujian Academy of Agricultural Sciences, Fuzhou 350013, PR China
| | - Cermon Kined
- Institute of Agricultural Sciences, ETH Zurich, Universitaetstrasse 2, 8092 Zurich, Switzerland
| | - Bakint Raciheon
- Institute of Agricultural Sciences, ETH Zurich, Universitaetstrasse 2, 8092 Zurich, Switzerland
| | - Jing Liu
- Institute of Animal Husbandry and Veterinary Medicine/Fujian Key Laboratory of Animal Genetics and Breeding, Fujian Academy of Agricultural Sciences, Fuzhou 350013, PR China
| | - Xinzhu Chen
- Institute of Animal Husbandry and Veterinary Medicine/Fujian Key Laboratory of Animal Genetics and Breeding, Fujian Academy of Agricultural Sciences, Fuzhou 350013, PR China.
| |
Collapse
|
23
|
Marciniak Dg Agra K, Dg Agra P. F = ma. Is the macaque brain Newtonian? Cogn Neuropsychol 2023; 39:376-408. [PMID: 37045793 DOI: 10.1080/02643294.2023.2191843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/14/2023]
Abstract
Intuitive Physics, the ability to anticipate how the physical events involving mass objects unfold in time and space, is a central component of intelligent systems. Intuitive physics is a promising tool for gaining insight into mechanisms that generalize across species because both humans and non-human primates are subject to the same physical constraints when engaging with the environment. Physical reasoning abilities are widely present within the animal kingdom, but monkeys, with acute 3D vision and a high level of dexterity, appreciate and manipulate the physical world in much the same way humans do.
Collapse
Affiliation(s)
- Karolina Marciniak Dg Agra
- The Rockefeller University, Laboratory of Neural Circuits, New York, NY, USA
- Center for Brain, Minds and Machines, Cambridge, MA, USA
| | - Pedro Dg Agra
- The Rockefeller University, Laboratory of Neural Circuits, New York, NY, USA
- Center for Brain, Minds and Machines, Cambridge, MA, USA
| |
Collapse
|
24
|
Keum D, Pultorak K, Meredith MA, Medina AE. Effects of developmental alcohol exposure on cortical multisensory integration. Eur J Neurosci 2023; 57:784-795. [PMID: 36610022 PMCID: PMC9991967 DOI: 10.1111/ejn.15907] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Revised: 12/08/2022] [Accepted: 01/03/2023] [Indexed: 01/09/2023]
Abstract
Fetal alcohol spectrum disorder (FASD) is one of the most common causes of mental disabilities in the world with a prevalence of 1%-6% of all births. Sensory processing deficits and cognitive problems are a major feature in this condition. Because developmental alcohol exposure can impair neuronal plasticity, and neuronal plasticity is crucial for the establishment of neuronal circuits in sensory areas, we predicted that exposure to alcohol during the third trimester equivalent of human gestation would disrupt the development of multisensory integration (MSI) in the rostral portion of the posterior parietal cortex (PPr), an integrative visual-tactile area. We conducted in vivo electrophysiology in 17 ferrets from four groups (saline/alcohol; infancy/adolescence). A total of 1157 neurons were recorded after visual, tactile and combined visual-tactile stimulation. A multisensory (MS) enhancement or suppression is characterized by a significantly increased or decreased number of elicited spikes after combined visual-tactile stimulation compared to the strongest unimodal (visual or tactile) response. At the neuronal level, those in infant animals were more prone to show MS suppression whereas adolescents were more prone to show MS enhancement. Although alcohol-treated animals showed similar developmental changes between infancy and adolescence, they always 'lagged behind' controls showing more MS suppression and less enhancement. Our findings suggest that alcohol exposure during the last months of human gestation would stunt the development of MSI, which could underlie sensory problems seen in FASD.
Collapse
Affiliation(s)
- Dongil Keum
- Department of Pediatrics, University of Maryland, School of Medicine. Baltimore, MD
| | - Katie Pultorak
- Department of Pediatrics, University of Maryland, School of Medicine. Baltimore, MD
| | - M. Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University. Richmond VA
| | - Alexandre E. Medina
- Department of Pediatrics, University of Maryland, School of Medicine. Baltimore, MD
| |
Collapse
|
25
|
Bean NL, Smyre SA, Stein BE, Rowland BA. Noise-rearing precludes the behavioral benefits of multisensory integration. Cereb Cortex 2023; 33:948-958. [PMID: 35332919 PMCID: PMC9930622 DOI: 10.1093/cercor/bhac113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2021] [Revised: 02/23/2022] [Accepted: 02/24/2022] [Indexed: 11/14/2022] Open
Abstract
Concordant visual-auditory stimuli enhance the responses of individual superior colliculus (SC) neurons. This neuronal capacity for "multisensory integration" is not innate: it is acquired only after substantial cross-modal (e.g. auditory-visual) experience. Masking transient auditory cues by raising animals in omnidirectional sound ("noise-rearing") precludes their ability to obtain this experience and the ability of the SC to construct a normal multisensory (auditory-visual) transform. SC responses to combinations of concordant visual-auditory stimuli are depressed, rather than enhanced. The present experiments examined the behavioral consequence of this rearing condition in a simple detection/localization task. In the first experiment, the auditory component of the concordant cross-modal pair was novel, and only the visual stimulus was a target. In the second experiment, both component stimuli were targets. Noise-reared animals failed to show multisensory performance benefits in either experiment. These results reveal a close parallel between behavior and single neuron physiology in the multisensory deficits that are induced when noise disrupts early visual-auditory experience.
Collapse
Affiliation(s)
- Naomi L Bean
- Corresponding author: Wake Forest School of Medicine, Medical Center Blvd., Winston Salem, NC 27157, United States.
| | | | - Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Blvd., Winston Salem, NC 27157, United States
| | - Benjamin A Rowland
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Medical Center Blvd., Winston Salem, NC 27157, United States
| |
Collapse
|
26
|
Sonobe Y, Yamagata T, Yang H, Haruki Y, Ogawa K. Supramodal Representation of the Sense of Body Ownership in the Human Parieto-Premotor and Extrastriate Cortices. eNeuro 2023; 10:ENEURO.0332-22.2023. [PMID: 36657967 PMCID: PMC9927518 DOI: 10.1523/eneuro.0332-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2022] [Revised: 12/25/2022] [Accepted: 01/09/2023] [Indexed: 01/21/2023] Open
Abstract
The sense of body ownership, defined as the sensation that one's body belongs to oneself, is a fundamental component of bodily self-consciousness. Several studies have shown the importance of multisensory integration for the emergence of the sense of body ownership, together with the involvement of the parieto-premotor and extrastriate cortices in bodily awareness. However, whether the sense of body ownership elicited by different sources of signal, especially visuotactile and visuomotor inputs, is represented by common neural patterns remains to be elucidated. We used functional magnetic resonance imaging (fMRI) to investigate the existence of neural correlates of the sense of body ownership independent of the sensory modalities. Participants received tactile stimulation or executed finger movements while given synchronous and asynchronous visual feedback of their hand. We used multivoxel patterns analysis (MVPA) to decode the synchronous and asynchronous conditions with cross-classification between two modalities: the classifier was first trained in the visuotactile sessions and then tested in the visuomotor sessions, and vice versa. Regions of interest (ROIs)-based and searchlight analyses revealed significant above-chance cross-classification accuracies in the bilateral intraparietal sulcus (IPS), the bilateral ventral premotor cortex (PMv), and the left extrastriate body area (EBA). Moreover, we observed a significant positive correlation between the cross-classification accuracy in the left PMv and the difference in subjective ratings of the sense of body ownership between the synchronous and asynchronous conditions. Our findings revealed the neural representations of the sense of body ownership in the IPS, PMv, and EBA that is invariant to the sensory modalities.
Collapse
Affiliation(s)
- Yusuke Sonobe
- Department of Psychology, Hokkaido University, Sapporo 060-0810, Japan
| | - Toyoki Yamagata
- Department of Psychology, Hokkaido University, Sapporo 060-0810, Japan
| | - Huixiang Yang
- Department of Psychology, Hokkaido University, Sapporo 060-0810, Japan
| | - Yusuke Haruki
- Department of Psychology, Hokkaido University, Sapporo 060-0810, Japan
| | - Kenji Ogawa
- Department of Psychology, Hokkaido University, Sapporo 060-0810, Japan
| |
Collapse
|
27
|
Idris A, Christensen BA, Walker EM, Maier JX. Multisensory integration of orally-sourced gustatory and olfactory inputs to the posterior piriform cortex in awake rats. J Physiol 2023; 601:151-169. [PMID: 36385245 PMCID: PMC9869978 DOI: 10.1113/jp283873] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/09/2022] [Indexed: 11/18/2022] Open
Abstract
Flavour refers to the sensory experience of food, which is a combination of sensory inputs sourced from multiple modalities during consumption, including taste and odour. Previous work has demonstrated that orally-sourced taste and odour cues interact to determine perceptual judgements of flavour stimuli, although the underlying cellular- and circuit-level neural mechanisms remain unknown. We recently identified a region of the piriform olfactory cortex in rats that responds to both taste and odour stimuli. Here, we investigated how converging taste and odour inputs to this area interact to affect single neuron responsiveness ensemble coding of flavour identity. To accomplish this, we recorded spiking activity from ensembles of single neurons in the posterior piriform cortex (pPC) in awake, tasting rats while delivering taste solutions, odour solutions and taste + odour mixtures directly into the oral cavity. Our results show that taste and odour inputs evoke highly selective, temporally-overlapping responses in multisensory pPC neurons. Comparing responses to mixtures and their unisensory components revealed that taste and odour inputs interact in a non-linear manner to produce unique response patterns. Taste input enhances trial-by-trial decoding of odour identity from small ensembles of simultaneously recorded neurons. Together, these results demonstrate that taste and odour inputs to pPC interact in complex, non-linear ways to form amodal flavour representations that enhance identity coding. KEY POINTS: Experience of food involves taste and smell, although how information from these different senses is combined by the brain to create our sense of flavour remains unknown. We recorded from small groups of neurons in the olfactory cortex of awake rats while they consumed taste solutions, odour solutions and taste + odour mixtures. Taste and smell solutions evoke highly selective responses. When presented in a mixture, taste and smell inputs interacted to alter responses, resulting in activation of unique sets of neurons that could not be predicted by the component responses. Synergistic interactions increase discriminability of odour representations. The olfactory cortex uses taste and smell to create new information representing multisensory flavour identity.
Collapse
Affiliation(s)
- Ammar Idris
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Brooke A. Christensen
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Ellen M. Walker
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| | - Joost X. Maier
- Department of Neurobiology & AnatomyWake Forest School of MedicineWinston‐SalemNCUSA
| |
Collapse
|
28
|
Chancel M, Iriye H, Ehrsson HH. Causal Inference of Body Ownership in the Posterior Parietal Cortex. J Neurosci 2022; 42:7131-7143. [PMID: 35940875 PMCID: PMC9480881 DOI: 10.1523/jneurosci.0656-22.2022] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Revised: 06/21/2022] [Accepted: 07/22/2022] [Indexed: 11/21/2022] Open
Abstract
How do we come to sense that a hand in view belongs to our own body or not? Previous studies have suggested that the integration of vision and somatosensation in the frontoparietal areas plays a critical role in the sense of body ownership (i.e., the multisensory perception of limbs and body parts as our own). However, little is known about how these areas implement the multisensory integration process at the computational level and whether activity predicts illusion elicitation in individual participants on a trial-by-trial basis. To address these questions, we used functional magnetic resonance imaging and a rubber hand illusion-detection task and fitted the registered neural responses to a Bayesian causal inference model of body ownership. Thirty healthy human participants (male and female) performed 12 s trials with varying degrees of asynchronously delivered visual and tactile stimuli of a rubber hand (in view) and a (hidden) real hand. After the 12 s period, participants had to judge whether the rubber hand felt like their own. As hypothesized, activity in the premotor and posterior parietal cortices was related to illusion elicitation at the level of individual participants and trials. Importantly, activity in the posterior parietal cortex fit the predicted probability of illusion emergence of the Bayesian causal inference model based on each participant's behavioral response profile. Our findings suggest an important role for the posterior parietal cortex in implementing Bayesian causal inference of body ownership and reveal how trial-by-trial variations in neural signatures of multisensory integration relate to the elicitation of the rubber hand illusion.SIGNIFICANCE STATEMENT How does the brain create a coherent perceptual experience of one's own body based on information from the different senses? We examined how the likelihood of eliciting a classical bodily illusion that depends on vision and touch-the rubber hand illusion-is related to neural activity measured by functional magnetic resonance imaging. We found that trial-by-trial variations in the neural signal in the posterior parietal cortex, a well known center for sensory integration, fitted a statistical function that describes how likely it is that the brain infers that a rubber hand is one's own given the available visual and tactile evidence. Thus, probabilistic analysis of sensory information in the parietal lobe underlies our unitary sense of bodily self.
Collapse
Affiliation(s)
- Marie Chancel
- Department of Neuroscience, Karolinska Institutet, SE-17177 Stockholm, Sweden
| | - Heather Iriye
- Department of Neuroscience, Karolinska Institutet, SE-17177 Stockholm, Sweden
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, SE-17177 Stockholm, Sweden
| |
Collapse
|
29
|
Vittek AL, Juan C, Nowak LG, Girard P, Cappe C. Multisensory integration in neurons of the medial pulvinar of macaque monkey. Cereb Cortex 2022; 33:4202-4215. [PMID: 36068947 PMCID: PMC10110443 DOI: 10.1093/cercor/bhac337] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 07/29/2022] [Accepted: 07/30/2022] [Indexed: 11/14/2022] Open
Abstract
The pulvinar is a heterogeneous thalamic nucleus, which is well developed in primates. One of its subdivisions, the medial pulvinar, is connected to many cortical areas, including the visual, auditory, and somatosensory cortices, as well as with multisensory areas and premotor areas. However, except for the visual modality, little is known about its sensory functions. A hypothesis is that, as a region of convergence of information from different sensory modalities, the medial pulvinar plays a role in multisensory integration. To test this hypothesis, 2 macaque monkeys were trained to a fixation task and the responses of single-units to visual, auditory, and auditory-visual stimuli were examined. Analysis revealed auditory, visual, and multisensory neurons in the medial pulvinar. It also revealed multisensory integration in this structure, mainly suppressive (the audiovisual response is less than the strongest unisensory response) and subadditive (the audiovisual response is less than the sum of the auditory and the visual responses). These findings suggest that the medial pulvinar is involved in multisensory integration.
Collapse
Affiliation(s)
- Anne-Laure Vittek
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Cécile Juan
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Lionel G Nowak
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Pascal Girard
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France.,INSERM, CHU Purpan - BP 3028 - 31024 Toulouse Cedex 3, France
| | - Céline Cappe
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| |
Collapse
|
30
|
Socially meaningful visual context either enhances or inhibits vocalisation processing in the macaque brain. Nat Commun 2022; 13:4886. [PMID: 35985995 PMCID: PMC9391382 DOI: 10.1038/s41467-022-32512-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Accepted: 08/03/2022] [Indexed: 11/08/2022] Open
Abstract
Social interactions rely on the interpretation of semantic and emotional information, often from multiple sensory modalities. Nonhuman primates send and receive auditory and visual communicative signals. However, the neural mechanisms underlying the association of visual and auditory information based on their common social meaning are unknown. Using heart rate estimates and functional neuroimaging, we show that in the lateral and superior temporal sulcus of the macaque monkey, neural responses are enhanced in response to species-specific vocalisations paired with a matching visual context, or when vocalisations follow, in time, visual information, but inhibited when vocalisation are incongruent with the visual context. For example, responses to affiliative vocalisations are enhanced when paired with affiliative contexts but inhibited when paired with aggressive or escape contexts. Overall, we propose that the identified neural network represents social meaning irrespective of sensory modality. Social interaction involves processing semantic and emotional information. Here the authors show that in the macaque monkey lateral and superior temporal sulcus, cortical activity is enhanced in response to species-specific vocalisations predicted by matching face or social visual stimuli but inhibited when vocalisations are incongruent with the predictive visual context.
Collapse
|
31
|
Rosenblum L, Grewe E, Churan J, Bremmer F. Influence of Tactile Flow on Visual Heading Perception. Multisens Res 2022; 35:291-308. [PMID: 35263712 DOI: 10.1163/22134808-bja10071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2021] [Accepted: 02/10/2022] [Indexed: 11/19/2022]
Abstract
The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual-vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer's self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.
Collapse
Affiliation(s)
- Lisa Rosenblum
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| | - Elisa Grewe
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany
| | - Jan Churan
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| |
Collapse
|
32
|
Ehrsson HH, Fotopoulou A, Radziun D, Longo MR, Tsakiris M. No specific relationship between hypnotic suggestibility and the rubber hand illusion. Nat Commun 2022; 13:564. [PMID: 35091562 PMCID: PMC8799653 DOI: 10.1038/s41467-022-28177-z] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2020] [Accepted: 01/07/2022] [Indexed: 11/09/2022] Open
Affiliation(s)
- H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Aikaterini Fotopoulou
- Clinical, Educational and Health Psychology Research Department, University College London, London, UK
| | - Dominika Radziun
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Manos Tsakiris
- The Warburg Institute, School of Advanced Study, University of London, London, UK. .,Lab of Action & Body, Department of Psychology, Royal Holloway, University of London, Egham, Surrey, UK. .,Department of Behavioural and Cognitive Sciences, Faculty of Humanities, Education and Social Sciences, University of Luxembourg, Esch-sur-Alzette, Luxembourg.
| |
Collapse
|
33
|
Abstract
Traditional brain-machine interfaces decode cortical motor commands to control external devices. These commands are the product of higher-level cognitive processes, occurring across a network of brain areas, that integrate sensory information, plan upcoming motor actions, and monitor ongoing movements. We review cognitive signals recently discovered in the human posterior parietal cortex during neuroprosthetic clinical trials. These signals are consistent with small regions of cortex having a diverse role in cognitive aspects of movement control and body monitoring, including sensorimotor integration, planning, trajectory representation, somatosensation, action semantics, learning, and decision making. These variables are encoded within the same population of cells using structured representations that bind related sensory and motor variables, an architecture termed partially mixed selectivity. Diverse cognitive signals provide complementary information to traditional motor commands to enable more natural and intuitive control of external devices.
Collapse
Affiliation(s)
- Richard A Andersen
- Division of Biology and Biological Engineering and Tianqiao & Chrissy Chen Brain-Machine Interface Center, California Institute of Technology, Pasadena, California 91125, USA;
- USC Neurorestoration Center, Keck School of Medicine of USC, Los Angeles, California 90033, USA
| | - Tyson Aflalo
- Division of Biology and Biological Engineering and Tianqiao & Chrissy Chen Brain-Machine Interface Center, California Institute of Technology, Pasadena, California 91125, USA;
| | - Luke Bashford
- Division of Biology and Biological Engineering and Tianqiao & Chrissy Chen Brain-Machine Interface Center, California Institute of Technology, Pasadena, California 91125, USA;
| | - David Bjånes
- Division of Biology and Biological Engineering and Tianqiao & Chrissy Chen Brain-Machine Interface Center, California Institute of Technology, Pasadena, California 91125, USA;
| | - Spencer Kellis
- Division of Biology and Biological Engineering and Tianqiao & Chrissy Chen Brain-Machine Interface Center, California Institute of Technology, Pasadena, California 91125, USA;
- USC Neurorestoration Center, Keck School of Medicine of USC, Los Angeles, California 90033, USA
- Department of Neurological Surgery, Keck School of Medicine of USC, Los Angeles, California 90033, USA
| |
Collapse
|
34
|
Merrikhi Y, Kok MA, Carrasco A, Meredith MA, Lomber SG. MULTISENSORY RESPONSES IN A BELT REGION OF THE DORSAL AUDITORY CORTICAL PATHWAY. Eur J Neurosci 2021; 55:589-610. [PMID: 34927294 DOI: 10.1111/ejn.15573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 12/13/2021] [Accepted: 12/14/2021] [Indexed: 11/30/2022]
Abstract
A basic function of the cerebral cortex is to receive and integrate information from different sensory modalities into a comprehensive percept of the environment. Neurons that demonstrate multisensory convergence occur across the necortex, but are especially prevalent in higher-order, association areas. However, a recent study of a cat higher-order auditory area, the dorsal zone (DZ) of auditory cortex, did not observe any multisensory features. Therefore, the goal of the present investigation was to address this conflict using recording and testing methodologies that are established for exposing and studying multisensory neuronal processing. Among the 482 neurons studied, we found that 76.6% were influenced by non-auditory stimuli. Of these neurons, 99% were affected by visual stimulation, but only 11% by somatosensory. Furthermore, a large proportion of the multisensory neurons showed integrated responses to multisensory stimulation, constituted a majority of the excitatory and inhibitory neurons encountered (as identified by the duration of their waveshape), and exhibited a distinct spatial distribution within DZ. These findings demonstrate that the dorsal zone of auditory cortex robustly exhibits multisensory properties and that the proportions of multisensory neurons encountered are consistent with those identified in other higher-order cortices.
Collapse
Affiliation(s)
- Yaser Merrikhi
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Melanie A Kok
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - Andres Carrasco
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia, USA
| | - Stephen G Lomber
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
35
|
Foster C, Sheng WA, Heed T, Ben Hamed S. The macaque ventral intraparietal area has expanded into three homologue human parietal areas. Prog Neurobiol 2021; 209:102185. [PMID: 34775040 DOI: 10.1016/j.pneurobio.2021.102185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2021] [Revised: 10/27/2021] [Accepted: 11/05/2021] [Indexed: 10/19/2022]
Abstract
The macaque ventral intraparietal area (VIP) in the fundus of the intraparietal sulcus has been implicated in a diverse range of sensorimotor and cognitive functions such as motion processing, multisensory integration, processing of head peripersonal space, defensive behavior, and numerosity coding. Here, we exhaustively review macaque VIP function, cytoarchitectonics, and anatomical connectivity and integrate it with human studies that have attempted to identify a potential human VIP homologue. We show that human VIP research has consistently identified three, rather than one, bilateral parietal areas that each appear to subsume some, but not all, of the macaque area's functionality. Available evidence suggests that this human "VIP complex" has evolved as an expansion of the macaque area, but that some precursory specialization within macaque VIP has been previously overlooked. The three human areas are dominated, roughly, by coding the head or self in the environment, visual heading direction, and the peripersonal environment around the head, respectively. A unifying functional principle may be best described as prediction in space and time, linking VIP to state estimation as a key parietal sensorimotor function. VIP's expansive differentiation of head and self-related processing may have been key in the emergence of human bodily self-consciousness.
Collapse
Affiliation(s)
- Celia Foster
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany; Center of Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Wei-An Sheng
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229, CNRS-University of Lyon 1, France
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany; Center of Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany; Department of Psychology, University of Salzburg, Salzburg, Austria; Centre for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria.
| | - Suliann Ben Hamed
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229, CNRS-University of Lyon 1, France.
| |
Collapse
|
36
|
Precision control for a flexible body representation. Neurosci Biobehav Rev 2021; 134:104401. [PMID: 34736884 DOI: 10.1016/j.neubiorev.2021.10.023] [Citation(s) in RCA: 40] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2021] [Revised: 10/20/2021] [Accepted: 10/21/2021] [Indexed: 11/24/2022]
Abstract
Adaptive body representation requires the continuous integration of multisensory inputs within a flexible 'body model' in the brain. The present review evaluates the idea that this flexibility is augmented by the contextual modulation of sensory processing 'top-down'; which can be described as precision control within predictive coding formulations of Bayesian inference. Specifically, I focus on the proposal that an attenuation of proprioception may facilitate the integration of conflicting visual and proprioceptive bodily cues. Firstly, I review empirical work suggesting that the processing of visual vs proprioceptive body position information can be contextualised 'top-down'; for instance, by adopting specific attentional task sets. Building up on this, I review research showing a similar contextualisation of visual vs proprioceptive information processing in the rubber hand illusion and in visuomotor adaptation. Together, the reviewed literature suggests that proprioception, despite its indisputable importance for body perception and action control, can be attenuated top-down (through precision control) to facilitate the contextual adaptation of the brain's body model to novel visual feedback.
Collapse
|
37
|
Spadone S, Perrucci MG, Di Cosmo G, Costantini M, Della Penna S, Ferri F. Frontal and parietal background connectivity and their dynamic changes account for individual differences in the multisensory representation of peripersonal space. Sci Rep 2021; 11:20533. [PMID: 34654814 PMCID: PMC8520015 DOI: 10.1038/s41598-021-00048-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Accepted: 10/05/2021] [Indexed: 11/22/2022] Open
Abstract
Functional connectivity (FC) of brain networks dynamically fluctuates during both rest and task execution. Individual differences in dynamic FC have been associated with several cognitive and behavioral traits. However, whether dynamic FC also contributes to sensorimotor representations guiding body-environment interactions, such as the representation of peripersonal space (PPS), is currently unknown. PPS is the space immediately surrounding the body and acts as a multisensory interface between the individual and the environment. We used an audio-tactile task with approaching sounds to map the individual PPS extension, and fMRI to estimate the background FC. Specifically, we analyzed FC values for each stimulus type (near and far space) and its across-trial variability. FC was evaluated between task-relevant nodes of two fronto-parietal networks (the Dorsal Attention Network, DAN, and the Fronto-Parietal Network, FPN) and a key PPS region in the premotor cortex (PM). PM was significantly connected to specific task-relevant nodes of the DAN and the FPN during the audio-tactile task, and FC was stronger while processing near space, as compared to far space. At the individual level, less PPS extension was associated with stronger premotor-parietal FC during processing of near space, while the across-trial variability of premotor-parietal and premotor-frontal FC was higher during the processing of far space. Notably, only across-trial FC variability captured the near-far modulation of space processing. Our findings indicate that PM connectivity with task-relevant frontal and parietal regions and its dynamic changes participate in the mechanisms that enable PPS representation, in agreement with the idea that neural variability plays a crucial role in plastic and dynamic sensorimotor representations.
Collapse
Affiliation(s)
- Sara Spadone
- Department of Neuroscience, Imaging and Clinical Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy.
| | - Mauro Gianni Perrucci
- Department of Neuroscience, Imaging and Clinical Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy
| | - Giulio Di Cosmo
- Department of Neuroscience, Imaging and Clinical Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy
| | - Marcello Costantini
- Department of Psychological, Health and Territorial Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy
| | - Stefania Della Penna
- Department of Neuroscience, Imaging and Clinical Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy
| | - Francesca Ferri
- Department of Neuroscience, Imaging and Clinical Sciences - and ITAB, Institute for Advanced Biomedical Technologies, G. d'Annunzio University of Chieti-Pescara, Chieti, Italy
| |
Collapse
|
38
|
Vestibular Stimulation May Drive Multisensory Processing: Principles for Targeted Sensorimotor Therapy (TSMT). Brain Sci 2021; 11:brainsci11081111. [PMID: 34439730 PMCID: PMC8393350 DOI: 10.3390/brainsci11081111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Revised: 08/20/2021] [Accepted: 08/20/2021] [Indexed: 12/01/2022] Open
Abstract
At birth, the vestibular system is fully mature, whilst higher order sensory processing is yet to develop in the full-term neonate. The current paper lays out a theoretical framework to account for the role vestibular stimulation may have driving multisensory and sensorimotor integration. Accordingly, vestibular stimulation, by activating the parieto-insular vestibular cortex, and/or the posterior parietal cortex may provide the cortical input for multisensory neurons in the superior colliculus that is needed for multisensory processing. Furthermore, we propose that motor development, by inducing change of reference frames, may shape the receptive field of multisensory neurons. This, by leading to lack of spatial contingency between formally contingent stimuli, may cause degradation of prior motor responses. Additionally, we offer a testable hypothesis explaining the beneficial effect of sensory integration therapies regarding attentional processes. Key concepts of a sensorimotor integration therapy (e.g., targeted sensorimotor therapy (TSMT)) are also put into a neurological context. TSMT utilizes specific tools and instruments. It is administered in 8-weeks long successive treatment regimens, each gradually increasing vestibular and postural stimulation, so sensory-motor integration is facilitated, and muscle strength is increased. Empirically TSMT is indicated for various diseases. Theoretical foundations of this sensorimotor therapy are discussed.
Collapse
|
39
|
Russ BE, Petkov CI, Kwok SC, Zhu Q, Belin P, Vanduffel W, Hamed SB. Common functional localizers to enhance NHP & cross-species neuroscience imaging research. Neuroimage 2021; 237:118203. [PMID: 34048898 PMCID: PMC8529529 DOI: 10.1016/j.neuroimage.2021.118203] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2020] [Revised: 05/15/2021] [Accepted: 05/24/2021] [Indexed: 11/25/2022] Open
Abstract
Functional localizers are invaluable as they can help define regions of interest, provide cross-study comparisons, and most importantly, allow for the aggregation and meta-analyses of data across studies and laboratories. To achieve these goals within the non-human primate (NHP) imaging community, there is a pressing need for the use of standardized and validated localizers that can be readily implemented across different groups. The goal of this paper is to provide an overview of the value of localizer protocols to imaging research and we describe a number of commonly used or novel localizers within NHPs, and keys to implement them across studies. As has been shown with the aggregation of resting-state imaging data in the original PRIME-DE submissions, we believe that the field is ready to apply the same initiative for task-based functional localizers in NHP imaging. By coming together to collect large datasets across research group, implementing the same functional localizers, and sharing the localizers and data via PRIME-DE, it is now possible to fully test their robustness, selectivity and specificity. To do this, we reviewed a number of common localizers and we created a repository of well-established localizer that are easily accessible and implemented through the PRIME-RE platform.
Collapse
Affiliation(s)
- Brian E Russ
- Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY, United States; Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York City, NY, United States; Department of Psychiatry, New York University at Langone, New York City, NY, United States.
| | - Christopher I Petkov
- Biosciences Institute, Newcastle University Medical School, Newcastle upon Tyne, United Kingdom
| | - Sze Chai Kwok
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, Shanghai Key Laboratory of Magnetic Resonance, Affiliated Mental Health Center (ECNU), School of Psychology and Cognitive Science, East China Normal University, Shanghai, China; Division of Natural and Applied Sciences, Duke Kunshan University, Kunshan, Jiangsu, China; NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China
| | - Qi Zhu
- Cognitive Neuroimaging Unit, INSERM, CEA, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France; Laboratory for Neuro-and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven, 3000, Belgium
| | - Pascal Belin
- Institut de Neurosciences de La Timone, Aix-Marseille Université et CNRS, Marseille, 13005, France
| | - Wim Vanduffel
- Laboratory for Neuro-and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven, 3000, Belgium; Leuven Brain Institute, KU Leuven, Leuven, 3000, Belgium; Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, MA 02129, United States; Department of Radiology, Harvard Medical School, Boston, MA 02144, United States.
| | - Suliann Ben Hamed
- Institut des Sciences Cognitives Marc Jeannerod, UMR 5229, Université de Lyon - CNRS, France.
| |
Collapse
|
40
|
Lohse M, Dahmen JC, Bajo VM, King AJ. Subcortical circuits mediate communication between primary sensory cortical areas in mice. Nat Commun 2021; 12:3916. [PMID: 34168153 PMCID: PMC8225818 DOI: 10.1038/s41467-021-24200-x] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Accepted: 06/02/2021] [Indexed: 12/20/2022] Open
Abstract
Integration of information across the senses is critical for perception and is a common property of neurons in the cerebral cortex, where it is thought to arise primarily from corticocortical connections. Much less is known about the role of subcortical circuits in shaping the multisensory properties of cortical neurons. We show that stimulation of the whiskers causes widespread suppression of sound-evoked activity in mouse primary auditory cortex (A1). This suppression depends on the primary somatosensory cortex (S1), and is implemented through a descending circuit that links S1, via the auditory midbrain, with thalamic neurons that project to A1. Furthermore, a direct pathway from S1 has a facilitatory effect on auditory responses in higher-order thalamic nuclei that project to other brain areas. Crossmodal corticofugal projections to the auditory midbrain and thalamus therefore play a pivotal role in integrating multisensory signals and in enabling communication between different sensory cortical areas.
Collapse
Affiliation(s)
- Michael Lohse
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK.
- Sainsbury Wellcome Centre, London, UK.
| | - Johannes C Dahmen
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Victoria M Bajo
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Andrew J King
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK.
| |
Collapse
|
41
|
Bahadori M, Cesari P. Affective sounds entering the peripersonal space influence the whole-body action preparation. Neuropsychologia 2021; 159:107917. [PMID: 34153305 DOI: 10.1016/j.neuropsychologia.2021.107917] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 06/11/2021] [Accepted: 06/15/2021] [Indexed: 10/21/2022]
Abstract
The peripersonal space (PPS), the space surrounding us, is found to have enhanced multisensory-motor representation in the brain. In this study, we investigate how approaching sounds stopping at different distances within the peripersonal space, and carrying emotional content (positive, negative, and neutral), modulate the preparation of action as performing a Step. Premotor reaction times were measured by means of anticipatory forces and muscular activations to capture action preparation, the kinematics of stepping was considered for defining action performance, and for each stimulus, the individual perceived level of arousal and valence was evaluated. In general, we found a prompter premotor reaction for closer sounds compared to the farther ones and the fastest reactions detected for the neutral sound at each distance. We interpreted this time facilitation for neutral sound due to the large frequency spectrum of the stimuli and the absence of affective component and semantical content to decode. Interestingly, while at the close distance, none difference was found between positive and negative emotional stimuli, at the far distance faster reactions were present for negative compared to the positive sounds indicating that when arousal is less enhanced individuals are able to differentiate the emotional content of a sound. The kinematics observed after action initiation sustained the anticipatory results by showing that larger steps were performed when reacting to close compared to far sounds, being perceived as more arousing, and this happened particularly for neutral and negative sounds. Altogether, the results showed that action preparation is influenced by the vicinity and by the valence carried by looming auditory stimuli. For discriminating the stimuli valence, a certain distance, still within the PPS, is necessary; when instead stimuli are too close to the body valence discrimination is not performed.
Collapse
Affiliation(s)
- Mehrdad Bahadori
- Department of Neurosciences, Biomedicine & Movement Sciences, University of Verona, Verona, Italy
| | - Paola Cesari
- Department of Neurosciences, Biomedicine & Movement Sciences, University of Verona, Verona, Italy.
| |
Collapse
|
42
|
Bogdanova OV, Bogdanov VB, Dureux A, Farnè A, Hadj-Bouziane F. The Peripersonal Space in a social world. Cortex 2021; 142:28-46. [PMID: 34174722 DOI: 10.1016/j.cortex.2021.05.005] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2020] [Revised: 02/27/2021] [Accepted: 05/19/2021] [Indexed: 11/27/2022]
Abstract
The PeriPersonal Space (PPS) has been defined as the space surrounding the body, where physical interactions with elements of the environment take place. As our world is social in nature, recent evidence revealed the complex modulation of social factors onto PPS representation. In light of the growing interest in the field, in this review we take a close look at the experimental approaches undertaken to assess the impact of social factors onto PPS representation. Our social world also influences the personal space (PS), a concept stemming from social psychology, defined as the space we keep between us and others to avoid discomfort. Here we analytically compare PPS and PS with the aim of understanding if and how they relate to each other. At the behavioral level, the multiplicity of experimental methodologies, whether well-established or novel, lead to somewhat divergent results and interpretations. Beyond behavior, we review physiological and neural signatures of PPS representation to discuss how interoceptive signals could contribute to PPS representation, as well as how these internal signals could shape the neural responses of PPS representation. In particular, by merging exteroceptive information from the environment and internal signals that come from the body, PPS may promote an integrated representation of the self, as distinct from the environment and the others. We put forward that integrating internal and external signals in the brain for perception of proximal environmental stimuli may also provide us with a better understanding of the processes at play during social interactions. Adopting such an integrative stance may offer novel insights about PPS representation in a social world. Finally, we discuss possible links between PPS research and social cognition, a link that may contribute to the understanding of intentions and feelings of others around us and promote appropriate social interactions.
Collapse
Affiliation(s)
- Olena V Bogdanova
- Integrative Multisensory Perception Action & Cognition Team (Impact), INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center (CRNL), Lyon, France; University of Lyon 1, France; INCIA, UMR 5287, CNRS, Université de Bordeaux, France.
| | - Volodymyr B Bogdanov
- Integrative Multisensory Perception Action & Cognition Team (Impact), INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center (CRNL), Lyon, France; University of Lyon 1, France; Ecole Nationale des Travaux Publics de l'Etat, Laboratoire Génie Civil et Bâtiment, Vaulx-en-Velin, France
| | - Audrey Dureux
- Integrative Multisensory Perception Action & Cognition Team (Impact), INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center (CRNL), Lyon, France; University of Lyon 1, France
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition Team (Impact), INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center (CRNL), Lyon, France; University of Lyon 1, France; Hospices Civils de Lyon, Neuro-Immersion Platform, Lyon, France; Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy
| | - Fadila Hadj-Bouziane
- Integrative Multisensory Perception Action & Cognition Team (Impact), INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center (CRNL), Lyon, France; University of Lyon 1, France.
| |
Collapse
|
43
|
Churan J, Kaminiarz A, Schwenk JCB, Bremmer F. Action-dependent processing of self-motion in parietal cortex of macaque monkeys. J Neurophysiol 2021; 125:2432-2443. [PMID: 34010579 DOI: 10.1152/jn.00049.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Successful interaction with the environment requires the dissociation of self-induced from externally induced sensory stimulation. Temporal proximity of action and effect is hereby often used as an indicator of whether an observed event should be interpreted as a result of own actions or not. We tested how the delay between an action (press of a touch bar) and an effect (onset of simulated self-motion) influences the processing of visually simulated self-motion in the ventral intraparietal area (VIP) of macaque monkeys. We found that a delay between the action and the start of the self-motion stimulus led to a rise of activity above the baseline activity before motion onset in a subpopulation of 21% of the investigated neurons. In the responses to the stimulus, we found a significantly lower sustained activity when the press of a touch bar and the motion onset were contiguous compared to the condition when the motion onset was delayed. We speculate that this weak inhibitory effect might be part of a mechanism that sharpens the tuning of VIP neurons during self-induced motion and thus has the potential to increase the precision of heading information that is required to adjust the orientation of self-motion in everyday navigational tasks.NEW & NOTEWORTHY Neurons in macaque ventral intraparietal area (VIP) are responding to sensory stimulation related to self-motion, e.g. visual optic flow. Here, we found that self-motion induced activation depends on the sense of agency, i.e., it differed when optic flow was perceived as self- or externally induced. This demonstrates that area VIP is well suited for study of the interplay between active behavior and sensory processing during self-motion.
Collapse
Affiliation(s)
- Jan Churan
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Andre Kaminiarz
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Jakob C B Schwenk
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, Philipps-Universität Marburg, Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
| |
Collapse
|
44
|
VanGilder P, Shi Y, Apker G, Buneo CA. Sensory feedback-dependent coding of arm position in local field potentials of the posterior parietal cortex. Sci Rep 2021; 11:9060. [PMID: 33907213 PMCID: PMC8079385 DOI: 10.1038/s41598-021-88278-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 04/06/2021] [Indexed: 11/19/2022] Open
Abstract
Although multisensory integration is crucial for sensorimotor function, it is unclear how visual and proprioceptive sensory cues are combined in the brain during motor behaviors. Here we characterized the effects of multisensory interactions on local field potential (LFP) activity obtained from the superior parietal lobule (SPL) as non-human primates performed a reaching task with either unimodal (proprioceptive) or bimodal (visual-proprioceptive) sensory feedback. Based on previous analyses of spiking activity, we hypothesized that evoked LFP responses would be tuned to arm location but would be suppressed on bimodal trials, relative to unimodal trials. We also expected to see a substantial number of recording sites with enhanced beta band spectral power for only one set of feedback conditions (e.g. unimodal or bimodal), as was previously observed for spiking activity. We found that evoked activity and beta band power were tuned to arm location at many individual sites, though this tuning often differed between unimodal and bimodal trials. Across the population, both evoked and beta activity were consistent with feedback-dependent tuning to arm location, while beta band activity also showed evidence of response suppression on bimodal trials. The results suggest that multisensory interactions can alter the tuning and gain of arm position-related LFP activity in the SPL.
Collapse
Affiliation(s)
- Paul VanGilder
- School of Biological and Health Systems Engineering, Arizona State University, P.O. Box 879709, Tempe, AZ, 85287-9709, USA
| | - Ying Shi
- School of Biological and Health Systems Engineering, Arizona State University, P.O. Box 879709, Tempe, AZ, 85287-9709, USA
| | - Gregory Apker
- School of Biological and Health Systems Engineering, Arizona State University, P.O. Box 879709, Tempe, AZ, 85287-9709, USA
| | - Christopher A Buneo
- School of Biological and Health Systems Engineering, Arizona State University, P.O. Box 879709, Tempe, AZ, 85287-9709, USA.
| |
Collapse
|
45
|
Chancel M, Hasenack B, Ehrsson HH. Integration of predictions and afferent signals in body ownership. Cognition 2021; 212:104722. [PMID: 33865046 DOI: 10.1016/j.cognition.2021.104722] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 04/03/2021] [Accepted: 04/05/2021] [Indexed: 10/21/2022]
Abstract
We aimed at investigating the contribution of sensory predictions triggered by the sight of an object moving towards the body for the sense of body ownership. We used a recently developed psychophysical discrimination task to assess body ownership in the rubber hand illusion. In this task, the participants had to choose which of the two right rubber hands in view felt most like their own, and the ownership discriminations were fitted to psychometric curves. In the current study, we occluded the visual impressions of the object moving towards one of the rubber hands (during the first two-thirds of the path) and only revealed the final third of the object's movement trajectory when it touched the rubber hand (approach-occluded condition). Alternatively, we occluded only the final part so that the main part of the movement towards the model hand was visible (touch-occluded). We compared these two conditions to an illusion baseline condition where the object was visible during the entire trajectory and contact (no-occlusion). The touch-occluded condition produced equally strong hand ownership as the baseline condition with no occlusion, while ownership perception was significantly reduced when vision of the object approaching the rubber hand was occluded (approach-occluded). Our results show that tactile predictions generated from seeing an object moving towards the body are temporally exact, and they contribute to the rubber hand illusion by integrating with temporally congruent afferent sensory signals. This finding highlights the importance of multisensory predictions in peripersonal space, object permanence, and the interplay between bottom-up sensory signals and top-down predictions in body ownership.
Collapse
Affiliation(s)
- Marie Chancel
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institutet, Sweden.
| | - Birgit Hasenack
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institutet, Sweden; Departement of Psychology, University of Amsterdam, the Netherlands
| | - H Henrik Ehrsson
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institutet, Sweden
| |
Collapse
|
46
|
Phase-coupling of neural oscillations contributes to individual differences in peripersonal space. Neuropsychologia 2021; 156:107823. [PMID: 33705822 DOI: 10.1016/j.neuropsychologia.2021.107823] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2020] [Revised: 03/02/2021] [Accepted: 03/04/2021] [Indexed: 11/23/2022]
Abstract
The peripersonal space (PPS) is a multisensory and sensorimotor interface between our body and the environment. The location of PPS boundary is not fixed. Rather, it adapts to the environmental context and differs greatly across individuals. Recent studies have started to unveil the neural correlates of individual differences in PPS extension; however, this picture is not clear yet. Here, we used approaching auditory stimuli and magnetoencephalography to capture the individual boundary of PPS and examine its neural underpinnings. In particular, building upon previous studies from our own group, we investigated the possible contribution of an intrinsic feature of the brain, that is the "resting state" functional connectivity, to the individual differences in PPS extension and the frequency specificity of this contribution. Specifically, we focused on the activity synchronized to the premotor cortex, where multisensory neurons encoding PPS have been described. Results showed that the stronger the connectivity between left premotor cortex (lPM) and a set of fronto-parietal, sensorimotor regions in the right and left hemisphere, the wider the extension of the PPS. Strikingly, such a correlation was observed only in the beta-frequency band. Overall, our results suggest that the individual extension of the PPS is coded in spatially- and spectrally-specific resting state functional links.
Collapse
|
47
|
A multisensory perspective onto primate pulvinar functions. Neurosci Biobehav Rev 2021; 125:231-243. [PMID: 33662442 DOI: 10.1016/j.neubiorev.2021.02.043] [Citation(s) in RCA: 52] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 02/18/2021] [Accepted: 02/25/2021] [Indexed: 02/08/2023]
Abstract
Perception in ambiguous environments relies on the combination of sensory information from various sources. Most associative and primary sensory cortical areas are involved in this multisensory active integration process. As a result, the entire cortex appears as heavily multisensory. In this review, we focus on the contribution of the pulvinar to multisensory integration. This subcortical thalamic nucleus plays a central role in visual detection and selection at a fast time scale, as well as in the regulation of visual processes, at a much slower time scale. However, the pulvinar is also densely connected to cortical areas involved in multisensory integration. In spite of this, little is known about its multisensory properties and its contribution to multisensory perception. Here, we review the anatomical and functional organization of multisensory input to the pulvinar. We describe how visual, auditory, somatosensory, pain, proprioceptive and olfactory projections are differentially organized across the main subdivisions of the pulvinar and we show that topography is central to the organization of this complex nucleus. We propose that the pulvinar combines multiple sources of sensory information to enhance fast responses to the environment, while also playing the role of a general regulation hub for adaptive and flexible cognition.
Collapse
|
48
|
Fanghella M, Era V, Candidi M. Interpersonal Motor Interactions Shape Multisensory Representations of the Peripersonal Space. Brain Sci 2021; 11:255. [PMID: 33669561 PMCID: PMC7922994 DOI: 10.3390/brainsci11020255] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2021] [Revised: 02/11/2021] [Accepted: 02/12/2021] [Indexed: 02/07/2023] Open
Abstract
This perspective review focuses on the proposal that predictive multisensory integration occurring in one's peripersonal space (PPS) supports individuals' ability to efficiently interact with others, and that integrating sensorimotor signals from the interacting partners leads to the emergence of a shared representation of the PPS. To support this proposal, we first introduce the features of body and PPS representations that are relevant for interpersonal motor interactions. Then, we highlight the role of action planning and execution on the dynamic expansion of the PPS. We continue by presenting evidence of PPS modulations after tool use and review studies suggesting that PPS expansions may be accounted for by Bayesian sensory filtering through predictive coding. In the central section, we describe how this conceptual framework can be used to explain the mechanisms through which the PPS may be modulated by the actions of our interaction partner, in order to facilitate interpersonal coordination. Last, we discuss how this proposal may support recent evidence concerning PPS rigidity in Autism Spectrum Disorder (ASD) and its possible relationship with ASD individuals' difficulties during interpersonal coordination. Future studies will need to clarify the mechanisms and neural underpinning of these dynamic, interpersonal modulations of the PPS.
Collapse
Affiliation(s)
- Martina Fanghella
- Department of Psychology, Sapienza University, 00185 Rome, Italy; (M.F.); (V.E.)
- IRCCS Fondazione Santa Lucia, 00179 Rome, Italy
- Department of Psychology, University of London, London EC1V 0HB, UK
| | - Vanessa Era
- Department of Psychology, Sapienza University, 00185 Rome, Italy; (M.F.); (V.E.)
- IRCCS Fondazione Santa Lucia, 00179 Rome, Italy
| | - Matteo Candidi
- Department of Psychology, Sapienza University, 00185 Rome, Italy; (M.F.); (V.E.)
- IRCCS Fondazione Santa Lucia, 00179 Rome, Italy
| |
Collapse
|
49
|
Chivukula S, Zhang CY, Aflalo T, Jafari M, Pejsa K, Pouratian N, Andersen RA. Neural encoding of actual and imagined touch within human posterior parietal cortex. eLife 2021; 10:61646. [PMID: 33647233 PMCID: PMC7924956 DOI: 10.7554/elife.61646] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Accepted: 02/08/2021] [Indexed: 12/27/2022] Open
Abstract
In the human posterior parietal cortex (PPC), single units encode high-dimensional information with partially mixed representations that enable small populations of neurons to encode many variables relevant to movement planning, execution, cognition, and perception. Here, we test whether a PPC neuronal population previously demonstrated to encode visual and motor information is similarly engaged in the somatosensory domain. We recorded neurons within the PPC of a human clinical trial participant during actual touch presentation and during a tactile imagery task. Neurons encoded actual touch at short latency with bilateral receptive fields, organized by body part, and covered all tested regions. The tactile imagery task evoked body part-specific responses that shared a neural substrate with actual touch. Our results are the first neuron-level evidence of touch encoding in human PPC and its cognitive engagement during a tactile imagery task, which may reflect semantic processing, attention, sensory anticipation, or imagined touch.
Collapse
Affiliation(s)
- Srinivas Chivukula
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States,Geffen School of Medicine, University of California, Los AngelesLos AngelesUnited States
| | - Carey Y Zhang
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States
| | - Tyson Aflalo
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States
| | - Matiar Jafari
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States,Geffen School of Medicine, University of California, Los AngelesLos AngelesUnited States
| | - Kelsie Pejsa
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States
| | - Nader Pouratian
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States,Geffen School of Medicine, University of California, Los AngelesLos AngelesUnited States
| | - Richard A Andersen
- Department of Biology and Biological Engineering, California Institute of TechnologyPasadenaUnited States,Tianqiao and Chrissy Chen Brain-Machine Interface Center, Chen Institute for Neuroscience, California Institute of TechnologyPasadenaUnited States
| |
Collapse
|
50
|
Chancel M, Ehrsson HH. Which hand is mine? Discriminating body ownership perception in a two-alternative forced-choice task. Atten Percept Psychophys 2020; 82:4058-4083. [PMID: 32856222 PMCID: PMC7593318 DOI: 10.3758/s13414-020-02107-x] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
The experience of one's body as one's own is referred to as the sense of body ownership. This central part of human conscious experience determines the boundary between the self and the external environment, a crucial distinction in perception, action, and cognition. Although body ownership is known to involve the integration of signals from multiple sensory modalities, including vision, touch, and proprioception, little is known about the principles that determine this integration process, and the relationship between body ownership and perception is unclear. These uncertainties stem from the lack of a sensitive and rigorous method to quantify body ownership. Here, we describe a two-alternative forced-choice discrimination task that allows precise and direct measurement of body ownership as participants decide which of two rubber hands feels more like their own in a version of the rubber hand illusion. In two experiments, we show that the temporal and spatial congruence principles of multisensory stimulation, which determine ownership discrimination, impose tighter constraints than previously thought and that texture congruence constitutes an additional principle; these findings are compatible with theoretical models of multisensory integration. Taken together, our results suggest that body ownership constitutes a genuine perceptual multisensory phenomenon that can be quantified with psychophysics in discrimination experiments.
Collapse
Affiliation(s)
- Marie Chancel
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institute, SE-171 77, Stockholm, Sweden.
| | - H Henrik Ehrsson
- Department of Neuroscience, Brain, Body and Self Laboratory, Karolinska Institute, SE-171 77, Stockholm, Sweden
| |
Collapse
|