1
|
Ma S, Zhou Y, Wan T, Ren Q, Yan J, Fan L, Yuan H, Chan M, Chai Y. Bioinspired In-Sensor Multimodal Fusion for Enhanced Spatial and Spatiotemporal Association. NANO LETTERS 2024; 24:7091-7099. [PMID: 38804877 DOI: 10.1021/acs.nanolett.4c01727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Abstract
Multimodal perception can capture more precise and comprehensive information compared with unimodal approaches. However, current sensory systems typically merge multimodal signals at computing terminals following parallel processing and transmission, which results in the potential loss of spatial association information and requires time stamps to maintain temporal coherence for time-series data. Here we demonstrate bioinspired in-sensor multimodal fusion, which effectively enhances comprehensive perception and reduces the level of data transfer between sensory terminal and computation units. By adopting floating gate phototransistors with reconfigurable photoresponse plasticity, we realize the agile spatial and spatiotemporal fusion under nonvolatile and volatile photoresponse modes. To realize an optimal spatial estimation, we integrate spatial information from visual-tactile signals. For dynamic events, we capture and fuse in real time spatiotemporal information from visual-audio signals, realizing a dance-music synchronization recognition task without a time-stamping process. This in-sensor multimodal fusion approach provides the potential to simplify the multimodal integration system, extending the in-sensor computing paradigm.
Collapse
Affiliation(s)
- Sijie Ma
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Yue Zhou
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Tianqing Wan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Qinqi Ren
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Jianmin Yan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Lingwei Fan
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| | - Huanmei Yuan
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong 999077, People's Republic of China
| | - Mansun Chan
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong 999077, People's Republic of China
| | - Yang Chai
- Department of Applied Physics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
- Joint Research Centre of Microelectronics, The Hong Kong Polytechnic University, Kowloon, Hong Kong 999077, People's Republic of China
| |
Collapse
|
2
|
Fitzek MP, Mecklenburg J, Overeem LH, Lange KS, Siebert A, Triller P, Neeb L, Dreier JP, Kondziella D, Reuter U, Raffaelli B. Alice in Wonderland Syndrome (AIWS): prevalence and characteristics in adults with migraine. J Neurol 2024:10.1007/s00415-024-12471-5. [PMID: 38822148 DOI: 10.1007/s00415-024-12471-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2024] [Revised: 05/17/2024] [Accepted: 05/22/2024] [Indexed: 06/02/2024]
Abstract
OBJECTIVE Alice in Wonderland Syndrome (AIWS) is a sensory disorder characterized by a distorted somatosensory and/or visual perception. Additionally, distortion of time perception and symptoms of derealization/depersonalization may occur. AIWS is frequently associated with migraine. However, its prevalence, and clinical characteristics remain poorly understood. Here, we investigated the prevalence and features of AIWS in individuals with migraine. We hypothesized AIWS is more frequent in migraine patients with aura than in those without aura. METHODS This was a prospective cross-sectional cohort study, conducted at a tertiary headache center. Participants with migraine filled out questionnaires, providing details on demographics, headache, AIWS characteristics and the occurrence of transient visual phenomena such as fragmented vision. RESULTS Of 808 migraine patients, 133 individuals (16.5%, mean age 44.4 ± 13.3 years, 87% women) reported AIWS symptoms throughout their lives. Micro- and/or telopsia (72.9%) were most frequent, followed by micro- and/or macrosomatognosia (49.6%), and macro- and/or pelopsia (38.3%), lasting on average half an hour. AIWS symptoms occurred in association with headache in 65.1% of individuals, and 53.7% had their first AIWS episode at the age of 18 years or earlier. Migraine patients with aura were more likely to report AIWS symptoms than those without aura (19.5% vs. 14.1%, p = 0.04). Participants with AIWS reported a higher incidence of 17 out of the 22 investigated visual phenomena. CONCLUSION AIWS symptoms appear to be a common lifetime phenomenon in migraine patients. The correlation and clinical parallels between AIWS and migraine aura could indicate shared underlying pathomechanisms.
Collapse
Affiliation(s)
- Mira P Fitzek
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
- Junior Clinician Scientist Program, Berlin Institute of Health at Charité (BIH), Berlin, Germany
| | - Jasper Mecklenburg
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Lucas H Overeem
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Kristin S Lange
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
- Clinician Scientist Program, Berlin Institute of Health at Charité (BIH), Berlin, Germany
| | - Anke Siebert
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Paul Triller
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
| | - Lars Neeb
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
- Helios Global Health, Berlin, Germany
| | - Jens P Dreier
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
- Center for Stroke Research, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, GermanyUniversitätsmedizin Berlin, Berlin, Germany
- Department of Experimental Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Einstein Center for Neurosciences Berlin, Berlin, Germany
| | - Daniel Kondziella
- Department of Neurology, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
- Department of Clinical Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Uwe Reuter
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany
- Universitätsmedizin Greifswald, Greifswald, Germany
| | - Bianca Raffaelli
- Department of Neurology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität Zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117, Berlin, Germany.
- Clinician Scientist Program, Berlin Institute of Health at Charité (BIH), Berlin, Germany.
| |
Collapse
|
3
|
Klaffehn AL, Herbort O, Pfister R. The fusion point of temporal binding: Promises and perils of multisensory accounts. Cogn Psychol 2024; 151:101662. [PMID: 38772251 DOI: 10.1016/j.cogpsych.2024.101662] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 04/12/2024] [Accepted: 04/18/2024] [Indexed: 05/23/2024]
Abstract
Performing an action to initiate a consequence in the environment triggers the perceptual illusion of temporal binding. This phenomenon entails that actions and following effects are perceived to occur closer in time than they do outside the action-effect relationship. Here we ask whether temporal binding can be explained in terms of multisensory integration, by assuming either multisensory fusion or partial integration of the two events. We gathered two datasets featuring a wide range of action-effect delays as a key factor influencing integration. We then tested the fit of a computational model for multisensory integration, the statistically optimal cue integration (SOCI) model. Indeed, qualitative aspects of the data on a group-level followed the principles of a multisensory account. By contrast, quantitative evidence from a comprehensive model evaluation indicated that temporal binding cannot be reduced to multisensory integration. Rather, multisensory integration should be seen as one of several component processes underlying temporal binding on an individual level.
Collapse
Affiliation(s)
| | | | - Roland Pfister
- Trier University, Germany; Institute for Cognitive and Affective Neuroscience (ICAN), University of Trier, Germany
| |
Collapse
|
4
|
Mudrik L, Hirschhorn R, Korisky U. Taking consciousness for real: Increasing the ecological validity of the study of conscious vs. unconscious processes. Neuron 2024; 112:1642-1656. [PMID: 38653247 PMCID: PMC11100345 DOI: 10.1016/j.neuron.2024.03.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Revised: 03/23/2024] [Accepted: 03/29/2024] [Indexed: 04/25/2024]
Abstract
The study of consciousness has developed well-controlled, rigorous methods for manipulating and measuring consciousness. Yet, in the process, experimental paradigms grew farther away from everyday conscious and unconscious processes, which raises the concern of ecological validity. In this review, we suggest that the field can benefit from adopting a more ecological approach, akin to other fields of cognitive science. There, this approach challenged some existing hypotheses, yielded stronger effects, and enabled new research questions. We argue that such a move is critical for studying consciousness, where experimental paradigms tend to be artificial and small effect sizes are relatively prevalent. We identify three paths for doing so-changing the stimuli and experimental settings, changing the measures, and changing the research questions themselves-and review works that have already started implementing such approaches. While acknowledging the inherent challenges, we call for increasing ecological validity in consciousness studies.
Collapse
Affiliation(s)
- Liad Mudrik
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel; Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.
| | - Rony Hirschhorn
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Uri Korisky
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel
| |
Collapse
|
5
|
Davies-Owen J, Roberts H, Scott M, Thomas A, Sen S, Sethna S, Roberts C, Giesbrecht T, Fallon N. Beauty is in the nose of the beholder: Fragrance modulates attractiveness, confidence and femininity ratings and neural responses to faces of self and others. Behav Brain Res 2024; 465:114932. [PMID: 38437921 DOI: 10.1016/j.bbr.2024.114932] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2023] [Revised: 02/27/2024] [Accepted: 02/28/2024] [Indexed: 03/06/2024]
Abstract
Previous research investigated cross-modal influence of olfactory stimuli on perception and evaluation of faces. However, little is known about the neural dynamics underpinning this multisensory perception, and no research examined perception for images of oneself, and others, in presence of fragrances. This study investigated the neural mechanisms of olfactory-visual processing using electroencephalography (EEG) and subjective evaluations of self- and other-images. 22 female participants evaluated images of female actors and themselves while being exposed to the fragrance of a commercially available body wash or clean air delivered via olfactometer. Participants rated faces for attractiveness, femininity, confidence and glamorousness on visual analogue scales. EEG data was recorded and event-related potentials (ERPs) associated with onset of face stimuli were analysed to consider effects of fragrance presence on face processing, and interactions between fragrance and self-other image-type. Subjective ratings of confidence, attractiveness and femininity were increased for both image-types in pleasant fragrance relative to clean air condition. ERP components covering early-to-late stages of face processing were modulated by the presence of fragrance. Findings also revealed a cross-modal fragrance-face interaction, with pleasant fragrance particularly affecting ERPs to self-images in mid-latency ERP components. Results showed that the pleasant fragrance of the commercially available body wash impacted how participants perceived faces of self and others. Self- and other-image faces were subjectively rated as more attractive, confident and feminine in the presence of the pleasant fragrance compared to an un-fragranced control. The pleasant fragrance also modulated underlying electrophysiological activity. For the first time, an effect of pleasant fragrance on face perception was observed in the N1 component, suggesting impact within 100 ms. Pleasant fragrance also demonstrated greater impact on subsequent neural processing for self, relative to other-faces. The findings have implications for understanding multisensory integration during evaluations of oneself and others.
Collapse
Affiliation(s)
- Jennifer Davies-Owen
- Department of Psychology, Institute of Population Health, Faculty of Health and Life Sciences, University of Liverpool, Liverpool, United Kingdom
| | - Hannah Roberts
- Department of Psychology, Institute of Population Health, Faculty of Health and Life Sciences, University of Liverpool, Liverpool, United Kingdom
| | - Margaret Scott
- Unilever Research & Development, Port Sunlight, United Kingdom
| | - Anna Thomas
- Unilever Research & Development, Port Sunlight, United Kingdom
| | - Soumitra Sen
- Unilever Research & Development, Mumbai UIPL, India
| | | | - Carl Roberts
- Department of Psychology, Institute of Population Health, Faculty of Health and Life Sciences, University of Liverpool, Liverpool, United Kingdom
| | - Timo Giesbrecht
- Unilever Research & Development, Port Sunlight, United Kingdom
| | - Nicholas Fallon
- Department of Psychology, Institute of Population Health, Faculty of Health and Life Sciences, University of Liverpool, Liverpool, United Kingdom.
| |
Collapse
|
6
|
Hermosillo RJM, Moore LA, Feczko E, Miranda-Domínguez Ó, Pines A, Dworetsky A, Conan G, Mooney MA, Randolph A, Graham A, Adeyemo B, Earl E, Perrone A, Carrasco CM, Uriarte-Lopez J, Snider K, Doyle O, Cordova M, Koirala S, Grimsrud GJ, Byington N, Nelson SM, Gratton C, Petersen S, Feldstein Ewing SW, Nagel BJ, Dosenbach NUF, Satterthwaite TD, Fair DA. A precision functional atlas of personalized network topography and probabilities. Nat Neurosci 2024; 27:1000-1013. [PMID: 38532024 PMCID: PMC11089006 DOI: 10.1038/s41593-024-01596-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 02/08/2024] [Indexed: 03/28/2024]
Abstract
Although the general location of functional neural networks is similar across individuals, there is vast person-to-person topographic variability. To capture this, we implemented precision brain mapping functional magnetic resonance imaging methods to establish an open-source, method-flexible set of precision functional network atlases-the Masonic Institute for the Developing Brain (MIDB) Precision Brain Atlas. This atlas is an evolving resource comprising 53,273 individual-specific network maps, from more than 9,900 individuals, across ages and cohorts, including the Adolescent Brain Cognitive Development study, the Developmental Human Connectome Project and others. We also generated probabilistic network maps across multiple ages and integration zones (using a new overlapping mapping technique, Overlapping MultiNetwork Imaging). Using regions of high network invariance improved the reproducibility of executive function statistical maps in brain-wide associations compared to group average-based parcellations. Finally, we provide a potential use case for probabilistic maps for targeted neuromodulation. The atlas is expandable to alternative datasets with an online interface encouraging the scientific community to explore and contribute to understanding the human brain function more precisely.
Collapse
Affiliation(s)
- Robert J M Hermosillo
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA.
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA.
| | - Lucille A Moore
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
| | - Eric Feczko
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | - Óscar Miranda-Domínguez
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | - Adam Pines
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
- Penn Lifespan Informatics and Neuroimaging Center, University of Pennsylvania, Philadelphia, PA, USA
| | - Ally Dworetsky
- Department of Radiology, Washington University School of Medicine, St. Louis, MO, USA
- Department of Psychology, Northwestern University, Evanston, IL, USA
- Department of Psychology, Florida State University, Tallahassee, FL, USA
| | - Gregory Conan
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
| | - Michael A Mooney
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
- Department of Medical Informatics and Clinical Epidemiology, Oregon Health and Science University, Portland, OR, USA
- Knight Cancer Institute, Oregon Health & Science University, Portland, OR, USA
- Center for Mental Health Innovation, Oregon Health and Science University, Portland, OR, USA
| | - Anita Randolph
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | - Alice Graham
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
| | - Babatunde Adeyemo
- Department of Neurology, Washington University School of Medicine, St. Louis, MO, USA
| | - Eric Earl
- Data Science and Sharing Team, National Institute of Mental Health, Bethesda, MD, USA
| | - Anders Perrone
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
| | - Cristian Morales Carrasco
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | | | - Kathy Snider
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
| | - Olivia Doyle
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
| | - Michaela Cordova
- Joint Doctoral Program in Clinical Psychology, San Diego State University, San Diego, CA, USA
- Joint Doctoral Program in Clinical Psychology, University of California San Diego, San Diego, CA, USA
| | - Sanju Koirala
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Institute of Child Development, University of Minnesota, Minneapolis, MN, USA
| | - Gracie J Grimsrud
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
| | - Nora Byington
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
| | - Steven M Nelson
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | - Caterina Gratton
- Department of Psychology, Northwestern University, Evanston, IL, USA
- Department of Psychology, Florida State University, Tallahassee, FL, USA
- Department of Psychological and Brain Sciences, Washington University School of Medicine, St. Louis, MO, USA
| | - Steven Petersen
- Department of Radiology, Washington University School of Medicine, St. Louis, MO, USA
- Department of Neurology, Washington University School of Medicine, St. Louis, MO, USA
- Department of Psychological and Brain Sciences, Washington University School of Medicine, St. Louis, MO, USA
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO, USA
- Department of Biomedical Engineering, Washington University School of Medicine, St. Louis, MO, USA
| | | | - Bonnie J Nagel
- Department of Psychiatry, Oregon Health & Science University, Portland, OR, USA
| | - Nico U F Dosenbach
- Department of Neurology, Washington University School of Medicine, St. Louis, MO, USA
| | - Theodore D Satterthwaite
- Penn Lifespan Informatics and Neuroimaging Center, University of Pennsylvania, Philadelphia, PA, USA
- Department of Psychiatry, University of Pennsylvania, Philadelphia, PA, USA
| | - Damien A Fair
- Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN, USA
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
- Institute of Child Development, University of Minnesota, Minneapolis, MN, USA
| |
Collapse
|
7
|
O'Kane SH, Chancel M, Ehrsson HH. Hierarchical and dynamic relationships between body part ownership and full-body ownership. Cognition 2024; 246:105697. [PMID: 38364444 DOI: 10.1016/j.cognition.2023.105697] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 12/12/2023] [Accepted: 12/13/2023] [Indexed: 02/18/2024]
Abstract
What is the relationship between experiencing individual body parts and the whole body as one's own? We theorised that body part ownership is driven primarily by the perceptual binding of visual and somatosensory signals from specific body parts, whereas full-body ownership depends on a more global binding process based on multisensory information from several body segments. To examine this hypothesis, we used a bodily illusion and asked participants to rate illusory changes in ownership over five different parts of a mannequin's body and the mannequin as a whole, while we manipulated the synchrony or asynchrony of visual and tactile stimuli delivered to three different body parts. We found that body part ownership was driven primarily by local visuotactile synchrony and could be experienced relatively independently of full-body ownership. Full-body ownership depended on the number of synchronously stimulated parts in a nonlinear manner, with the strongest full-body ownership illusion occurring when all parts received synchronous stimulation. Additionally, full-body ownership influenced body part ownership for nonstimulated body parts, and skin conductance responses provided physiological evidence supporting an interaction between body part and full-body ownership. We conclude that body part and full-body ownership correspond to different processes and propose a hierarchical probabilistic model to explain the relationship between part and whole in the context of multisensory awareness of one's own body.
Collapse
Affiliation(s)
- Sophie H O'Kane
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Marie Chancel
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden; Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, 38000 Grenoble, France
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| |
Collapse
|
8
|
Jia T, Sun J, McGeady C, Ji L, Li C. Enhancing Brain-Computer Interface Performance by Incorporating Brain-to-Brain Coupling. CYBORG AND BIONIC SYSTEMS 2024; 5:0116. [PMID: 38680535 PMCID: PMC11052607 DOI: 10.34133/cbsystems.0116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2023] [Accepted: 03/24/2024] [Indexed: 05/01/2024] Open
Abstract
Human cooperation relies on key features of social interaction in order to reach desirable outcomes. Similarly, human-robot interaction may benefit from integration with human-human interaction factors. In this paper, we aim to investigate brain-to-brain coupling during motor imagery (MI)-based brain-computer interface (BCI) training using eye-contact and hand-touch interaction. Twelve pairs of friends (experimental group) and 10 pairs of strangers (control group) were recruited for MI-based BCI tests concurrent with electroencephalography (EEG) hyperscanning. Event-related desynchronization (ERD) was estimated to measure cortical activation, and interbrain functional connectivity was assessed using multilevel statistical analysis. Furthermore, we compared BCI classification performance under different social interaction conditions. In the experimental group, greater ERD was found around the contralateral sensorimotor cortex under social interaction conditions compared with MI without any social interaction. Notably, EEG channels with decreased power were mainly distributed around the frontal, central, and occipital regions. A significant increase in interbrain coupling was also found under social interaction conditions. BCI decoding accuracies were significantly improved in the eye contact condition and eye and hand contact condition compared with the no-interaction condition. However, for the strangers' group, no positive effects were observed in comparisons of cortical activations between interaction and no-interaction conditions. These findings indicate that social interaction can improve the neural synchronization between familiar partners with enhanced brain activations and brain-to-brain coupling. This study may provide a novel method for enhancing MI-based BCI performance in conjunction with neural synchronization between users.
Collapse
Affiliation(s)
- Tianyu Jia
- Lab of Intelligent and Biomimetic Machinery, Department of Mechanical Engineering,
Tsinghua University, Beijing, China
- Department of Bioengineering,
Imperial College London, London, UK
| | - Jingyao Sun
- Lab of Intelligent and Biomimetic Machinery, Department of Mechanical Engineering,
Tsinghua University, Beijing, China
| | - Ciarán McGeady
- Department of Bioengineering,
Imperial College London, London, UK
| | - Linhong Ji
- Lab of Intelligent and Biomimetic Machinery, Department of Mechanical Engineering,
Tsinghua University, Beijing, China
| | - Chong Li
- Lab of Intelligent and Biomimetic Machinery, Department of Mechanical Engineering,
Tsinghua University, Beijing, China
- School of Clinical Medicine,
Tsinghua University, Beijing, China
- Beijing Tsinghua Changgung Hospital,
Tsinghua University, Beijing, China
| |
Collapse
|
9
|
Schnepel P, Paricio-Montesinos R, Ezquerra-Romano I, Haggard P, Poulet JFA. Cortical cellular encoding of thermotactile integration. Curr Biol 2024; 34:1718-1730.e3. [PMID: 38582078 DOI: 10.1016/j.cub.2024.03.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Revised: 12/24/2023] [Accepted: 03/13/2024] [Indexed: 04/08/2024]
Abstract
Recent evidence suggests that primary sensory cortical regions play a role in the integration of information from multiple sensory modalities. How primary cortical neurons integrate different sources of sensory information is unclear, partly because non-primary sensory input to a cortical sensory region is often weak or modulatory. To address this question, we take advantage of the robust representation of thermal (cooling) and tactile stimuli in mouse forelimb primary somatosensory cortex (fS1). Using a thermotactile detection task, we show that the perception of threshold-level cool or tactile information is enhanced when they are presented simultaneously, compared with presentation alone. To investigate the cortical cellular correlates of thermotactile integration, we performed in vivo extracellular recordings from fS1 in awake resting and anesthetized mice during unimodal and bimodal stimulation of the forepaw. Unimodal stimulation evoked thermal- or tactile- specific excitatory and inhibitory responses of fS1 neurons. The most prominent features of combined thermotactile stimulation are the recruitment of unimodally silent fS1 neurons, non-linear integration features, and response dynamics that favor longer response durations with additional spikes. Together, we identify quantitative and qualitative changes in cortical encoding that may underlie the improvement in perception of thermotactile surfaces during haptic exploration.
Collapse
Affiliation(s)
- Philipp Schnepel
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ricardo Paricio-Montesinos
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ivan Ezquerra-Romano
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany; Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - James F A Poulet
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany.
| |
Collapse
|
10
|
Weiler S, Rahmati V, Isstas M, Wutke J, Stark AW, Franke C, Graf J, Geis C, Witte OW, Hübener M, Bolz J, Margrie TW, Holthoff K, Teichert M. A primary sensory cortical interareal feedforward inhibitory circuit for tacto-visual integration. Nat Commun 2024; 15:3081. [PMID: 38594279 PMCID: PMC11003985 DOI: 10.1038/s41467-024-47459-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Accepted: 04/03/2024] [Indexed: 04/11/2024] Open
Abstract
Tactile sensation and vision are often both utilized for the exploration of objects that are within reach though it is not known whether or how these two distinct sensory systems combine such information. Here in mice, we used a combination of stereo photogrammetry for 3D reconstruction of the whisker array, brain-wide anatomical tracing and functional connectivity analysis to explore the possibility of tacto-visual convergence in sensory space and within the circuitry of the primary visual cortex (VISp). Strikingly, we find that stimulation of the contralateral whisker array suppresses visually evoked activity in a tacto-visual sub-region of VISp whose visual space representation closely overlaps with the whisker search space. This suppression is mediated by local fast-spiking interneurons that receive a direct cortico-cortical input predominantly from layer 6 neurons located in the posterior primary somatosensory barrel cortex (SSp-bfd). These data demonstrate functional convergence within and between two primary sensory cortical areas for multisensory object detection and recognition.
Collapse
Affiliation(s)
- Simon Weiler
- Sainsbury Wellcome Centre for Neuronal Circuits and Behaviour, University College London, 25 Howland Street, London, W1T 4JG, UK
| | - Vahid Rahmati
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Marcel Isstas
- Friedrich Schiller University Jena, Institute of General Zoology and Animal Physiology, Erbertstraße 1, 07743, Jena, Germany
| | - Johann Wutke
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Andreas Walter Stark
- Friedrich Schiller University Jena, Institute of Applied Optics and Biophysics, Fröbelstieg 1, 07743, Jena, Germany
| | - Christian Franke
- Friedrich Schiller University Jena, Institute of Applied Optics and Biophysics, Fröbelstieg 1, 07743, Jena, Germany
- Friedrich Schiller University Jena, Jena Center for Soft Matter, Philosophenweg 7, 07743, Jena, Germany
- Friedrich Schiller University Jena, Abbe Center of Photonics, Albert-Einstein-Straße 6, 07745, Jena, Germany
| | - Jürgen Graf
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Christian Geis
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Otto W Witte
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Mark Hübener
- Max Planck Institute for Biological Intelligence, Am Klopferspitz 18, 82152, Martinsried, Germany
| | - Jürgen Bolz
- Friedrich Schiller University Jena, Institute of General Zoology and Animal Physiology, Erbertstraße 1, 07743, Jena, Germany
| | - Troy W Margrie
- Sainsbury Wellcome Centre for Neuronal Circuits and Behaviour, University College London, 25 Howland Street, London, W1T 4JG, UK
| | - Knut Holthoff
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany
| | - Manuel Teichert
- Jena University Hospital, Department of Neurology, Am Klinikum 1, 07747, Jena, Germany.
| |
Collapse
|
11
|
Mazo C, Baeta M, Petreanu L. Auditory cortex conveys non-topographic sound localization signals to visual cortex. Nat Commun 2024; 15:3116. [PMID: 38600132 PMCID: PMC11006897 DOI: 10.1038/s41467-024-47546-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Accepted: 04/02/2024] [Indexed: 04/12/2024] Open
Abstract
Spatiotemporally congruent sensory stimuli are fused into a unified percept. The auditory cortex (AC) sends projections to the primary visual cortex (V1), which could provide signals for binding spatially corresponding audio-visual stimuli. However, whether AC inputs in V1 encode sound location remains unknown. Using two-photon axonal calcium imaging and a speaker array, we measured the auditory spatial information transmitted from AC to layer 1 of V1. AC conveys information about the location of ipsilateral and contralateral sound sources to V1. Sound location could be accurately decoded by sampling AC axons in V1, providing a substrate for making location-specific audiovisual associations. However, AC inputs were not retinotopically arranged in V1, and audio-visual modulations of V1 neurons did not depend on the spatial congruency of the sound and light stimuli. The non-topographic sound localization signals provided by AC might allow the association of specific audiovisual spatial patterns in V1 neurons.
Collapse
Affiliation(s)
- Camille Mazo
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| | - Margarida Baeta
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Leopoldo Petreanu
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| |
Collapse
|
12
|
Kayser C, Debats N, Heuer H. Both stimulus-specific and configurational features of multiple visual stimuli shape the spatial ventriloquism effect. Eur J Neurosci 2024; 59:1770-1788. [PMID: 38230578 DOI: 10.1111/ejn.16251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 12/22/2023] [Accepted: 12/25/2023] [Indexed: 01/18/2024]
Abstract
Studies on multisensory perception often focus on simplistic conditions in which one single stimulus is presented per modality. Yet, in everyday life, we usually encounter multiple signals per modality. To understand how multiple signals within and across the senses are combined, we extended the classical audio-visual spatial ventriloquism paradigm to combine two visual stimuli with one sound. The individual visual stimuli presented in the same trial differed in their relative timing and spatial offsets to the sound, allowing us to contrast their individual and combined influence on sound localization judgements. We find that the ventriloquism bias is not dominated by a single visual stimulus but rather is shaped by the collective multisensory evidence. In particular, the contribution of an individual visual stimulus to the ventriloquism bias depends not only on its own relative spatio-temporal alignment to the sound but also the spatio-temporal alignment of the other visual stimulus. We propose that this pattern of multi-stimulus multisensory integration reflects the evolution of evidence for sensory causal relations during individual trials, calling for the need to extend established models of multisensory causal inference to more naturalistic conditions. Our data also suggest that this pattern of multisensory interactions extends to the ventriloquism aftereffect, a bias in sound localization observed in unisensory judgements following a multisensory stimulus.
Collapse
Affiliation(s)
- Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Nienke Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
13
|
Jiang M, Zeng Z. Memristive Bionic Memory Circuit Implementation and Its Application in Multisensory Mutual Associative Learning Networks. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2024; 18:308-321. [PMID: 37831580 DOI: 10.1109/tbcas.2023.3324574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/15/2023]
Abstract
Memory is vital and indispensable for organisms and brain-inspired intelligence to gain complete sensation and cognition of the environment. In this work, a memristive bionic memory circuit inspired by human memory model is proposed, which includes 1) receptor and sensory neuron (SN), 2) short-term memory (STM) module, and 3) long-term memory (LTM) module. By leveraging the in-memory computing characteristic of memristors, various functions such as sensation, learning, forgetting, recall, consolidation, reconsolidation, retrieval, and reset are realized. Besides, a multisensory mutual associative learning network is constructed with several bionic memory units to memorize and associate sensory information of different modalities bidirectionally. Except for association establishment, enhancement, and extinction, we also mimicked multisensory integration to manifest the synthetic process of information from different sensory channels. According to the simulation results in PSPICE, the proposed circuit performs high robustness, low area overhead, and low power consumption. Combining associative memory with human memory model, this work provides a possible idea for further research in associative learning networks.
Collapse
|
14
|
Huntley MK, Nguyen A, Albrecht MA, Marinovic W. Tactile cues are more intrinsically linked to motor timing than visual cues in visual-tactile sensorimotor synchronization. Atten Percept Psychophys 2024; 86:1022-1037. [PMID: 38263510 PMCID: PMC11062975 DOI: 10.3758/s13414-023-02828-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/07/2023] [Indexed: 01/25/2024]
Abstract
Many tasks require precise synchronization with external sensory stimuli, such as driving a car. This study investigates whether combined visual-tactile information provides additional benefits to movement synchrony over separate visual and tactile stimuli and explores the relationship with the temporal binding window for multisensory integration. In Experiment 1, participants completed a sensorimotor synchronization task to examine movement variability and a simultaneity judgment task to measure the temporal binding window. Results showed similar synchronization variability between visual-tactile and tactile-only stimuli, but significantly lower than visual only. In Experiment 2, participants completed a visual-tactile sensorimotor synchronization task with cross-modal stimuli presented inside (stimulus onset asynchrony 80 ms) and outside (stimulus-onset asynchrony 400 ms) the temporal binding window to examine temporal accuracy of movement execution. Participants synchronized their movement with the first stimulus in the cross-modal pair, either the visual or tactile stimulus. Results showed significantly greater temporal accuracy when only one stimulus was presented inside the window and the second stimulus was outside the window than when both stimuli were presented inside the window, with movement execution being more accurate when attending to the tactile stimulus. Overall, these findings indicate there may be a modality-specific benefit to sensorimotor synchronization performance, such that tactile cues are weighted more strongly than visual information as tactile information is more intrinsically linked to motor timing than visual information. Further, our findings indicate that the visual-tactile temporal binding window is related to the temporal accuracy of movement execution.
Collapse
Affiliation(s)
- Michelle K Huntley
- School of Population Health, Curtin University, Perth, Western Australia, Australia.
- School of Psychology and Public Health, La Trobe University, Wodonga, Victoria, Australia.
| | - An Nguyen
- School of Population Health, Curtin University, Perth, Western Australia, Australia
| | - Matthew A Albrecht
- Western Australia Centre for Road Safety Research, School of Psychological Science, University of Western Australia, Perth, Western Australia, Australia
| | - Welber Marinovic
- School of Population Health, Curtin University, Perth, Western Australia, Australia
| |
Collapse
|
15
|
Marsicano G, Bertini C, Ronconi L. Alpha-band sensory entrainment improves audiovisual temporal acuity. Psychon Bull Rev 2024; 31:874-885. [PMID: 37783899 DOI: 10.3758/s13423-023-02388-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2023] [Indexed: 10/04/2023]
Abstract
Visual and auditory stimuli are transmitted from the environment to sensory cortices with different timing, requiring the brain to encode when sensory inputs must be segregated or integrated into a single percept. The probability that different audiovisual (AV) stimuli are integrated into a single percept even when presented asynchronously is reflected in the construct of temporal binding window (TBW). There is a strong interest in testing whether it is possible to broaden or shrink TBW by using different neuromodulatory approaches that can speed up or slow down ongoing alpha oscillations, which have been repeatedly hypothesized to be an important determinant of the TBWs size. Here, we employed a web-based sensory entrainment protocol combined with a simultaneity judgment task using simple flash-beep stimuli. The aim was to test whether AV temporal acuity could be modulated trial by trial by synchronizing ongoing neural oscillations in the prestimulus period to a rhythmic sensory stream presented in the upper (∼12 Hz) or lower (∼8.5 Hz) alpha range. As a control, we implemented a nonrhythmic condition where only the first and the last entrainers were employed. Results show that upper alpha entrainment shrinks AV TBW and improves AV temporal acuity when compared with lower alpha and control conditions. Our findings represent a proof of concept of the efficacy of sensory entrainment to improve AV temporal acuity in a trial-by-trial manner, and they strengthen the idea that alpha oscillations may reflect the temporal unit of AV temporal binding.
Collapse
Affiliation(s)
- Gianluca Marsicano
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Caterina Bertini
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, 40121, Bologna, Italy
- Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, 47023, Cesena, Italy
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Via Olgettina 58, 20132, Milan, Italy.
- Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy.
| |
Collapse
|
16
|
Oude Lohuis MN, Marchesi P, Olcese U, Pennartz CMA. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 2024; 27:758-771. [PMID: 38307971 DOI: 10.1038/s41593-023-01564-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 12/19/2023] [Indexed: 02/04/2024]
Abstract
Primary sensory cortices respond to crossmodal stimuli-for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80-100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.
Collapse
Affiliation(s)
- Matthijs N Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Pietro Marchesi
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M A Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands.
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands.
| |
Collapse
|
17
|
Kreyenmeier P, Bhuiyan I, Gian M, Chow HM, Spering M. Smooth pursuit inhibition reveals audiovisual enhancement of fast movement control. J Vis 2024; 24:3. [PMID: 38558158 PMCID: PMC10996987 DOI: 10.1167/jov.24.4.3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Accepted: 02/03/2024] [Indexed: 04/04/2024] Open
Abstract
The sudden onset of a visual object or event elicits an inhibition of eye movements at latencies approaching the minimum delay of visuomotor conductance in the brain. Typically, information presented via multiple sensory modalities, such as sound and vision, evokes stronger and more robust responses than unisensory information. Whether and how multisensory information affects ultra-short latency oculomotor inhibition is unknown. In two experiments, we investigate smooth pursuit and saccadic inhibition in response to multisensory distractors. Observers tracked a horizontally moving dot and were interrupted by an unpredictable visual, auditory, or audiovisual distractor. Distractors elicited a transient inhibition of pursuit eye velocity and catch-up saccade rate within ∼100 ms of their onset. Audiovisual distractors evoked stronger oculomotor inhibition than visual- or auditory-only distractors, indicating multisensory response enhancement. Multisensory response enhancement magnitudes were equal to the linear sum of responses to component stimuli. These results demonstrate that multisensory information affects eye movements even at ultra-short latencies, establishing a lower time boundary for multisensory-guided behavior. We conclude that oculomotor circuits must have privileged access to sensory information from multiple modalities, presumably via a fast, subcortical pathway.
Collapse
Affiliation(s)
- Philipp Kreyenmeier
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada
| | - Ishmam Bhuiyan
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Mathew Gian
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Hiu Mei Chow
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Department of Psychology, St. Thomas University, Fredericton, New Brunswick, Canada
| | - Miriam Spering
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada
- Djavad Mowafaghian Center for Brain Health, University of British Columbia, BC, Vancouver, Canada
- Institute for Computing, Information, and Cognitive Systems, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
18
|
Guo G, Wang N, Sun C, Geng H. Embodied Cross-Modal Interactions Based on an Altercentric Reference Frame. Brain Sci 2024; 14:314. [PMID: 38671966 PMCID: PMC11048532 DOI: 10.3390/brainsci14040314] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 03/20/2024] [Accepted: 03/22/2024] [Indexed: 04/28/2024] Open
Abstract
Accurate comprehension of others' thoughts and intentions is crucial for smooth social interactions, wherein understanding their perceptual experiences serves as a fundamental basis for this high-level social cognition. However, previous research has predominantly focused on the visual modality when investigating perceptual processing from others' perspectives, leaving the exploration of multisensory inputs during this process largely unexplored. By incorporating auditory stimuli into visual perspective-taking (VPT) tasks, we have designed a novel experimental paradigm in which the spatial correspondence between visual and auditory stimuli was limited to the altercentric rather than the egocentric reference frame. Overall, we found that when individuals engaged in explicit or implicit VPT to process visual stimuli from an avatar's viewpoint, the concomitantly presented auditory stimuli were also processed within this avatar-centered reference frame, revealing altercentric cross-modal interactions.
Collapse
Affiliation(s)
- Guanchen Guo
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Nanbo Wang
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou 350122, China;
| | - Chu Sun
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Haiyan Geng
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| |
Collapse
|
19
|
Jiang P, Kent C, Rossiter J. Towards sensory substitution and augmentation: Mapping visual distance to audio and tactile frequency. PLoS One 2024; 19:e0299213. [PMID: 38530828 DOI: 10.1371/journal.pone.0299213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Accepted: 02/07/2024] [Indexed: 03/28/2024] Open
Abstract
Multimodal perception is the predominant means by which individuals experience and interact with the world. However, sensory dysfunction or loss can significantly impede this process. In such cases, cross-modality research offers valuable insight into how we can compensate for these sensory deficits through sensory substitution. Although sight and hearing are both used to estimate the distance to an object (e.g., by visual size and sound volume) and the perception of distance is an important element in navigation and guidance, it is not widely studied in cross-modal research. We investigate the relationship between audio and vibrotactile frequencies (in the ranges 47-2,764 Hz and 10-99 Hz, respectively) and distances uniformly distributed in the range 1-12 m. In our experiments participants mapped the distance (represented by an image of a model at that distance) to a frequency via adjusting a virtual tuning knob. The results revealed that the majority (more than 76%) of participants demonstrated a strong negative monotonic relationship between frequency and distance, across both vibrotactile (represented by a natural log function) and auditory domains (represented by an exponential function). However, a subgroup of participants showed the opposite positive linear relationship between frequency and distance. The strong cross-modal sensory correlation could contribute to the development of assistive robotic technologies and devices to augment human perception. This work provides the fundamental foundation for future assisted HRI applications where a mapping between distance and frequency is needed, for example for people with vision or hearing loss, drivers with loss of focus or response delay, doctors undertaking teleoperation surgery, and users in augmented reality (AR) or virtual reality (VR) environments.
Collapse
Affiliation(s)
- Pingping Jiang
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| | - Christopher Kent
- School of Psychological Science, University of Bristol, Bristol, United Kingdom
| | - Jonathan Rossiter
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| |
Collapse
|
20
|
Radziun D, Korczyk M, Szwed M, Ehrsson HH. Are blind individuals immune to bodily illusions? Somatic rubber hand illusion in the blind revisited. Behav Brain Res 2024; 460:114818. [PMID: 38135190 DOI: 10.1016/j.bbr.2023.114818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Revised: 12/13/2023] [Accepted: 12/15/2023] [Indexed: 12/24/2023]
Abstract
Multisensory awareness of one's own body relies on the integration of signals from various sensory modalities such as vision, touch, and proprioception. But how do blind individuals perceive their bodies without visual cues, and does the brain of a blind person integrate bodily senses differently from a sighted person? To address this question, we aimed to replicate the only two previous studies on this topic, which claimed that blind individuals do not experience the somatic rubber hand illusion, a bodily illusion triggered by the integration of correlated tactile and proprioceptive signals from the two hands. We used a larger sample size than the previous studies and added Bayesian analyses to examine statistical evidence in favor of the lack of an illusion effect. Moreover, we employed tests to investigate whether enhanced tactile acuity and cardiac interoceptive accuracy in blind individuals could also explain the weaker illusion. We tested 36 blind individuals and 36 age- and sex-matched sighted volunteers. The results show that blind individuals do not experience the somatic rubber hand illusion based on questionnaire ratings and behavioral measures that assessed changes in hand position sense toward the location of the rubber hand. This conclusion is supported by Bayesian evidence in favor of the null hypothesis. The findings confirm that blind individuals do not experience the somatic rubber hand illusion, indicating that lack of visual experience leads to permanent changes in multisensory bodily perception. In summary, our study suggests that changes in multisensory integration of tactile and proprioceptive signals may explain why blind individuals are "immune" to the nonvisual version of the rubber hand illusion.
Collapse
Affiliation(s)
- Dominika Radziun
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, the Netherlands.
| | | | - Marcin Szwed
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
21
|
Evers K, Farisco M, Pennartz CMA. Assessing the commensurability of theories of consciousness: On the usefulness of common denominators in differentiating, integrating and testing hypotheses. Conscious Cogn 2024; 119:103668. [PMID: 38417198 DOI: 10.1016/j.concog.2024.103668] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Revised: 02/07/2024] [Accepted: 02/12/2024] [Indexed: 03/01/2024]
Abstract
How deep is the current diversity in the panoply of theories to define consciousness, and to what extent do these theories share common denominators? Here we first examine to what extent different theories are commensurable (or comparable) along particular dimensions. We posit logical (and, when applicable, empirical) commensurability as a necessary condition for identifying common denominators among different theories. By consequence, dimensions for inclusion in a set of logically and empirically commensurable theories of consciousness can be proposed. Next, we compare a limited subset of neuroscience-based theories in terms of commensurability. This analysis does not yield a denominator that might serve to define a minimally unifying model of consciousness. Theories that seem to be akin by one denominator can be remote by another. We suggest a methodology of comparing different theories via multiple probing questions, allowing to discern overall (dis)similarities between theories. Despite very different background definitions of consciousness, we conclude that, if attention is paid to the search for a common methological approach to brain-consciousness relationships, it should be possible in principle to overcome the current Babylonian confusion of tongues and eventually integrate and merge different theories.
Collapse
Affiliation(s)
- K Evers
- Centre for Research Ethics and Bioethics, Uppsala University, Uppsala, Sweden.
| | - M Farisco
- Centre for Research Ethics and Bioethics, Uppsala University, Uppsala, Sweden; Bioethics Unit, Biogem, Molecular Biology and Molecular Genetics Research Institute, Ariano Irpino (AV), Italy
| | - C M A Pennartz
- Department of Cognitive and Systems Neuroscience, Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, Netherland; Research Priority Area, Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
22
|
Keum D, Medina AE. The effect of developmental alcohol exposure on multisensory integration is larger in deeper cortical layers. Alcohol 2024:S0741-8329(24)00032-6. [PMID: 38417561 DOI: 10.1016/j.alcohol.2024.02.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 02/23/2024] [Accepted: 02/23/2024] [Indexed: 03/01/2024]
Abstract
Fetal Alcohol Spectrum Disorders (FASD) are one of the most common causes of mental disability in the world. Despite efforts to increase public awareness of the risks of drinking during pregnancy, epidemiological studies indicate a prevalence of 1-6% in all births. There is growing evidence that deficits in sensory processing may contribute to social problems observed in FASD. Multisensory (MS) integration occurs when a combination of inputs from two sensory modalities leads to enhancement or suppression of neuronal firing. MS enhancement is usually linked to processes that facilitate cognition and reaction time, whereas MS suppression has been linked to filtering unwanted sensory information. The rostral portion of the posterior parietal cortex (PPr) of the ferret is an area that shows robust visual-tactile integration and displays both MS enhancement and suppression. Recently, our lab demonstrated that ferrets exposed to alcohol during the "third trimester equivalent" of human gestation show less MS enhancement and more MS suppression in PPr than controls. Here we complement these findings by comparing in vivo electrophysiological recordings from channels located in shallow and deep cortical layers. We observed that while the effects of alcohol (less MS enhancement and more MS suppression) were found in all layers, the magnitude of these effects were more pronounced in putative layers V-VI. These findings extend our knowledge on the sensory deficits of FASD.
Collapse
Affiliation(s)
- Dongil Keum
- Department of Pediatrics, University of Maryland, School of Medicine. 655 Baltimore, St. Baltimore, MD, 21230
| | - Alexandre E Medina
- Department of Pediatrics, University of Maryland, School of Medicine. 655 Baltimore, St. Baltimore, MD, 21230.
| |
Collapse
|
23
|
Yang H, Cai B, Tan W, Luo L, Zhang Z. Pitch Improvement in Attentional Blink: A Study across Audiovisual Asymmetries. Behav Sci (Basel) 2024; 14:145. [PMID: 38392498 PMCID: PMC10885858 DOI: 10.3390/bs14020145] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Revised: 02/07/2024] [Accepted: 02/16/2024] [Indexed: 02/24/2024] Open
Abstract
Attentional blink (AB) is a phenomenon in which the perception of a second target is impaired when it appears within 200-500 ms after the first target. Sound affects an AB and is accompanied by the appearance of an asymmetry during audiovisual integration, but it is not known whether this is related to the tonal representation of sound. The aim of the present study was to investigate the effect of audiovisual asymmetry on attentional blink and whether the presentation of pitch improves the ability to detect a target during an AB that is accompanied by audiovisual asymmetry. The results showed that as the lag increased, the subject's target recognition improved and the pitch produced further improvements. These improvements exhibited a significant asymmetry across the audiovisual channel. Our findings could contribute to better utilizations of audiovisual integration resources to improve attentional transients and auditory recognition decline, which could be useful in areas such as driving and education.
Collapse
Affiliation(s)
- Haoping Yang
- School of Physical Education and Sports Science, Soochow University, Suzhou 215021, China
- Suzhou Cognitive Psychology Co-Operative Society, Soochow University, Suzhou 215021, China
| | - Biye Cai
- School of Physical Education and Sports Science, Soochow University, Suzhou 215021, China
| | - Wenjie Tan
- Suzhou Cognitive Psychology Co-Operative Society, Soochow University, Suzhou 215021, China
- Department of Physical Education, South China University of Technology, Guangzhou 518100, China
| | - Li Luo
- School of Physical Education and Sports Science, Soochow University, Suzhou 215021, China
| | - Zonghao Zhang
- School of Physical Education and Sports Science, Soochow University, Suzhou 215021, China
| |
Collapse
|
24
|
Wilson KM, Arquilla AM, Hussein M, Rosales-Torres KM, Chan MG, Saltzman W. Effects of reproductive status on behavioral and neural responses to isolated pup stimuli in female California mice. Behav Brain Res 2024; 457:114727. [PMID: 37871656 DOI: 10.1016/j.bbr.2023.114727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 10/17/2023] [Accepted: 10/18/2023] [Indexed: 10/25/2023]
Abstract
The transition to motherhood in mammals is marked by changes in females' perception of and responsiveness to sensory stimuli from infants. Our understanding of maternally induced sensory plasticity relies most heavily on studies in uniparental, promiscuous house mice and rats, which may not be representative of rodent species with different life histories. We exposed biparental, monogamous California mouse (Peromyscus californicus) mothers and ovariectomized virgin females to one of four acoustic and olfactory stimulus combinations (Control: clean cotton and white noise; Call: clean cotton and pup vocalizations; Odor: pup-scented cotton and white noise; Call + Odor: pup-scented cotton and pup vocalizations) and quantified females' behavior and Fos expression in select brain regions. Behavior did not differ between mothers and ovariectomized virgins. Among mothers, however, those exposed to the Control condition took the longest to sniff the odor stimulus, and mothers exposed to the Odor condition were quicker to sniff the odor ball compared to those in the Call condition. Behavior did not differ among ovariectomized virgins exposed to the different conditions. Fos expression differed across conditions only in the anterior hypothalamic nucleus (AHN), which responds to aversive stimuli: among mothers, the Control condition elicited the highest AHN Fos and Call + Odor elicited the lowest. Among ovariectomized virgin females, Call elicited the lowest Fos in the AHN. Thus, reproductive status in California mice alters females' behavioral responses to stimuli from pups, especially odors, and results in the inhibition of defense circuitry in response to pup stimuli.
Collapse
Affiliation(s)
- Kerianne M Wilson
- Department of Biology, Pomona College, Claremont, CA, USA; Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA.
| | - April M Arquilla
- Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA
| | - Manal Hussein
- Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA
| | - Kelsey M Rosales-Torres
- Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA
| | - May G Chan
- Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA
| | - Wendy Saltzman
- Department of Evolution, Ecology, and Organismal Biology, University of California Riverside, Riverside, CA, USA; Neuroscience Graduate Program, University of California Riverside, Riverside, CA, USA
| |
Collapse
|
25
|
Loskutova E, Butler JS, Setti A, O'Brien C, Loughman J. Ability to Process Multisensory Information Is Impaired in Open Angle Glaucoma. J Glaucoma 2024; 33:78-86. [PMID: 37974328 DOI: 10.1097/ijg.0000000000002331] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Accepted: 10/09/2023] [Indexed: 11/19/2023]
Abstract
PRCIS Patients with glaucoma demonstrated deficiencies in their ability to process multisensory information when compared with controls, with those deficiencies being related to glaucoma severity. Impaired multisensory integration (MSI) may affect the quality of life in individuals with glaucoma and may contribute to the increased prevalence of falls and driving safety concerns. Therapeutic possibilities to influence cognition in glaucoma should be explored. PURPOSE Glaucoma is a neurodegenerative disease of the optic nerve that has also been linked to cognitive health decline. This study explored MSI as a function of glaucoma status and severity. METHODS MSI was assessed in 37 participants with open angle glaucoma relative to 18 age-matched healthy controls. The sound-induced flash illusion was used to assess MSI efficiency. Participants were presented with various combinations of simultaneous visual and/or auditory stimuli and were required to indicate the number of visual stimuli observed for each of the 96 total presentations. Central retinal sensitivity was assessed as an indicator of glaucoma severity (MAIA; CenterVue). RESULTS Participants with glaucoma performed with equivalent capacity to healthy controls on unisensory trials ( F1,53 =2.222, P =0.142). Both groups performed equivalently on congruent multisensory trials involving equal numbers of auditory and visual stimuli F1,53 =1.032, P =0.314). For incongruent presentations, that is, 2 beeps and 1 flash stimulus, individuals with glaucoma demonstrated a greater influence of the incongruent beeps when judging the number of flashes, indicating less efficient MSI relative to age-matched controls ( F1,53 =11.45, P <0.002). In addition, MSI performance was positively correlated with retinal sensitivity ( F3,49 =4.042, P <0.025), adjusted R ²=0.15). CONCLUSIONS Individuals with open angle glaucoma exhibited MSI deficiencies that relate to disease severity. The type of deficiencies observed were similar to those observed among older individuals with cognitive impairment and balance issues. Impaired MSI may, therefore, be relevant to the increased prevalence of falls observed among individuals with glaucoma, a concept that merits further investigation.
Collapse
Affiliation(s)
- Ekaterina Loskutova
- Centre for Eye Research Ireland, School of Physics, Clinical & Optometric Sciences, Technological University Dublin, Dublin, Ireland
| | - John S Butler
- Centre for Eye Research Ireland, School of Mathematical Sciences, Technological University Dublin, Dublin, Ireland
| | - Annalisa Setti
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Colm O'Brien
- Department of Ophthalmology, Mater Misericordiae University Hospital, Dublin, Ireland
| | - James Loughman
- Centre for Eye Research Ireland, School of Physics, Clinical & Optometric Sciences, Technological University Dublin, Dublin, Ireland
| |
Collapse
|
26
|
Crucianelli L, Reader AT, Ehrsson HH. Subcortical contributions to the sense of body ownership. Brain 2024; 147:390-405. [PMID: 37847057 PMCID: PMC10834261 DOI: 10.1093/brain/awad359] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2023] [Revised: 09/01/2023] [Accepted: 10/03/2023] [Indexed: 10/18/2023] Open
Abstract
The sense of body ownership (i.e. the feeling that our body or its parts belong to us) plays a key role in bodily self-consciousness and is believed to stem from multisensory integration. Experimental paradigms such as the rubber hand illusion have been developed to allow the controlled manipulation of body ownership in laboratory settings, providing effective tools for investigating malleability in the sense of body ownership and the boundaries that distinguish self from other. Neuroimaging studies of body ownership converge on the involvement of several cortical regions, including the premotor cortex and posterior parietal cortex. However, relatively less attention has been paid to subcortical structures that may also contribute to body ownership perception, such as the cerebellum and putamen. Here, on the basis of neuroimaging and neuropsychological observations, we provide an overview of relevant subcortical regions and consider their potential role in generating and maintaining a sense of ownership over the body. We also suggest novel avenues for future research targeting the role of subcortical regions in making sense of the body as our own.
Collapse
Affiliation(s)
- Laura Crucianelli
- Department of Biological and Experimental Psychology, Queen Mary University of London, London E1 4DQ, UK
- Department of Neuroscience, Karolinska Institutet, Stockholm 171 65, Sweden
| | - Arran T Reader
- Department of Psychology, Faculty of Natural Sciences, University of Stirling, Stirling FK9 4LA, UK
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Stockholm 171 65, Sweden
| |
Collapse
|
27
|
Qian Q, Cai S, Zhang X, Huang J, Chen Y, Wang A, Zhang M. Seeing is believing: Larger Colavita effect in school-aged children with attention-deficit/hyperactivity disorder. J Exp Child Psychol 2024; 238:105798. [PMID: 37844345 DOI: 10.1016/j.jecp.2023.105798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Revised: 08/07/2023] [Accepted: 09/25/2023] [Indexed: 10/18/2023]
Abstract
Attention-deficit/hyperactivity disorder (ADHD) is a common neurodevelopmental disorder that leads to visually relevant compensatory activities and cognitive strategies in children. Previous studies have identified difficulties with audiovisual integration in children with ADHD, but the characteristics of the visual dominance effect when processing multisensory stimuli are not clear in children with ADHD. The current study used the Colavita paradigm to explore the visual dominance effect in school-aged children with ADHD. The results found that, compared with typically developing children, children with ADHD had a higher proportion of "visual-auditory" trials and a lower proportion of "simultaneous" trials. The study also found that the proportion of visual-auditory trials in children with ADHD decreased as their Swanson, Nolan, and Pelham-IV rating scale (SNAP-IV) inattention scores increased. The results showed that school-aged children with ADHD had a larger Colavita effect, which decreased with the severity of inattentive symptoms. This may be due to an overreliance on visual information and an abnormal integration time window. The connection between multisensory cognitive processing performance and clinical symptoms found in the current study provides empirical and theoretical support for the knowledge base of multisensory and cognitive abilities in disorders.
Collapse
Affiliation(s)
- Qinyue Qian
- Department of Psychology, Soochow University, Suzhou, Jiangsu 215123, China; Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, Jiangsu 215123, China
| | - Shizhong Cai
- Department of Child and Adolescent Healthcare, Children's Hospital of Soochow University, Suzhou, Jiangsu 215003, China
| | - Xianghui Zhang
- Department of Psychology, Soochow University, Suzhou, Jiangsu 215123, China; Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, Jiangsu 215123, China
| | - Jie Huang
- Department of Psychology, Soochow University, Suzhou, Jiangsu 215123, China; Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, Jiangsu 215123, China
| | - Yan Chen
- Department of Child and Adolescent Healthcare, Children's Hospital of Soochow University, Suzhou, Jiangsu 215003, China.
| | - Aijun Wang
- Department of Psychology, Soochow University, Suzhou, Jiangsu 215123, China; Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, Jiangsu 215123, China.
| | - Ming Zhang
- Department of Psychology, Soochow University, Suzhou, Jiangsu 215123, China; Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, Jiangsu 215123, China; Department of Psychology, Suzhou University of Science and Technology, Suzhou, Jiangsu 215011, China; Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan.
| |
Collapse
|
28
|
Xiong X, Dai L, Chen W, Lu J, Hu C, Zhao H, Ke J. Dynamics and concordance alterations of regional brain function indices in vestibular migraine: a resting-state fMRI study. J Headache Pain 2024; 25:1. [PMID: 38178029 PMCID: PMC10768112 DOI: 10.1186/s10194-023-01705-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2023] [Accepted: 12/12/2023] [Indexed: 01/06/2024] Open
Abstract
BACKGROUND Prior MRI studies on vestibular migraine (VM) have revealed abnormalities in static regional intrinsic brain activity (iBA) and dynamic functional connectivity between brain regions or networks. However, the temporal variation and concordance of regional iBA measures remain to be explored. METHODS 57 VM patients during the interictal period were compared to 88 healthy controls (HC) in this resting-state functional magnetic resonance imaging (fMRI) study. The dynamics and concordance of regional iBA indices, including amplitude of low-frequency fluctuations (ALFF) and regional homogeneity (ReHo), were examined by utilizing sliding time-window analysis. Partial correlation analyses were performed between clinical parameters and resting-state fMRI indices in brain areas showing significant group differences. RESULTS The VM group showed increased ALFF and ReHo dynamics, as well as increased temporal concordance between ALFF and ReHo in the bilateral paracentral lobule and supplementary motor area relative to the HC group. We also found decreased ReHo dynamics in the right temporal pole, and decreased ALFF dynamics in the right cerebellum posterior lobe, bilateral angular gyrus and middle occipital gyrus (MOG) in the VM group compared with the HC group. Moreover, a positive correlation was observed between ALFF dynamics in the left MOG and vertigo disease duration across all VM patients. CONCLUSION Temporal dynamics and concordance of regional iBA indices were altered in the motor cortex, cerebellum, occipital and temporoparietal cortex, which may contribute to disrupted multisensory processing and vestibular control in patients with VM. ALFF dynamics in the left MOG may be useful biomarker for evaluating vertigo burden in this disorder.
Collapse
Affiliation(s)
- Xing Xiong
- Department of Radiology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China
- Institute of Medical imaging, Soochow University, Soochow, Jiangsu Province, People's Republic of China
| | - Lingling Dai
- Department of Radiology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China
- Institute of Medical imaging, Soochow University, Soochow, Jiangsu Province, People's Republic of China
| | - Wen Chen
- Department of Radiology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China
- Institute of Medical imaging, Soochow University, Soochow, Jiangsu Province, People's Republic of China
| | - Jiajie Lu
- Department of Neurology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China
| | - Chunhong Hu
- Department of Radiology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China
- Institute of Medical imaging, Soochow University, Soochow, Jiangsu Province, People's Republic of China
| | - Hongru Zhao
- Department of Neurology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China.
| | - Jun Ke
- Department of Radiology, The First Affiliated Hospital of Soochow University, Suzhou, 215006, Jiangsu, China.
- Institute of Medical imaging, Soochow University, Soochow, Jiangsu Province, People's Republic of China.
| |
Collapse
|
29
|
Marusic U, Mahoney JR. Editorial: The intersection of cognitive, motor, and sensory processing in aging: links to functional outcomes, volume II. Front Aging Neurosci 2024; 15:1340547. [PMID: 38239490 PMCID: PMC10794332 DOI: 10.3389/fnagi.2023.1340547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2023] [Accepted: 12/07/2023] [Indexed: 01/22/2024] Open
Affiliation(s)
- Uros Marusic
- Institute for Kinesiology Research, Science and Research Centre Koper, Koper, Slovenia
- Department of Health Sciences, Alma Mater Europaea - ECM, Maribor, Slovenia
| | - Jeannette R. Mahoney
- Division of Cognitive and Motor Aging, Albert Einstein College of Medicine, Bronx, NY, United States
| |
Collapse
|
30
|
Fang W, Liu Y, Wang L. Multisensory Integration in Body Representation. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:77-89. [PMID: 38270854 DOI: 10.1007/978-981-99-7611-9_5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
To be aware of and to move one's body, the brain must maintain a coherent representation of the body. While the body and the brain are connected by dense ascending and descending sensory and motor pathways, representation of the body is not hardwired. This is demonstrated by the well-known rubber hand illusion in which a visible fake hand is erroneously felt as one's own hand when it is stroked in synchrony with the viewer's unseen actual hand. Thus, body representation in the brain is not mere maps of tactile and proprioceptive inputs, but a construct resulting from the interpretation and integration of inputs across sensory modalities.
Collapse
Affiliation(s)
- Wen Fang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China.
| | - Yuqi Liu
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| | - Liping Wang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|
31
|
Nikbakht N. More Than the Sum of Its Parts: Visual-Tactile Integration in the Behaving Rat. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:37-58. [PMID: 38270852 DOI: 10.1007/978-981-99-7611-9_3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
We experience the world by constantly integrating cues from multiple modalities to form unified sensory percepts. Once familiar with multimodal properties of an object, we can recognize it regardless of the modality involved. In this chapter we will examine the case of a visual-tactile orientation categorization experiment in rats. We will explore the involvement of the cerebral cortex in recognizing objects through multiple sensory modalities. In the orientation categorization task, rats learned to examine and judge the orientation of a raised, black and white grating using touch, vision, or both. Their multisensory performance was better than the predictions of linear models for cue combination, indicating synergy between the two sensory channels. Neural recordings made from a candidate associative cortical area, the posterior parietal cortex (PPC), reflected the principal neuronal correlates of the behavioral results: PPC neurons encoded both graded information about the object and categorical information about the animal's decision. Intriguingly single neurons showed identical responses under each of the three modality conditions providing a substrate for a neural circuit in the cortex that is involved in modality-invariant processing of objects.
Collapse
Affiliation(s)
- Nader Nikbakht
- Massachusetts Institute of Technology, Cambridge, MA, USA.
| |
Collapse
|
32
|
Joshi SD, Ruffini G, Nuttall HE, Watson DG, Braithwaite JJ. Optimised Multi-Channel Transcranial Direct Current Stimulation (MtDCS) Reveals Differential Involvement of the Right-Ventrolateral Prefrontal Cortex (rVLPFC) and Insular Complex in those Predisposed to Aberrant Experiences. Conscious Cogn 2024; 117:103610. [PMID: 38056338 DOI: 10.1016/j.concog.2023.103610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Revised: 11/20/2023] [Accepted: 11/22/2023] [Indexed: 12/08/2023]
Abstract
Research has shown a prominent role for cortical hyperexcitability underlying aberrant perceptions, hallucinations, and distortions in human conscious experience - even in neurotypical groups. The rVLPFC has been identified as an important structure in mediating cognitive affective states / feeling conscious states. The current study examined the involvement of the rVLPFC in mediating cognitive affective states in those predisposed to aberrant experiences in the neurotypical population. Participants completed two trait-based measures: (i) the Cortical Hyperexcitability Index_II (CHi_II, a proxy measure of cortical hyperexcitability) and (ii) two factors from the Cambridge Depersonalisation Scale (CDS). An optimised 7-channel MtDCS montage for stimulation conditions (Anodal, Cathodal and Sham) was created targeting the rVLPFC in a single-blind study. At the end of each stimulation session, participants completed a body-threat task (BTAB) while skin conductance responses (SCRs) and psychological responses were recorded. Participants with signs of increasing cortical hyperexcitability showed significant suppression of SCRs in the Cathodal stimulation relative to the Anodal and sSham conditions. Those high on the trait-based measures of depersonalisation-like experiences failed to show reliable effects. Collectively, the findings suggest that baseline brain states can mediate the effects of neurostimulation which would be missed via sample level averaging and without appropriate measures for stratifying individual differences.
Collapse
|
33
|
Jones SA, Noppeney U. Multisensory Integration and Causal Inference in Typical and Atypical Populations. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:59-76. [PMID: 38270853 DOI: 10.1007/978-981-99-7611-9_4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Multisensory perception is critical for effective interaction with the environment, but human responses to multisensory stimuli vary across the lifespan and appear changed in some atypical populations. In this review chapter, we consider multisensory integration within a normative Bayesian framework. We begin by outlining the complex computational challenges of multisensory causal inference and reliability-weighted cue integration, and discuss whether healthy young adults behave in accordance with normative Bayesian models. We then compare their behaviour with various other human populations (children, older adults, and those with neurological or neuropsychiatric disorders). In particular, we consider whether the differences seen in these groups are due only to changes in their computational parameters (such as sensory noise or perceptual priors), or whether the fundamental computational principles (such as reliability weighting) underlying multisensory perception may also be altered. We conclude by arguing that future research should aim explicitly to differentiate between these possibilities.
Collapse
Affiliation(s)
- Samuel A Jones
- Department of Psychology, Nottingham Trent University, Nottingham, UK.
| | - Uta Noppeney
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
34
|
Azizi Z, Hirst RJ, O' Dowd A, McCrory C, Kenny RA, Newell FN, Setti A. Evidence for an association between allostatic load and multisensory integration in middle-aged and older adults. Arch Gerontol Geriatr 2024; 116:105155. [PMID: 37597376 DOI: 10.1016/j.archger.2023.105155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 06/30/2023] [Accepted: 08/03/2023] [Indexed: 08/21/2023]
Abstract
Multisensory integration, the ability of the brain to integrate information from different sensory modalities, is critical for responding to environmental stimuli. While older adults show changes in multisensory integration with age, the impact of allostatic load (AL) (i.e., the effect of exposure to chronic stress, which can accelerate ageing) on multisensory perception remains understudied. We explored the relationship between multisensory integration and AL in 1,358 adults aged 50+ from The Irish Longitudinal Study on Ageing by performing a Sound Induced Flash Illusion (SIFI) task at multiple audio-visual temporal asynchronies. The AL score was created using a battery of biomarkers representing the activity of four major physiological systems: immunological, cardiovascular, metabolic, and renal. The number of biomarkers for which a participant was categorised in the highest risk quartile using sex-specific cutoffs was used to produce an overall AL score. We accounted for medication use when calculating our AL score. We analysed the accuracy of illusion trials on a SIFI task using generalised logistic mixed effects regression models adjusted for a number of covariates. Observation of cross-sectional and longitudinal results revealed that lower accuracy in integration (i.e., higher SIFI susceptibility with larger temporal asynchronies) was associated with higher AL. This confirmed the distinct patterns of multisensory integration in ageing.
Collapse
Affiliation(s)
- Zahra Azizi
- School of Psychology and Institute of Neuroscience, Trinity College Dublin, Ireland; The Irish Longitudinal Study on Ageing, Trinity College Dublin, Ireland; School of Applied Psychology, University College Cork, Ireland.
| | - Rebecca J Hirst
- School of Psychology and Institute of Neuroscience, Trinity College Dublin, Ireland; The Irish Longitudinal Study on Ageing, Trinity College Dublin, Ireland
| | - Alan O' Dowd
- School of Psychology and Institute of Neuroscience, Trinity College Dublin, Ireland; The Irish Longitudinal Study on Ageing, Trinity College Dublin, Ireland
| | - Cathal McCrory
- Department of Medical Gerontology, Trinity College Dublin, Dublin, Ireland
| | - Rose Anne Kenny
- The Irish Longitudinal Study on Ageing, Trinity College Dublin, Ireland; Mercer Institute for Successful Ageing, St. James Hospital, Dublin, Ireland
| | - Fiona N Newell
- School of Psychology and Institute of Neuroscience, Trinity College Dublin, Ireland
| | - Annalisa Setti
- The Irish Longitudinal Study on Ageing, Trinity College Dublin, Ireland; School of Applied Psychology, University College Cork, Ireland
| |
Collapse
|
35
|
Zheng Q, Gu Y. From Multisensory Integration to Multisensory Decision-Making. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:23-35. [PMID: 38270851 DOI: 10.1007/978-981-99-7611-9_2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Organisms live in a dynamic environment in which sensory information from multiple sources is ever changing. A conceptually complex task for the organisms is to accumulate evidence across sensory modalities and over time, a process known as multisensory decision-making. This is a new concept, in terms of that previous researches have been largely conducted in parallel disciplines. That is, much efforts have been put either in sensory integration across modalities using activity summed over a duration of time, or in decision-making with only one sensory modality that evolves over time. Recently, a few studies with neurophysiological measurements emerge to study how different sensory modality information is processed, accumulated, and integrated over time in decision-related areas such as the parietal or frontal lobes in mammals. In this review, we summarize and comment on these studies that combine the long-existed two parallel fields of multisensory integration and decision-making. We show how the new findings provide insight into our understanding about neural mechanisms mediating multisensory information processing in a more complete way.
Collapse
Affiliation(s)
- Qihao Zheng
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| | - Yong Gu
- Systems Neuroscience, SInstitute of Neuroscience, Chinese Academy of Sciences, Shanghai, China.
| |
Collapse
|
36
|
Wang X, Tang X, Wang A, Zhang M. Non-spatial inhibition of return attenuates audiovisual integration owing to modality disparities. Atten Percept Psychophys 2023:10.3758/s13414-023-02825-y. [PMID: 38127253 DOI: 10.3758/s13414-023-02825-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/29/2023] [Indexed: 12/23/2023]
Abstract
Although previous studies have investigated the relationship between inhibition of return (IOR) and multisensory integration, the influence of non-spatial has not been explored. The present study aimed to investigate the influence of non-spatial IOR on audiovisual integration by using a "prime-neutral cue-target" paradigm. In Experiment 1, which manipulated prime validity and target modality, the targets were positioned centrally, revealing significant non-spatial IOR effects in the visual, auditory, and audiovisual modalities. Analysis of relative multisensory response enhancement (rMRE) indicated substantial audiovisual integration enhancement in both valid and invalid target conditions. Furthermore, the enhancement was weaker for valid targets than for invalid targets. In Experiment 2, the targets were positioned above and below to rule out repetition blindness (RB); this experiment successfully replicated the results observed in Experiment 1. Notably, Experiments 1 and 2 consistently found that the correlation between modality differences and rMRE for valid targets indicated that differences in signal strength between visual and auditory modalities contributed to a reduction in audiovisual integration. However, the absence of correlation with the invalid target suggests that attention, as a key factor, may play a significant role in this process. The present study highlights how non-spatial IOR reduces audiovisual integration and sheds light on the complex interaction between attention and multisensory integration.
Collapse
Affiliation(s)
- Xiaoxue Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, People's Republic of China
| | - Xiaoyu Tang
- School of Psychology, Liaoning Normal University, Dalian, China
| | - Aijun Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, People's Republic of China.
| | - Ming Zhang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, People's Republic of China.
- Department of Psychology, Suzhou University of Science and Technology, Suzhou, China.
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan.
| |
Collapse
|
37
|
Ahmed F, Nidiffer AR, Lalor EC. The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario. Front Hum Neurosci 2023; 17:1283206. [PMID: 38162285 PMCID: PMC10754997 DOI: 10.3389/fnhum.2023.1283206] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Accepted: 11/20/2023] [Indexed: 01/03/2024] Open
Abstract
Seeing the speaker's face greatly improves our speech comprehension in noisy environments. This is due to the brain's ability to combine the auditory and the visual information around us, a process known as multisensory integration. Selective attention also strongly influences what we comprehend in scenarios with multiple speakers-an effect known as the cocktail-party phenomenon. However, the interaction between attention and multisensory integration is not fully understood, especially when it comes to natural, continuous speech. In a recent electroencephalography (EEG) study, we explored this issue and showed that multisensory integration is enhanced when an audiovisual speaker is attended compared to when that speaker is unattended. Here, we extend that work to investigate how this interaction varies depending on a person's gaze behavior, which affects the quality of the visual information they have access to. To do so, we recorded EEG from 31 healthy adults as they performed selective attention tasks in several paradigms involving two concurrently presented audiovisual speakers. We then modeled how the recorded EEG related to the audio speech (envelope) of the presented speakers. Crucially, we compared two classes of model - one that assumed underlying multisensory integration (AV) versus another that assumed two independent unisensory audio and visual processes (A+V). This comparison revealed evidence of strong attentional effects on multisensory integration when participants were looking directly at the face of an audiovisual speaker. This effect was not apparent when the speaker's face was in the peripheral vision of the participants. Overall, our findings suggest a strong influence of attention on multisensory integration when high fidelity visual (articulatory) speech information is available. More generally, this suggests that the interplay between attention and multisensory integration during natural audiovisual speech is dynamic and is adaptable based on the specific task and environment.
Collapse
Affiliation(s)
| | | | - Edmund C. Lalor
- Department of Biomedical Engineering, Department of Neuroscience, and Del Monte Institute for Neuroscience, and Center for Visual Science, University of Rochester, Rochester, NY, United States
| |
Collapse
|
38
|
Spomer AM, Conner BC, Schwartz MH, Lerner ZF, Steele KM. Audiovisual biofeedback amplifies plantarflexor adaptation during walking among children with cerebral palsy. J Neuroeng Rehabil 2023; 20:164. [PMID: 38062454 PMCID: PMC10704679 DOI: 10.1186/s12984-023-01279-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Accepted: 11/01/2023] [Indexed: 12/18/2023] Open
Abstract
BACKGROUND Biofeedback is a promising noninvasive strategy to enhance gait training among individuals with cerebral palsy (CP). Commonly, biofeedback systems are designed to guide movement correction using audio, visual, or sensorimotor (i.e., tactile or proprioceptive) cues, each of which has demonstrated measurable success in CP. However, it is currently unclear how the modality of biofeedback may influence user response which has significant implications if systems are to be consistently adopted into clinical care. METHODS In this study, we evaluated the extent to which adolescents with CP (7M/1F; 14 [12.5,15.5] years) adapted their gait patterns during treadmill walking (6 min/modality) with audiovisual (AV), sensorimotor (SM), and combined AV + SM biofeedback before and after four acclimation sessions (20 min/session) and at a two-week follow-up. Both biofeedback systems were designed to target plantarflexor activity on the more-affected limb, as these muscles are commonly impaired in CP and impact walking function. SM biofeedback was administered using a resistive ankle exoskeleton and AV biofeedback displayed soleus activity from electromyography recordings during gait. At every visit, we measured the time-course response to each biofeedback modality to understand how the rate and magnitude of gait adaptation differed between modalities and following acclimation. RESULTS Participants significantly increased soleus activity from baseline using AV + SM (42.8% [15.1, 59.6]), AV (28.5% [19.2, 58.5]), and SM (10.3% [3.2, 15.2]) biofeedback, but the rate of soleus adaptation was faster using AV + SM biofeedback than either modality alone. Further, SM-only biofeedback produced small initial increases in plantarflexor activity, but these responses were transient within and across sessions (p > 0.11). Following multi-session acclimation and at the two-week follow-up, responses to AV and AV + SM biofeedback were maintained. CONCLUSIONS This study demonstrated that AV biofeedback was critical to increase plantarflexor engagement during walking, but that combining AV and SM modalities further amplified the rate of gait adaptation. Beyond improving our understanding of how individuals may differentially prioritize distinct forms of afferent information, outcomes from this study may inform the design and selection of biofeedback systems for use in clinical care.
Collapse
Affiliation(s)
- Alyssa M Spomer
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA.
- Gillette Children's, 200 University Avenue East, Stop 490105, St. Paul, MN, 55101, USA.
| | - Benjamin C Conner
- College of Medicine - Phoenix, University of Arizona, Phoenix, AZ, USA
| | - Michael H Schwartz
- Department of Orthopedic Surgery, University of Minnesota, Minneapolis, MN, USA
- Gillette Children's, 200 University Avenue East, Stop 490105, St. Paul, MN, 55101, USA
| | - Zachary F Lerner
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
- Department of Mechanical Engineering, Northern Arizona University, Flagstaff, AZ, USA
| | - Katherine M Steele
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| |
Collapse
|
39
|
Shan L, Yuan L, Zhang B, Ma J, Xu X, Gu F, Jiang Y, Dai J. Neural Integration of Audiovisual Sensory Inputs in Macaque Amygdala and Adjacent Regions. Neurosci Bull 2023; 39:1749-1761. [PMID: 36920645 PMCID: PMC10661144 DOI: 10.1007/s12264-023-01043-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 02/13/2023] [Indexed: 03/16/2023] Open
Abstract
Integrating multisensory inputs to generate accurate perception and guide behavior is among the most critical functions of the brain. Subcortical regions such as the amygdala are involved in sensory processing including vision and audition, yet their roles in multisensory integration remain unclear. In this study, we systematically investigated the function of neurons in the amygdala and adjacent regions in integrating audiovisual sensory inputs using a semi-chronic multi-electrode array and multiple combinations of audiovisual stimuli. From a sample of 332 neurons, we showed the diverse response patterns to audiovisual stimuli and the neural characteristics of bimodal over unimodal modulation, which could be classified into four types with differentiated regional origins. Using the hierarchical clustering method, neurons were further clustered into five groups and associated with different integrating functions and sub-regions. Finally, regions distinguishing congruent and incongruent bimodal sensory inputs were identified. Overall, visual processing dominates audiovisual integration in the amygdala and adjacent regions. Our findings shed new light on the neural mechanisms of multisensory integration in the primate brain.
Collapse
Affiliation(s)
- Liang Shan
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
| | - Liu Yuan
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Bo Zhang
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- Key Laboratory of Brain Science, Zunyi Medical University, Zunyi, 563000, China
| | - Jian Ma
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Xiao Xu
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Fei Gu
- University of Chinese Academy of Sciences, Beijing, 100049, China
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
| | - Yi Jiang
- University of Chinese Academy of Sciences, Beijing, 100049, China.
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.
- Chinese Institute for Brain Research, Beijing, 102206, China.
| | - Ji Dai
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China.
- Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen, 518055, China.
| |
Collapse
|
40
|
Alemi R, Wolfe J, Neumann S, Manning J, Towler W, Koirala N, Gracco VL, Deroche M. Audiovisual integration in children with cochlear implants revealed through EEG and fNIRS. Brain Res Bull 2023; 205:110817. [PMID: 37989460 DOI: 10.1016/j.brainresbull.2023.110817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Revised: 09/22/2023] [Accepted: 11/13/2023] [Indexed: 11/23/2023]
Abstract
Sensory deprivation can offset the balance of audio versus visual information in multimodal processing. Such a phenomenon could persist for children born deaf, even after they receive cochlear implants (CIs), and could potentially explain why one modality is given priority over the other. Here, we recorded cortical responses to a single speaker uttering two syllables, presented in audio-only (A), visual-only (V), and audio-visual (AV) modes. Electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) were successively recorded in seventy-five school-aged children. Twenty-five were children with normal hearing (NH) and fifty wore CIs, among whom 26 had relatively high language abilities (HL) comparable to those of NH children, while 24 others had low language abilities (LL). In EEG data, visual-evoked potentials were captured in occipital regions, in response to V and AV stimuli, and they were accentuated in the HL group compared to the LL group (the NH group being intermediate). Close to the vertex, auditory-evoked potentials were captured in response to A and AV stimuli and reflected a differential treatment of the two syllables but only in the NH group. None of the EEG metrics revealed any interaction between group and modality. In fNIRS data, each modality induced a corresponding activity in visual or auditory regions, but no group difference was observed in A, V, or AV stimulation. The present study did not reveal any sign of abnormal AV integration in children with CI. An efficient multimodal integrative network (at least for rudimentary speech materials) is clearly not a sufficient condition to exhibit good language and literacy.
Collapse
Affiliation(s)
- Razieh Alemi
- Department of Psychology, Concordia University, 7141 Sherbrooke St. West, Montreal, Quebec H4B 1R6, Canada.
| | - Jace Wolfe
- Oberkotter Foundation, Oklahoma City, OK, USA
| | - Sara Neumann
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Jacy Manning
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Will Towler
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Nabin Koirala
- Haskins Laboratories, 300 George St., New Haven, CT 06511, USA
| | | | - Mickael Deroche
- Department of Psychology, Concordia University, 7141 Sherbrooke St. West, Montreal, Quebec H4B 1R6, Canada
| |
Collapse
|
41
|
Soballa P, Frings C, Schmalbrock P, Merz S. Multisensory integration reduces landmark distortions for tactile but not visual targets. J Neurophysiol 2023; 130:1403-1413. [PMID: 37910559 DOI: 10.1152/jn.00282.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 10/24/2023] [Accepted: 10/25/2023] [Indexed: 11/03/2023] Open
Abstract
Target localization is influenced by the presence of additionally presented nontargets, termed landmarks. In both the visual and tactile modality, these landmarks led to systematic distortions of target localizations often resulting in a shift toward the landmark. This shift has been attributed to averaging the spatial memory of both stimuli. Crucially, everyday experiences often rely on multiple modalities, and multisensory research suggests that inputs from different senses are optimally integrated, not averaged, for accurate perception, resulting in more reliable perception of cross-modal compared with uni-modal stimuli. As this could also lead to a reduced influence of the landmark, we wanted to test whether landmark distortions would be reduced when presented in a different modality or whether landmark distortions were unaffected by the modalities presented. In two experiments (each n = 30) tactile or visual targets were paired with tactile or visual landmarks. Experiment 1 showed that targets were less shifted toward landmarks from the different than the same modality, which was more pronounced for tactile than for visual targets. Experiment 2 aimed to replicate this pattern with increased visual uncertainty to rule out that smaller localization shifts of visual targets due to low uncertainty had led to the results. Still, landmark modality influenced localization shifts for tactile but not visual targets. The data pattern for tactile targets is not in line with memory averaging but seems to reflect the effects of multisensory integration, whereas visual targets were less prone to landmark distortions and do not appear to benefit from multisensory integration.NEW & NOTEWORTHY In the present study, we directly tested the predictions of two different accounts, namely, spatial memory averaging and multisensory integration, concerning the degree of landmark distortions of targets across modalities. We showed that landmark distortions were reduced across modalities compared to distortions within modalities, which is in line with multisensory integration. Crucially, this pattern was more pronounced for tactile than for visual targets.
Collapse
Affiliation(s)
- Paula Soballa
- Department of Psychology, University of Trier, Germany
| | | | | | - Simon Merz
- Department of Psychology, University of Trier, Germany
| |
Collapse
|
42
|
Sun X, Fu Q. The Visual Advantage Effect in Comparing Uni-Modal and Cross-Modal Probabilistic Category Learning. J Intell 2023; 11:218. [PMID: 38132836 PMCID: PMC10744040 DOI: 10.3390/jintelligence11120218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Revised: 11/14/2023] [Accepted: 11/23/2023] [Indexed: 12/23/2023] Open
Abstract
People rely on multiple learning systems to complete weather prediction (WP) tasks with visual cues. However, how people perform in audio and audiovisual modalities remains elusive. The present research investigated how the cue modality influences performance in probabilistic category learning and conscious awareness about the category knowledge acquired. A modified weather prediction task was adopted, in which the cues included two dimensions from visual, auditory, or audiovisual modalities. The results of all three experiments revealed better performances in the visual modality relative to the audio and audiovisual modalities. Moreover, participants primarily acquired unconscious knowledge in the audio and audiovisual modalities, while conscious knowledge was acquired in the visual modality. Interestingly, factors such as the amount of training, the complexity of visual stimuli, and the number of objects to which the two cues belonged influenced the amount of conscious knowledge acquired but did not change the visual advantage effect. These findings suggest that individuals can learn probabilistic cues and category associations across different modalities, but a robust visual advantage persists. Specifically, visual associations can be learned more effectively, and are more likely to become conscious. The possible causes and implications of these effects are discussed.
Collapse
Affiliation(s)
- Xunwei Sun
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China;
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100083, China
- Beijing Key Laboratory of Behavior and Mental Health, School of Psychological and Cognitive Sciences, Peking University, Beijing 100080, China
| | - Qiufang Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China;
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100083, China
| |
Collapse
|
43
|
Powell HJ, He JL, Khalil N, Wodka EL, DeRonda A, Edden RAE, Vasa RA, Mostofsky SH, Puts NA. Perceptual alterations in the relationship between sensory reactivity, intolerance of uncertainty, and anxiety in autistic children with and without ADHD. Dev Psychopathol 2023:1-13. [PMID: 37990408 DOI: 10.1017/s0954579423001360] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2023]
Abstract
Sensory differences and anxiety disorders are highly prevalent in autistic individuals with and without ADHD. Studies have shown that sensory differences and anxiety are associated and that intolerance of uncertainty (IU) plays an important role in this relationship. However, it is unclear as to how different levels of the sensory processing pathway (i.e., perceptual, affective, or behavioral) contribute. Here, we used psychophysics to assess how alterations in tactile perception contribute to questionnaire measures of sensory reactivity, IU, and anxiety. Thirty-eight autistic children (aged 8-12 years; 27 with co-occurring ADHD) were included. Consistent with previous findings, mediation analyses showed that child-reported IU fully mediated an association between parent-reported sensory reactivity and parent-reported anxiety and that anxiety partially mediated an association between sensory reactivity and IU. Of the vibrotactile thresholds, only simultaneous frequency discrimination (SFD) thresholds correlated with sensory reactivity. Interestingly, we found that sensory reactivity fully mediated an association between SFD threshold and anxiety, and between SFD threshold and IU. Taken together, those findings suggest a mechanistic pathway whereby tactile perceptual alterations contribute to sensory reactivity at the affective level, leading in turn to increased IU and anxiety. This stepwise association can inform potential interventions for IU and anxiety in autism.
Collapse
Affiliation(s)
- Helen J Powell
- Department of Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology, and Neuroscience, King's College London, London, UK
| | - Jason L He
- Department of Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology, and Neuroscience, King's College London, London, UK
| | - Nermin Khalil
- Department of Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology, and Neuroscience, King's College London, London, UK
| | - Ericka L Wodka
- Center for Autism and Related Disorders, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Psychiatry and Behavioural Sciences, The Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Alyssa DeRonda
- Center for Neurodevelopmental and Imaging Research, Kennedy Krieger Institute, Baltimore, MD, USA
| | - Richard A E Edden
- Department of Radiology and Radiological Science, The Johns Hopkins University School of Medicine, Baltimore, MD, USA
- F.M. Kirby Research Center for Functional Brain Imaging, Kennedy Krieger Institute, Baltimore, MD, USA
| | - Roma A Vasa
- Center for Autism and Related Disorders, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Psychiatry and Behavioural Sciences, The Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Stewart H Mostofsky
- Department of Psychiatry and Behavioural Sciences, The Johns Hopkins University School of Medicine, Baltimore, MD, USA
- Center for Neurodevelopmental and Imaging Research, Kennedy Krieger Institute, Baltimore, MD, USA
- Department of Neurology, The Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Nicolaas A Puts
- Department of Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology, and Neuroscience, King's College London, London, UK
- MRC Centre for Neurodevelopmental Disorders, King's College London, London, UK
| |
Collapse
|
44
|
Zhao H, Zhang Y, Han L, Qian W, Wang J, Wu H, Li J, Dai Y, Zhang Z, Bowen CR, Yang Y. Intelligent Recognition Using Ultralight Multifunctional Nano-Layered Carbon Aerogel Sensors with Human-Like Tactile Perception. NANO-MICRO LETTERS 2023; 16:11. [PMID: 37943399 PMCID: PMC10635924 DOI: 10.1007/s40820-023-01216-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/23/2023] [Accepted: 09/11/2023] [Indexed: 11/10/2023]
Abstract
Humans can perceive our complex world through multi-sensory fusion. Under limited visual conditions, people can sense a variety of tactile signals to identify objects accurately and rapidly. However, replicating this unique capability in robots remains a significant challenge. Here, we present a new form of ultralight multifunctional tactile nano-layered carbon aerogel sensor that provides pressure, temperature, material recognition and 3D location capabilities, which is combined with multimodal supervised learning algorithms for object recognition. The sensor exhibits human-like pressure (0.04-100 kPa) and temperature (21.5-66.2 °C) detection, millisecond response times (11 ms), a pressure sensitivity of 92.22 kPa-1 and triboelectric durability of over 6000 cycles. The devised algorithm has universality and can accommodate a range of application scenarios. The tactile system can identify common foods in a kitchen scene with 94.63% accuracy and explore the topographic and geomorphic features of a Mars scene with 100% accuracy. This sensing approach empowers robots with versatile tactile perception to advance future society toward heightened sensing, recognition and intelligence.
Collapse
Affiliation(s)
- Huiqi Zhao
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, People's Republic of China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, People's Republic of China
| | - Yizheng Zhang
- Tencent Robotics X, Shenzhen, 518054, People's Republic of China
| | - Lei Han
- Tencent Robotics X, Shenzhen, 518054, People's Republic of China
| | - Weiqi Qian
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, People's Republic of China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, People's Republic of China
| | - Jiabin Wang
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, People's Republic of China
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, People's Republic of China
| | - Heting Wu
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, People's Republic of China
| | - Jingchen Li
- Tencent Robotics X, Shenzhen, 518054, People's Republic of China
| | - Yuan Dai
- Tencent Robotics X, Shenzhen, 518054, People's Republic of China.
| | - Zhengyou Zhang
- Tencent Robotics X, Shenzhen, 518054, People's Republic of China
| | - Chris R Bowen
- Department of Mechanical Engineering, University of Bath, Bath, BA2 7AK, UK
| | - Ya Yang
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, People's Republic of China.
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, People's Republic of China.
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, People's Republic of China.
| |
Collapse
|
45
|
Antono JE, Dang S, Auksztulewicz R, Pooresmaeili A. Distinct Patterns of Connectivity between Brain Regions Underlie the Intra-Modal and Cross-Modal Value-Driven Modulations of the Visual Cortex. J Neurosci 2023; 43:7361-7375. [PMID: 37684031 PMCID: PMC10621764 DOI: 10.1523/jneurosci.0355-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2023] [Revised: 07/30/2023] [Accepted: 08/26/2023] [Indexed: 09/10/2023] Open
Abstract
Past reward associations may be signaled from different sensory modalities; however, it remains unclear how different types of reward-associated stimuli modulate sensory perception. In this human fMRI study (female and male participants), a visual target was simultaneously presented with either an intra- (visual) or a cross-modal (auditory) cue that was previously associated with rewards. We hypothesized that, depending on the sensory modality of the cues, distinct neural mechanisms underlie the value-driven modulation of visual processing. Using a multivariate approach, we confirmed that reward-associated cues enhanced the target representation in early visual areas and identified the brain valuation regions. Then, using an effective connectivity analysis, we tested three possible patterns of connectivity that could underlie the modulation of the visual cortex: a direct pathway from the frontal valuation areas to the visual areas, a mediated pathway through the attention-related areas, and a mediated pathway that additionally involved sensory association areas. We found evidence for the third model demonstrating that the reward-related information in both sensory modalities is communicated across the valuation and attention-related brain regions. Additionally, the superior temporal areas were recruited when reward was cued cross-modally. The strongest dissociation between the intra- and cross-modal reward-driven effects was observed at the level of the feedforward and feedback connections of the visual cortex estimated from the winning model. These results suggest that, in the presence of previously rewarded stimuli from different sensory modalities, a combination of domain-general and domain-specific mechanisms are recruited across the brain to adjust the visual perception.SIGNIFICANCE STATEMENT Reward has a profound effect on perception, but it is not known whether shared or disparate mechanisms underlie the reward-driven effects across sensory modalities. In this human fMRI study, we examined the reward-driven modulation of the visual cortex by visual (intra-modal) and auditory (cross-modal) reward-associated cues. Using a model-based approach to identify the most plausible pattern of inter-regional effective connectivity, we found that higher-order areas involved in the valuation and attentional processing were recruited by both types of rewards. However, the pattern of connectivity between these areas and the early visual cortex was distinct between the intra- and cross-modal rewards. This evidence suggests that, to effectively adapt to the environment, reward signals may recruit both domain-general and domain-specific mechanisms.
Collapse
Affiliation(s)
- Jessica Emily Antono
- Perception and Cognition Lab, European Neuroscience Institute Goettingen-A Joint Initiative of the University Medical Center Goettingen and the Max-Planck-Society, Germany, Goettingen, 37077, Germany
| | - Shilpa Dang
- Perception and Cognition Lab, European Neuroscience Institute Goettingen-A Joint Initiative of the University Medical Center Goettingen and the Max-Planck-Society, Germany, Goettingen, 37077, Germany
- School of Artificial Intelligence and Data Science, Indian Institute of Technology Jodhpur, Karwar, Jodhpur 342030, India
| | - Ryszard Auksztulewicz
- Center for Cognitive Neuroscience Berlin, Free University Berlin, Berlin, 14195, Germany
| | - Arezoo Pooresmaeili
- Perception and Cognition Lab, European Neuroscience Institute Goettingen-A Joint Initiative of the University Medical Center Goettingen and the Max-Planck-Society, Germany, Goettingen, 37077, Germany
| |
Collapse
|
46
|
Engelen T, Solcà M, Tallon-Baudry C. Interoceptive rhythms in the brain. Nat Neurosci 2023; 26:1670-1684. [PMID: 37697110 DOI: 10.1038/s41593-023-01425-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Accepted: 08/08/2023] [Indexed: 09/13/2023]
Abstract
Sensing internal bodily signals, or interoception, is fundamental to maintain life. However, interoception should not be viewed as an isolated domain, as it interacts with exteroception, cognition and action to ensure the integrity of the organism. Focusing on cardiac, respiratory and gastric rhythms, we review evidence that interoception is anatomically and functionally intertwined with the processing of signals from the external environment. Interactions arise at all stages, from the peripheral transduction of interoceptive signals to sensory processing and cortical integration, in a network that extends beyond core interoceptive regions. Interoceptive rhythms contribute to functions ranging from perceptual detection up to sense of self, or conversely compete with external inputs. Renewed interest in interoception revives long-standing issues on how the brain integrates and coordinates information in distributed regions, by means of oscillatory synchrony, predictive coding or multisensory integration. Considering interoception and exteroception in the same framework paves the way for biological modes of information processing specific to living organisms.
Collapse
Affiliation(s)
- Tahnée Engelen
- Cognitive and Computational Neuroscience Laboratory, Inserm, Ecole Normale Supérieure PSL University, Paris, France
| | - Marco Solcà
- Cognitive and Computational Neuroscience Laboratory, Inserm, Ecole Normale Supérieure PSL University, Paris, France
| | - Catherine Tallon-Baudry
- Cognitive and Computational Neuroscience Laboratory, Inserm, Ecole Normale Supérieure PSL University, Paris, France.
| |
Collapse
|
47
|
Feenders G. Attentional capture or multisensory integration? (Commentary on Bean et al., 2021). Eur J Neurosci 2023; 58:3714-3718. [PMID: 37697730 DOI: 10.1111/ejn.16131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Accepted: 08/10/2023] [Indexed: 09/13/2023]
Affiliation(s)
- Gesa Feenders
- Animal Physiology and Behaviour Group, Cluster of Excellence Hearing4all, Department of Neuroscience, School of Medicine and Health Sciences, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
48
|
Gat A, Pechuk V, Peedikayil-Kurien S, Karimi S, Goldman G, Sela S, Lubliner J, Krieg M, Oren-Suissa M. Integration of spatially opposing cues by a single interneuron guides decision-making in C. elegans. Cell Rep 2023; 42:113075. [PMID: 37691148 DOI: 10.1016/j.celrep.2023.113075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Revised: 07/11/2023] [Accepted: 08/16/2023] [Indexed: 09/12/2023] Open
Abstract
The capacity of animals to respond to hazardous stimuli in their surroundings is crucial for their survival. In mammals, complex evaluations of the environment require large numbers and different subtypes of neurons. The nematode C. elegans avoids hazardous chemicals they encounter by reversing their direction of movement. How does the worms' compact nervous system process the spatial information and direct motion change? We show here that a single interneuron, AVA, receives glutamatergic excitatory and inhibitory signals from head and tail sensory neurons, respectively. AVA integrates the spatially distinct and opposing cues, whose output instructs the animal's behavioral decision. We further find that the differential activation of AVA stems from distinct localization of inhibitory and excitatory glutamate-gated receptors along AVA's process and from different threshold sensitivities of the sensory neurons. Our results thus uncover a cellular mechanism that mediates spatial computation of nociceptive cues for efficient decision-making in C. elegans.
Collapse
Affiliation(s)
- Asaf Gat
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel; Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Vladyslava Pechuk
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel; Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Sonu Peedikayil-Kurien
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel; Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Shadi Karimi
- Neurophotonics and Mechanical Systems Biology, ICFO (Institut de Ciencies Fot'oniques), The Barcelona Institute of Science and Technology, 08860 Castelldefels, Barcelona, Spain
| | - Gal Goldman
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Sapir Sela
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel; Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Jazz Lubliner
- Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel
| | - Michael Krieg
- Neurophotonics and Mechanical Systems Biology, ICFO (Institut de Ciencies Fot'oniques), The Barcelona Institute of Science and Technology, 08860 Castelldefels, Barcelona, Spain
| | - Meital Oren-Suissa
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot 7610001, Israel; Department of Molecular Neuroscience, Weizmann Institute of Science, Rehovot 7610001, Israel.
| |
Collapse
|
49
|
Choi I, Demir I, Oh S, Lee SH. Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
| | - Ilayda Demir
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seungmi Oh
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| |
Collapse
|
50
|
Zeng Z, Zhang C, Gu Y. Visuo-vestibular heading perception: a model system to study multi-sensory decision making. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220334. [PMID: 37545303 PMCID: PMC10404926 DOI: 10.1098/rstb.2022.0334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Accepted: 05/15/2023] [Indexed: 08/08/2023] Open
Abstract
Integrating noisy signals across time as well as sensory modalities, a process named multi-sensory decision making (MSDM), is an essential strategy for making more accurate and sensitive decisions in complex environments. Although this field is just emerging, recent extraordinary works from different perspectives, including computational theory, psychophysical behaviour and neurophysiology, begin to shed new light onto MSDM. In the current review, we focus on MSDM by using a model system of visuo-vestibular heading. Combining well-controlled behavioural paradigms on virtual-reality systems, single-unit recordings, causal manipulations and computational theory based on spiking activity, recent progress reveals that vestibular signals contain complex temporal dynamics in many brain regions, including unisensory, multi-sensory and sensory-motor association areas. This challenges the brain for cue integration across time and sensory modality such as optic flow which mainly contains a motion velocity signal. In addition, new evidence from the higher-level decision-related areas, mostly in the posterior and frontal/prefrontal regions, helps revise our conventional thought on how signals from different sensory modalities may be processed, converged, and moment-by-moment accumulated through neural circuits for forming a unified, optimal perceptual decision. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Zhao Zeng
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Ce Zhang
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Yong Gu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| |
Collapse
|