1
|
Augière T, Simoneau M, Mercier C. Visuotactile integration in individuals with fibromyalgia. Front Hum Neurosci 2024; 18:1390609. [PMID: 38826615 PMCID: PMC11140151 DOI: 10.3389/fnhum.2024.1390609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2024] [Accepted: 04/29/2024] [Indexed: 06/04/2024] Open
Abstract
Our brain constantly integrates afferent information, such as visual and tactile information, to perceive the world around us. According to the maximum-likelihood estimation (MLE) model, imprecise information will be weighted less than precise, making the multisensory percept as precise as possible. Individuals with fibromyalgia (FM), a chronic pain syndrome, show alterations in the integration of tactile information. This could lead to a decrease in their weight in a multisensory percept or a general disruption of multisensory integration, making it less beneficial. To assess multisensory integration, 15 participants with FM and 18 pain-free controls performed a temporal-order judgment task in which they received pairs of sequential visual, tactile (unisensory conditions), or visuotactile (multisensory condition) stimulations on the index and the thumb of the non-dominant hand and had to determine which finger was stimulated first. The task enabled us to measure the precision and accuracy of the percept in each condition. Results indicate an increase in precision in the visuotactile condition compared to the unimodal conditions in controls only, although we found no intergroup differences. The observed visuotactile precision was correlated to the precision predicted by the MLE model in both groups, suggesting an optimal integration. Finally, the weights of the sensory information were not different between the groups; however, in the group with FM, higher pain intensity was associated with smaller tactile weight. This study shows no alterations of the visuotactile integration in individuals with FM, though pain may influence tactile weight in these participants.
Collapse
Affiliation(s)
- Tania Augière
- Center for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), CIUSSS de la Capitale-Nationale, Quebec, QC, Canada
- School of Rehabilitation Sciences, Faculty of Medicine, Laval University, Quebec, QC, Canada
| | - Martin Simoneau
- Center for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), CIUSSS de la Capitale-Nationale, Quebec, QC, Canada
- Department of Kinesiology, Faculty of Medicine, Laval University, Quebec, QC, Canada
| | - Catherine Mercier
- Center for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), CIUSSS de la Capitale-Nationale, Quebec, QC, Canada
- School of Rehabilitation Sciences, Faculty of Medicine, Laval University, Quebec, QC, Canada
| |
Collapse
|
2
|
Li Y, Wang J, Liang J, Zhu C, Zhang Z, Luo W. The impact of degraded vision on emotional perception of audiovisual stimuli: An event-related potential study. Neuropsychologia 2024; 194:108785. [PMID: 38159799 DOI: 10.1016/j.neuropsychologia.2023.108785] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2023] [Revised: 12/25/2023] [Accepted: 12/27/2023] [Indexed: 01/03/2024]
Abstract
Emotion recognition will be challenged for individuals when visual signals are degraded in real-life scenarios. Recently, researchers have conducted many studies on the distinct neural activity between clear and degraded audiovisual stimuli. These findings addressed the "how" question, but the precise stage of the distinct activity that occurred remains unknown. Therefore, it is crucial to use event-related potential (ERP) to explore the "when" question, just the time course of the neural activity of degraded audiovisual stimuli. In the present research, we established two conditions: clear auditory + degraded visual (AcVd) and clear auditory + clear visual (AcVc) multisensory conditions. We enlisted 31 participants to evaluate the emotional valence of audiovisual stimuli. The resulting data were analyzed using ERP in time domains and Microstate analysis. Current results suggest that degraded vision impairs the early-stage processing of audiovisual stimuli, with the superior parietal lobule (SPL) regulating audiovisual processing in a top-down fashion. Additionally, our findings indicate that negative and positive stimuli elicit greater EPN compared to neutral stimuli, pointing towards a subjective motivation-related attentional regulation. To sum up, in the early stage of emotional audiovisual processing, the degraded visual signal affected the perception of the physical attributes of audiovisual stimuli and had a further influence on emotion extraction processing, leading to the different regulation of top-down attention resources in the later stage.
Collapse
Affiliation(s)
- Yuchen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China; Institute of Psychology, Shandong Second Medical University, Weifang, 216053, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029, China
| | - Jing Wang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029, China
| | - Junyu Liang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China; School of Psychology, South China Normal University, Guangzhou, 510631, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029, China
| | - Chuanlin Zhu
- School of Educational Science, Yangzhou University, Yangzhou, 225002, China.
| | - Zhao Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China; Institute of Psychology, Shandong Second Medical University, Weifang, 216053, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, 116029, China.
| |
Collapse
|
3
|
Lee J, Park S. Multi-modal Representation of the Size of Space in the Human Brain. J Cogn Neurosci 2024; 36:340-361. [PMID: 38010320 DOI: 10.1162/jocn_a_02092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2023]
Abstract
To estimate the size of an indoor space, we must analyze the visual boundaries that limit the spatial extent and acoustic cues from reflected interior surfaces. We used fMRI to examine how the brain processes the geometric size of indoor scenes when various types of sensory cues are presented individually or together. Specifically, we asked whether the size of space is represented in a modality-specific way or in an integrative way that combines multimodal cues. In a block-design study, images or sounds that depict small- and large-sized indoor spaces were presented. Visual stimuli were real-world pictures of empty spaces that were small or large. Auditory stimuli were sounds convolved with different reverberations. By using a multivoxel pattern classifier, we asked whether the two sizes of space can be classified in visual, auditory, and visual-auditory combined conditions. We identified both sensory-specific and multimodal representations of the size of space. To further investigate the nature of the multimodal region, we specifically examined whether it contained multimodal information in a coexistent or integrated form. We found that angular gyrus and the right medial frontal gyrus had modality-integrated representation, displaying sensitivity to the match in the spatial size information conveyed through image and sound. Background functional connectivity analysis further demonstrated that the connection between sensory-specific regions and modality-integrated regions increases in the multimodal condition compared with single modality conditions. Our results suggest that spatial size perception relies on both sensory-specific and multimodal representations, as well as their interplay during multimodal perception.
Collapse
|
4
|
Choi I, Demir I, Oh S, Lee SH. Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Ilsong Choi
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
| | - Ilayda Demir
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seungmi Oh
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| | - Seung-Hee Lee
- Center for Synaptic Brain Dysfunctions, Institute for Basic Science (IBS), Daejeon 34141, Republic of Korea
- Department of biological sciences, KAIST, Daejeon 34141, Republic of Korea
| |
Collapse
|
5
|
Guo A, Yang W, Yang X, Lin J, Li Z, Ren Y, Yang J, Wu J. Audiovisual n-Back Training Alters the Neural Processes of Working Memory and Audiovisual Integration: Evidence of Changes in ERPs. Brain Sci 2023; 13:992. [PMID: 37508924 PMCID: PMC10377064 DOI: 10.3390/brainsci13070992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 05/15/2023] [Accepted: 05/16/2023] [Indexed: 07/30/2023] Open
Abstract
(1) Background: This study investigates whether audiovisual n-back training leads to training effects on working memory and transfer effects on perceptual processing. (2) Methods: Before and after training, the participants were tested using the audiovisual n-back task (1-, 2-, or 3-back), to detect training effects, and the audiovisual discrimination task, to detect transfer effects. (3) Results: For the training effect, the behavioral results show that training leads to greater accuracy and faster response times. Stronger training gains in accuracy and response time using 3- and 2-back tasks, compared to 1-back, were observed in the training group. Event-related potentials (ERPs) data revealed an enhancement of P300 in the frontal and central regions across all working memory levels after training. Training also led to the enhancement of N200 in the central region in the 3-back condition. For the transfer effect, greater audiovisual integration in the frontal and central regions during the post-test rather than pre-test was observed at an early stage (80-120 ms) in the training group. (4) Conclusion: Our findings provide evidence that audiovisual n-back training enhances neural processes underlying a working memory and demonstrate a positive influence of higher cognitive functions on lower cognitive functions.
Collapse
Affiliation(s)
- Ao Guo
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan
| | - Weiping Yang
- Department of Psychology, Faculty of Education, Hubei University, Wuhan 430062, China
- Brain and Cognition Research Center (BCRC), Faculty of Education, Hubei University, Wuhan 430062, China
| | - Xiangfu Yang
- Department of Psychology, Faculty of Education, Hubei University, Wuhan 430062, China
| | - Jinfei Lin
- Department of Psychology, Faculty of Education, Hubei University, Wuhan 430062, China
| | - Zimo Li
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan
| | - Yanna Ren
- Department of Psychology, College of Humanities and Management, Guizhou University of Traditional Chinese Medicine, Guiyang 550003, China
| | - Jiajia Yang
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan
- Applied Brain Science Lab., Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan
| | - Jinglong Wu
- School of Medical Technology, Beijing Institute of Technology, Beijing 100811, China
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama 700-8530, Japan
| |
Collapse
|
6
|
Wen P, Landy MS, Rokers B. Identifying cortical areas that underlie the transformation from 2D retinal to 3D head-centric motion signals. Neuroimage 2023; 270:119909. [PMID: 36801370 PMCID: PMC10061442 DOI: 10.1016/j.neuroimage.2023.119909] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Revised: 01/26/2023] [Accepted: 01/28/2023] [Indexed: 02/18/2023] Open
Abstract
Accurate motion perception requires that the visual system integrate the 2D retinal motion signals received by the two eyes into a single representation of 3D motion. However, most experimental paradigms present the same stimulus to the two eyes, signaling motion limited to a 2D fronto-parallel plane. Such paradigms are unable to dissociate the representation of 3D head-centric motion signals (i.e., 3D object motion relative to the observer) from the associated 2D retinal motion signals. Here, we used stereoscopic displays to present separate motion signals to the two eyes and examined their representation in visual cortex using fMRI. Specifically, we presented random-dot motion stimuli that specified various 3D head-centric motion directions. We also presented control stimuli, which matched the motion energy of the retinal signals, but were inconsistent with any 3D motion direction. We decoded motion direction from BOLD activity using a probabilistic decoding algorithm. We found that 3D motion direction signals can be reliably decoded in three major clusters in the human visual system. Critically, in early visual cortex (V1-V3), we found no significant difference in decoding performance between stimuli specifying 3D motion directions and the control stimuli, suggesting that these areas represent the 2D retinal motion signals, rather than 3D head-centric motion itself. In voxels in and surrounding hMT and IPS0 however, decoding performance was consistently superior for stimuli that specified 3D motion directions compared to control stimuli. Our results reveal the parts of the visual processing hierarchy that are critical for the transformation of retinal into 3D head-centric motion signals and suggest a role for IPS0 in their representation, in addition to its sensitivity to 3D object structure and static depth.
Collapse
Affiliation(s)
- Puti Wen
- Psychology, New York University Abu Dhabi, United Arab Emirates.
| | - Michael S Landy
- Department of Psychology and Center for Neural Science, New York University, United States
| | - Bas Rokers
- Psychology, New York University Abu Dhabi, United Arab Emirates; Department of Psychology and Center for Neural Science, New York University, United States
| |
Collapse
|
7
|
Scheliga S, Kellermann T, Lampert A, Rolke R, Spehr M, Habel U. Neural correlates of multisensory integration in the human brain: an ALE meta-analysis. Rev Neurosci 2023; 34:223-245. [PMID: 36084305 DOI: 10.1515/revneuro-2022-0065] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 07/22/2022] [Indexed: 02/07/2023]
Abstract
Previous fMRI research identified superior temporal sulcus as central integration area for audiovisual stimuli. However, less is known about a general multisensory integration network across senses. Therefore, we conducted activation likelihood estimation meta-analysis with multiple sensory modalities to identify a common brain network. We included 49 studies covering all Aristotelian senses i.e., auditory, visual, tactile, gustatory, and olfactory stimuli. Analysis revealed significant activation in bilateral superior temporal gyrus, middle temporal gyrus, thalamus, right insula, and left inferior frontal gyrus. We assume these regions to be part of a general multisensory integration network comprising different functional roles. Here, thalamus operate as first subcortical relay projecting sensory information to higher cortical integration centers in superior temporal gyrus/sulcus while conflict-processing brain regions as insula and inferior frontal gyrus facilitate integration of incongruent information. We additionally performed meta-analytic connectivity modelling and found each brain region showed co-activations within the identified multisensory integration network. Therefore, by including multiple sensory modalities in our meta-analysis the results may provide evidence for a common brain network that supports different functional roles for multisensory integration.
Collapse
Affiliation(s)
- Sebastian Scheliga
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Thilo Kellermann
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Angelika Lampert
- Institute of Physiology, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Roman Rolke
- Department of Palliative Medicine, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Marc Spehr
- Department of Chemosensation, RWTH Aachen University, Institute for Biology, Worringerweg 3, 52074 Aachen, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| |
Collapse
|
8
|
Yang W, Yang X, Guo A, Li S, Li Z, Lin J, Ren Y, Yang J, Wu J, Zhang Z. Audiovisual integration of the dynamic hand-held tool at different stimulus intensities in aging. Front Hum Neurosci 2022; 16:968987. [PMID: 36590067 PMCID: PMC9794578 DOI: 10.3389/fnhum.2022.968987] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 11/15/2022] [Indexed: 12/23/2022] Open
Abstract
Introduction: In comparison to the audiovisual integration of younger adults, the same process appears more complex and unstable in older adults. Previous research has found that stimulus intensity is one of the most important factors influencing audiovisual integration. Methods: The present study compared differences in audiovisual integration between older and younger adults using dynamic hand-held tool stimuli, such as holding a hammer hitting the floor. Meanwhile, the effects of stimulus intensity on audiovisual integration were compared. The intensity of the visual and auditory stimuli was regulated by modulating the contrast level and sound pressure level. Results: Behavioral results showed that both older and younger adults responded faster and with higher hit rates to audiovisual stimuli than to visual and auditory stimuli. Further results of event-related potentials (ERPs) revealed that during the early stage of 60-100 ms, in the low-intensity condition, audiovisual integration of the anterior brain region was greater in older adults than in younger adults; however, in the high-intensity condition, audiovisual integration of the right hemisphere region was greater in younger adults than in older adults. Moreover, audiovisual integration was greater in the low-intensity condition than in the high-intensity condition in older adults during the 60-100 ms, 120-160 ms, and 220-260 ms periods, showing inverse effectiveness. However, there was no difference in the audiovisual integration of younger adults across different intensity conditions. Discussion: The results suggested that there was an age-related dissociation between high- and low-intensity conditions with audiovisual integration of the dynamic hand-held tool stimulus. Older adults showed greater audiovisual integration in the lower intensity condition, which may be due to the activation of compensatory mechanisms.
Collapse
Affiliation(s)
- Weiping Yang
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China,Brain and Cognition Research Center (BCRC), Faculty of Education, Hubei University, Wuhan, China
| | - Xiangfu Yang
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China
| | - Ao Guo
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China
| | - Shengnan Li
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China
| | - Zimo Li
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China
| | - Jinfei Lin
- Department of Psychology, Faculty of Education, Hubei University, Wuhan, China
| | - Yanna Ren
- Department of Psychology, College of Humanities and Management, Guizhou University of Traditional Chinese Medicine, Guiyang, China,*Correspondence: Yanna Ren Zhilin Zhang
| | - Jiajia Yang
- Applied Brain Science Lab, Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Jinglong Wu
- Applied Brain Science Lab, Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan,Research Center for Medical Artificial Intelligence, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong, China
| | - Zhilin Zhang
- Research Center for Medical Artificial Intelligence, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong, China,*Correspondence: Yanna Ren Zhilin Zhang
| |
Collapse
|
9
|
Takatsuru Y, Motegi S, Nishikata T, Sato H, Yonemochi K. Frontal medial cortex and angular gyrus functional connectivity is related to sex and age differences in odor sensitivity. J Neuroimaging 2022; 32:611-616. [PMID: 35355361 DOI: 10.1111/jon.12994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2021] [Revised: 02/23/2022] [Accepted: 03/13/2022] [Indexed: 11/30/2022] Open
Abstract
BACKGROUND AND PURPOSE Odor preference is one of the key factors for the rehabilitation of the swallowing function. On the other hand, sensitivity to odor differs between sexes and decreases with age. These factors rely on brain neuronal circuits. However, it remains not fully clarified which neuronal circuit determines the sex and age differences in odor sensitivity. In this study, we carried out both the odor sensitivity test and functional MRI (fMRI) to find the key neuronal circuits determining sex and age differences in odor sensitivity. METHODS Healthy volunteers (28 males, aged 27-62 years, and 30 females, aged 21-59 years) participated in this study. Some of them (seven males and seven females) underwent fMRI. We prepared five odorous test substances and presented each substance at 1 minute intervals. After 5 minutes of questioning about food intake, the subjects were asked to recall each of the test substances presented from the list. In the fMRI study, all the subjects underwent 15 minutes of the prestimulation, stimulation with peppermint odor, and poststimulation sessions. RESULTS The odor test score was significantly higher in females than in males and showed an age-dependent decrease. We found four functional connectivities whose degrees were significantly different between males and females. One of them, the functional connectivity between the frontal medial cortex (MedFC) and the left angular gyrus (AG. l), showed an age-dependent change. CONCLUSIONS The functional MedFC-AG.l connectivity is one of the important neuronal circuits that affect the sex- and age-dependent odor sensitivity.
Collapse
Affiliation(s)
- Yusuke Takatsuru
- Division of Multidimensional Clinical Medicine, Department of Nutrition and Health Sciences, Toyo University, Itakura, Japan.,Department of Medicine, Johmoh Hospital, Maebashi, Japan
| | - Shunichi Motegi
- Department of Radiological Sciences, International University of Health and Welfare, Otawara, Japan
| | | | - Hideyasu Sato
- Department of Food Life Sciences, Toyo University, Itakura, Japan
| | - Keita Yonemochi
- Department of Radiological Technology, Gunma Prefectural College of Health Sciences, Maebashi, Japan
| |
Collapse
|
10
|
Liang K, Wang W, Lei X, Zeng H, Gong W, Lou C, Chen L. Odor-induced sound localization bias under unilateral intranasal trigeminal stimulation. Chem Senses 2022; 47:6794997. [PMID: 36326595 DOI: 10.1093/chemse/bjac029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
As a stereo odor cue, internostril odor influx could help us in many spatial tasks, including localization and navigation. Studies have also revealed that this benefit could be modulated by the asymmetric concentrations of both influxes (left nose vs right nose). The interaction between olfaction and vision, such as in object recognition and visual direction judgment, has been documented; however, little has been revealed about the impact of odor cues on sound localization. Here we adopted the ventriloquist paradigm in auditory-odor interactions and investigated sound localization with the concurrent unilateral odor influx. Specifically, we teased apart both the "nature" of the odors (pure olfactory stimulus vs. mixed olfactory/trigeminal stimulus) and the location of influx (left nose vs. right nose) and examined sound localization with the method of constant stimuli. Forty-one participants, who passed the Chinese Smell Identification Test, perceived sounds with different azimuths (0°, 5°, 10°, and 20° unilaterally deflected from the sagittal plane by head-related transfer function) and performed sound localization (leftward or rightward) tasks under concurrent, different unilateral odor influxes (10% v/v phenylethyl alcohol, PEA, as pure olfactory stimulus, 1% m/v menthol as mixed olfactory/trigeminal stimulus, and propylene glycol as the control). Meanwhile, they reported confidence levels of the judgments. Results suggested that unilateral PEA influx did not affect human sound localization judgments. However, unilateral menthol influx systematically biased the perceived sound localization, shifting toward the odor source. Our study provides evidence that unilateral odor influx could bias perceived sound localization only when the odor activates the trigeminal nerves.
Collapse
Affiliation(s)
- Kun Liang
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China
| | - Wu Wang
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China
| | - Xiao Lei
- Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China
| | - Huanke Zeng
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China
| | - Wenxiao Gong
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China
| | - Chunmiao Lou
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China
| | - Lihan Chen
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China.,Beijing Key Laboratory of Behaviour and Mental Health, Peking University, Beijing, China.,Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, China
| |
Collapse
|
11
|
Liang X, Koh CL, Yeh CH, Goodin P, Lamp G, Connelly A, Carey LM. Predicting Post-Stroke Somatosensory Function from Resting-State Functional Connectivity: A Feasibility Study. Brain Sci 2021; 11:brainsci11111388. [PMID: 34827387 PMCID: PMC8615819 DOI: 10.3390/brainsci11111388] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2021] [Revised: 10/07/2021] [Accepted: 10/18/2021] [Indexed: 12/02/2022] Open
Abstract
Accumulating evidence shows that brain functional deficits may be impacted by damage to remote brain regions. Recent advances in neuroimaging suggest that stroke impairment can be better predicted based on disruption to brain networks rather than from lesion locations or volumes only. Our aim was to explore the feasibility of predicting post-stroke somatosensory function from brain functional connectivity through the application of machine learning techniques. Somatosensory impairment was measured using the Tactile Discrimination Test. Functional connectivity was employed to model the global brain function. Behavioral measures and MRI were collected at the same timepoint. Two machine learning models (linear regression and support vector regression) were chosen to predict somatosensory impairment from disrupted networks. Along with two feature pools (i.e., low-order and high-order functional connectivity, or low-order functional connectivity only) engineered, four predictive models were built and evaluated in the present study. Forty-three chronic stroke survivors participated this study. Results showed that the regression model employing both low-order and high-order functional connectivity can predict outcomes based on correlation coefficient of r = 0.54 (p = 0.0002). A machine learning predictive approach, involving high- and low-order modelling, is feasible for the prediction of residual somatosensory function in stroke patients using functional brain networks.
Collapse
Affiliation(s)
- Xiaoyun Liang
- Neurorehabilitation and Recovery, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-L.K.); (P.G.); (G.L.); (L.M.C.)
- Victorian Infant Brain Studies (VIBeS) Group, Murdoch Children’s Research Institute, Melbourne, VIC 3052, Australia
- Correspondence:
| | - Chia-Lin Koh
- Neurorehabilitation and Recovery, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-L.K.); (P.G.); (G.L.); (L.M.C.)
- Department of Occupational Therapy, Social Work and Social Policy, School of Allied Health Human Services and Sport, La Trobe University, Melbourne, VIC 3086, Australia
- Department of Occupational Therapy, College of Medicine, National Cheng Kung University, Tainan 701, Taiwan
| | - Chun-Hung Yeh
- Imaging Division, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-H.Y.); (A.C.)
- Institute for Radiological Research, Chang Gung University and Chang Gung Memorial Hospital, Taoyuan 33302, Taiwan
- Department of Psychiatry, Chang Gung Memorial Hospital, Linkou Medical Center, Taoyuan 33305, Taiwan
| | - Peter Goodin
- Neurorehabilitation and Recovery, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-L.K.); (P.G.); (G.L.); (L.M.C.)
| | - Gemma Lamp
- Neurorehabilitation and Recovery, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-L.K.); (P.G.); (G.L.); (L.M.C.)
- Department of Psychology and Counselling, School of Psychology and Public Health, La Trobe University, Melbourne, VIC 3086, Australia
| | - Alan Connelly
- Imaging Division, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-H.Y.); (A.C.)
| | - Leeanne M. Carey
- Neurorehabilitation and Recovery, Florey Institute of Neuroscience and Mental Health, Melbourne, VIC 3084, Australia; (C.-L.K.); (P.G.); (G.L.); (L.M.C.)
- Department of Occupational Therapy, Social Work and Social Policy, School of Allied Health Human Services and Sport, La Trobe University, Melbourne, VIC 3086, Australia
| |
Collapse
|
12
|
Iravani B, Peter MG, Arshamian A, Olsson MJ, Hummel T, Kitzler HH, Lundström JN. Acquired olfactory loss alters functional connectivity and morphology. Sci Rep 2021; 11:16422. [PMID: 34385571 PMCID: PMC8361122 DOI: 10.1038/s41598-021-95968-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Accepted: 08/02/2021] [Indexed: 11/10/2022] Open
Abstract
Removing function from a developed and functional sensory system is known to alter both cerebral morphology and functional connections. To date, a majority of studies assessing sensory-dependent plasticity have focused on effects from either early onset or long-term sensory loss and little is known how the recent sensory loss affects the human brain. With the aim of determining how recent sensory loss affects cerebral morphology and functional connectivity, we assessed differences between individuals with acquired olfactory loss (duration 7-36 months) and matched healthy controls in their grey matter volume, using multivariate pattern analyses, and functional connectivity, using dynamic connectivity analyses, within and from the olfactory cortex. Our results demonstrate that acquired olfactory loss is associated with altered grey matter volume in, among others, posterior piriform cortex, a core olfactory processing area, as well as the inferior frontal gyrus and angular gyrus. In addition, compared to controls, individuals with acquired anosmia displayed significantly stronger dynamic functional connectivity from the posterior piriform cortex to, among others, the angular gyrus, a known multisensory integration area. When assessing differences in dynamic functional connectivity from the angular gyrus, individuals with acquired anosmia had stronger connectivity from the angular gyrus to areas primary responsible for basic visual processing. These results demonstrate that recently acquired sensory loss is associated with both changed cerebral morphology within core olfactory areas and increase dynamic functional connectivity from olfactory cortex to cerebral areas processing multisensory integration.
Collapse
Affiliation(s)
- Behzad Iravani
- Division of Psychology, Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 171 77, Stockholm, Sweden
| | - Moa G Peter
- Division of Psychology, Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 171 77, Stockholm, Sweden
| | - Artin Arshamian
- Division of Psychology, Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 171 77, Stockholm, Sweden
| | - Mats J Olsson
- Division of Psychology, Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 171 77, Stockholm, Sweden
| | - Thomas Hummel
- Department of Otorhinolaryngology, Smell and Taste Clinic, TU Dresden, Dresden, Germany
| | - Hagen H Kitzler
- Institute of Diagnostic and Interventional Neuroradiology, TU Dresden, Dresden, Germany
| | - Johan N Lundström
- Division of Psychology, Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 171 77, Stockholm, Sweden. .,Monell Chemical Senses Center, Philadelphia, PA, USA. .,Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA. .,Stockholm University Brain Imaging Centre, Stockholm University, Stockholm, Sweden.
| |
Collapse
|
13
|
Wu J, Li Q, Fu Q, Rose M, Jing L. Multisensory Information Facilitates the Categorization of Untrained Stimuli. Multisens Res 2021; 35:79-107. [PMID: 34388699 DOI: 10.1163/22134808-bja10061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2020] [Accepted: 07/30/2021] [Indexed: 11/19/2022]
Abstract
Although it has been demonstrated that multisensory information can facilitate object recognition and object memory, it remains unclear whether such facilitation effect exists in category learning. To address this issue, comparable car images and sounds were first selected by a discrimination task in Experiment 1. Then, those selected images and sounds were utilized in a prototype category learning task in Experiments 2 and 3, in which participants were trained with auditory, visual, and audiovisual stimuli, and were tested with trained or untrained stimuli within the same categories presented alone or accompanied with a congruent or incongruent stimulus in the other modality. In Experiment 2, when low-distortion stimuli (more similar to the prototypes) were trained, there was higher accuracy for audiovisual trials than visual trials, but no significant difference between audiovisual and auditory trials. During testing, accuracy was significantly higher for congruent trials than unisensory or incongruent trials, and the congruency effect was larger for untrained high-distortion stimuli than trained low-distortion stimuli. In Experiment 3, when high-distortion stimuli (less similar to the prototypes) were trained, there was higher accuracy for audiovisual trials than visual or auditory trials, and the congruency effect was larger for trained high-distortion stimuli than untrained low-distortion stimuli during testing. These findings demonstrated that higher degree of stimuli distortion resulted in more robust multisensory effect, and the categorization of not only trained but also untrained stimuli in one modality could be influenced by an accompanying stimulus in the other modality.
Collapse
Affiliation(s)
- Jie Wu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.,Department of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.,NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, 20246 Hamburg, Germany
| | - Qitian Li
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.,Department of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.,NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, 20246 Hamburg, Germany
| | - Qiufang Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.,Department of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
| | - Michael Rose
- NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, 20246 Hamburg, Germany
| | - Liping Jing
- Beijing Key Lab of Traffic Data Analysis and Mining Beijing Jiaotong University, Beijing, China
| |
Collapse
|
14
|
McCormick K, Lacey S, Stilla R, Nygaard LC, Sathian K. Neural Basis of the Sound-Symbolic Crossmodal Correspondence Between Auditory Pseudowords and Visual Shapes. Multisens Res 2021; 35:29-78. [PMID: 34384048 PMCID: PMC9196751 DOI: 10.1163/22134808-bja10060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Accepted: 07/17/2021] [Indexed: 11/19/2022]
Abstract
Sound symbolism refers to the association between the sounds of words and their meanings, often studied using the crossmodal correspondence between auditory pseudowords, e.g., 'takete' or 'maluma', and pointed or rounded visual shapes, respectively. In a functional magnetic resonance imaging study, participants were presented with pseudoword-shape pairs that were sound-symbolically congruent or incongruent. We found no significant congruency effects in the blood oxygenation level-dependent (BOLD) signal when participants were attending to visual shapes. During attention to auditory pseudowords, however, we observed greater BOLD activity for incongruent compared to congruent audiovisual pairs bilaterally in the intraparietal sulcus and supramarginal gyrus, and in the left middle frontal gyrus. We compared this activity to independent functional contrasts designed to test competing explanations of sound symbolism, but found no evidence for mediation via language, and only limited evidence for accounts based on multisensory integration and a general magnitude system. Instead, we suggest that the observed incongruency effects are likely to reflect phonological processing and/or multisensory attention. These findings advance our understanding of sound-to-meaning mapping in the brain.
Collapse
Affiliation(s)
- Kelly McCormick
- Department of Psychology, Emory University, Atlanta, GA 30322, USA
| | - Simon Lacey
- Department of Neurology, Milton S. Hershey Medical Center, Penn State College of Medicine, Hershey, PA 17033-0859, USA
- Department of Neural and Behavioral Sciences, Milton S. Hershey Medical Center, Penn State College of Medicine, Hershey, PA 17033-0859, USA
| | - Randall Stilla
- Winship Cancer Institute, Emory University, Atlanta, GA 30322, USA
| | - Lynne C. Nygaard
- Department of Psychology, Emory University, Atlanta, GA 30322, USA
| | - K. Sathian
- Department of Neurology, Milton S. Hershey Medical Center, Penn State College of Medicine, Hershey, PA 17033-0859, USA
- Department of Neural and Behavioral Sciences, Milton S. Hershey Medical Center, Penn State College of Medicine, Hershey, PA 17033-0859, USA
- Department of Psychology, Milton S. Hershey Medical Center, Penn State College of Medicine, Hershey, PA 17033-0859, USA
| |
Collapse
|
15
|
Peter MG, Mårtensson G, Postma EM, Engström Nordin L, Westman E, Boesveldt S, Lundström JN. Seeing Beyond Your Nose? The Effects of Lifelong Olfactory Sensory Deprivation on Cerebral Audio-visual Integration. Neuroscience 2021; 472:1-10. [PMID: 34311017 DOI: 10.1016/j.neuroscience.2021.07.017] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2021] [Revised: 07/06/2021] [Accepted: 07/16/2021] [Indexed: 11/28/2022]
Abstract
Lifelong auditory and visual sensory deprivation have been demonstrated to alter both perceptual acuity and the neural processing of remaining senses. Recently, it was demonstrated that individuals with anosmia, i.e. complete olfactory sensory deprivation, displayed enhanced multisensory integration performance. Whether this ability is due to a reorganization of olfactory processing regions to focus on cross-modal multisensory information or whether it is due to enhanced processing within multisensory integration regions is not known. To dissociate these two outcomes, we investigated the neural processing of dynamic audio-visual stimuli in individuals with congenital anosmia and matched controls (both groups, n = 33) using functional magnetic resonance imaging. Specifically, we assessed whether the previously demonstrated multisensory enhancement is related to cross-modal processing of multisensory stimuli in olfactory associated regions, the piriform and olfactory orbitofrontal cortices, or enhanced multisensory processing in established multisensory integration regions, the superior temporal and intraparietal sulci. No significant group differences were found in the a priori hypothesized regions using region of interest analyses. However, exploratory whole-brain analysis suggested higher activation related to multisensory integration within the posterior superior temporal sulcus, in close proximity to the multisensory region of interest, in individuals with congenital anosmia. No group differences were demonstrated in olfactory associated regions. Although results were outside our hypothesized regions, combined, they tentatively suggest that enhanced processing of audio-visual stimuli in individuals with congenital anosmia may be mediated by multisensory, and not primary sensory, cerebral regions.
Collapse
Affiliation(s)
- Moa G Peter
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Gustav Mårtensson
- Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Stockholm, Sweden
| | - Elbrich M Postma
- Division of Human Nutrition and Health, Wageningen University, Wageningen, the Netherlands; Smell and Taste Centre, Hospital Gelderse Vallei, Ede, the Netherlands
| | - Love Engström Nordin
- Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Stockholm, Sweden; Department of Diagnostic Medical Physics, Karolinska University Hospital Solna, Stockholm, Sweden
| | - Eric Westman
- Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Stockholm, Sweden; Department of Neuroimaging, Centre for Neuroimaging Sciences, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Sanne Boesveldt
- Division of Human Nutrition and Health, Wageningen University, Wageningen, the Netherlands
| | - Johan N Lundström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden; Monell Chemical Senses Center, Philadelphia, PA, United States; Department of Psychology, University of Pennsylvania, Philadelphia, United States; Stockholm University Brain Imaging Centre, Stockholm University, Stockholm, Sweden
| |
Collapse
|
16
|
Porada DK, Regenbogen C, Freiherr J, Seubert J, Lundström JN. Trimodal processing of complex stimuli in inferior parietal cortex is modality-independent. Cortex 2021; 139:198-210. [PMID: 33878687 DOI: 10.1016/j.cortex.2021.03.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2020] [Revised: 11/29/2020] [Accepted: 03/09/2021] [Indexed: 11/26/2022]
Abstract
In humans, multisensory mechanisms facilitate object processing through integration of sensory signals that match in their temporal and spatial occurrence as well as their meaning. The generalizability of such integration processes across different sensory modalities is, however, to date not well understood. As such, it remains unknown whether there are cerebral areas that process object-related signals independently of the specific senses from which they arise, and whether these areas show different response profiles depending on the number of sensory channels that carry information. To address these questions, we presented participants with dynamic stimuli that simultaneously emitted object-related sensory information via one, two, or three channels (sight, sound, smell) in the MR scanner. By comparing neural activation patterns between various integration processes differing in type and number of stimulated senses, we showed that the left inferior frontal gyrus and areas within the left inferior parietal cortex were engaged independently of the number and type of sensory input streams. Activation in these areas was enhanced during bimodal stimulation, compared to the sum of unimodal activations, and increased even further during trimodal stimulation. Taken together, our findings demonstrate that activation of the inferior parietal cortex during processing and integration of meaningful multisensory stimuli is both modality-independent and modulated by the number of available sensory modalities. This suggests that the processing demand placed on the parietal cortex increases with the number of sensory input streams carrying meaningful information, likely due to the increasing complexity of such stimuli.
Collapse
Affiliation(s)
- Danja K Porada
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | - Christina Regenbogen
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden; Department of Psychiatry, Psychotherapy and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany; JARA Institute Brain Structure Function Relationship, RWTH Aachen University, Aachen, Germany
| | - Jessica Freiherr
- Department of Psychiatry and Psychotherapy, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany
| | - Janina Seubert
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Johan N Lundström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden; Monell Chemical Senses Center, Philadelphia, USA; Department of Psychology, University of Pennsylvania, Philadelphia, USA; Stockholm University Brain Imaging Centre, Stockholm University, Stockholm, Sweden.
| |
Collapse
|
17
|
Csonka M, Mardmomen N, Webster PJ, Brefczynski-Lewis JA, Frum C, Lewis JW. Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain. Cereb Cortex Commun 2021; 2:tgab002. [PMID: 33718874 PMCID: PMC7941256 DOI: 10.1093/texcom/tgab002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 12/31/2020] [Accepted: 01/06/2021] [Indexed: 01/23/2023] Open
Abstract
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.
Collapse
Affiliation(s)
- Matt Csonka
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Nadia Mardmomen
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Paula J Webster
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Julie A Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Chris Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - James W Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| |
Collapse
|
18
|
Mohl JT, Pearson JM, Groh JM. Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli. J Neurophysiol 2020; 124:715-727. [PMID: 32727263 DOI: 10.1152/jn.00046.2020] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit "same vs. different" source judgments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.NEW & NOTEWORTHY We developed a novel behavioral paradigm for the study of multisensory causal inference in both humans and monkeys and found that both species make causal judgments in the same Bayes-optimal fashion. To our knowledge, this is the first demonstration of behavioral causal inference in animals, and this cross-species comparison lays the groundwork for future experiments using neuronal recording techniques that are impractical or impossible in human subjects.
Collapse
Affiliation(s)
- Jeff T Mohl
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina
| | - John M Pearson
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina.,Department of Psychology and Neuroscience, Duke University, Durham, North Carolina.,Department of Biostatistics and Bioinformatics, Duke University Medical School, Durham, North Carolina
| | - Jennifer M Groh
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina.,Department of Psychology and Neuroscience, Duke University, Durham, North Carolina
| |
Collapse
|
19
|
Peter MG, Porada DK, Regenbogen C, Olsson MJ, Lundström JN. Sensory loss enhances multisensory integration performance. Cortex 2019; 120:116-130. [DOI: 10.1016/j.cortex.2019.06.003] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2018] [Revised: 04/25/2019] [Accepted: 06/04/2019] [Indexed: 10/26/2022]
|
20
|
Leclerc MP, Kellermann T, Freiherr J, Clemens B, Habel U, Regenbogen C. Externalization Errors of Olfactory Source Monitoring in Healthy Controls-An fMRI Study. Chem Senses 2019; 44:593-606. [PMID: 31414135 DOI: 10.1093/chemse/bjz055] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Using a combined approach of functional magnetic resonance imaging (fMRI) and noninvasive brain stimulation (transcranial direct current stimulation [tDCS]), the present study investigated source memory and its link to mental imagery in the olfactory domain, as well as in the auditory domain. Source memory refers to the knowledge of the origin of mental experiences, differentiating events that have occurred and memories of imagined events. Because of a confusion between internally generated and externally perceived information, patients that are prone to hallucinations show decreased source memory accuracy; also, vivid mental imagery can lead to similar results in healthy controls. We tested source memory following cathodal tDCS stimulation using a mental imagery task, which required participants to perceive or imagine a set of the same olfactory and auditory stimuli during fMRI. The supplementary motor area (SMA) is involved in mental imagery across different modalities and potentially linked to source memory. Therefore, we attempted to modulate participants' SMA activation before entering the scanner using tDCS to influence source memory accuracy in healthy participants. Our results showed the same source memory accuracy between the olfactory and auditory modalities with no effects of stimulation. Finally, we found SMA's subregions differentially involved in olfactory and auditory imagery, with activation of dorsal SMA correlated with auditory source memory.
Collapse
Affiliation(s)
- Marcel P Leclerc
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen, Pauwelsstr, Aachen, Germany.,JARA-BRAIN, Pauwelsstr, Aachen, Germany
| | - Thilo Kellermann
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen, Pauwelsstr, Aachen, Germany.,JARA-BRAIN, Pauwelsstr, Aachen, Germany
| | - Jessica Freiherr
- Diagnostic and Interventional Neuroradiology, RWTH Aachen University, Pauwelsstr, Aachen, Germany.,Psychiatrische und Psychotherapeutische Klinik, Friedrich-Alexander-Universität Erlangen-Nürnberg, Schwabachanlage, Erlangen, Germany
| | - Benjamin Clemens
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen, Pauwelsstr, Aachen, Germany.,JARA-BRAIN, Pauwelsstr, Aachen, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen, Pauwelsstr, Aachen, Germany.,JARA-BRAIN, Pauwelsstr, Aachen, Germany
| | - Christina Regenbogen
- Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen, Pauwelsstr, Aachen, Germany.,JARA-BRAIN, Pauwelsstr, Aachen, Germany.,Department of Clinical Neuroscience, Karolinska Institutet, Tomtebodavägen 18A,17177 Stockholm, Sweden
| |
Collapse
|
21
|
Stickel S, Weismann P, Kellermann T, Regenbogen C, Habel U, Freiherr J, Chechko N. Audio-visual and olfactory-visual integration in healthy participants and subjects with autism spectrum disorder. Hum Brain Mapp 2019; 40:4470-4486. [PMID: 31301203 PMCID: PMC6865810 DOI: 10.1002/hbm.24715] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2019] [Revised: 05/23/2019] [Accepted: 07/01/2019] [Indexed: 01/22/2023] Open
Abstract
The human capacity to integrate sensory signals has been investigated with respect to different sensory modalities. A common denominator of the neural network underlying the integration of sensory clues has yet to be identified. Additionally, brain imaging data from patients with autism spectrum disorder (ASD) do not cover disparities in neuronal sensory processing. In this fMRI study, we compared the underlying neural networks of both olfactory-visual and auditory-visual integration in patients with ASD and a group of matched healthy participants. The aim was to disentangle sensory-specific networks so as to derive a potential (amodal) common source of multisensory integration (MSI) and to investigate differences in brain networks with sensory processing in individuals with ASD. In both groups, similar neural networks were found to be involved in the olfactory-visual and auditory-visual integration processes, including the primary visual cortex, the inferior parietal sulcus (IPS), and the medial and inferior frontal cortices. Amygdala activation was observed specifically during olfactory-visual integration, with superior temporal activation having been seen during auditory-visual integration. A dynamic causal modeling analysis revealed a nonlinear top-down IPS modulation of the connection between the respective primary sensory regions in both experimental conditions and in both groups. Thus, we demonstrate that MSI has shared neural sources across olfactory-visual and audio-visual stimulation in patients and controls. The enhanced recruitment of the IPS to modulate changes between areas is relevant to sensory perception. Our results also indicate that, with respect to MSI processing, adults with ASD do not significantly differ from their healthy counterparts.
Collapse
Affiliation(s)
- Susanne Stickel
- Department of Psychiatry, Psychotherapy and PsychosomaticsFaculty of Medicine, RWTH AachenAachenGermany
- Institute of Neuroscience and Medicine: JARA‐Institute Brain Structure Function Relationship (INM 10)Research Center JülichJülichGermany
| | - Pauline Weismann
- Department of Psychiatry and PsychotherapyFriedrich‐Alexander‐Universität Erlangen‐NürnbergErlangenGermany
| | - Thilo Kellermann
- Department of Psychiatry, Psychotherapy and PsychosomaticsFaculty of Medicine, RWTH AachenAachenGermany
- Institute of Neuroscience and Medicine: JARA‐Institute Brain Structure Function Relationship (INM 10)Research Center JülichJülichGermany
| | - Christina Regenbogen
- Department of Psychiatry, Psychotherapy and PsychosomaticsFaculty of Medicine, RWTH AachenAachenGermany
- Institute of Neuroscience and Medicine: JARA‐Institute Brain Structure Function Relationship (INM 10)Research Center JülichJülichGermany
- Department of Clinical NeuroscienceKarolinska InstitutetStockholmSweden
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and PsychosomaticsFaculty of Medicine, RWTH AachenAachenGermany
- Institute of Neuroscience and Medicine: JARA‐Institute Brain Structure Function Relationship (INM 10)Research Center JülichJülichGermany
| | - Jessica Freiherr
- Department of Psychiatry and PsychotherapyFriedrich‐Alexander‐Universität Erlangen‐NürnbergErlangenGermany
- Sensory AnalyticsFraunhofer Institute for Process Engineering and Packaging IVVFreisingGermany
| | - Natalya Chechko
- Department of Psychiatry, Psychotherapy and PsychosomaticsFaculty of Medicine, RWTH AachenAachenGermany
- Institute of Neuroscience and Medicine: JARA‐Institute Brain Structure Function Relationship (INM 10)Research Center JülichJülichGermany
| |
Collapse
|
22
|
Multisensory Enhancement of Odor Object Processing in Primary Olfactory Cortex. Neuroscience 2019; 418:254-265. [DOI: 10.1016/j.neuroscience.2019.08.040] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2019] [Revised: 08/22/2019] [Accepted: 08/23/2019] [Indexed: 01/06/2023]
|
23
|
Visual input shapes the auditory frequency responses in the inferior colliculus of mouse. Hear Res 2019; 381:107777. [PMID: 31430633 DOI: 10.1016/j.heares.2019.107777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/30/2019] [Accepted: 08/02/2019] [Indexed: 11/23/2022]
Abstract
The integration of visual and auditory information is important for humans or animals to build an accurate and coherent perception of the external world. Although some evidence has shown some principles of the audiovisual integration, little insight has been gained into its functional purpose. In this study, we investigated the functional influence of dynamic visual input on auditory frequency processing by recording single unit activity in the central nucleus of the inferior colliculus (ICc). Results showed that the auditory responses of ICc neurons to sound frequencies could be enhanced or suppressed by visual stimuli even though the same visual stimuli induced no neural responses when presented alone. For each ICc neuron, the most effective visual stimuli were located in the same azimuth as for auditory stimuli and preceded for ∼20 ms. Additionally, visual stimuli could steepen or flatten the frequency tuning curves (FTCs) of ICc neurons by various visual effects at each responsive frequency. The modulation degree of auditory FTCs was dependent on the minimal thresholds (MTs) of ICc neurons, i.e., with MTs increasing, the modulation degree decreased. Due to the non-homogeneous distribution of MTs which was lowest at 10 kHz, visual modulation of auditory FTCs exhibited a frequency-specific manner, the closer it reached the characteristic frequency (CF) of 10 kHz, the greater modulation. Thus, visual modulation of auditory frequency responses in ICc is dependent not only on the visual stimulus but also on the auditory characteristics of ICc neurons. These results suggest a moment-to-moment visual modulation of auditory frequency responses that in real time increase auditory frequency sensitivity to audiovisual stimuli. Furthermore, in the long term such modulation could serve to instruct auditory adaptive plasticity to maintain necessary and accurate auditory detection and perceptual behavior.
Collapse
|
24
|
Regenbogen C, Seubert J, Johansson E, Finkelmeyer A, Andersson P, Lundström JN. The intraparietal sulcus governs multisensory integration of audiovisual information based on task difficulty. Hum Brain Mapp 2017; 39:1313-1326. [PMID: 29235185 DOI: 10.1002/hbm.23918] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2017] [Revised: 11/30/2017] [Accepted: 12/04/2017] [Indexed: 01/20/2023] Open
Abstract
Object recognition benefits maximally from multimodal sensory input when stimulus presentation is noisy, or degraded. Whether this advantage can be attributed specifically to the extent of overlap in object-related information, or rather, to object-unspecific enhancement due to the mere presence of additional sensory stimulation, remains unclear. Further, the cortical processing differences driving increased multisensory integration (MSI) for degraded compared with clear information remain poorly understood. Here, two consecutive studies first compared behavioral benefits of audio-visual overlap of object-related information, relative to conditions where one channel carried information and the other carried noise. A hierarchical drift diffusion model indicated performance enhancement when auditory and visual object-related information was simultaneously present for degraded stimuli. A subsequent fMRI study revealed visual dominance on a behavioral and neural level for clear stimuli, while degraded stimulus processing was mainly characterized by activation of a frontoparietal multisensory network, including IPS. Connectivity analyses indicated that integration of degraded object-related information relied on IPS input, whereas clear stimuli were integrated through direct information exchange between visual and auditory sensory cortices. These results indicate that the inverse effectiveness observed for identification of degraded relative to clear objects in behavior and brain activation might be facilitated by selective recruitment of an executive cortical network which uses IPS as a relay mediating crossmodal sensory information exchange.
Collapse
Affiliation(s)
- Christina Regenbogen
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.,Department of Psychiatry, Psychotherapy and Psychosomatics, Medical School, RWTH Aachen University, Germany.,JARA - BRAIN Institute 1: Structure-Function Relationship: Decoding the Human Brain at systemic levels, Forschungszentrum Jülich, Germany
| | - Janina Seubert
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.,Department of Neurobiology, Aging Research Center, Care Sciences and Society, Karolinska Institute and Stockholm University, Stockholm, Sweden
| | - Emilia Johansson
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Andreas Finkelmeyer
- Institute of Neuroscience, Newcastle University, Newcastle-upon-Tyne, United Kingdom
| | - Patrik Andersson
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.,Stockholm University Brain Imaging Centre, Stockholm University, Sweden
| | - Johan N Lundström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.,Monell Chemical Senses Center, Philadelphia, Pennsylvania.,Department of Psychology, University of Pennsylvania, Philadelphia, Pennsylvania
| |
Collapse
|