1
|
Paasonen J, Valjakka JS, Salo RA, Paasonen E, Tanila H, Michaeli S, Mangia S, Gröhn O. Whisker stimulation with different frequencies reveals non-uniform modulation of functional magnetic resonance imaging signal across sensory systems in awake rats. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.11.13.623361. [PMID: 39605361 PMCID: PMC11601494 DOI: 10.1101/2024.11.13.623361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 11/29/2024]
Abstract
Primary sensory systems are classically considered to be separate units, however there is current evidence that there are notable interactions between them. We examined the cross-sensory interplay by applying a quiet and motion-tolerant zero echo time functional magnetic resonance imaging (fMRI) technique to elucidate the evoked brain-wide responses to whisker pad stimulation in awake and anesthetized rats. Specifically, characterized the brain-wide responses in core and non-core regions to whisker pad stimulation by the varying stimulation-frequency, and determined whether isoflurane-medetomidine anesthesia, traditionally used in preclinical imaging, confounded investigations related to sensory integration. We demonstrated that unilateral whisker pad stimulation not only elicited robust activity along the whisker-mediated tactile system, but also in auditory, visual, high-order, and cerebellar regions, indicative of brain-wide cross-sensory and associative activity. By inspecting the response profiles to different stimulation frequencies and temporal signal characteristics, we observed that the non-core regions responded to stimulation in a very different way compared to the primary sensory system, likely reflecting different encoding modes between the primary sensory, cross-sensory, and integrative processing. Lastly, while the activity evoked in low-order sensory structures could be reliably detected under anesthesia, the activity in high-order processing and the complex differences between primary, cross-sensory, and associative systems were visible only in the awake state. We conclude that our study reveals novel aspects of the cross-sensory interplay of whisker-mediated tactile system, and importantly, that these would be difficult to observe in anesthetized rats.
Collapse
Affiliation(s)
- Jaakko Paasonen
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
| | - Juha S. Valjakka
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
- Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, USA
| | - Raimo A. Salo
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
| | - Ekaterina Paasonen
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
- NeuroCenter, Kuopio University Hospital, Kuopio, Finland
| | - Heikki Tanila
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
| | - Shalom Michaeli
- Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, USA
| | - Silvia Mangia
- Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, USA
| | - Olli Gröhn
- A. I. Virtanen Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
| |
Collapse
|
2
|
Knyazev GG, Savostyanov AN, Bocharov AV, Saprigyn AE. Representational similarity analysis of self- versus other-processing: Effect of trait aggressiveness. Aggress Behav 2024; 50:e22125. [PMID: 38268387 DOI: 10.1002/ab.22125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 10/27/2023] [Accepted: 11/06/2023] [Indexed: 01/26/2024]
Abstract
In this study, using the self/other adjective judgment task, we aimed to explore how people perceive themselves in comparison to various other people, including friends, strangers, and those they dislike. Next, using representational similarity analysis, we sought to elucidate how these perceptual similarities and differences are represented in brain activity and how aggressiveness is related to these representations. Behavioral ratings show that, on average, people tend to consider themselves more like their friends than neutral strangers, and least like people they dislike. This pattern of similarity is positively correlated with neural representation in social and cognitive circuits of the brain and negatively correlated with neural representation in emotional centers that may represent emotional arousal associated with various social objects. Aggressiveness seems to predispose a person to a pattern of behavior that is the opposite of the average pattern, that is, a tendency to think of oneself as less like one's friends and more like one's enemies. This corresponds to an increase in the similarity of the behavioral representation with the representation in the emotional centers and a decrease in its similarity with the representation in the social and cognitive centers. This can be seen as evidence that in individuals prone to aggression, behavior in the social environment may depend to a greater extent on the representation of social objects in the emotional rather than social and cognitive brain circuits.
Collapse
Affiliation(s)
- Gennady G Knyazev
- Laboratory of Differential Psychophysiology, Institute of Neurosciences and Medicine, Novosibirsk, Russia
| | - Alexander N Savostyanov
- Laboratory of Differential Psychophysiology, Institute of Neurosciences and Medicine, Novosibirsk, Russia
- Laboratory of Psychological Genetics, Institute of Cytology and Genetics SB RAS, Novosibirsk, Russia
| | - Andrey V Bocharov
- Laboratory of Differential Psychophysiology, Institute of Neurosciences and Medicine, Novosibirsk, Russia
| | - Alexander E Saprigyn
- Laboratory of Differential Psychophysiology, Institute of Neurosciences and Medicine, Novosibirsk, Russia
| |
Collapse
|
3
|
Seydell-Greenwald A, Wang X, Newport EL, Bi Y, Striem-Amit E. Spoken language processing activates the primary visual cortex. PLoS One 2023; 18:e0289671. [PMID: 37566582 PMCID: PMC10420367 DOI: 10.1371/journal.pone.0289671] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Accepted: 07/24/2023] [Indexed: 08/13/2023] Open
Abstract
Primary visual cortex (V1) is generally thought of as a low-level sensory area that primarily processes basic visual features. Although there is evidence for multisensory effects on its activity, these are typically found for the processing of simple sounds and their properties, for example spatially or temporally-congruent simple sounds. However, in congenitally blind individuals, V1 is involved in language processing, with no evidence of major changes in anatomical connectivity that could explain this seemingly drastic functional change. This is at odds with current accounts of neural plasticity, which emphasize the role of connectivity and conserved function in determining a neural tissue's role even after atypical early experiences. To reconcile what appears to be unprecedented functional reorganization with known accounts of plasticity limitations, we tested whether V1's multisensory roles include responses to spoken language in sighted individuals. Using fMRI, we found that V1 in normally sighted individuals was indeed activated by comprehensible spoken sentences as compared to an incomprehensible reversed speech control condition, and more strongly so in the left compared to the right hemisphere. Activation in V1 for language was also significant and comparable for abstract and concrete words, suggesting it was not driven by visual imagery. Last, this activation did not stem from increased attention to the auditory onset of words, nor was it correlated with attentional arousal ratings, making general attention accounts an unlikely explanation. Together these findings suggest that V1 responds to spoken language even in sighted individuals, reflecting the binding of multisensory high-level signals, potentially to predict visual input. This capability might be the basis for the strong V1 language activation observed in people born blind, re-affirming the notion that plasticity is guided by pre-existing connectivity and abilities in the typically developed brain.
Collapse
Affiliation(s)
- Anna Seydell-Greenwald
- Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington, DC, United States of America
| | - Xiaoying Wang
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
| | - Elissa L. Newport
- Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington, DC, United States of America
| | - Yanchao Bi
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
| | - Ella Striem-Amit
- Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington, DC, United States of America
- Department of Neuroscience, Georgetown University Medical Center, Washington, DC, United States of America
| |
Collapse
|
4
|
Bailey KM, Giordano BL, Kaas AL, Smith FW. Decoding sounds depicting hand-object interactions in primary somatosensory cortex. Cereb Cortex 2022; 33:3621-3635. [PMID: 36045002 DOI: 10.1093/cercor/bhac296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 05/24/2022] [Accepted: 07/07/2022] [Indexed: 11/13/2022] Open
Abstract
Neurons, even in the earliest sensory regions of cortex, are subject to a great deal of contextual influences from both within and across modality connections. Recent work has shown that primary sensory areas can respond to and, in some cases, discriminate stimuli that are not of their target modality: for example, primary somatosensory cortex (SI) discriminates visual images of graspable objects. In the present work, we investigated whether SI would discriminate sounds depicting hand-object interactions (e.g. bouncing a ball). In a rapid event-related functional magnetic resonance imaging experiment, participants listened attentively to sounds from 3 categories: hand-object interactions, and control categories of pure tones and animal vocalizations, while performing a one-back repetition detection task. Multivoxel pattern analysis revealed significant decoding of hand-object interaction sounds within SI, but not for either control category. Crucially, in the hand-sensitive voxels defined from an independent tactile localizer, decoding accuracies were significantly higher for hand-object interactions compared to pure tones in left SI. Our findings indicate that simply hearing sounds depicting familiar hand-object interactions elicit different patterns of activity in SI, despite the complete absence of tactile stimulation. These results highlight the rich contextual information that can be transmitted across sensory modalities even to primary sensory areas.
Collapse
Affiliation(s)
- Kerri M Bailey
- School of Psychology, University of East Anglia, Norwich NR4 7TJ, United Kingdom
| | - Bruno L Giordano
- Institut des Neurosciences de La Timone, CNRS UMR 7289, Université Aix-Marseille, Marseille CNRS UMR 7289, France
| | - Amanda L Kaas
- Department of Cognitive Neuroscience, Maastricht University, Maastricht 6229 EV, The Netherlands
| | - Fraser W Smith
- School of Psychology, University of East Anglia, Norwich NR4 7TJ, United Kingdom
| |
Collapse
|
5
|
Katayama R, Yoshida W, Ishii S. Confidence modulates the decodability of scene prediction during partially-observable maze exploration in humans. Commun Biol 2022; 5:367. [PMID: 35440615 PMCID: PMC9018866 DOI: 10.1038/s42003-022-03314-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 03/23/2022] [Indexed: 11/23/2022] Open
Abstract
Prediction ability often involves some degree of uncertainty-a key determinant of confidence. Here, we sought to assess whether predictions are decodable in partially-observable environments where one's state is uncertain, and whether this information is sensitive to confidence produced by such uncertainty. We used functional magnetic resonance imaging-based, partially-observable maze navigation tasks in which subjects predicted upcoming scenes and reported their confidence regarding these predictions. Using a multi-voxel pattern analysis, we successfully decoded both scene predictions and subjective confidence from activities in the localized parietal and prefrontal regions. We also assessed confidence in their beliefs about where they were in the maze. Importantly, prediction decodability varied according to subjective scene confidence in the superior parietal lobule and state confidence estimated by the behavioral model in the inferior parietal lobule. These results demonstrate that prediction in uncertain environments depends on the prefrontal-parietal network within which prediction and confidence interact.
Collapse
Affiliation(s)
- Risa Katayama
- Graduate School of Informatics, Kyoto University, Kyoto, Kyoto, 606-8501, Japan.
| | - Wako Yoshida
- Nuffield Department of Clinical Neuroscience, University of Oxford, Oxford, OX3 9DU, UK
- Department of Neural Computation for Decision-making, Advanced Telecommunications Research Institute International, Soraku-gun, Kyoto, 619-0288, Japan
| | - Shin Ishii
- Graduate School of Informatics, Kyoto University, Kyoto, Kyoto, 606-8501, Japan
- Neural Information Analysis Laboratories, Advanced Telecommunications Research Institute International, Soraku-gun, Kyoto, 619-0288, Japan
- International Research Center for Neurointelligence, The University of Tokyo, Bunkyo-ku, Tokyo, 113-0033, Japan
| |
Collapse
|
6
|
Neural interactions in occipitotemporal cortex during basic human movement perception by dynamic causal modeling. Brain Imaging Behav 2021; 15:231-243. [PMID: 32141031 DOI: 10.1007/s11682-019-00250-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
Action recognition is an essential component of our daily life. The occipitotemporal cortex (OTC) is an important area in human movement perception. The previous studies have revealed that three vital regions including the extrastriate body area (EBA), human middle temporal complex (hMT+), and posterior superior temporal sulcus (pSTS) in OTC play an important role in motion perception. The aim of the current study is to explore the neural interactions between these three regions during basic human movement perception. Functional magnetic resonance imaging data were acquired when participants viewed dynamic videos depicting basic human movements. By the dynamic causal modeling analysis, a model space consisting of 576 models was constructed and evaluated to select the optimal model given the data. The information of the visual movement was found to enter the system through hMT+. We speculated that hMT+ would be the region to show sensitivity to the presence of motion and it subsequently influence and be influenced by the other two regions. Our results also revealed the manner in which the three regions interact during action recognition. Furthermore, We found significantly enhanced modulated connectivity from hMT+ to both EBA and pSTS, as well as from EBA to both hMT+ and pSTS. We inferred that there may be multiple routes for human action perception. One responsible route for processing motion signals is through hMT+ to pSTS, and the other projects information to pSTS may be via the form-processing route. In addition, pSTS may integrate and mediate visual signals and possibly convey them to distributed areas to maintain high-order cognitive tasks.
Collapse
|