1
|
Adenosine and Cortical Plasticity. Neuroscientist 2024:10738584241236773. [PMID: 38497585 DOI: 10.1177/10738584241236773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/19/2024]
Abstract
Brain plasticity is the ability of the nervous system to change its structure and functioning in response to experiences. These changes occur mainly at synaptic connections, and this plasticity is named synaptic plasticity. During postnatal development, environmental influences trigger changes in synaptic plasticity that will play a crucial role in the formation and refinement of brain circuits and their functions in adulthood. One of the greatest challenges of present neuroscience is to try to explain how synaptic connections change and cortical maps are formed and modified to generate the most suitable adaptive behavior after different external stimuli. Adenosine is emerging as a key player in these plastic changes at different brain areas. Here, we review the current knowledge of the mechanisms responsible for the induction and duration of synaptic plasticity at different postnatal brain development stages in which adenosine, probably released by astrocytes, directly participates in the induction of long-term synaptic plasticity and in the control of the duration of plasticity windows at different cortical synapses. In addition, we comment on the role of the different adenosine receptors in brain diseases and on the potential therapeutic effects of acting via adenosine receptors.
Collapse
|
2
|
Heterosynaptic plasticity of the visuo-auditory projection requires cholecystokinin released from entorhinal cortex afferents. eLife 2024; 13:e83356. [PMID: 38436304 PMCID: PMC10954309 DOI: 10.7554/elife.83356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 03/03/2024] [Indexed: 03/05/2024] Open
Abstract
The entorhinal cortex is involved in establishing enduring visuo-auditory associative memory in the neocortex. Here we explored the mechanisms underlying this synaptic plasticity related to projections from the visual and entorhinal cortices to the auditory cortex in mice using optogenetics of dual pathways. High-frequency laser stimulation (HFS laser) of the visuo-auditory projection did not induce long-term potentiation. However, after pairing with sound stimulus, the visuo-auditory inputs were potentiated following either infusion of cholecystokinin (CCK) or HFS laser of the entorhino-auditory CCK-expressing projection. Combining retrograde tracing and RNAscope in situ hybridization, we show that Cck expression is higher in entorhinal cortex neurons projecting to the auditory cortex than in those originating from the visual cortex. In the presence of CCK, potentiation in the neocortex occurred when the presynaptic input arrived 200 ms before postsynaptic firing, even after just five trials of pairing. Behaviorally, inactivation of the CCK+ projection from the entorhinal cortex to the auditory cortex blocked the formation of visuo-auditory associative memory. Our results indicate that neocortical visuo-auditory association is formed through heterosynaptic plasticity, which depends on release of CCK in the neocortex mostly from entorhinal afferents.
Collapse
|
3
|
Crossmodal Associations and Working Memory in the Brain. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:91-100. [PMID: 38270855 DOI: 10.1007/978-981-99-7611-9_6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Crossmodal associations between stimuli from different sensory modalities could emerge in non-synesthetic people and be stored in working memory to guide goal-directed behaviors. This chapter reviews a plethora of studies in this field to summarize where, when, and how crossmodal associations and working memory are processed. It has been found that in those brain regions that are traditionally considered as unimodal primary sensory areas, neural activity could be influenced by crossmodal sensory signals at temporally very early stage of information processing. This phenomenon could not be due to feedback projections from higher level associative areas. Sequentially, neural processes would then occur in associative cortical areas including the posterior parietal cortex and prefrontal cortex. Neural oscillations in multiple frequency bands may reflect brain activity in crossmodal associations, and it is likely that neural synchrony is related to potential neural mechanisms underlying these processes. Primary sensory areas and associative areas coordinate together through neural synchrony to fulfil crossmodal associations and to guide working memory performance.
Collapse
|
4
|
The multisensory mind: a systematic review of multisensory integration processing in Anorexia and Bulimia Nervosa. J Eat Disord 2023; 11:204. [PMID: 37974266 PMCID: PMC10655389 DOI: 10.1186/s40337-023-00930-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Accepted: 11/12/2023] [Indexed: 11/19/2023] Open
Abstract
Individuals with Anorexia Nervosa and Bulimia Nervosa present alterations in the way they experience their bodies. Body experience results from a multisensory integration process in which information from different sensory domains and spatial reference frames is combined into a coherent percept. Given the critical role of the body in the onset and maintenance of both Anorexia Nervosa and Bulimia Nervosa, we conducted a systematic review to examine multisensory integration abilities of individuals affected by these two conditions and investigate whether they exhibit impairments in crossmodal integration. We searched for studies evaluating crossmodal integration in individuals with a current diagnosis of Anorexia Nervosa and Bulimia Nervosa as compared to healthy individuals from both behavioral and neurobiological perspectives. A search of PubMed, PsycINFO, and Web of Sciences databases was performed to extract relevant articles. Of the 2348 studies retrieved, 911 were unique articles. After the screening, 13 articles were included. Studies revealed multisensory integration abnormalities in patients affected by Anorexia Nervosa; only one included individuals with Bulimia Nervosa and observed less severe impairments compared to healthy controls. Overall, results seemed to support the presence of multisensory deficits in Anorexia Nervosa, especially when integrating interoceptive and exteroceptive information. We proposed the Predictive Coding framework for understanding our findings and suggested future lines of investigation.
Collapse
|
5
|
Performing a vibrotactile discrimination task modulates finger representations in primary somatosensory cortex. J Neurophysiol 2023; 130:1015-1027. [PMID: 37671429 PMCID: PMC10649835 DOI: 10.1152/jn.00428.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2022] [Revised: 08/29/2023] [Accepted: 08/31/2023] [Indexed: 09/07/2023] Open
Abstract
It is well established that vibrotactile stimuli are represented in somatotopic maps. However, less is known about whether these somatotopic representations are modulated by task demands and maybe even in the absence of tactile input. Here, we used a vibrotactile discrimination task as a tool to investigate these questions in further detail. Participants were required to actively perceive and process tactile stimuli in comparison to a no-task control condition where identical stimuli were passively perceived (no-memory condition). Importantly, both vibrotactile stimuli were either applied to the right index or little finger, allowing us to investigate whether cognitive task demands shape finger representations in primary somatosensory cortex (S1). Using multivoxel pattern analysis and representational similarity analysis, we found that S1 finger representations were more distinct during the memory than the no-memory condition. Interestingly, this effect was not only observed while tactile stimuli were presented but also during the delay period (i.e., in the absence of tactile stimulation). Our findings imply that when individuals are required to focus on tactile stimuli, retain them in their memory, and engage in active processing of distinctive stimulus features, this exerts a modulatory effect on the finger representations present in S1.NEW & NOTEWORTHY Using multivoxel pattern analysis, we found that discrimination task demands shape finger representations in the contralateral primary somatosensory cortex (S1), and that somatotopic representations are modulated by task demands not only during tactile stimulation but also to a certain extent in the absence of tactile input.
Collapse
|
6
|
A bio-inspired visuotactile neuron for multisensory integration. Nat Commun 2023; 14:5729. [PMID: 37714853 PMCID: PMC10504285 DOI: 10.1038/s41467-023-40686-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Accepted: 08/03/2023] [Indexed: 09/17/2023] Open
Abstract
Multisensory integration is a salient feature of the brain which enables better and faster responses in comparison to unisensory integration, especially when the unisensory cues are weak. Specialized neurons that receive convergent input from two or more sensory modalities are responsible for such multisensory integration. Solid-state devices that can emulate the response of these multisensory neurons can advance neuromorphic computing and bridge the gap between artificial and natural intelligence. Here, we introduce an artificial visuotactile neuron based on the integration of a photosensitive monolayer MoS2 memtransistor and a triboelectric tactile sensor which minutely captures the three essential features of multisensory integration, namely, super-additive response, inverse effectiveness effect, and temporal congruency. We have also realized a circuit which can encode visuotactile information into digital spiking events, with probability of spiking determined by the strength of the visual and tactile cues. We believe that our comprehensive demonstration of bio-inspired and multisensory visuotactile neuron and spike encoding circuitry will advance the field of neuromorphic computing, which has thus far primarily focused on unisensory intelligence and information processing.
Collapse
|
7
|
Seeing our hand or a tool during visually-guided actions: Different effects on the somatosensory and visual cortices. Neuropsychologia 2023; 185:108582. [PMID: 37121267 DOI: 10.1016/j.neuropsychologia.2023.108582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 03/11/2023] [Accepted: 04/27/2023] [Indexed: 05/02/2023]
Abstract
The processing of proprioceptive information in the context of a conflict between visual and somatosensory feedbacks deteriorates motor performance. Previous studies have shown that seeing one's hand increases the weighting assigned to arm somatosensory inputs. In this light, we hypothesized that the sensory conflict, when tracing the contour of a shape with mirror-reversed vision, will be greater for participants who trace with a stylus seen in their hand (Hand group, n = 17) than for participants who trace with the tip of rod without seen their hand (Tool group, n = 15). Based on this hypothesis, we predicted that the tracing performance with mirror vision will be more deteriorated for the Hand group than for the Tool group, and we predicted a greater gating of somatosensory information for the Hand group to reduce the sensory conflict. The participants of both groups followed the outline of a shape in two visual conditions. Direct vision: the participants saw the hand or portion of a light 40 cm rod directly. Mirror Vision: the hand or the rod was seen through a mirror. We measured tracing performance using a digitizing tablet and the cortical activity with electroencephalography. Behavioral analyses revealed that the tracing performance of both groups was similarly impaired by mirror vision. However, contrasting the spectral content of the cortical oscillatory activity between the Mirror and Direct conditions, we observed that tracing with mirror vision resulted in significantly larger alpha (8-12 Hz) and beta (15-25 Hz) powers in the somatosensory cortex for participants of the Hand group. The somatosensory alpha and beta powers did not significantly differ between Mirror and Direct vision conditions for the Tool group. For both groups, tracing with mirror vision altered the activity of the visual cortex: decreased alpha power for the Hand group, decreased alpha and beta power for the Tool group. Overall, these results suggest that seeing the hand enhanced the sensory conflict when tracing with mirror vision and that the increase of alpha and beta powers in the somatosensory cortex served to reduce the weight assigned to somatosensory information. The increased activity of the visual cortex observed for both groups in the mirror vision condition suggests greater visual processing with increased task difficulty. Finally, the fact that the participants of the Tool group did not show better tracing performance than those of the Hand group suggests that tracing deterioration resulted from a sensorimotor conflict (as opposed to a visuo-proprioceptive conflict).
Collapse
|
8
|
S1 represents multisensory contexts and somatotopic locations within and outside the bounds of the cortical homunculus. Cell Rep 2023; 42:112312. [PMID: 37002922 PMCID: PMC10544688 DOI: 10.1016/j.celrep.2023.112312] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2022] [Revised: 02/06/2023] [Accepted: 03/13/2023] [Indexed: 04/03/2023] Open
Abstract
Recent literature suggests that tactile events are represented in the primary somatosensory cortex (S1) beyond its long-established topography; in addition, the extent to which S1 is modulated by vision remains unclear. To better characterize S1, human electrophysiological data were recorded during touches to the forearm or finger. Conditions included visually observed physical touches, physical touches without vision, and visual touches without physical contact. Two major findings emerge from this dataset. First, vision strongly modulates S1 area 1, but only if there is a physical element to the touch, suggesting that passive touch observation is insufficient to elicit neural responses. Second, despite recording in a putative arm area of S1, neural activity represents both arm and finger stimuli during physical touches. Arm touches are encoded more strongly and specifically, supporting the idea that S1 encodes tactile events primarily through its topographic organization but also more generally, encompassing other areas of the body.
Collapse
|
9
|
The neural representations underlying asymmetric cross-modal prediction of words. Hum Brain Mapp 2023; 44:2418-2435. [PMID: 36715307 PMCID: PMC10028649 DOI: 10.1002/hbm.26219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Revised: 12/20/2022] [Accepted: 01/18/2023] [Indexed: 01/31/2023] Open
Abstract
Cross-modal prediction serves a crucial adaptive role in the multisensory world, yet the neural mechanisms underlying this prediction are poorly understood. The present study addressed this important question by combining a novel audiovisual sequence memory task, functional magnetic resonance imaging (fMRI), and multivariate neural representational analyses. Our behavioral results revealed a reliable asymmetric cross-modal predictive effect, with a stronger prediction from visual to auditory (VA) modality than auditory to visual (AV) modality. Mirroring the behavioral pattern, we found the superior parietal lobe (SPL) showed higher pattern similarity for VA than AV pairs, and the strength of the predictive coding in the SPL was positively correlated with the behavioral predictive effect in the VA condition. Representational connectivity analyses further revealed that the SPL mediated the neural pathway from the visual to the auditory cortex in the VA condition but was not involved in the auditory to visual cortex pathway in the AV condition. Direct neural pathways within the unimodal regions were found for the visual-to-visual and auditory-to-auditory predictions. Together, these results provide novel insights into the neural mechanisms underlying cross-modal sequence prediction.
Collapse
|
10
|
Transforming musculoskeletal anatomy learning with haptic surface painting. ANATOMICAL SCIENCES EDUCATION 2023. [PMID: 36748362 DOI: 10.1002/ase.2262] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Revised: 02/03/2023] [Accepted: 02/06/2023] [Indexed: 06/18/2023]
Abstract
Anatomical body painting has traditionally been utilized to support learner engagement and understanding of surface anatomy. Learners apply two-dimensional representations of surface markings directly on to the skin, based on the identification of key landmarks. Esthetically satisfying representations of musculature and viscera can also be created. However, established body painting approaches do not typically address three-dimensional spatial anatomical concepts. Haptic Surface Painting (HSP) is a novel activity, distinct from traditional body painting, and aims to develop learner spatial awareness. The HSP process is underpinned by previous work describing how a Haptico-visual observation and drawing method can support spatial, holistic, and collaborative anatomy learning. In HSP, superficial and underlying musculoskeletal and vascular structures are located haptically by palpation. Transparent colors are then immediately applied to the skin using purposive and cross-contour drawing techniques to produce corresponding visual representations of learner observation and cognition. Undergraduate students at a United Kingdom medical school (n = 7) participated in remote HSP workshops and focus groups. A phenomenological study of learner perspectives identified four themes from semantic qualitative analysis of transcripts: Three-dimensional haptico-visual exploration relating to learner spatial awareness of their own anatomy; cognitive freedom and accessibility provided by a flexible and empowering learning process; altered perspectives of anatomical detail, relationships, and clinical relevance; and delivery and context, relating to curricular integration, session format, and educator guidance. This work expands the pedagogic repertoire of anatomical body painting and has implications for anatomy educators seeking to integrate innovative, engaging, and effective learning approaches for transforming student learning.
Collapse
|
11
|
Hierarchical unimodal processing within the primary somatosensory cortex during a bimodal detection task. Proc Natl Acad Sci U S A 2022; 119:e2213847119. [PMID: 36534792 PMCID: PMC9907144 DOI: 10.1073/pnas.2213847119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
Do sensory cortices process more than one sensory modality? To answer these questions, scientists have generated a wide variety of studies at distinct space-time scales in different animal models, and often shown contradictory conclusions. Some conclude that this process occurs in early sensory cortices, but others that this occurs in areas central to sensory cortices. Here, we sought to determine whether sensory neurons process and encode physical stimulus properties of different modalities (tactile and acoustic). For this, we designed a bimodal detection task where the senses of touch and hearing compete from trial to trial. Two Rhesus monkeys performed this novel task, while neural activity was recorded in areas 3b and 1 of the primary somatosensory cortex (S1). We analyzed neurons' coding properties and variability, organizing them by their receptive field's position relative to the stimulation zone. Our results indicate that neurons of areas 3b and 1 are unimodal, encoding only the tactile modality in both the firing rate and variability. Moreover, we found that neurons in area 3b carried more information about the periodic stimulus structure than those in area 1, possessed lower response and coding latencies, and had a lower intrinsic time scale. In sum, these differences reveal a hidden processing-based hierarchy. Finally, using a powerful nonlinear dimensionality reduction algorithm, we show that the activity from areas 3b and 1 can be separated, establishing a clear division in the functionality of these two subareas of S1.
Collapse
|
12
|
Excitatory Crossmodal Input to a Widespread Population of Primary Sensory Cortical Neurons. Neurosci Bull 2022; 38:1139-1152. [PMID: 35429324 PMCID: PMC9554107 DOI: 10.1007/s12264-022-00855-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Accepted: 01/23/2022] [Indexed: 11/28/2022] Open
Abstract
Crossmodal information processing in sensory cortices has been reported in sparsely distributed neurons under normal conditions and can undergo experience- or activity-induced plasticity. Given the potential role in brain function as indicated by previous reports, crossmodal connectivity in the sensory cortex needs to be further explored. Using perforated whole-cell recording in anesthetized adult rats, we found that almost all neurons recorded in the primary somatosensory, auditory, and visual cortices exhibited significant membrane-potential responses to crossmodal stimulation, as recorded when brain activity states were pharmacologically down-regulated in light anesthesia. These crossmodal cortical responses were excitatory and subthreshold, and further seemed to be relayed primarily by the sensory thalamus, but not the sensory cortex, of the stimulated modality. Our experiments indicate a sensory cortical presence of widespread excitatory crossmodal inputs, which might play roles in brain functions involving crossmodal information processing or plasticity.
Collapse
|
13
|
Mugs and Plants: Object Semantic Knowledge Alters Perceptual Processing With Behavioral Ramifications. Psychol Sci 2022; 33:1695-1707. [PMID: 36044640 DOI: 10.1177/09567976221097497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Neural processing of objects with action associations recruits dorsal visual regions more than the neural processing of objects without such associations. We hypothesized that because the dorsal and ventral visual pathways have differing proportions of magno- and parvocellular input, there should be behavioral differences in perceptual tasks between manipulable and nonmanipulable objects. This hypothesis was tested in college-age adults across five experiments (Ns = 26, 26, 30, 25, and 25) using a gap-detection task, suited to the spatial resolution of parvocellular processing, and an object-flicker-discrimination task, suited to the temporal resolution of magnocellular processing. Directly predicted from the cellular composition of each pathway, a strong nonmanipulable-object advantage was observed in gap detection, and a small manipulable-object advantage was observed in flicker discrimination. Additionally, these effects were modulated by reducing object recognition through inversion and by suppressing magnocellular processing using red light. These results establish perceptual differences between objects dependent on semantic knowledge.
Collapse
|
14
|
Comparative Functional Connectivity of Core Brain Regions between Implicit and Explicit Memory Tasks Underlying Negative Emotion in General Anxiety Disorder. CLINICAL PSYCHOPHARMACOLOGY AND NEUROSCIENCE 2022; 20:279-291. [PMID: 35466099 PMCID: PMC9048018 DOI: 10.9758/cpn.2022.20.2.279] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/09/2020] [Revised: 02/23/2021] [Accepted: 02/27/2021] [Indexed: 11/24/2022]
Abstract
Objective To investigate not only differential patterns of functional connectivity of core brain regions between implicit and explicit verbal memory tasks underlying negatively evoked emotional condition, but also correlations of functional connectivity (FC) strength with clinical symptom severity in patients with generalized anxiety disorder (GAD). Methods Thirteen patients with GAD and 13 healthy controls underwent functional magnetic resonance imaging for memory tasks with negative emotion words. Results Clinical symptom and its severities of GAD were potentially associated with abnormalities of task-based FC with core brain regions and distinct FC patterns between implicit vs. explicit memory processing in GAD were potentially well discriminated. Outstanding FC in implicit memory task includes positive connections of precentral gyus (PrG) to inferior frontal gyrus and inferior parietal gyrus (IPG), respectively, in encoding period; a positive connection of amygdala (Amg) to globus pallidus as well as a negative connection of Amg to cerebellum in retrieval period. Meanwhile, distinct FC in explicit memory included a positive connection of PrG to inferior temporal gyrus (ITG) in encoding period; a positive connection of the anterior cingulate gyrus to superior frontal gyrus in retrieval period. Especially, there were positive correlation between GAD-7 scores and FC of PrG-IPG (r2 = 0.324, p = 0.042) in implicit memory encoding, and FC of PrG-ITG (r2 = 0.378, p = 0.025) in explicit memory encoding. Conclusion This study clarified differential patterns of brain activation and relevant FC between implicit and explicit verbal memory tasks underlying negative emotional feelings in GAD. These findings will be helpful for an understanding of distinct brain functional mechanisms associated with clinical symptom severities in GAD.
Collapse
|
15
|
Neural Encoding of Active Multi-Sensing Enhances Perceptual Decision-Making via a Synergistic Cross-Modal Interaction. J Neurosci 2022; 42:2344-2355. [PMID: 35091504 PMCID: PMC8936614 DOI: 10.1523/jneurosci.0861-21.2022] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2021] [Revised: 11/29/2021] [Accepted: 01/02/2022] [Indexed: 12/16/2022] Open
Abstract
Most perceptual decisions rely on the active acquisition of evidence from the environment involving stimulation from multiple senses. However, our understanding of the neural mechanisms underlying this process is limited. Crucially, it remains elusive how different sensory representations interact in the formation of perceptual decisions. To answer these questions, we used an active sensing paradigm coupled with neuroimaging, multivariate analysis, and computational modeling to probe how the human brain processes multisensory information to make perceptual judgments. Participants of both sexes actively sensed to discriminate two texture stimuli using visual (V) or haptic (H) information or the two sensory cues together (VH). Crucially, information acquisition was under the participants' control, who could choose where to sample information from and for how long on each trial. To understand the neural underpinnings of this process, we first characterized where and when active sensory experience (movement patterns) is encoded in human brain activity (EEG) in the three sensory conditions. Then, to offer a neurocomputational account of active multisensory decision formation, we used these neural representations of active sensing to inform a drift diffusion model of decision-making behavior. This revealed a multisensory enhancement of the neural representation of active sensing, which led to faster and more accurate multisensory decisions. We then dissected the interactions between the V, H, and VH representations using a novel information-theoretic methodology. Ultimately, we identified a synergistic neural interaction between the two unisensory (V, H) representations over contralateral somatosensory and motor locations that predicted multisensory (VH) decision-making performance.SIGNIFICANCE STATEMENT In real-world settings, perceptual decisions are made during active behaviors, such as crossing the road on a rainy night, and include information from different senses (e.g., car lights, slippery ground). Critically, it remains largely unknown how sensory evidence is combined and translated into perceptual decisions in such active scenarios. Here we address this knowledge gap. First, we show that the simultaneous exploration of information across senses (multi-sensing) enhances the neural encoding of active sensing movements. Second, the neural representation of active sensing modulates the evidence available for decision; and importantly, multi-sensing yields faster evidence accumulation. Finally, we identify a cross-modal interaction in the human brain that correlates with multisensory performance, constituting a putative neural mechanism for forging active multisensory perception.
Collapse
|
16
|
Auditory noise improves balance control by cross-modal stochastic resonance. Heliyon 2021; 7:e08299. [PMID: 34765798 PMCID: PMC8571705 DOI: 10.1016/j.heliyon.2021.e08299] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Revised: 05/17/2021] [Accepted: 10/27/2021] [Indexed: 11/25/2022] Open
Abstract
It is known that enhanced somatosensory function leads to improved balance, and somatosensory function can be enhanced by the appropriate level of mechanical, visual, or auditory noise. In this study, we tested the potential benefit of an auditory noise on balance control. We first assessed static balance by measuring 10 times the duration of standing on the toes of one leg with closed eyes. For the 18 healthy adult participants, the median standing times ranged from 2.1 to 45.6 s, and the median of the distribution was 9.9 s. From the above, the participants were divided into two groups: lower (below 10 s, n = 9) and higher (above 10 s, n = 9) balance groups. We then investigated the effect on balance control of an auditory white noise emitted at the detection threshold. Each individual performed 20 trials. The auditory noise was applied in half the trials, while the remaining trials were conducted without noise. The order of the noise and no-noise trials was quasi-random. In the lower-balance group, the median standing time significantly increased during the noise trials (10.3 s) compared with the time in the no-noise controls (5.2 s). On the other hand, noise had no significant effect in the higher-balance group, presumably because of a ceiling effect. These findings suggest that static balance in the lower-balance participants can be improved by applying a weak noise through cross-modal stochastic resonance.
Collapse
|
17
|
Multisensory-Guided Associative Learning Enhances Multisensory Representation in Primary Auditory Cortex. Cereb Cortex 2021; 32:1040-1054. [PMID: 34378017 DOI: 10.1093/cercor/bhab264] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2021] [Revised: 07/13/2021] [Accepted: 07/15/2021] [Indexed: 11/12/2022] Open
Abstract
Sensory cortices, classically considered to represent modality-specific sensory information, are also found to engage in multisensory processing. However, how sensory processing in sensory cortices is cross-modally modulated remains an open question. Specifically, we understand little of cross-modal representation in sensory cortices in perceptual tasks and how perceptual learning modifies this process. Here, we recorded neural responses in primary auditory cortex (A1) both while freely moving rats discriminated stimuli in Go/No-Go tasks and when anesthetized. Our data show that cross-modal representation in auditory cortices varies with task contexts. In the task of an audiovisual cue being the target associating with water reward, a significantly higher proportion of auditory neurons showed a visually evoked response. The vast majority of auditory neurons, if processing auditory-visual interactions, exhibit significant multisensory enhancement. However, when the rats performed tasks with unisensory cues being the target, cross-modal inhibition, rather than enhancement, predominated. In addition, multisensory associational learning appeared to leave a trace of plastic change in A1, as a larger proportion of A1 neurons showed multisensory enhancement in anesthesia. These findings indicate that multisensory processing in principle sensory cortices is not static, and having cross-modal interaction in the task requirement can substantially enhance multisensory processing in sensory cortices.
Collapse
|
18
|
The causal role of auditory cortex in auditory working memory. eLife 2021; 10:64457. [PMID: 33913809 PMCID: PMC8169109 DOI: 10.7554/elife.64457] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 04/28/2021] [Indexed: 01/18/2023] Open
Abstract
Working memory (WM), the ability to actively hold information in memory over a delay period of seconds, is a fundamental constituent of cognition. Delay-period activity in sensory cortices has been observed in WM tasks, but whether and when the activity plays a functional role for memory maintenance remains unclear. Here, we investigated the causal role of auditory cortex (AC) for memory maintenance in mice performing an auditory WM task. Electrophysiological recordings revealed that AC neurons were active not only during the presentation of the auditory stimulus but also early in the delay period. Furthermore, optogenetic suppression of neural activity in AC during the stimulus epoch and early delay period impaired WM performance, whereas suppression later in the delay period did not. Thus, AC is essential for information encoding and maintenance in auditory WM task, especially during the early delay period.
Collapse
|
19
|
Observation of Patients' 3D Printed Anatomical Features and 3D Visualisation Technologies Improve Spatial Awareness for Surgical Planning and in-Theatre Performance. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2021; 1334:23-37. [PMID: 34476743 DOI: 10.1007/978-3-030-76951-2_2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Improved spatial awareness is vital in anatomy education as well as in many areas of medical practice. Many healthcare professionals struggle with the extrapolation of 2D data to its locus within the 3D volume of the anatomy. In this chapter, we outline the use of touch as an important sensory modality in the observation of 3D forms, including anatomical parts, with the specific neuroscientific underpinnings in this regard being described. We explore how improved spatial awareness is directly linked to improved spatial skill. The reader is offered two practical exercises that lead to improved spatial awareness for application in exploring external 3D anatomy volume as well as internal 3D anatomy volume. These exercises are derived from the Haptico-visual observation and drawing (HVOD) method. The resulting cognitive improvement in spatial awareness that these exercises engender can be of benefit to students in their study of anatomy and for application by healthcare professionals in many aspects of their medical practice. The use of autostereoscopic visualisation technology (AS3D) to view the anatomy from DICOM data, in combination with the haptic exploration of a 3D print (3Dp) of the same stereoscopic on-screen image, is recommended as a practice for improved understanding of any anatomical part or feature. We describe a surgical innovation that relies on the haptic perception of patients' 3D printed (3Dp) anatomical features from patient DICOM data, for improved surgical planning and in-theatre surgical performance. Throughout the chapter, underlying neuroscientific correlates to haptic and visual observation, memory, working memory, and cognitive load are provided.
Collapse
|
20
|
Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study. Front Robot AI 2020; 7:540565. [PMID: 33501309 PMCID: PMC7805622 DOI: 10.3389/frobt.2020.540565] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Accepted: 12/02/2020] [Indexed: 12/03/2022] Open
Abstract
The quality of crossmodal perception hinges on two factors: The accuracy of the independent unimodal perception and the ability to integrate information from different sensory systems. In humans, the ability for cognitively demanding crossmodal perception diminishes from young to old age. Here, we propose a new approach to research to which degree the different factors contribute to crossmodal processing and the age-related decline by replicating a medical study on visuo-tactile crossmodal pattern discrimination utilizing state-of-the-art tactile sensing technology and artificial neural networks (ANN). We implemented two ANN models to specifically focus on the relevance of early integration of sensory information during the crossmodal processing stream as a mechanism proposed for efficient processing in the human brain. Applying an adaptive staircase procedure, we approached comparable unimodal classification performance for both modalities in the human participants as well as the ANN. This allowed us to compare crossmodal performance between and within the systems, independent of the underlying unimodal processes. Our data show that unimodal classification accuracies of the tactile sensing technology are comparable to humans. For crossmodal discrimination of the ANN the integration of high-level unimodal features on earlier stages of the crossmodal processing stream shows higher accuracies compared to the late integration of independent unimodal classifications. In comparison to humans, the ANN show higher accuracies than older participants in the unimodal as well as the crossmodal condition, but lower accuracies than younger participants in the crossmodal task. Taken together, we can show that state-of-the-art tactile sensing technology is able to perform a complex tactile recognition task at levels comparable to humans. For crossmodal processing, human inspired early sensory integration seems to improve the performance of artificial neural networks. Still, younger participants seem to employ more efficient crossmodal integration mechanisms than modeled in the proposed ANN. Our work demonstrates how collaborative research in neuroscience and embodied artificial neurocognitive models can help to derive models to inform the design of future neurocomputational architectures.
Collapse
|
21
|
Effects of Visual Attentional Load on the Tactile Sensory Memory Indexed by Somatosensory Mismatch Negativity. Front Neuroinform 2020; 14:575078. [PMID: 33324187 PMCID: PMC7724049 DOI: 10.3389/fninf.2020.575078] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Accepted: 10/22/2020] [Indexed: 11/13/2022] Open
Abstract
Auditory sensory memory indexed by mismatch negativity has been broadly studied over the past century, but far less attention has been directed to tactile sensory memory. To investigate whether tactile sensory memory is affected by attention, we recorded somatosensory mismatch negativity (sMMN) from 24 healthy adults in two experiments to distinguish sustained attention from non-sustained attention. Using the roving somatosensory oddball paradigm, we analyzed the average dynamic changes in the amplitude and latency of sMMN amplitude and found a clear sMMN component at the central region at a 100–300 ms interval. The sMMN amplitude, which indexes the early detection of tactile stimuli with the sensory memory trace, was larger in the tactile attentional task. Additionally, the sMMN latency increased with the increasing visual attentional load, which indicates a decay of tactile sensory memory. Our results indicate that the more attention resources are allocated for a tactile sensation, the more favorable it is to the generation of tactile sensory memory.
Collapse
|
22
|
Non-informative vision improves spatial tactile discrimination on the shoulder but does not influence detection sensitivity. Exp Brain Res 2020; 238:2865-2875. [PMID: 33051694 PMCID: PMC7644450 DOI: 10.1007/s00221-020-05944-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 10/06/2020] [Indexed: 12/20/2022]
Abstract
Vision of the body has been reported to improve tactile acuity even when vision is not informative about the actual tactile stimulation. However, it is currently unclear whether this effect is limited to body parts such as hand, forearm or foot that can be normally viewed, or it also generalizes to body locations, such as the shoulder, that are rarely before our own eyes. In this study, subjects consecutively performed a detection threshold task and a numerosity judgment task of tactile stimuli on the shoulder. Meanwhile, they watched either a real-time video showing their shoulder or simply a fixation cross as control condition. We show that non-informative vision improves tactile numerosity judgment which might involve tactile acuity, but not tactile sensitivity. Furthermore, the improvement in tactile accuracy modulated by vision seems to be due to an enhanced ability in discriminating the number of adjacent active electrodes. These results are consistent with the view that bimodal visuotactile neurons sharp tactile receptive fields in an early somatosensory map, probably via top-down modulation of lateral inhibition.
Collapse
|
23
|
Cortical Processing of Multimodal Sensory Learning in Human Neonates. Cereb Cortex 2020; 31:1827-1836. [PMID: 33207366 PMCID: PMC7869081 DOI: 10.1093/cercor/bhaa340] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Revised: 10/15/2020] [Accepted: 10/15/2020] [Indexed: 12/13/2022] Open
Abstract
Following birth, infants must immediately process and rapidly adapt to the array of unknown sensory experiences associated with their new ex-utero environment. However, although it is known that unimodal stimuli induce activity in the corresponding primary sensory cortices of the newborn brain, it is unclear how multimodal stimuli are processed and integrated across modalities. The latter is essential for learning and understanding environmental contingencies through encoding relationships between sensory experiences; and ultimately likely subserves development of life-long skills such as speech and language. Here, for the first time, we map the intracerebral processing which underlies auditory-sensorimotor classical conditioning in a group of 13 neonates (median gestational age at birth: 38 weeks + 4 days, range: 32 weeks + 2 days to 41 weeks + 6 days; median postmenstrual age at scan: 40 weeks + 5 days, range: 38 weeks + 3 days to 42 weeks + 1 days) with blood-oxygen-level-dependent (BOLD) functional magnetic resonance imaging (MRI) and magnetic resonance (MR) compatible robotics. We demonstrate that classical conditioning can induce crossmodal changes within putative unimodal sensory cortex even in the absence of its archetypal substrate. Our results also suggest that multimodal learning is associated with network wide activity within the conditioned neural system. These findings suggest that in early life, external multimodal sensory stimulation and integration shapes activity in the developing cortex and may influence its associated functional network architecture.
Collapse
|
24
|
Event-related functional MRI of awake behaving pigeons at 7T. Nat Commun 2020; 11:4715. [PMID: 32948772 PMCID: PMC7501281 DOI: 10.1038/s41467-020-18437-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Accepted: 08/20/2020] [Indexed: 11/08/2022] Open
Abstract
Animal-fMRI is a powerful method to understand neural mechanisms of cognition, but it remains a major challenge to scan actively participating small animals under low-stress conditions. Here, we present an event-related functional MRI platform in awake pigeons using single-shot RARE fMRI to investigate the neural fundaments for visually-guided decision making. We established a head-fixated Go/NoGo paradigm, which the animals quickly learned under low-stress conditions. The animals were motivated by water reward and behavior was assessed by logging mandibulations during the fMRI experiment with close to zero motion artifacts over hundreds of repeats. To achieve optimal results, we characterized the species-specific hemodynamic response function. As a proof-of-principle, we run a color discrimination task and discovered differential neural networks for Go-, NoGo-, and response execution-phases. Our findings open the door to visualize the neural fundaments of perceptual and cognitive functions in birds-a vertebrate class of which some clades are cognitively on par with primates.
Collapse
|
25
|
Focused Multisensory Anatomy Observation and Drawing for Enhancing Social Learning and Three-Dimensional Spatial Understanding. ANATOMICAL SCIENCES EDUCATION 2020; 13:488-503. [PMID: 31705741 DOI: 10.1002/ase.1929] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2019] [Revised: 10/08/2019] [Accepted: 11/03/2019] [Indexed: 06/10/2023]
Abstract
The concept that multisensory observation and drawing can be effective for enhancing anatomy learning is supported by pedagogic research and theory, and theories of drawing. A haptico-visual observation and drawing (HVOD) process has been previously introduced to support understanding of the three-dimensional (3D) spatial form of anatomical structures. The HVOD process involves exploration of 3D anatomy with the combined use of touch and sight, and the simultaneous act of making graphite marks on paper which correspond to the anatomy under observation. Findings from a previous study suggest that HVOD can increase perceptual understanding of anatomy through memorization and recall of the 3D form of observed structures. Here, additional pedagogic and cognitive underpinnings are presented to further demonstrate how and why HVOD can be effective for anatomy learning. Delivery of a HVOD workshop is described as a detailed guide for instructors, and themes arising from a phenomenological study of educator experiences of the HVOD process are presented. Findings indicate that HVOD can provide an engaging approach for the spatial exploration of anatomy within a supportive social learning environment, but also requires modification for effective curricular integration. Consequently, based on the most effective research-informed, theoretical, and logistical elements of art-based approaches in anatomy learning, including the framework provided by the observe-reflect-draw-edit-repeat (ORDER) method, an optimized "ORDER Touch" observation and drawing process has been developed. This is with the aim of providing a widely accessible resource for supporting social learning and 3D spatial understanding of anatomy, in addition to improving specific anatomical knowledge.
Collapse
|
26
|
A Simple and Compact MR-Compatible Electromagnetic Vibrotactile Stimulator. Front Neurosci 2020; 13:1403. [PMID: 32009884 PMCID: PMC6978794 DOI: 10.3389/fnins.2019.01403] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2019] [Accepted: 12/12/2019] [Indexed: 12/26/2022] Open
Abstract
We have developed a low-cost electromagnetic vibrotactile stimulator that uses the magnetic field of an MR scanner as a permanent magnet to power a vibrating motor. A simple variable current power supply is controlled by software using a USB data acquisition controller. In our study, the function of our novel stimulator was verified in a vibration frequency discrimination working memory task, in which various ranges of frequencies and amplitudes are delivered in MRI scanner. Furthermore, our functional MRI study revealed activations of the primary and secondary somatosensory cortices during the perception of tactile stimulation. Therefore, the new designed electromagnetic vibrotactile stimulator is capable of generating various frequencies of tactile stimuli and represents a powerful and useful tool for studying somatosensory functions with functional MRI.
Collapse
|
27
|
Touch improves visual discrimination of object features in capuchin monkeys (Sapajus spp.). Behav Processes 2020; 172:104044. [PMID: 31954810 DOI: 10.1016/j.beproc.2020.104044] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2019] [Revised: 01/03/2020] [Accepted: 01/13/2020] [Indexed: 11/25/2022]
Abstract
Primates perceive many object features through vision and touch. To date, little is known on how the synergy of these two sensory modalities contributes to enhance object recognition. Here, we investigated in capuchin monkeys (N = 12) whether manipulating objects and retaining tactile information enhanced visual recognition of geometrical object properties on different scales. Capuchins were trained to visually select the rewarded one of two objects differing in size, shape (larger-scale) or surface structure (smaller-scale). Objects were explored in two experimental conditions: the Sight condition prevented capuchins from touching the chosen object; the Sight and Touch condition allowed them to touch the selected object. Our results indicated that tactile information increased the capuchins' learning speed for visual discrimination of object features. Moreover, the capuchins' learning speed was higher in both size and shape discrimination compared to surface discrimination regardless of the availability of tactile input. Overall, our data demonstrated that the acquisition of tactile information about object features was advantageous for the capuchins and allowed them to achieve high levels of visual accuracy faster. This suggests that information from touch potentiated object recognition in the visual modality.
Collapse
|
28
|
Neural Correlates Underlying the Precision of Visual Working Memory. Neuroscience 2020; 425:301-311. [PMID: 31812661 DOI: 10.1016/j.neuroscience.2019.11.037] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2019] [Revised: 11/22/2019] [Accepted: 11/25/2019] [Indexed: 01/24/2023]
Abstract
The neural mechanisms associated with the limited capacity of working memory (WM) has long been studied, but it is still unclear which neural regions are associated with the precision of visual WM. Here, an orientation recall task for estimating the trial-wise precision of visual WM was performed and then repeated two weeks later in an fMRI scanner. Results showed that activity in frontal and parietal regions during WM maintenance scaled with WM load, but not with the precision of WM (i.e., recall error in radians). Conversely, activity in the lateral occipital complex (LOC) during WM maintenance was not affected by memory load, but rather, correlated with WM precision on a trial-by-trial basis. Moreover, activity in LOC also correlated with the individual participant's precision of WM from a separate behavioral experiment. Interestingly, a region within the prefrontal cortex, the inferior frontal junction (IFJ), exhibited greater functional connectivity with LOC when the WM load increased. Together, our findings provide unique evidence that the LOC supports visual WM precision, while communication between the IFJ and LOC varies based on WM load demands. These results suggest an intriguing possibility that distinct neural mechanisms may be associated with general content (load) or detailed information (precision) of WM.
Collapse
|
29
|
The Causal Role of the Prefrontal Cortex and Somatosensory Cortex in Tactile Working Memory. Cereb Cortex 2019; 28:3468-3477. [PMID: 28968894 DOI: 10.1093/cercor/bhx213] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2017] [Indexed: 12/31/2022] Open
Abstract
In the present study, we searched for causal evidence linking activity in the bilateral primary somatosensory cortex (SI), posterior parietal cortex (PPC), and prefrontal cortex (PFC) with behavioral performance in vibrotactile working memory. Participants performed a vibrotactile delayed matching-to-sample task, while single-pulse transcranial magnetic stimulation (sp-TMS) was applied over these cortical areas at 100, 200, 300, 600, 1600, and 1900 ms after the onset of vibrotactile stimulation (200 ms duration). In our experiments, sp-TMS over the contralateral SI at the early delay (100 and 200 ms) deteriorated the accuracy of task performance, and over the ipsilateral SI at the late delay (1600 and 1900 ms) also induced such deteriorating effects. Furthermore, deteriorating effects caused by sp-TMS over the contralateral DLPFC at the same maintenance stage (1600 ms) were correlated with the effects caused by sp-TMS over the ipsilateral SI, indicating that information retained in the ipsilateral SI during the late delay may be associated with the DLPFC. Taken together, these results suggest that both the contralateral and ipsilateral SIs are involved in tactile WM, and the contralateral DLPFC bridges the contralateral SI and ipsilateral SI for goal-directed action.
Collapse
|
30
|
Using neural response properties to draw the distinction between modal and amodal representations. PHILOSOPHICAL PSYCHOLOGY 2019. [DOI: 10.1080/09515089.2018.1563677] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
31
|
Abstract
The details of auditory response at the subthreshold level in the rodent primary somatosensory cortex, the barrel cortex, have not been studied extensively, although several phenomenological reports have been published. Multisensory features may act as neuronal representations of links between inputs from one sensory modality to other sensory modalities. Here, we examined the basic multisensory postsynaptic responses in the rodent barrel cortex using in vivo whole-cell recordings of neurons. We observed robust responses to acoustic stimuli in most barrel cortex neurons. Acoustically evoked responses were mediated by hearing and reached approximately 60% of the postsynaptic response amplitude elicited by strong somatosensory stimuli. Compared to tactile stimuli, auditory stimuli evoked postsynaptic potentials with a longer latency and longer duration. Specifically, auditory stimuli in barrel cortex neurons appeared to trigger "up states", episodes associated with membrane depolarization and increased synaptic activity. Taken together, our data suggest that barrel cortex neurons have multisensory properties, with distinct synaptic mechanisms underlying tactile and non-tactile responses.
Collapse
|
32
|
Cortical dynamics underpinning the self-other distinction of touch: A TMS-EEG study. Neuroimage 2018; 178:475-484. [DOI: 10.1016/j.neuroimage.2018.05.078] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2018] [Revised: 05/09/2018] [Accepted: 05/31/2018] [Indexed: 01/10/2023] Open
|
33
|
Neural Correlates of Feedback Processing in Visuo-Tactile Crossmodal Paired-Associate Learning. Front Hum Neurosci 2018; 12:266. [PMID: 30018542 PMCID: PMC6037861 DOI: 10.3389/fnhum.2018.00266] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 06/08/2018] [Indexed: 11/13/2022] Open
Abstract
Previous studies have examined the neural correlates for crossmodal paired-associate (PA) memory and the temporal dynamics of its formation. However, the neural dynamics for feedback processing of crossmodal PA learning remain unclear. To examine this process, we recorded event-related scalp electrical potentials for PA learning of unimodal visual-visual pairs and crossmodal visual-tactile pairs when participants performed unimodal and crossmodal tasks. We examined event-related potentials (ERPs) after the onset of feedback in the tasks for three effects: feedback type (positive feedback vs. negative feedback), learning (as the learning progressed) and the task modality (crossmodal vs. unimodal). The results were as follows: (1) feedback type: the amplitude of P300 decreased with incorrect trials and the P400/N400 complex was only present in incorrect trials; (2) learning: progressive positive voltage shifts in frontal recording sites and negative voltage shifts in central and posterior recording sites were identified as learning proceeded; and (3) task modality: compared with the unimodal PA learning task, positive voltage shifts in frontal sites and negative voltage shifts in posterior sites were found in the crossmodal PA learning task. To sum up, these results shed light on cortical excitability related to feedback processing of crossmodal PA learning.
Collapse
|
34
|
Bottom-up and top-down modulation of multisensory integration. Curr Opin Neurobiol 2018; 52:115-122. [PMID: 29778970 DOI: 10.1016/j.conb.2018.05.002] [Citation(s) in RCA: 53] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Accepted: 05/03/2018] [Indexed: 11/23/2022]
Abstract
Sensory perception in the real world requires proper integration of different modality inputs. Process of multisensory integration is not uniform. It varies from individual to individual and changes at different behavioral states of the animal. What factors affect multisensory integration? How does the mammalian brain reconstruct a multisensory world at different states? Here, we summarize recent findings on bottom-up and top-down factors that can modulate sensory processing and multisensory integration. We discuss cortical circuits that are responsible for modulation of multisensory processing based on recent rodent studies. We suggest that multisensory information is not a simple, fixed signal in the brain. Multisensory processing is dynamically modulated in the mammalian brain and leads to a unique and subjective experience of perception.
Collapse
|
35
|
Dorsolateral prefrontal cortex bridges bilateral primary somatosensory cortices during cross-modal working memory. Behav Brain Res 2018; 350:116-121. [PMID: 29727709 DOI: 10.1016/j.bbr.2018.04.053] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2017] [Revised: 04/29/2018] [Accepted: 04/30/2018] [Indexed: 10/17/2022]
Abstract
Neural activity in the dorsolateral prefrontal cortex (DLPFC) has been suggested to integrate information from distinct sensory areas. However, how the DLPFC interacts with the bilateral primary somatosensory cortices (SIs) in tactile-visual cross-modal working memory has not yet been established. In the present study, we applied single-pulse transcranial magnetic stimulation (sp-TMS) over the contralateral DLPFC and bilateral SIs of human participants at various time points, while they performed a tactile-visual delayed matching-to-sample task with a 2-second delay. sp-TMS over the contralateral DLPFC or the contralateral SI at either an sensory encoding stage [i.e. 100 ms after the onset of a vibrotactile sample stimulus (200-ms duration)] or an early maintenance stage (i.e. 300 ms after the onset), significantly impaired the accuracy of task performance; sp-TMS over the contralateral DLPFC or the ipsilateral SI at a late maintenance stage (1600 ms and 1900 ms) also significantly disrupted the performance. Furthermore, at 300 ms after the onset of the vibrotactile sample stimulus, there was a significant correlation between the deteriorating effects of sp-TMS over the contralateral SI and the contralateral DLPFC. These results imply that the DLPFC and the bilateral SIs play causal roles at distinctive stages during cross-modal working memory, while the contralateral DLPFC communicates with the contralateral SI in the early delay, and cooperates with the ipsilateral SI in the late delay.
Collapse
|
36
|
|
37
|
Functional MRI Responses to Passive, Active, and Observed Touch in Somatosensory and Insular Cortices of the Macaque Monkey. J Neurosci 2018. [PMID: 29540550 DOI: 10.1523/jneurosci.1587-17.2018] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023] Open
Abstract
Neurophysiological data obtained in primates suggests that merely observing others' actions can modulate activity in the observer's motor cortices. In humans, it has been suggested that these multimodal vicarious responses extend well beyond the motor cortices, including somatosensory and insular brain regions, which seem to yield vicarious responses when witnessing others' actions, sensations, or emotions (Gazzola and Keysers, 2009). Despite the wealth of data with respect to shared action responses in the monkey motor system, whether the somatosensory and insular cortices also yield vicarious responses during observation of touch remains largely unknown. Using independent tactile and motor fMRI localizers, we first mapped the hand representations of two male monkeys' primary (SI) and secondary (SII) somatosensory cortices. In two subsequent visual experiments, we examined fMRI brain responses to (1) observing a conspecific's hand being touched or (2) observing a human hand grasping or mere touching an object or another human hand. Whereas functionally defined "tactile SI" and "tactile SII" showed little involvement in representing observed touch, vicarious responses for touch were found in parietal area PFG, consistent with recent observations in humans (Chan and Baker, 2015). Interestingly, a more anterior portion of SII, and posterior insular cortex, both of which responded when monkeys performed active grasping movements, also yielded visual responses during different instances of touch observation.SIGNIFICANCE STATEMENT Common coding of one's own and others' actions, sensations, and emotions seems to be widespread in the brain. Although it is currently unclear to what extent human somatosensory cortices yield vicarious responses when observing touch, even less is known about the presence of similar vicarious responses in monkey somatosensory cortex. We therefore localized monkey somatosensory hand representations using fMRI and investigated whether these regions yield vicarious responses while observing various instances of touch. Whereas "tactile SI and SII" did not elicit responses during touch observation, a more anterior portion of SII, in addition to area PFG and posterior insular cortex, all of which responded during monkeys' own grasping movements, yielded vicarious responses during observed touch.
Collapse
|
38
|
Overlapping frontoparietal networks for tactile and visual parametric working memory representations. Neuroimage 2018; 166:325-334. [DOI: 10.1016/j.neuroimage.2017.10.059] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2017] [Revised: 10/17/2017] [Accepted: 10/26/2017] [Indexed: 11/24/2022] Open
|
39
|
Visually-Driven Maps in Area 3b. J Neurosci 2018; 38:1295-1310. [PMID: 29301873 DOI: 10.1523/jneurosci.0491-17.2017] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2017] [Revised: 11/28/2017] [Accepted: 12/01/2017] [Indexed: 01/22/2023] Open
Abstract
Sensory perception relies on the precise neuronal encoding of modality-specific environmental features in primary sensory cortices. Some studies have reported the penetration of signals from other modalities even into early sensory areas. So far, no comprehensive account of maps induced by "foreign sources" exists. We addressed this question using surface-based topographic mapping techniques applied to ultra-high resolution fMRI neuroimaging data, measured in female participants. We show that fine-grained finger maps in human primary somatosensory cortex, area 3b, are somatotopically activated not only during tactile mechanical stimulation, but also when viewing the same fingers being touched. Visually-induced maps were weak in amplitude, but overlapped with the stronger tactile maps tangential to the cortical sheet when finger touches were observed in both first- and third-person perspectives. However, visually-induced maps did not overlap tactile maps when the observed fingers were only approached by an object but not actually touched. Our data provide evidence that "foreign source maps" in early sensory cortices are present in the healthy human brain, that their arrangement is precise, and that their induction is feature-selective. The computations required to generate such specific responses suggest that counterflow (feedback) processing may be much more spatially specific than has been often assumed.SIGNIFICANCE STATEMENT Using ultra-high field fMRI, we provide empirical evidence that viewing touches activates topographically aligned single finger maps in human primary somatosensory cortical area 3b. This shows that "foreign source maps" in early sensory cortices are topographic, precise, and feature-selective in healthy human participants with intact sensory pathways.
Collapse
|
40
|
Abstract
The sense of taste is a key component of the sensory machinery, enabling the evaluation of both the safety as well as forming associations regarding the nutritional value of ingestible substances. Indicative of the salience of the modality, taste conditioning can be achieved in rodents upon a single pairing of a tastant with a chemical stimulus inducing malaise. This robust associative learning paradigm has been heavily linked with activity within the insular cortex (IC), among other regions, such as the amygdala and medial prefrontal cortex. A number of studies have demonstrated taste memory formation to be dependent on protein synthesis at the IC and to correlate with the induction of signaling cascades involved in synaptic plasticity. Taste learning has been shown to require the differential involvement of dopaminergic GABAergic, glutamatergic, muscarinic neurotransmission across an extended taste learning circuit. The subsequent activation of downstream protein kinases (ERK, CaMKII), transcription factors (CREB, Elk-1) and immediate early genes (c-fos, Arc), has been implicated in the regulation of the different phases of taste learning. This review discusses the relevant neurotransmission, molecular signaling pathways and genetic markers involved in novel and aversive taste learning, with a particular focus on the IC. Imaging and other studies in humans have implicated the IC in the pathophysiology of a number of cognitive disorders. We conclude that the IC participates in circuit-wide computations that modulate the interception and encoding of sensory information, as well as the formation of subjective internal representations that control the expression of motivated behaviors.
Collapse
|
41
|
Content-Specific Codes of Parametric Vibrotactile Working Memory in Humans. J Neurosci 2017; 37:9771-9777. [PMID: 28893928 DOI: 10.1523/jneurosci.1167-17.2017] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Revised: 08/28/2017] [Accepted: 08/30/2017] [Indexed: 01/03/2023] Open
Abstract
To understand how the brain handles mentally represented information flexibly in the absence of sensory stimulation, working memory (WM) studies have been essential. A seminal finding in monkey research is that neurons in the prefrontal cortex (PFC) retain stimulus-specific information when vibrotactile frequencies were memorized. A direct mapping between monkey studies and human research is still controversial. Although oscillatory signatures, in terms of frequency-dependent parametric beta-band modulation, have been observed recently in human EEG studies, the content specificity of these representations in terms of multivariate pattern analysis has not yet been shown. Here, we used fMRI in combination with multivariate classification techniques to determine which brain regions retain information during WM. In a retro-cue delayed-match-to-sample task, human subjects memorized the frequency of vibrotactile stimulation over a 12 s delay phase. Using an assumption-free whole-brain searchlight approach, we tested with support vector regression which brain regions exhibited multivariate parametric WM codes of the maintained frequencies during the WM delay. Interestingly, our analysis revealed an overlap with regions previously identified in monkeys composed of bilateral premotor cortices, supplementary motor area, and the right inferior frontal gyrus as part of the PFC. Therefore, our results establish a link between the WM codes found in monkeys and those in humans and emphasize the importance of the PFC for information maintenance during WM also in humans.SIGNIFICANCE STATEMENT Working memory (WM) research in monkeys has identified a network of regions, including prefrontal regions, to code stimulus-specific information when vibrotactile frequencies are memorized. Here, we performed an fMRI study during which human subjects had to memorize vibratory frequencies in parallel to previous monkey research. Using an assumption-free, whole-brain searchlight decoding approach, we identified for the first time regions in the human brain that exhibit multivariate patterns of activity to code the vibratory frequency parametrically during WM. Our results parallel previous monkey findings and show that the supplementary motor area, premotor, and the right prefrontal cortex are involved in vibrotactile WM coding in humans.
Collapse
|
42
|
Neural correlates of visuo-tactile crossmodal paired-associate learning and memory in humans. Neuroscience 2017; 362:181-195. [PMID: 28843996 DOI: 10.1016/j.neuroscience.2017.08.035] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2017] [Revised: 08/16/2017] [Accepted: 08/17/2017] [Indexed: 11/16/2022]
Abstract
Studies have indicated that a cortical sensory system is capable of processing information from different sensory modalities. However, it still remains unclear when and how a cortical system integrates and retains information across sensory modalities during learning. Here we investigated the neural dynamics underlying crossmodal associations and memory by recording event-related potentials (ERPs) when human participants performed visuo-tactile (crossmodal) and visuo-visual (unimodal) paired-associate (PA) learning tasks. In a trial of the tasks, the participants were required to explore and learn the relationship (paired or non-paired) between two successive stimuli. EEG recordings revealed dynamic ERP changes during participants' learning of paired-associations. Specifically, (1) the frontal N400 component showed learning-related changes in both unimodal and crossmodal tasks but did not show any significant difference between these two tasks, while the central P400 displayed both learning changes and task differences; (2) a late posterior negative slow wave (LPN) showed the learning effect only in the crossmodal task; (3) alpha-band oscillations appeared to be involved in crossmodal working memory. Additional behavioral experiments suggested that these ERP components were not relevant to the participants' familiarity with stimuli per se. Further, by shortening the delay length (from 1300ms to 400ms or 200 ms) between the first and second stimulus in the crossmodal task, declines in participants' task performance were observed accordingly. Taken together, these results provide insights into the cortical plasticity (induced by PA learning) of neural networks involved in crossmodal associations in working memory.
Collapse
|
43
|
Multisensory Bayesian Inference Depends on Synapse Maturation during Training: Theoretical Analysis and Neural Modeling Implementation. Neural Comput 2017; 29:735-782. [DOI: 10.1162/neco_a_00935] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding—the idea that a population of neurons can encode probability functions to perform Bayesian inference. The model consists of two chains of unisensory neurons (auditory and visual) topologically organized. They receive the corresponding input through a plastic receptive field and reciprocally exchange plastic cross-modal synapses, which encode the spatial co-occurrence of visual-auditory inputs. A third chain of multisensory neurons performs a simple sum of auditory and visual excitations. The work includes a theoretical part and a computer simulation study. We show how a simple rule for synapse learning (consisting of Hebbian reinforcement and a decay term) can be used during training to shrink the receptive fields and encode the unisensory likelihood functions. Hence, after training, each unisensory area realizes a maximum likelihood estimate of stimulus position (auditory or visual). In cross-modal conditions, the same learning rule can encode information on prior probability into the cross-modal synapses. Computer simulations confirm the theoretical results and show that the proposed network can realize a maximum likelihood estimate of auditory (or visual) positions in unimodal conditions and a Bayesian estimate, with moderate deviations from optimality, in cross-modal conditions. Furthermore, the model explains the ventriloquism illusion and, looking at the activity in the multimodal neurons, explains the automatic reweighting of auditory and visual inputs on a trial-by-trial basis, according to the reliability of the individual cues.
Collapse
|
44
|
Multisensory Interactions Influence Neuronal Spike Train Dynamics in the Posterior Parietal Cortex. PLoS One 2016; 11:e0166786. [PMID: 28033334 PMCID: PMC5199055 DOI: 10.1371/journal.pone.0166786] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2016] [Accepted: 11/03/2016] [Indexed: 12/11/2022] Open
Abstract
Although significant progress has been made in understanding multisensory interactions at the behavioral level, their underlying neural mechanisms remain relatively poorly understood in cortical areas, particularly during the control of action. In recent experiments where animals reached to and actively maintained their arm position at multiple spatial locations while receiving either proprioceptive or visual-proprioceptive position feedback, multisensory interactions were shown to be associated with reduced spiking (i.e. subadditivity) as well as reduced intra-trial and across-trial spiking variability in the superior parietal lobule (SPL). To further explore the nature of such interaction-induced changes in spiking variability we quantified the spike train dynamics of 231 of these neurons. Neurons were classified as Poisson, bursty, refractory, or oscillatory (in the 13–30 Hz “beta-band”) based on their spike train power spectra and autocorrelograms. No neurons were classified as Poisson-like in either the proprioceptive or visual-proprioceptive conditions. Instead, oscillatory spiking was most commonly observed with many neurons exhibiting these oscillations under only one set of feedback conditions. The results suggest that the SPL may belong to a putative beta-synchronized network for arm position maintenance and that position estimation may be subserved by different subsets of neurons within this network depending on available sensory information. In addition, the nature of the observed spiking variability suggests that models of multisensory interactions in the SPL should account for both Poisson-like and non-Poisson variability.
Collapse
|
45
|
Associative learning changes cross-modal representations in the gustatory cortex. eLife 2016; 5. [PMID: 27572258 PMCID: PMC5026467 DOI: 10.7554/elife.16420] [Citation(s) in RCA: 55] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2016] [Accepted: 08/16/2016] [Indexed: 01/03/2023] Open
Abstract
A growing body of literature has demonstrated that primary sensory cortices are not exclusively unimodal, but can respond to stimuli of different sensory modalities. However, several questions concerning the neural representation of cross-modal stimuli remain open. Indeed, it is poorly understood if cross-modal stimuli evoke unique or overlapping representations in a primary sensory cortex and whether learning can modulate these representations. Here we recorded single unit responses to auditory, visual, somatosensory, and olfactory stimuli in the gustatory cortex (GC) of alert rats before and after associative learning. We found that, in untrained rats, the majority of GC neurons were modulated by a single modality. Upon learning, both prevalence of cross-modal responsive neurons and their breadth of tuning increased, leading to a greater overlap of representations. Altogether, our results show that the gustatory cortex represents cross-modal stimuli according to their sensory identity, and that learning changes the overlap of cross-modal representations. DOI:http://dx.doi.org/10.7554/eLife.16420.001 Imagine that you are waiting for a cappuccino at your favorite café. You hear the sound of the steamer, and shortly afterwards the barista calls your name and announces that your cappuccino is ready. As they hand it to you, you see the foam sprinkled with cocoa and the aroma of the cappuccino reaches your nose. You can almost taste it. When you finally take your first sip, the taste is hardly a surprise; it is just as your eyes and nose predicted. How does the brain deal with such a rich and multisensory experience? How does it learn to associate the sight and smell of a cappuccino with its taste? Specialized regions of the brain called associative areas were traditionally thought to perform this task. These areas receive inputs from every sensory system and can link information from these different sources together. According to this view, the job of each individual sensory system is to pass along information relevant to one particular sense. More recent results, however, challenge this strict division of labor and suggest that individual sensory systems may be able to combine information from multiple senses. Thus the sights, sounds and odors associated with our cappuccino may also activate the area of the brain in charge of processing taste: the gustatory cortex. To investigate this possibility, Vincis and Fontanini set out to determine whether neurons in the gustatory cortex of rats can process stimuli belonging to senses other than taste. As predicted, neurons in the gustatory cortex did change their firing rates in response to odors, touch, sounds and light. However, more of the gustatory neurons responded to odors and touch than to sounds and light. In addition, of the four stimuli, the rats most easily learned to associate odors and touch with a sugary solution. This is consistent with the fact that rodents rely more upon their whiskers and their sense of smell to find food they do their eyes and ears. Finally, learning to associate a stimulus other than taste with a sugary solution increased the number of neurons in the gustatory cortex that subsequently responded to other senses and changed their response properties. Further studies are now required to answer three questions. Why can some senses more effectively influence the activity of the gustatory cortex than others? Can gustatory neurons distinguish between different stimuli of the same type – different odors, for example? What are the neural pathways that convey multisensory information to the gustatory cortex? Answering these questions will help us to better understand how sensory systems link information from multiple senses. DOI:http://dx.doi.org/10.7554/eLife.16420.002
Collapse
|
46
|
A cross-modal feedback scheme for control of prosthetic grasp strength. J Rehabil Assist Technol Eng 2016; 3:2055668316663121. [PMID: 31186905 PMCID: PMC6453087 DOI: 10.1177/2055668316663121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2015] [Accepted: 07/07/2016] [Indexed: 11/16/2022] Open
Abstract
Introduction Given the lack of haptic feedback inherent in prosthetic devices, a natural and adaptable feedback scheme must be implemented. While multimodal feedback has proven successful in aiding dexterous performance, it can be mentally tasking on the individual. Conversely, cross-modal schemes relying on sensory substitution have proven to be equally effective in aiding task performance without cognitively burdening the user to the same degree. Objectives This experiment investigated the effectiveness of the cross-modal feedback scheme through using audio feedback to represent prosthetic grasping strength during dynamic control of a prosthetic hand. Methods A total of five individuals participated in two sets of experiments (four subjects in the first, one subject in the second). Participants were asked to control the grasping strength exerted by a prosthetic hand while using real-time audio feedback in order to reach up to three different levels of force within a trial set. Results The cross-modal feedback scheme successfully provided users with the robust ability to modulate grasping strength in real-time using only audio feedback. Conclusion Audio feedback effectively conveys haptic information to the user of a prosthetic hand. Retention of the training knowledge is evident and can be generalized to perform new (i.e. untrained) tasks.
Collapse
|
47
|
A gustocentric perspective to understanding primary sensory cortices. Curr Opin Neurobiol 2016; 40:118-124. [PMID: 27455038 DOI: 10.1016/j.conb.2016.06.008] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2016] [Revised: 06/08/2016] [Accepted: 06/09/2016] [Indexed: 12/27/2022]
Abstract
Most of the general principles used to explain sensory cortical function have been inferred from experiments performed on neocortical, primary sensory areas. Attempts to apply a neocortical view to the study of the gustatory cortex (GC) have provided only a limited understanding of this area. Failures to conform GC to classical neocortical principles have been implicitly interpreted as a demonstration of GC's uniqueness. Here we propose to take the opposite perspective, dismissing GC's uniqueness and using principles extracted from its study as a lens for looking at neocortical sensory function. In this review, we describe three significant findings related to gustatory cortical function and advocate their relevance for understanding neocortical sensory areas.
Collapse
|
48
|
Species-dependent role of crossmodal connectivity among the primary sensory cortices. Hear Res 2016; 343:83-91. [PMID: 27292113 DOI: 10.1016/j.heares.2016.05.014] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/04/2016] [Revised: 05/24/2016] [Accepted: 05/27/2016] [Indexed: 11/19/2022]
Abstract
When a major sense is lost, crossmodal plasticity substitutes functional processing from the remaining, intact senses. Recent studies of deafness-induced crossmodal plasticity in different subregions of auditory cortex indicate that the phenomenon is largely based on the "unmasking" of existing inputs. However, there is not yet a consensus on the sources or effects of crossmodal inputs to primary sensory cortical areas. In the present review, a rigorous re-examination of the experimental literature indicates that connections between different primary sensory cortices consistently occur in rodents, while primary-to-primary projections are absent/inconsistent in non-rodents such as cats and monkeys. These observations suggest that crossmodal plasticity that involves primary sensory areas are likely to exhibit species-specific distinctions.
Collapse
|
49
|
The Dynamic Multisensory Engram: Neural Circuitry Underlying Crossmodal Object Recognition in Rats Changes with the Nature of Object Experience. J Neurosci 2016; 36:1273-89. [PMID: 26818515 DOI: 10.1523/jneurosci.3043-15.2016] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
UNLABELLED Rats, humans, and monkeys demonstrate robust crossmodal object recognition (CMOR), identifying objects across sensory modalities. We have shown that rats' performance of a spontaneous tactile-to-visual CMOR task requires functional integration of perirhinal (PRh) and posterior parietal (PPC) cortices, which seemingly provide visual and tactile object feature processing, respectively. However, research with primates has suggested that PRh is sufficient for multisensory object representation. We tested this hypothesis in rats using a modification of the CMOR task in which multimodal preexposure to the to-be-remembered objects significantly facilitates performance. In the original CMOR task, with no preexposure, reversible lesions of PRh or PPC produced patterns of impairment consistent with modality-specific contributions. Conversely, in the CMOR task with preexposure, PPC lesions had no effect, whereas PRh involvement was robust, proving necessary for phases of the task that did not require PRh activity when rats did not have preexposure; this pattern was supported by results from c-fos imaging. We suggest that multimodal preexposure alters the circuitry responsible for object recognition, in this case obviating the need for PPC contributions and expanding PRh involvement, consistent with the polymodal nature of PRh connections and results from primates indicating a key role for PRh in multisensory object representation. These findings have significant implications for our understanding of multisensory information processing, suggesting that the nature of an individual's past experience with an object strongly determines the brain circuitry involved in representing that object's multisensory features in memory. SIGNIFICANCE STATEMENT The ability to integrate information from multiple sensory modalities is crucial to the survival of organisms living in complex environments. Appropriate responses to behaviorally relevant objects are informed by integration of multisensory object features. We used crossmodal object recognition tasks in rats to study the neurobiological basis of multisensory object representation. When rats had no prior exposure to the to-be-remembered objects, the spontaneous ability to recognize objects across sensory modalities relied on functional interaction between multiple cortical regions. However, prior multisensory exploration of the task-relevant objects remapped cortical contributions, negating the involvement of one region and significantly expanding the role of another. This finding emphasizes the dynamic nature of cortical representation of objects in relation to past experience.
Collapse
|
50
|
A Neural Parametric Code for Storing Information of More than One Sensory Modality in Working Memory. Neuron 2016; 89:54-62. [DOI: 10.1016/j.neuron.2015.11.026] [Citation(s) in RCA: 50] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2015] [Revised: 10/20/2015] [Accepted: 11/11/2015] [Indexed: 10/22/2022]
|