1
|
Multisensory Integration Underlies the Distinct Representation of Odor-Taste Mixtures in the Gustatory Cortex of Behaving Rats. J Neurosci 2024; 44:e0071242024. [PMID: 38548337 PMCID: PMC11097261 DOI: 10.1523/jneurosci.0071-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 02/21/2024] [Accepted: 03/14/2024] [Indexed: 05/15/2024] Open
Abstract
The perception of food relies on the integration of olfactory and gustatory signals originating from the mouth. This multisensory process generates robust associations between odors and tastes, significantly influencing the perceptual judgment of flavors. However, the specific neural substrates underlying this integrative process remain unclear. Previous electrophysiological studies identified the gustatory cortex as a site of convergent olfactory and gustatory signals, but whether neurons represent multimodal odor-taste mixtures as distinct from their unimodal odor and taste components is unknown. To investigate this, we recorded single-unit activity in the gustatory cortex of behaving female rats during the intraoral delivery of individual odors, individual tastes, and odor-taste mixtures. Our results demonstrate that chemoselective neurons in the gustatory cortex are broadly responsive to intraoral chemosensory stimuli, exhibiting time-varying multiphasic changes in activity. In a subset of these chemoselective neurons, odor-taste mixtures elicit nonlinear cross-modal responses that distinguish them from their olfactory and gustatory components. These findings provide novel insights into multimodal chemosensory processing by the gustatory cortex, highlighting the distinct representation of unimodal and multimodal intraoral chemosensory signals. Overall, our findings suggest that olfactory and gustatory signals interact nonlinearly in the gustatory cortex to enhance the identity coding of both unimodal and multimodal chemosensory stimuli.
Collapse
|
2
|
A primary sensory cortical interareal feedforward inhibitory circuit for tacto-visual integration. Nat Commun 2024; 15:3081. [PMID: 38594279 PMCID: PMC11003985 DOI: 10.1038/s41467-024-47459-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Accepted: 04/03/2024] [Indexed: 04/11/2024] Open
Abstract
Tactile sensation and vision are often both utilized for the exploration of objects that are within reach though it is not known whether or how these two distinct sensory systems combine such information. Here in mice, we used a combination of stereo photogrammetry for 3D reconstruction of the whisker array, brain-wide anatomical tracing and functional connectivity analysis to explore the possibility of tacto-visual convergence in sensory space and within the circuitry of the primary visual cortex (VISp). Strikingly, we find that stimulation of the contralateral whisker array suppresses visually evoked activity in a tacto-visual sub-region of VISp whose visual space representation closely overlaps with the whisker search space. This suppression is mediated by local fast-spiking interneurons that receive a direct cortico-cortical input predominantly from layer 6 neurons located in the posterior primary somatosensory barrel cortex (SSp-bfd). These data demonstrate functional convergence within and between two primary sensory cortical areas for multisensory object detection and recognition.
Collapse
|
3
|
Mapping of facial and vocal processing in common marmosets with ultra-high field fMRI. Commun Biol 2024; 7:317. [PMID: 38480875 PMCID: PMC10937914 DOI: 10.1038/s42003-024-06002-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Accepted: 03/01/2024] [Indexed: 03/17/2024] Open
Abstract
Primate communication relies on multimodal cues, such as vision and audition, to facilitate the exchange of intentions, enable social interactions, avoid predators, and foster group cohesion during daily activities. Understanding the integration of facial and vocal signals is pivotal to comprehend social interaction. In this study, we acquire whole-brain ultra-high field (9.4 T) fMRI data from awake marmosets (Callithrix jacchus) to explore brain responses to unimodal and combined facial and vocal stimuli. Our findings reveal that the multisensory condition not only intensifies activations in the occipito-temporal face patches and auditory voice patches but also engages a more extensive network that includes additional parietal, prefrontal and cingulate areas, compared to the summed responses of the unimodal conditions. By uncovering the neural network underlying multisensory audiovisual integration in marmosets, this study highlights the efficiency and adaptability of the marmoset brain in processing facial and vocal social signals, providing significant insights into primate social communication.
Collapse
|
4
|
Heterosynaptic plasticity of the visuo-auditory projection requires cholecystokinin released from entorhinal cortex afferents. eLife 2024; 13:e83356. [PMID: 38436304 PMCID: PMC10954309 DOI: 10.7554/elife.83356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 03/03/2024] [Indexed: 03/05/2024] Open
Abstract
The entorhinal cortex is involved in establishing enduring visuo-auditory associative memory in the neocortex. Here we explored the mechanisms underlying this synaptic plasticity related to projections from the visual and entorhinal cortices to the auditory cortex in mice using optogenetics of dual pathways. High-frequency laser stimulation (HFS laser) of the visuo-auditory projection did not induce long-term potentiation. However, after pairing with sound stimulus, the visuo-auditory inputs were potentiated following either infusion of cholecystokinin (CCK) or HFS laser of the entorhino-auditory CCK-expressing projection. Combining retrograde tracing and RNAscope in situ hybridization, we show that Cck expression is higher in entorhinal cortex neurons projecting to the auditory cortex than in those originating from the visual cortex. In the presence of CCK, potentiation in the neocortex occurred when the presynaptic input arrived 200 ms before postsynaptic firing, even after just five trials of pairing. Behaviorally, inactivation of the CCK+ projection from the entorhinal cortex to the auditory cortex blocked the formation of visuo-auditory associative memory. Our results indicate that neocortical visuo-auditory association is formed through heterosynaptic plasticity, which depends on release of CCK in the neocortex mostly from entorhinal afferents.
Collapse
|
5
|
Neuronal Population Encoding of Identity in Primate Prefrontal Cortex. J Neurosci 2024; 44:e0703232023. [PMID: 37963766 PMCID: PMC10860606 DOI: 10.1523/jneurosci.0703-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 08/22/2023] [Accepted: 10/10/2023] [Indexed: 11/16/2023] Open
Abstract
The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorical features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (Macaca mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression, or their interaction in their stimulus-related firing rates; however, decoding of expression and identity using single-unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudo-populations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.
Collapse
|
6
|
Neural Integration of Audiovisual Sensory Inputs in Macaque Amygdala and Adjacent Regions. Neurosci Bull 2023; 39:1749-1761. [PMID: 36920645 PMCID: PMC10661144 DOI: 10.1007/s12264-023-01043-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 02/13/2023] [Indexed: 03/16/2023] Open
Abstract
Integrating multisensory inputs to generate accurate perception and guide behavior is among the most critical functions of the brain. Subcortical regions such as the amygdala are involved in sensory processing including vision and audition, yet their roles in multisensory integration remain unclear. In this study, we systematically investigated the function of neurons in the amygdala and adjacent regions in integrating audiovisual sensory inputs using a semi-chronic multi-electrode array and multiple combinations of audiovisual stimuli. From a sample of 332 neurons, we showed the diverse response patterns to audiovisual stimuli and the neural characteristics of bimodal over unimodal modulation, which could be classified into four types with differentiated regional origins. Using the hierarchical clustering method, neurons were further clustered into five groups and associated with different integrating functions and sub-regions. Finally, regions distinguishing congruent and incongruent bimodal sensory inputs were identified. Overall, visual processing dominates audiovisual integration in the amygdala and adjacent regions. Our findings shed new light on the neural mechanisms of multisensory integration in the primate brain.
Collapse
|
7
|
Neural signatures of natural behavior in socializing macaques. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.05.547833. [PMID: 37461580 PMCID: PMC10349985 DOI: 10.1101/2023.07.05.547833] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/05/2023]
Abstract
Our understanding of the neurobiology of primate behavior largely derives from artificial tasks in highly-controlled laboratory settings, overlooking most natural behaviors primate brains evolved to produce1. In particular, how primates navigate the multidimensional social relationships that structure daily life and shape survival and reproductive success remains largely unexplored at the single neuron level. Here, we combine ethological analysis with new wireless recording technologies to uncover neural signatures of natural behavior in unrestrained, socially interacting pairs of rhesus macaques within a larger colony. Population decoding of single neuron activity in prefrontal and temporal cortex unveiled robust encoding of 24 species-typical behaviors, which was strongly modulated by the presence and identity of surrounding monkeys. Male-female partners demonstrated near-perfect reciprocity in grooming, a key behavioral mechanism supporting friendships and alliances, and neural activity maintained a running account of these social investments. When confronted with an aggressive intruder, behavioral and neural population responses reflected empathy and were buffered by the presence of a partner. Surprisingly, neural signatures in prefrontal and temporal cortex were largely indistinguishable and irreducible to visual and motor contingencies. By employing an ethological approach to the study of primate neurobiology, we reveal a highly-distributed neurophysiological record of social dynamics, a potential computational foundation supporting communal life in primate societies, including our own.
Collapse
|
8
|
Multisensory integration in the mammalian brain: diversity and flexibility in health and disease. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220338. [PMID: 37545309 PMCID: PMC10404930 DOI: 10.1098/rstb.2022.0338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multisensory integration (MSI) occurs in a variety of brain areas, spanning cortical and subcortical regions. In traditional studies on sensory processing, the sensory cortices have been considered for processing sensory information in a modality-specific manner. The sensory cortices, however, send the information to other cortical and subcortical areas, including the higher association cortices and the other sensory cortices, where the multiple modality inputs converge and integrate to generate a meaningful percept. This integration process is neither simple nor fixed because these brain areas interact with each other via complicated circuits, which can be modulated by numerous internal and external conditions. As a result, dynamic MSI makes multisensory decisions flexible and adaptive in behaving animals. Impairments in MSI occur in many psychiatric disorders, which may result in an altered perception of the multisensory stimuli and an abnormal reaction to them. This review discusses the diversity and flexibility of MSI in mammals, including humans, primates and rodents, as well as the brain areas involved. It further explains how such flexibility influences perceptual experiences in behaving animals in both health and disease. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
|
9
|
Multisensory interactions of face and vocal information during perception and memory in ventrolateral prefrontal cortex. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220343. [PMID: 37545305 PMCID: PMC10404928 DOI: 10.1098/rstb.2022.0343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 03/21/2023] [Indexed: 08/08/2023] Open
Abstract
The ventral frontal lobe is a critical node in the circuit that underlies communication, a multisensory process where sensory features of faces and vocalizations come together. The neural basis of face and vocal integration is a topic of great importance since the integration of multiple sensory signals is essential for the decisions that govern our social interactions. Investigations have shown that the macaque ventrolateral prefrontal cortex (VLPFC), a proposed homologue of the human inferior frontal gyrus, is involved in the processing, integration and remembering of audiovisual signals. Single neurons in VLPFC encode and integrate species-specific faces and corresponding vocalizations. During working memory, VLPFC neurons maintain face and vocal information online and exhibit selective activity for face and vocal stimuli. Population analyses indicate that identity, a critical feature of social stimuli, is encoded by VLPFC neurons and dictates the structure of dynamic population activity in the VLPFC during the perception of vocalizations and their corresponding facial expressions. These studies suggest that VLPFC may play a primary role in integrating face and vocal stimuli with contextual information, in order to support decision making during social communication. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
|
10
|
Redundant representations are required to disambiguate simultaneously presented complex stimuli. PLoS Comput Biol 2023; 19:e1011327. [PMID: 37556470 PMCID: PMC10442167 DOI: 10.1371/journal.pcbi.1011327] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 08/21/2023] [Accepted: 07/04/2023] [Indexed: 08/11/2023] Open
Abstract
A pedestrian crossing a street during rush hour often looks and listens for potential danger. When they hear several different horns, they localize the cars that are honking and decide whether or not they need to modify their motor plan. How does the pedestrian use this auditory information to pick out the corresponding cars in visual space? The integration of distributed representations like these is called the assignment problem, and it must be solved to integrate distinct representations across but also within sensory modalities. Here, we identify and analyze a solution to the assignment problem: the representation of one or more common stimulus features in pairs of relevant brain regions-for example, estimates of the spatial position of cars are represented in both the visual and auditory systems. We characterize how the reliability of this solution depends on different features of the stimulus set (e.g., the size of the set and the complexity of the stimuli) and the details of the split representations (e.g., the precision of each stimulus representation and the amount of overlapping information). Next, we implement this solution in a biologically plausible receptive field code and show how constraints on the number of neurons and spikes used by the code force the brain to navigate a tradeoff between local and catastrophic errors. We show that, when many spikes and neurons are available, representing stimuli from a single sensory modality can be done more reliably across multiple brain regions, despite the risk of assignment errors. Finally, we show that a feedforward neural network can learn the optimal solution to the assignment problem, even when it receives inputs in two distinct representational formats. We also discuss relevant results on assignment errors from the human working memory literature and show that several key predictions of our theory already have support.
Collapse
|
11
|
Snapping Out of Autopilot: Overriding Habits in Real Time and the Role of Ventrolateral Prefrontal Cortex. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:482-490. [PMID: 36137178 PMCID: PMC10023494 DOI: 10.1177/17456916221120033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Habits allow environmental and interoceptive cues to trigger behavior in an automatized fashion, making them liable to deployment in inappropriate or outdated contexts. Over the long term, repeated failure of a once-adaptive habit to satisfy current goals produces extinction learning that suppresses the habit's execution. Less attention has been afforded to the mechanisms underlying real-time habit suppression: the capacity to stop the execution of a cued habit that is goal conflicting. Here, I first posit a model by which goal-relevant stimuli can (a) bring unfolding habits and their projected outcomes into awareness, (b) prompt evaluation of the habit outcome with respect to current goals, and (c) trigger cessation of the habit response if it is determined to be goal conflicting. Second, I propose a modified stop-signal task to test this model of goal-directed stopping of habit execution. Finally, I marshal evidence indicating that the ventrolateral prefrontal cortex, situated at the nexus of salience detection, action-plan assessment, and motor inhibition networks, is uniquely positioned to coordinate the overriding of habitual behaviors in real time. In sum, this perspective presents a testable model and candidate neurobiological substrate for our capacity to "snap out of autopilot" and override goal-conflicting habits in real time.
Collapse
|
12
|
Differences in brain functional networks for audiovisual integration during reading between children and adults. Ann N Y Acad Sci 2023; 1520:127-139. [PMID: 36478220 DOI: 10.1111/nyas.14943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Building robust letter-to-sound correspondences is a prerequisite for developing reading capacity. However, the neural mechanisms underlying the development of audiovisual integration for reading are largely unknown. This study used functional magnetic resonance imaging in a lexical decision task to investigate functional brain networks that support audiovisual integration during reading in developing child readers (10-12 years old) and skilled adult readers (20-28 years old). The results revealed enhanced connectivity in a prefrontal-superior temporal network (including the right medial frontal gyrus, right superior frontal gyrus, and left superior temporal gyrus) in adults relative to children, reflecting the development of attentional modulation of audiovisual integration involved in reading processing. Furthermore, the connectivity strength of this brain network was correlated with reading accuracy. Collectively, this study, for the first time, elucidates the differences in brain networks of audiovisual integration for reading between children and adults, promoting the understanding of the neurodevelopment of multisensory integration in high-level human cognition.
Collapse
|
13
|
Multisensory integration of orally-sourced gustatory and olfactory inputs to the posterior piriform cortex in awake rats. J Physiol 2023; 601:151-169. [PMID: 36385245 PMCID: PMC9869978 DOI: 10.1113/jp283873] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/09/2022] [Indexed: 11/18/2022] Open
Abstract
Flavour refers to the sensory experience of food, which is a combination of sensory inputs sourced from multiple modalities during consumption, including taste and odour. Previous work has demonstrated that orally-sourced taste and odour cues interact to determine perceptual judgements of flavour stimuli, although the underlying cellular- and circuit-level neural mechanisms remain unknown. We recently identified a region of the piriform olfactory cortex in rats that responds to both taste and odour stimuli. Here, we investigated how converging taste and odour inputs to this area interact to affect single neuron responsiveness ensemble coding of flavour identity. To accomplish this, we recorded spiking activity from ensembles of single neurons in the posterior piriform cortex (pPC) in awake, tasting rats while delivering taste solutions, odour solutions and taste + odour mixtures directly into the oral cavity. Our results show that taste and odour inputs evoke highly selective, temporally-overlapping responses in multisensory pPC neurons. Comparing responses to mixtures and their unisensory components revealed that taste and odour inputs interact in a non-linear manner to produce unique response patterns. Taste input enhances trial-by-trial decoding of odour identity from small ensembles of simultaneously recorded neurons. Together, these results demonstrate that taste and odour inputs to pPC interact in complex, non-linear ways to form amodal flavour representations that enhance identity coding. KEY POINTS: Experience of food involves taste and smell, although how information from these different senses is combined by the brain to create our sense of flavour remains unknown. We recorded from small groups of neurons in the olfactory cortex of awake rats while they consumed taste solutions, odour solutions and taste + odour mixtures. Taste and smell solutions evoke highly selective responses. When presented in a mixture, taste and smell inputs interacted to alter responses, resulting in activation of unique sets of neurons that could not be predicted by the component responses. Synergistic interactions increase discriminability of odour representations. The olfactory cortex uses taste and smell to create new information representing multisensory flavour identity.
Collapse
|
14
|
Multisensory integration in neurons of the medial pulvinar of macaque monkey. Cereb Cortex 2022; 33:4202-4215. [PMID: 36068947 PMCID: PMC10110443 DOI: 10.1093/cercor/bhac337] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 07/29/2022] [Accepted: 07/30/2022] [Indexed: 11/14/2022] Open
Abstract
The pulvinar is a heterogeneous thalamic nucleus, which is well developed in primates. One of its subdivisions, the medial pulvinar, is connected to many cortical areas, including the visual, auditory, and somatosensory cortices, as well as with multisensory areas and premotor areas. However, except for the visual modality, little is known about its sensory functions. A hypothesis is that, as a region of convergence of information from different sensory modalities, the medial pulvinar plays a role in multisensory integration. To test this hypothesis, 2 macaque monkeys were trained to a fixation task and the responses of single-units to visual, auditory, and auditory-visual stimuli were examined. Analysis revealed auditory, visual, and multisensory neurons in the medial pulvinar. It also revealed multisensory integration in this structure, mainly suppressive (the audiovisual response is less than the strongest unisensory response) and subadditive (the audiovisual response is less than the sum of the auditory and the visual responses). These findings suggest that the medial pulvinar is involved in multisensory integration.
Collapse
|
15
|
Representation of expression and identity by ventral prefrontal neurons. Neuroscience 2022; 496:243-260. [PMID: 35654293 PMCID: PMC10363293 DOI: 10.1016/j.neuroscience.2022.05.033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 05/20/2022] [Accepted: 05/25/2022] [Indexed: 01/26/2023]
Abstract
Evidence has suggested that the ventrolateral prefrontal cortex (VLPFC) processes social stimuli, including faces and vocalizations, which are essential for communication. Features embedded within audiovisual stimuli, including emotional expression and caller identity, provide abundant information about an individual's intention, emotional state, motivation, and social status, which are important to encode in a social exchange. However, it is unknown to what extent the VLPFC encodes such features. To investigate the role of VLPFC during social communication, we recorded single-unit activity while rhesus macaques (Macaca mulatta) performed a nonmatch-to-sample task using species-specific face-vocalization stimuli that differed in emotional expression or caller identity. 75% of recorded cells were task-related and of these >70% were responsive during the nonmatch period. A larger proportion of nonmatch cells encoded the stimulus rather than the context of the trial type. A subset of responsive neurons were most commonly modulated by the identity of the nonmatch stimulus and less by the emotional expression, or both features within the face-vocalization stimuli presented during the nonmatch period. Neurons encoding identity were found in VLPFC across a broader region than expression related cells which were confined to only the anterolateral portion of the recording chamber in VLPFC. These findings suggest that, within a working memory paradigm, VLPFC processes features of face and vocal stimuli, such as emotional expression and identity, in addition to task and contextual information. Thus, stimulus and contextual information may be integrated by VLPFC during social communication.
Collapse
|
16
|
Faces and Voices Processing in Human and Primate Brains: Rhythmic and Multimodal Mechanisms Underlying the Evolution and Development of Speech. Front Psychol 2022; 13:829083. [PMID: 35432052 PMCID: PMC9007199 DOI: 10.3389/fpsyg.2022.829083] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2021] [Accepted: 03/07/2022] [Indexed: 11/24/2022] Open
Abstract
While influential works since the 1970s have widely assumed that imitation is an innate skill in both human and non-human primate neonates, recent empirical studies and meta-analyses have challenged this view, indicating other forms of reward-based learning as relevant factors in the development of social behavior. The visual input translation into matching motor output that underlies imitation abilities instead seems to develop along with social interactions and sensorimotor experience during infancy and childhood. Recently, a new visual stream has been identified in both human and non-human primate brains, updating the dual visual stream model. This third pathway is thought to be specialized for dynamics aspects of social perceptions such as eye-gaze, facial expression and crucially for audio-visual integration of speech. Here, we review empirical studies addressing an understudied but crucial aspect of speech and communication, namely the processing of visual orofacial cues (i.e., the perception of a speaker's lips and tongue movements) and its integration with vocal auditory cues. Along this review, we offer new insights from our understanding of speech as the product of evolution and development of a rhythmic and multimodal organization of sensorimotor brain networks, supporting volitional motor control of the upper vocal tract and audio-visual voices-faces integration.
Collapse
|
17
|
The cortical connectome of primate lateral prefrontal cortex. Neuron 2022; 110:312-327.e7. [PMID: 34739817 PMCID: PMC8776613 DOI: 10.1016/j.neuron.2021.10.018] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 09/09/2021] [Accepted: 10/11/2021] [Indexed: 01/21/2023]
Abstract
The lateral prefrontal cortex (LPFC) of primates plays an important role in executive control, but how it interacts with the rest of the cortex remains unclear. To address this, we densely mapped the cortical connectome of LPFC, using electrical microstimulation combined with functional MRI (EM-fMRI). We found isomorphic mappings between LPFC and five major processing domains composing most of the cerebral cortex except early sensory and motor areas. An LPFC grid of ∼200 stimulation sites topographically mapped to separate grids of activation sites in the five domains, coarsely resembling how the visual cortex maps the retina. The temporal and parietal maps largely overlapped in LPFC, suggesting topographically organized convergence of the ventral and dorsal streams, and the other maps overlapped at least partially. Thus, the LPFC contains overlapping, millimeter-scale maps that mirror the organization of major cortical processing domains, supporting LPFC's role in coordinating activity within and across these domains.
Collapse
|
18
|
Visual modulation of firing and spectrotemporal receptive fields in mouse auditory cortex. CURRENT RESEARCH IN NEUROBIOLOGY 2022; 3:100040. [DOI: 10.1016/j.crneur.2022.100040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Revised: 04/26/2022] [Accepted: 05/06/2022] [Indexed: 10/18/2022] Open
|
19
|
Alterations of Regional Homogeneity in Children With Congenital Sensorineural Hearing Loss: A Resting-State fMRI Study. Front Neurosci 2021; 15:678910. [PMID: 34690668 PMCID: PMC8526795 DOI: 10.3389/fnins.2021.678910] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Accepted: 09/07/2021] [Indexed: 11/30/2022] Open
Abstract
Background: Brain functional alterations have been observed in children with congenital sensorineural hearing loss (CSNHL). The purpose of this study was to assess the alterations of regional homogeneity in children with CSNHL. Methods: Forty-five children with CSNHL and 20 healthy controls were enrolled into this study. Brain resting-state functional MRI (rs-fMRI) for regional homogeneity including the Kendall coefficient consistency (KCC-ReHo) and the coherence-based parameter (Cohe-ReHo) was analyzed and compared between the two groups, i.e., the CSNHL group and the healthy control group. Results: Compared to the healthy controls, children with CSNHL showed increased Cohe-ReHo values in left calcarine and decreased values in bilateral ventrolateral prefrontal cortex (VLPFC) and right dorsolateral prefrontal cortex (DLPFC). Children with CSNHL also had increased KCC-ReHo values in the left calcarine, cuneus, precentral gyrus, and right superior parietal lobule (SPL) and decreased values in the left VLPFC and right DLPFC. Correlations were detected between the ReHo values and age of the children with CSNHL. There were positive correlations between ReHo values in the pre-cuneus/pre-frontal cortex and age (p < 0.05). There were negative correlations between ReHo values in bilateral temporal lobes, fusiform gyrus, parahippocampal gyrus and precentral gyrus, and age (p < 0.05). Conclusion: Children with CSNHL had RoHo alterations in the auditory, visual, motor, and other related brain cortices as compared to the healthy controls with normal hearing. There were significant correlations between ReHo values and age in brain regions involved in information integration and processing. Our study showed promising data using rs-fMRI ReHo parameters to assess brain functional alterations in children with CSNHL.
Collapse
|
20
|
Abstract
Social interactions occur in group settings and are mediated by communication signals that are exchanged between individuals, often using vocalizations. The neural representation of group social communication remains largely unexplored. We conducted simultaneous wireless electrophysiological recordings from the frontal cortices of groups of Egyptian fruit bats engaged in both spontaneous and task-induced vocal interactions. We found that the activity of single neurons distinguished between vocalizations produced by self and by others, as well as among specific individuals. Coordinated neural activity among group members exhibited stable bidirectional interbrain correlation patterns specific to spontaneous communicative interactions. Tracking social and spatial arrangements within a group revealed a relationship between social preferences and intra- and interbrain activity patterns. Combined, these findings reveal a dedicated neural repertoire for group social communication within and across the brains of freely communicating groups of bats.
Collapse
|
21
|
Abstract
Working memory (WM) is the ability to maintain and manipulate information in the conscious mind over a timescale of seconds. This ability is thought to be maintained through the persistent discharges of neurons in a network of brain areas centered on the prefrontal cortex, as evidenced by neurophysiological recordings in nonhuman primates, though both the localization and the neural basis of WM has been a matter of debate in recent years. Neural correlates of WM are evident in species other than primates, including rodents and corvids. A specialized network of excitatory and inhibitory neurons, aided by neuromodulatory influences of dopamine, is critical for the maintenance of neuronal activity. Limitations in WM capacity and duration, as well as its enhancement during development, can be attributed to properties of neural activity and circuits. Changes in these factors can be observed through training-induced improvements and in pathological impairments. WM thus provides a prototypical cognitive function whose properties can be tied to the spiking activity of brain neurons. © 2021 American Physiological Society. Compr Physiol 11:1-41, 2021.
Collapse
|
22
|
Neuronal activity in the monkey prefrontal cortex during a duration discrimination task with visual and auditory cues. Sci Rep 2021; 11:17520. [PMID: 34471190 PMCID: PMC8410858 DOI: 10.1038/s41598-021-97094-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Accepted: 08/20/2021] [Indexed: 11/27/2022] Open
Abstract
To investigate neuronal processing involved in the integration of auditory and visual signals for time perception, we examined neuronal activity in prefrontal cortex (PFC) of macaque monkeys during a duration discrimination task with auditory and visual cues. In the task, two cues were consecutively presented for different durations between 0.2 and 1.8 s. Each cue was either auditory or visual and was followed by a delay period. After the second delay, subjects indicated whether the first or the second cue was longer. Cue- and delay-responsive neurons were found in PFC. Cue-responsive neurons mostly responded to either the auditory or the visual cue, and to either the first or the second cue. The neurons responsive to the first delay showed activity that changed depending on the first cue duration and were mostly sensitive to cue modality. The neurons responsive to the second delay exhibited activity that represented which cue, the first or second cue, was presented longer. Nearly half of this activity representing order-based duration was sensitive to cue modality. These results suggest that temporal information with visual and auditory signals was separately processed in PFC in the early stage of duration discrimination and integrated for the final decision.
Collapse
|
23
|
Neuroplasticity and Crossmodal Connectivity in the Normal, Healthy Brain. PSYCHOLOGY & NEUROSCIENCE 2021; 14:298-334. [PMID: 36937077 PMCID: PMC10019101 DOI: 10.1037/pne0000258] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Objective Neuroplasticity enables the brain to establish new crossmodal connections or reorganize old connections which are essential to perceiving a multisensorial world. The intent of this review is to identify and summarize the current developments in neuroplasticity and crossmodal connectivity, and deepen understanding of how crossmodal connectivity develops in the normal, healthy brain, highlighting novel perspectives about the principles that guide this connectivity. Methods To the above end, a narrative review is carried out. The data documented in prior relevant studies in neuroscience, psychology and other related fields available in a wide range of prominent electronic databases are critically assessed, synthesized, interpreted with qualitative rather than quantitative elements, and linked together to form new propositions and hypotheses about neuroplasticity and crossmodal connectivity. Results Three major themes are identified. First, it appears that neuroplasticity operates by following eight fundamental principles and crossmodal integration operates by following three principles. Second, two different forms of crossmodal connectivity, namely direct crossmodal connectivity and indirect crossmodal connectivity, are suggested to operate in both unisensory and multisensory perception. Third, three principles possibly guide the development of crossmodal connectivity into adulthood. These are labeled as the principle of innate crossmodality, the principle of evolution-driven 'neuromodular' reorganization and the principle of multimodal experience. These principles are combined to develop a three-factor interaction model of crossmodal connectivity. Conclusions The hypothesized principles and the proposed model together advance understanding of neuroplasticity, the nature of crossmodal connectivity, and how such connectivity develops in the normal, healthy brain.
Collapse
|
24
|
Neuroanatomical correlates of self-awareness of highly practiced visuomotor skills. Brain Struct Funct 2021; 226:2295-2306. [PMID: 34228220 DOI: 10.1007/s00429-021-02328-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2020] [Accepted: 06/22/2021] [Indexed: 12/27/2022]
Abstract
Metacognition is the ability to introspect and control ongoing cognitive processes. Despite the extensive investigation of the brain architectures supporting metacognition for perception and memory, little is known about the neural basis of metacognitive capacity for motor function, a vital aspect of human behavior. Here, using functional and structural magnetic resonance imaging (MRI), we examined the brain substrates underlying self-awareness of handwriting, a highly practiced visuomotor skill. Results showed that experienced adult writers generally overestimated their handwriting quality, and such overestimation was more pronounced in men relative to women. Individual variations in self-awareness of handwriting quality were positively correlated with gray matter volume in the left fusiform gyrus, right middle frontal gyrus and right precuneus. The left fusiform gyrus and right middle frontal gyrus are thought to represent domain-specific brain mechanisms for handwriting self-awareness, while the right precuneus that has been reported in other domains likely represents a domain-general brain mechanism for metacognition. Furthermore, the activity of these structurally related regions in a handwriting task was not correlated with self-awareness of handwriting, suggesting the correlation with metacognition was independent of task performance. Together, this study reveals that metacognition for practiced motor skills relies on both domain-general and domain-specific brain systems, extending our understanding about the neural basis of human metacognition.
Collapse
|
25
|
Disentangling the influences of multiple thalamic nuclei on prefrontal cortex and cognitive control. Neurosci Biobehav Rev 2021; 128:487-510. [PMID: 34216654 DOI: 10.1016/j.neubiorev.2021.06.042] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 04/13/2021] [Accepted: 06/09/2021] [Indexed: 10/21/2022]
Abstract
The prefrontal cortex (PFC) has a complex relationship with the thalamus, involving many nuclei which occupy predominantly medial zones along its anterior-to-posterior extent. Thalamocortical neurons in most of these nuclei are modulated by the affective and cognitive signals which funnel through the basal ganglia. We review how PFC-connected thalamic nuclei likely contribute to all aspects of cognitive control: from the processing of information on internal states and goals, facilitating its interactions with mnemonic information and learned values of stimuli and actions, to their influence on high-level cognitive processes, attentional allocation and goal-directed behavior. This includes contributions to transformations such as rule-to-choice (parvocellular mediodorsal nucleus), value-to-choice (magnocellular mediodorsal nucleus), mnemonic-to-choice (anteromedial nucleus) and sensory-to-choice (medial pulvinar). Common mechanisms appear to be thalamic modulation of cortical gain and cortico-cortical functional connectivity. The anatomy also implies a unique role for medial PFC in modulating processing in thalamocortical circuits involving other orbital and lateral PFC regions. We further discuss how cortico-basal ganglia circuits may provide a mechanism through which PFC controls cortico-cortical functional connectivity.
Collapse
|
26
|
Audiovisual integration in macaque face patch neurons. Curr Biol 2021; 31:1826-1835.e3. [PMID: 33636119 PMCID: PMC8521527 DOI: 10.1016/j.cub.2021.01.102] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2020] [Revised: 12/29/2020] [Accepted: 01/28/2021] [Indexed: 12/03/2022]
Abstract
Primate social communication depends on the perceptual integration of visual and auditory cues, reflected in the multimodal mixing of sensory signals in certain cortical areas. The macaque cortical face patch network, identified through visual, face-selective responses measured with fMRI, is assumed to contribute to visual social interactions. However, whether face patch neurons are also influenced by acoustic information, such as the auditory component of a natural vocalization, remains unknown. Here, we recorded single-unit activity in the anterior fundus (AF) face patch, in the superior temporal sulcus, and anterior medial (AM) face patch, on the undersurface of the temporal lobe, in macaques presented with audiovisual, visual-only, and auditory-only renditions of natural movies of macaques vocalizing. The results revealed that 76% of neurons in face patch AF were significantly influenced by the auditory component of the movie, most often through enhancement of visual responses but sometimes in response to the auditory stimulus alone. By contrast, few neurons in face patch AM exhibited significant auditory responses or modulation. Control experiments in AF used an animated macaque avatar to demonstrate, first, that the structural elements of the face were often essential for audiovisual modulation and, second, that the temporal modulation of the acoustic stimulus was more important than its frequency spectrum. Together, these results identify a striking contrast between two face patches and specifically identify AF as playing a potential role in the integration of audiovisual cues during natural modes of social communication.
Collapse
|
27
|
Sensory feedback-dependent coding of arm position in local field potentials of the posterior parietal cortex. Sci Rep 2021; 11:9060. [PMID: 33907213 PMCID: PMC8079385 DOI: 10.1038/s41598-021-88278-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 04/06/2021] [Indexed: 11/19/2022] Open
Abstract
Although multisensory integration is crucial for sensorimotor function, it is unclear how visual and proprioceptive sensory cues are combined in the brain during motor behaviors. Here we characterized the effects of multisensory interactions on local field potential (LFP) activity obtained from the superior parietal lobule (SPL) as non-human primates performed a reaching task with either unimodal (proprioceptive) or bimodal (visual-proprioceptive) sensory feedback. Based on previous analyses of spiking activity, we hypothesized that evoked LFP responses would be tuned to arm location but would be suppressed on bimodal trials, relative to unimodal trials. We also expected to see a substantial number of recording sites with enhanced beta band spectral power for only one set of feedback conditions (e.g. unimodal or bimodal), as was previously observed for spiking activity. We found that evoked activity and beta band power were tuned to arm location at many individual sites, though this tuning often differed between unimodal and bimodal trials. Across the population, both evoked and beta activity were consistent with feedback-dependent tuning to arm location, while beta band activity also showed evidence of response suppression on bimodal trials. The results suggest that multisensory interactions can alter the tuning and gain of arm position-related LFP activity in the SPL.
Collapse
|
28
|
Choice-dependent cross-modal interaction in the medial prefrontal cortex of rats. Mol Brain 2021; 14:13. [PMID: 33446258 PMCID: PMC7809823 DOI: 10.1186/s13041-021-00732-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2020] [Accepted: 01/08/2021] [Indexed: 11/25/2022] Open
Abstract
Cross-modal interaction (CMI) could significantly influence the perceptional or decision-making process in many circumstances. However, it remains poorly understood what integrative strategies are employed by the brain to deal with different task contexts. To explore it, we examined neural activities of the medial prefrontal cortex (mPFC) of rats performing cue-guided two-alternative forced-choice tasks. In a task requiring rats to discriminate stimuli based on auditory cue, the simultaneous presentation of an uninformative visual cue substantially strengthened mPFC neurons' capability of auditory discrimination mainly through enhancing the response to the preferred cue. Doing this also increased the number of neurons revealing a cue preference. If the task was changed slightly and a visual cue, like the auditory, denoted a specific behavioral direction, mPFC neurons frequently showed a different CMI pattern with an effect of cross-modal enhancement best evoked in information-congruent multisensory trials. In a choice free task, however, the majority of neurons failed to show a cross-modal enhancement effect and cue preference. These results indicate that CMI at the neuronal level is context-dependent in a way that differs from what has been shown in previous studies.
Collapse
|
29
|
Phase-amplitude coupling profiles differ in frontal and auditory cortices of bats. Eur J Neurosci 2020; 55:3483-3501. [PMID: 32979875 DOI: 10.1111/ejn.14986] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Revised: 09/15/2020] [Accepted: 09/16/2020] [Indexed: 11/29/2022]
Abstract
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1-4 Hz), theta (4-8 Hz) or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the delta-theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
Collapse
|
30
|
Approaches to Understanding Multisensory Dysfunction in Autism Spectrum Disorder. Autism Res 2020; 13:1430-1449. [PMID: 32869933 PMCID: PMC7721996 DOI: 10.1002/aur.2375] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Revised: 07/20/2020] [Accepted: 07/28/2020] [Indexed: 12/14/2022]
Abstract
Abnormal sensory responses are a DSM-5 symptom of autism spectrum disorder (ASD), and research findings demonstrate altered sensory processing in ASD. Beyond difficulties with processing information within single sensory domains, including both hypersensitivity and hyposensitivity, difficulties in multisensory processing are becoming a core issue of focus in ASD. These difficulties may be targeted by treatment approaches such as "sensory integration," which is frequently applied in autism treatment but not yet based on clear evidence. Recently, psychophysical data have emerged to demonstrate multisensory deficits in some children with ASD. Unlike deficits in social communication, which are best understood in humans, sensory and multisensory changes offer a tractable marker of circuit dysfunction that is more easily translated into animal model systems to probe the underlying neurobiological mechanisms. Paralleling experimental paradigms that were previously applied in humans and larger mammals, we and others have demonstrated that multisensory function can also be examined behaviorally in rodents. Here, we review the sensory and multisensory difficulties commonly found in ASD, examining laboratory findings that relate these findings across species. Next, we discuss the known neurobiology of multisensory integration, drawing largely on experimental work in larger mammals, and extensions of these paradigms into rodents. Finally, we describe emerging investigations into multisensory processing in genetic mouse models related to autism risk. By detailing findings from humans to mice, we highlight the advantage of multisensory paradigms that can be easily translated across species, as well as the potential for rodent experimental systems to reveal opportunities for novel treatments. LAY SUMMARY: Sensory and multisensory deficits are commonly found in ASD and may result in cascading effects that impact social communication. By using similar experiments to those in humans, we discuss how studies in animal models may allow an understanding of the brain mechanisms that underlie difficulties in multisensory integration, with the ultimate goal of developing new treatments. Autism Res 2020, 13: 1430-1449. © 2020 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
|
31
|
Differential Rapid Plasticity in Auditory and Visual Responses in the Primarily Multisensory Orbitofrontal Cortex. eNeuro 2020; 7:ENEURO.0061-20.2020. [PMID: 32424057 PMCID: PMC7294472 DOI: 10.1523/eneuro.0061-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 03/26/2020] [Indexed: 01/17/2023] Open
Abstract
Given the connectivity of orbitofrontal cortex (OFC) with the sensory areas and areas involved in goal execution, it is likely that OFC, along with its function in reward processing, also has a role to play in perception-based multisensory decision-making. To understand mechanisms involved in multisensory decision-making, it is important to first know the encoding of different sensory stimuli in single neurons of the mouse OFC. Ruling out effects of behavioral state, memory, and others, we studied the anesthetized mouse OFC responses to auditory, visual, and audiovisual/multisensory stimuli, multisensory associations and sensory-driven input organization to the OFC. Almost all, OFC single neurons were found to be multisensory in nature, with sublinear to supralinear integration of the component unisensory stimuli. With a novel multisensory oddball stimulus set, we show that the OFC receives both unisensory as well as multisensory inputs, further corroborated by retrograde tracers showing labeling in secondary auditory and visual cortices, which we find to also have similar multisensory integration and responses. With long audiovisual pairing/association, we show rapid plasticity in OFC single neurons, with a strong visual bias, leading to a strong depression of auditory responses and effective enhancement of visual responses. Such rapid multisensory association driven plasticity is absent in the auditory and visual cortices, suggesting its emergence in the OFC. Based on the above results, we propose a hypothetical local circuit model in the OFC that integrates auditory and visual information which participates in computing stimulus value in dynamic multisensory environments.
Collapse
|
32
|
Circuits, Networks, and Neuropsychiatric Disease: Transitioning From Anatomy to Imaging. Biol Psychiatry 2020; 87:318-327. [PMID: 31870495 DOI: 10.1016/j.biopsych.2019.10.024] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Revised: 10/24/2019] [Accepted: 10/24/2019] [Indexed: 12/14/2022]
Abstract
Since the development of cellular and myelin stains, anatomy has formed the foundation for understanding circuitry in the human brain. However, recent functional and structural studies using magnetic resonance imaging have taken the lead in this endeavor. These innovative and noninvasive approaches have the advantage of studying connectivity patterns under different conditions directly in the human brain. They demonstrate dynamic and structural changes within and across networks linked to normal function and to a wide range of psychiatric illnesses. However, these indirect methods are unable to link networks to the hardwiring that underlies them. In contrast, anatomic invasive experimental studies can. Following a brief review of prefrontal cortical, anterior cingulate, and striatal connections and the different methodologies used, this article discusses how data from anatomic studies can help inform how hardwired connections are linked to the functional and structural networks identified in imaging studies.
Collapse
|
33
|
The Influence of Subclinical Neck Pain on Neurophysiological and Behavioral Measures of Multisensory Integration. Brain Sci 2019; 9:brainsci9120362. [PMID: 31818030 PMCID: PMC6955897 DOI: 10.3390/brainsci9120362] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2019] [Revised: 12/04/2019] [Accepted: 12/05/2019] [Indexed: 02/02/2023] Open
Abstract
Multisensory integration (MSI) is necessary for the efficient execution of many everyday tasks. Alterations in sensorimotor integration (SMI) have been observed in individuals with subclinical neck pain (SCNP). Altered audiovisual MSI has previously been demonstrated in this population using performance measures, such as reaction time. However, neurophysiological techniques have not been combined with performance measures in the SCNP population to determine differences in neural processing that may contribute to these behavioral characteristics. Electroencephalography (EEG) event-related potentials (ERPs) have been successfully used in recent MSI studies to show differences in neural processing between different clinical populations. This study combined behavioral and ERP measures to characterize MSI differences between healthy and SCNP groups. EEG was recorded as 24 participants performed 8 blocks of a simple reaction time (RT) MSI task, with each block consisting of 34 auditory (A), visual (V), and audiovisual (AV) trials. Participants responded to the stimuli by pressing a response key. Both groups responded fastest to the AV condition. The healthy group demonstrated significantly faster RTs for the AV and V conditions. There were significant group differences in neural activity from 100-140 ms post-stimulus onset, with the control group demonstrating greater MSI. Differences in brain activity and RT between individuals with SCNP and a control group indicate neurophysiological alterations in how individuals with SCNP process audiovisual stimuli. This suggests that SCNP alters MSI. This study presents novel EEG findings that demonstrate MSI differences in a group of individuals with SCNP.
Collapse
|
34
|
Leveraging Nonhuman Primate Multisensory Neurons and Circuits in Assessing Consciousness Theory. J Neurosci 2019; 39:7485-7500. [PMID: 31358654 PMCID: PMC6750944 DOI: 10.1523/jneurosci.0934-19.2019] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2019] [Revised: 06/27/2019] [Accepted: 07/19/2019] [Indexed: 01/03/2023] Open
Abstract
Both the global neuronal workspace (GNW) and integrated information theory (IIT) posit that highly complex and interconnected networks engender perceptual awareness. GNW specifies that activity recruiting frontoparietal networks will elicit a subjective experience, whereas IIT is more concerned with the functional architecture of networks than with activity within it. Here, we argue that according to IIT mathematics, circuits converging on integrative versus convergent yet non-integrative neurons should support a greater degree of consciousness. We test this hypothesis by analyzing a dataset of neuronal responses collected simultaneously from primary somatosensory cortex (S1) and ventral premotor cortex (vPM) in nonhuman primates presented with auditory, tactile, and audio-tactile stimuli as they are progressively anesthetized with propofol. We first describe the multisensory (audio-tactile) characteristics of S1 and vPM neurons (mean and dispersion tendencies, as well as noise-correlations), and functionally label these neurons as convergent or integrative according to their spiking responses. Then, we characterize how these different pools of neurons behave as a function of consciousness. At odds with the IIT mathematics, results suggest that convergent neurons more readily exhibit properties of consciousness (neural complexity and noise correlation) and are more impacted during the loss of consciousness than integrative neurons. Last, we provide support for the GNW by showing that neural ignition (i.e., same trial coactivation of S1 and vPM) was more frequent in conscious than unconscious states. Overall, we contrast GNW and IIT within the same single-unit activity dataset, and support the GNW.SIGNIFICANCE STATEMENT A number of prominent theories of consciousness exist, and a number of these share strong commonalities, such as the central role they ascribe to integration. Despite the important and far reaching consequences developing a better understanding of consciousness promises to bring, for instance in diagnosing disorders of consciousness (e.g., coma, vegetative-state, locked-in syndrome), these theories are seldom tested via invasive techniques (with high signal-to-noise ratios), and never directly confronted within a single dataset. Here, we first derive concrete and testable predictions from the global neuronal workspace and integrated information theory of consciousness. Then, we put these to the test by functionally labeling specific neurons as either convergent or integrative nodes, and examining the response of these neurons during anesthetic-induced loss of consciousness.
Collapse
|
35
|
Abstract
Multisensory integration (MSI) is a fundamental emergent property of the mammalian brain. During MSI, perceptual information encoded in patterned activity is processed in multimodal association cortex. The systems-level neuronal dynamics that coordinate MSI, however, are unknown. Here, we demonstrate intrinsic hub-like network activity in the association cortex that regulates MSI. We engineered calcium reporter mouse lines based on the fluorescence resonance energy transfer sensor yellow cameleon (YC2.60) expressed in excitatory or inhibitory neurons. In medial and parietal association cortex, we observed spontaneous slow waves that self-organized into hubs defined by long-range excitatory and local inhibitory circuits. Unlike directional source/sink-like flows in sensory areas, medial/parietal excitatory and inhibitory hubs had net-zero balanced inputs. Remarkably, multisensory stimulation triggered rapid phase-locking mainly of excitatory hub activity persisting for seconds after the stimulus offset. Therefore, association cortex tends to form balanced excitatory networks that configure slow-wave phase-locking for MSI. VIDEO ABSTRACT.
Collapse
|
36
|
Large-scale temporo–parieto–frontal networks for motor and cognitive motor functions in the primate brain. Cortex 2019; 118:19-37. [DOI: 10.1016/j.cortex.2018.09.024] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2018] [Revised: 09/21/2018] [Accepted: 09/28/2018] [Indexed: 10/28/2022]
|
37
|
Multisensory Neurons in the Primate Amygdala. J Neurosci 2019; 39:3663-3675. [PMID: 30858163 DOI: 10.1523/jneurosci.2903-18.2019] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Revised: 02/12/2019] [Accepted: 02/13/2019] [Indexed: 11/21/2022] Open
Abstract
Animals identify, interpret, and respond to complex, natural signals that are often multisensory. The ability to integrate signals across sensory modalities depends on the convergence of sensory inputs at the level of single neurons. Neurons in the amygdala are expected to be multisensory because they respond to complex, natural stimuli, and the amygdala receives inputs from multiple sensory areas. We recorded activity from the amygdala of 2 male monkeys (Macaca mulatta) in response to visual, tactile, and auditory stimuli. Although the stimuli were devoid of inherent emotional or social significance and were not paired with rewards or punishments, the majority of neurons that responded to these stimuli were multisensory. Selectivity for sensory modality was stronger and emerged earlier than selectivity for individual items within a sensory modality. Modality and item selectivity were expressed via three main spike-train metrics: (1) response magnitude, (2) response polarity, and (3) response duration. None of these metrics were unique to a particular sensory modality; rather, each neuron responded with distinct combinations of spike-train metrics to discriminate sensory modalities and items within a modality. The relative proportion of multisensory neurons was similar across the nuclei of the amygdala. The convergence of inputs of multiple sensory modalities at the level of single neurons in the amygdala rests at the foundation for multisensory integration. The integration of visual, auditory, and tactile inputs in the amygdala may serve social communication by binding together social signals carried by facial expressions, vocalizations, and social grooming.SIGNIFICANCE STATEMENT Our brain continuously decodes information detected by multiple sensory systems. The emotional and social significance of the incoming signals is likely extracted by the amygdala, which receives input from all sensory domains. Here we show that a large portion of neurons in the amygdala respond to stimuli from two or more sensory modalities. The convergence of visual, tactile, and auditory signals at the level of individual neurons in the amygdala establishes a foundation for multisensory integration within this structure. The ability to integrate signals across sensory modalities is critical for social communication and other high-level cognitive functions.
Collapse
|
38
|
Human olfactory-auditory integration requires phase synchrony between sensory cortices. Nat Commun 2019; 10:1168. [PMID: 30858379 PMCID: PMC6411726 DOI: 10.1038/s41467-019-09091-3] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2018] [Accepted: 02/21/2019] [Indexed: 12/22/2022] Open
Abstract
Multisensory integration is particularly important in the human olfactory system, which is highly dependent on non-olfactory cues, yet its underlying neural mechanisms are not well understood. In this study, we use intracranial electroencephalography techniques to record neural activity in auditory and olfactory cortices during an auditory-olfactory matching task. Spoken cues evoke phase locking between low frequency oscillations in auditory and olfactory cortices prior to odor arrival. This phase synchrony occurs only when the participant's later response is correct. Furthermore, the phase of low frequency oscillations in both auditory and olfactory cortical areas couples to the amplitude of high-frequency oscillations in olfactory cortex during correct trials. These findings suggest that phase synchrony is a fundamental mechanism for integrating cross-modal odor processing and highlight an important role for primary olfactory cortical areas in multisensory integration with the olfactory system.
Collapse
|
39
|
Origin and evolution of human speech: Emergence from a trimodal auditory, visual and vocal network. PROGRESS IN BRAIN RESEARCH 2019; 250:345-371. [PMID: 31703907 DOI: 10.1016/bs.pbr.2019.01.005] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
In recent years, there have been important additions to the classical model of speech processing as originally depicted by the Broca-Wernicke model consisting of an anterior, productive region and a posterior, perceptive region, both connected via the arcuate fasciculus. The modern view implies a separation into a dorsal and a ventral pathway conveying different kinds of linguistic information, which parallels the organization of the visual system. Furthermore, this organization is highly conserved in evolution and can be seen as the neural scaffolding from which the speech networks originated. In this chapter we emphasize that the speech networks are embedded in a multimodal system encompassing audio-vocal and visuo-vocal connections, which can be referred to an ancestral audio-visuo-motor pathway present in nonhuman primates. Likewise, we propose a trimodal repertoire for speech processing and acquisition involving auditory, visual and motor representations of the basic elements of speech: phoneme, observation of mouth movements, and articulatory processes. Finally, we discuss this proposal in the context of a scenario for early speech acquisition in infants and in human evolution.
Collapse
|
40
|
Scaling up visual attention and visual working memory to the real world. PSYCHOLOGY OF LEARNING AND MOTIVATION 2019. [DOI: 10.1016/bs.plm.2019.03.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/16/2023]
|
41
|
Abstract
Perceiving social and emotional information from faces is a critical primate skill. For this purpose, primates evolved dedicated cortical architecture, especially in occipitotemporal areas, utilizing face-selective cells. Less understood face-selective neurons are present in the orbitofrontal cortex (OFC) and are our object of study. We examined 179 face-selective cells in the lateral sulcus of the OFC by characterizing their responses to a rich set of photographs of conspecific faces varying in age, gender, and facial expression. Principal component analysis and unsupervised cluster analysis of stimulus space both revealed that face cells encode face dimensions for social categories and emotions. Categories represented strongly were facial expressions (grin and threat versus lip smack), juvenile, and female monkeys. Cluster analyses of a control population of nearby cells lacking face selectivity did not categorize face stimuli in a meaningful way, suggesting that only face-selective cells directly support face categorization in OFC. Time course analyses of face cell activity from stimulus onset showed that faces were discriminated from nonfaces early, followed by within-face categorization for social and emotion content (i.e., young and facial expression). Face cells revealed no response to acoustic stimuli such as vocalizations and were poorly modulated by vocalizations added to faces. Neuronal responses remained stable when paired with positive or negative reinforcement, implying that face cells encode social information but not learned reward value associated to faces. Overall, our results shed light on a substantial role of the OFC in the characterizations of facial information bearing on social and emotional behavior.
Collapse
|
42
|
Cross-decoding supramodal information in the human brain. Brain Struct Funct 2018; 223:4087-4098. [PMID: 30143866 DOI: 10.1007/s00429-018-1740-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2018] [Accepted: 08/21/2018] [Indexed: 10/28/2022]
Abstract
Perceptual decision making is the cognitive process wherein the brain classifies stimuli into abstract categories for more efficient downstream processing. A system that, during categorization, can process information regardless of the information's original sensory modality (i.e., a supramodal system) would have a substantial advantage over a system with dedicated processes for specific sensory modalities. While many studies have probed decision processes through the lens of one sensory modality, it remains unclear whether there are such supramodal brain areas that can flexibly process task-relevant information regardless of the original "format" of the information. To investigate supramodality, one must ensure that supramodal information exists somewhere within the functional architecture by rendering information from multiple sensory systems necessary but insufficient for categorization. To this aim, we tasked participants with categorizing auditory and tactile frequency-modulated sweeps according to learned, supramodal categories in a delayed match-to-category paradigm while we measured their blood-oxygen-level dependent signal with functional MRI. To detect supramodal information, we implemented a set of cross-modality pattern classification analyses, which demonstrated that the left caudate nucleus encodes category-level information but not stimulus-specific information (such as spatial directions and stimulus modalities), while the right inferior frontal gyrus, showing the opposite pattern, encodes stimulus-specific information but not category-level information. Given our paradigm, these results reveal abstract representations in the brain that are independent of motor, semantic, and sensory-specific processing, instead reflecting supramodal, categorical information, which points to the caudate nucleus as a locus of cognitive processes involved in complex behavior.
Collapse
|
43
|
Modality-Independent Coding of Scene Categories in Prefrontal Cortex. J Neurosci 2018; 38:5969-5981. [PMID: 29858483 DOI: 10.1523/jneurosci.0272-18.2018] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2018] [Revised: 05/03/2018] [Accepted: 05/26/2018] [Indexed: 11/21/2022] Open
Abstract
Natural environments convey information through multiple sensory modalities, all of which contribute to people's percepts. Although it has been shown that visual or auditory content of scene categories can be decoded from brain activity, it remains unclear how humans represent scene information beyond a specific sensory modality domain. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. A group of healthy human subjects (both sexes) participated in the present study, where their brain activity was measured with fMRI while viewing images or listening to sounds of different real-world environments. We found that both visual and auditory scene categories can be decoded not only from modality-specific areas, but also from several brain regions in the temporal, parietal, and prefrontal cortex (PFC). Intriguingly, only in the PFC, but not in any other regions, categories of scene images and sounds appear to be represented in similar activation patterns, suggesting that scene representations in PFC are modality-independent. Furthermore, the error patterns of neural decoders indicate that category-specific neural activity patterns in the middle and superior frontal gyri are tightly linked to categorization behavior. Our findings demonstrate that complex scene information is represented at an abstract level in the PFC, regardless of the sensory modality of the stimulus.SIGNIFICANCE STATEMENT Our experience in daily life includes multiple sensory inputs, such as images, sounds, or scents from the surroundings, which all contribute to our understanding of the environment. Here, for the first time, we investigated where and how in the brain information about the natural environment from multiple senses is merged to form modality-independent representations of scene categories. We show direct decoding of scene categories across sensory modalities from patterns of neural activity in the prefrontal cortex (PFC). We also conclusively tie these neural representations to human categorization behavior by comparing patterns of errors between a neural decoder and behavior. Our findings suggest that PFC is a central hub for integrating sensory information and computing modality-independent representations of scene categories.
Collapse
|
44
|
Functional specialization of areas along the anterior-posterior axis of the primate prefrontal cortex. Cereb Cortex 2018; 27:3683-3697. [PMID: 27371761 DOI: 10.1093/cercor/bhw190] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Functional specialization of areas along the anterior-posterior axis of the lateral prefrontal cortex has been speculated but little evidence exists about distinct neurophysiological properties between prefrontal sub-regions. To address this issue we divided the lateral prefrontal cortex into a posterior-dorsal, a mid-dorsal, an anterior-dorsal, a posterior-ventral, and an anterior ventral region. Selectivity for spatial locations, shapes, and colors was evaluated in six monkeys never trained in working memory tasks, while they viewed the stimuli passively. Recordings from over two thousand neurons revealed systematic differences between anterior and posterior regions. In the dorsal prefrontal cortex, anterior regions exhibited the largest receptive fields, longest response latencies, and lowest amount of information for stimuli. In the ventral prefrontal cortex, posterior regions were characterized by a low percentage of responsive neurons to any stimuli we used, consistent with high specialization for stimulus features. Additionally, spatial information was more prominent in the dorsal and color in ventral regions. Our results provide neurophysiological evidence for a rostral-caudal gradient of stimulus selectivity through the prefrontal cortex, suggesting that posterior areas are selective for stimuli even when these are not releant for execution of a task, and that anterior areas are likely engaged in more abstract operations.
Collapse
|
45
|
A Brain for Speech. Evolutionary Continuity in Primate and Human Auditory-Vocal Processing. Front Neurosci 2018; 12:174. [PMID: 29636657 PMCID: PMC5880940 DOI: 10.3389/fnins.2018.00174] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Accepted: 03/05/2018] [Indexed: 12/27/2022] Open
Abstract
In this review article, I propose a continuous evolution from the auditory-vocal apparatus and its mechanisms of neural control in non-human primates, to the peripheral organs and the neural control of human speech. Although there is an overall conservatism both in peripheral systems and in central neural circuits, a few changes were critical for the expansion of vocal plasticity and the elaboration of proto-speech in early humans. Two of the most relevant changes were the acquisition of direct cortical control of the vocal fold musculature and the consolidation of an auditory-vocal articulatory circuit, encompassing auditory areas in the temporoparietal junction and prefrontal and motor areas in the frontal cortex. This articulatory loop, also referred to as the phonological loop, enhanced vocal working memory capacity, enabling early humans to learn increasingly complex utterances. The auditory-vocal circuit became progressively coupled to multimodal systems conveying information about objects and events, which gradually led to the acquisition of modern speech. Gestural communication accompanies the development of vocal communication since very early in human evolution, and although both systems co-evolved tightly in the beginning, at some point speech became the main channel of communication.
Collapse
|
46
|
Do the Different Sensory Areas Within the Cat Anterior Ectosylvian Sulcal Cortex Collectively Represent a Network Multisensory Hub? Multisens Res 2018; 31:793-823. [PMID: 31157160 PMCID: PMC6542292 DOI: 10.1163/22134808-20181316] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Current theory supports that the numerous functional areas of the cerebral cortex are organized and function as a network. Using connectional databases and computational approaches, the cerebral network has been demonstrated to exhibit a hierarchical structure composed of areas, clusters and, ultimately, hubs. Hubs are highly connected, higher-order regions that also facilitate communication between different sensory modalities. One region computationally identified network hub is the visual area of the Anterior Ectosylvian Sulcal cortex (AESc) of the cat. The Anterior Ectosylvian Visual area (AEV) is but one component of the AESc that also includes the auditory (Field of the Anterior Ectosylvian Sulcus - FAES) and somatosensory (Fourth somatosensory representation - SIV). To better understand the nature of cortical network hubs, the present report reviews the biological features of the AESc. Within the AESc, each area has extensive external cortical connections as well as among one another. Each of these core representations is separated by a transition zone characterized by bimodal neurons that share sensory properties of both adjoining core areas. Finally, core and transition zones are underlain by a continuous sheet of layer 5 neurons that project to common output structures. Altogether, these shared properties suggest that the collective AESc region represents a multiple sensory/multisensory cortical network hub. Ultimately, such an interconnected, composite structure adds complexity and biological detail to the understanding of cortical network hubs and their function in cortical processing.
Collapse
|
47
|
Rostro-caudal Connectional Heterogeneity of the Dorsal Part of the Macaque Prefrontal Area 46. Cereb Cortex 2017; 29:485-504. [DOI: 10.1093/cercor/bhx332] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2017] [Accepted: 11/20/2017] [Indexed: 11/13/2022] Open
|
48
|
Comparison of visual receptive fields in the dorsolateral prefrontal cortex and ventral intraparietal area in macaques. Eur J Neurosci 2017; 46:2702-2712. [DOI: 10.1111/ejn.13740] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2017] [Revised: 10/03/2017] [Accepted: 10/05/2017] [Indexed: 11/28/2022]
|
49
|
A Neural Signature of Divisive Normalization at the Level of Multisensory Integration in Primate Cortex. Neuron 2017; 95:399-411.e8. [PMID: 28728025 DOI: 10.1016/j.neuron.2017.06.043] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2017] [Revised: 06/19/2017] [Accepted: 06/26/2017] [Indexed: 10/19/2022]
Abstract
Studies of multisensory integration by single neurons have traditionally emphasized empirical principles that describe nonlinear interactions between inputs from two sensory modalities. We previously proposed that many of these empirical principles could be explained by a divisive normalization mechanism operating in brain regions where multisensory integration occurs. This normalization model makes a critical diagnostic prediction: a non-preferred sensory input from one modality, which activates the neuron on its own, should suppress the response to a preferred input from another modality. We tested this prediction by recording from neurons in macaque area MSTd that integrate visual and vestibular cues regarding self-motion. We show that many MSTd neurons exhibit the diagnostic form of cross-modal suppression, whereas unisensory neurons in area MT do not. The normalization model also fits population responses better than a model based on subtractive inhibition. These findings provide strong support for a divisive normalization mechanism in multisensory integration.
Collapse
|
50
|
Neural Coding for Action Execution and Action Observation in the Prefrontal Cortex and Its Role in the Organization of Socially Driven Behavior. Front Neurosci 2017; 11:492. [PMID: 28936159 PMCID: PMC5594103 DOI: 10.3389/fnins.2017.00492] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Accepted: 08/22/2017] [Indexed: 11/13/2022] Open
Abstract
The lateral prefrontal cortex (LPF) plays a fundamental role in planning, organizing, and optimizing behavioral performance. Neuroanatomical and neurophysiological studies have suggested that in this cortical sector, information processing becomes more abstract when moving from caudal to rostral and that such processing involves parietal and premotor areas. We review studies that have shown that the LPF, in addition to its involvement in implementing rules and setting behavioral goals, activates during the execution of forelimb movements even in the absence of a learned relationship between an instruction and its associated motor output. Thus, we propose that the prefrontal cortex is involved in exploiting contextual information for planning and guiding behavioral responses, also in natural situations. Among contextual cues, those provided by others' actions are particularly relevant for social interactions. Functional studies of macaques have demonstrated that the LPF is activated by the observation of biological stimuli, in particular those related to goal-directed actions. We review these studies and discuss the idea that the prefrontal cortex codes high-order representations of observed actions rather than simple visual descriptions of them. Based on evidence that the same sector of the LPF contains both neurons coding own action goals and neurons coding others' goals, we propose that this sector is involved in the selection of own actions appropriate for reacting in a particular social context and for the creation of new action sequences in imitative learning.
Collapse
|