1
|
A multimodal interface for speech perception: the role of the left superior temporal sulcus in social cognition and autism. Cereb Cortex 2024; 34:84-93. [PMID: 38696598 DOI: 10.1093/cercor/bhae066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Revised: 01/17/2024] [Accepted: 02/03/2024] [Indexed: 05/04/2024] Open
Abstract
Multimodal integration is crucial for human interaction, in particular for social communication, which relies on integrating information from various sensory modalities. Recently a third visual pathway specialized in social perception was proposed, which includes the right superior temporal sulcus (STS) playing a key role in processing socially relevant cues and high-level social perception. Importantly, it has also recently been proposed that the left STS contributes to audiovisual integration of speech processing. In this article, we propose that brain areas along the right STS that support multimodal integration for social perception and cognition can be considered homologs to those in the left, language-dominant hemisphere, sustaining multimodal integration of speech and semantic concepts fundamental for social communication. Emphasizing the significance of the left STS in multimodal integration and associated processes such as multimodal attention to socially relevant stimuli, we underscore its potential relevance in comprehending neurodevelopmental conditions characterized by challenges in social communication such as autism spectrum disorder (ASD). Further research into this left lateral processing stream holds the promise of enhancing our understanding of social communication in both typical development and ASD, which may lead to more effective interventions that could improve the quality of life for individuals with atypical neurodevelopment.
Collapse
|
2
|
Mapping of facial and vocal processing in common marmosets with ultra-high field fMRI. Commun Biol 2024; 7:317. [PMID: 38480875 PMCID: PMC10937914 DOI: 10.1038/s42003-024-06002-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Accepted: 03/01/2024] [Indexed: 03/17/2024] Open
Abstract
Primate communication relies on multimodal cues, such as vision and audition, to facilitate the exchange of intentions, enable social interactions, avoid predators, and foster group cohesion during daily activities. Understanding the integration of facial and vocal signals is pivotal to comprehend social interaction. In this study, we acquire whole-brain ultra-high field (9.4 T) fMRI data from awake marmosets (Callithrix jacchus) to explore brain responses to unimodal and combined facial and vocal stimuli. Our findings reveal that the multisensory condition not only intensifies activations in the occipito-temporal face patches and auditory voice patches but also engages a more extensive network that includes additional parietal, prefrontal and cingulate areas, compared to the summed responses of the unimodal conditions. By uncovering the neural network underlying multisensory audiovisual integration in marmosets, this study highlights the efficiency and adaptability of the marmoset brain in processing facial and vocal social signals, providing significant insights into primate social communication.
Collapse
|
3
|
Neuronal Population Encoding of Identity in Primate Prefrontal Cortex. J Neurosci 2024; 44:e0703232023. [PMID: 37963766 PMCID: PMC10860606 DOI: 10.1523/jneurosci.0703-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 08/22/2023] [Accepted: 10/10/2023] [Indexed: 11/16/2023] Open
Abstract
The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorical features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (Macaca mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression, or their interaction in their stimulus-related firing rates; however, decoding of expression and identity using single-unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudo-populations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.
Collapse
|
4
|
Neural Integration of Audiovisual Sensory Inputs in Macaque Amygdala and Adjacent Regions. Neurosci Bull 2023; 39:1749-1761. [PMID: 36920645 PMCID: PMC10661144 DOI: 10.1007/s12264-023-01043-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 02/13/2023] [Indexed: 03/16/2023] Open
Abstract
Integrating multisensory inputs to generate accurate perception and guide behavior is among the most critical functions of the brain. Subcortical regions such as the amygdala are involved in sensory processing including vision and audition, yet their roles in multisensory integration remain unclear. In this study, we systematically investigated the function of neurons in the amygdala and adjacent regions in integrating audiovisual sensory inputs using a semi-chronic multi-electrode array and multiple combinations of audiovisual stimuli. From a sample of 332 neurons, we showed the diverse response patterns to audiovisual stimuli and the neural characteristics of bimodal over unimodal modulation, which could be classified into four types with differentiated regional origins. Using the hierarchical clustering method, neurons were further clustered into five groups and associated with different integrating functions and sub-regions. Finally, regions distinguishing congruent and incongruent bimodal sensory inputs were identified. Overall, visual processing dominates audiovisual integration in the amygdala and adjacent regions. Our findings shed new light on the neural mechanisms of multisensory integration in the primate brain.
Collapse
|
5
|
Representation of conspecific vocalizations in amygdala of awake marmosets. Natl Sci Rev 2023; 10:nwad194. [PMID: 37818111 PMCID: PMC10561708 DOI: 10.1093/nsr/nwad194] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Revised: 06/23/2023] [Accepted: 07/06/2023] [Indexed: 10/12/2023] Open
Abstract
Human speech and animal vocalizations are important for social communication and animal survival. Neurons in the auditory pathway are responsive to a range of sounds, from elementary sound features to complex acoustic sounds. For social communication, responses to distinct patterns of vocalization are usually highly specific to an individual conspecific call, in some species. This includes the specificity of sound patterns and embedded biological information. We conducted single-unit recordings in the amygdala of awake marmosets and presented calls used in marmoset communication, calls of other species and calls from specific marmoset individuals. We found that some neurons (47/262) in the amygdala distinguished 'Phee' calls from vocalizations of other animals and other types of marmoset vocalizations. Interestingly, a subset of Phee-responsive neurons (22/47) also exhibited selectivity to one out of the three Phees from two different 'caller' marmosets. Our findings suggest that, while it has traditionally been considered the key structure in the limbic system, the amygdala also represents a critical stage of socially relevant auditory perceptual processing.
Collapse
|
6
|
Semantic cognition versus numerical cognition: a topographical perspective. Trends Cogn Sci 2023; 27:993-995. [PMID: 37634952 DOI: 10.1016/j.tics.2023.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 07/12/2023] [Accepted: 08/06/2023] [Indexed: 08/29/2023]
Abstract
Semantic cognition and numerical cognition are dissociable faculties with separable neural mechanisms. However, recent advances in the cortical topography of the temporal and parietal lobes have revealed a common organisational principle for the neural representations of semantics and numbers. We discuss their convergence and divergence through the prism of topography.
Collapse
|
7
|
Abstract
An increasingly popular animal model for studying the neural basis of social behavior, cognition, and communication is the common marmoset (Callithrix jacchus). Interest in this New World primate across neuroscience is now being driven by their proclivity for prosociality across their repertoire, high volubility, and rapid development, as well as their amenability to naturalistic testing paradigms and freely moving neural recording and imaging technologies. The complement of these characteristics set marmosets up to be a powerful model of the primate social brain in the years to come. Here, we focus on vocal communication because it is the area that has both made the most progress and illustrates the prodigious potential of this species. We review the current state of the field with a focus on the various brain areas and networks involved in vocal perception and production, comparing the findings from marmosets to other animals, including humans.
Collapse
|
8
|
Multisensory perception constrains the formation of object categories: a review of evidence from sensory-driven and predictive processes on categorical decisions. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220342. [PMID: 37545304 PMCID: PMC10404931 DOI: 10.1098/rstb.2022.0342] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Accepted: 06/29/2023] [Indexed: 08/08/2023] Open
Abstract
Although object categorization is a fundamental cognitive ability, it is also a complex process going beyond the perception and organization of sensory stimulation. Here we review existing evidence about how the human brain acquires and organizes multisensory inputs into object representations that may lead to conceptual knowledge in memory. We first focus on evidence for two processes on object perception, multisensory integration of redundant information (e.g. seeing and feeling a shape) and crossmodal, statistical learning of complementary information (e.g. the 'moo' sound of a cow and its visual shape). For both processes, the importance attributed to each sensory input in constructing a multisensory representation of an object depends on the working range of the specific sensory modality, the relative reliability or distinctiveness of the encoded information and top-down predictions. Moreover, apart from sensory-driven influences on perception, the acquisition of featural information across modalities can affect semantic memory and, in turn, influence category decisions. In sum, we argue that both multisensory processes independently constrain the formation of object categories across the lifespan, possibly through early and late integration mechanisms, respectively, to allow us to efficiently achieve the everyday, but remarkable, ability of recognizing objects. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
|
9
|
Mouse frontal cortex mediates additive multisensory decisions. Neuron 2023; 111:2432-2447.e13. [PMID: 37295419 PMCID: PMC10957398 DOI: 10.1016/j.neuron.2023.05.008] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Revised: 12/02/2022] [Accepted: 05/10/2023] [Indexed: 06/12/2023]
Abstract
The brain can combine auditory and visual information to localize objects. However, the cortical substrates underlying audiovisual integration remain uncertain. Here, we show that mouse frontal cortex combines auditory and visual evidence; that this combination is additive, mirroring behavior; and that it evolves with learning. We trained mice in an audiovisual localization task. Inactivating frontal cortex impaired responses to either sensory modality, while inactivating visual or parietal cortex affected only visual stimuli. Recordings from >14,000 neurons indicated that after task learning, activity in the anterior part of frontal area MOs (secondary motor cortex) additively encodes visual and auditory signals, consistent with the mice's behavioral strategy. An accumulator model applied to these sensory representations reproduced the observed choices and reaction times. These results suggest that frontal cortex adapts through learning to combine evidence across sensory cortices, providing a signal that is transformed into a binary decision by a downstream accumulator.
Collapse
|
10
|
Auditory cortical connectivity in humans. Cereb Cortex 2023; 33:6207-6227. [PMID: 36573464 PMCID: PMC10422925 DOI: 10.1093/cercor/bhac496] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2022] [Revised: 11/27/2022] [Accepted: 11/29/2022] [Indexed: 12/28/2022] Open
Abstract
To understand auditory cortical processing, the effective connectivity between 15 auditory cortical regions and 360 cortical regions was measured in 171 Human Connectome Project participants, and complemented with functional connectivity and diffusion tractography. 1. A hierarchy of auditory cortical processing was identified from Core regions (including A1) to Belt regions LBelt, MBelt, and 52; then to PBelt; and then to HCP A4. 2. A4 has connectivity to anterior temporal lobe TA2, and to HCP A5, which connects to dorsal-bank superior temporal sulcus (STS) regions STGa, STSda, and STSdp. These STS regions also receive visual inputs about moving faces and objects, which are combined with auditory information to help implement multimodal object identification, such as who is speaking, and what is being said. Consistent with this being a "what" ventral auditory stream, these STS regions then have effective connectivity to TPOJ1, STV, PSL, TGv, TGd, and PGi, which are language-related semantic regions connecting to Broca's area, especially BA45. 3. A4 and A5 also have effective connectivity to MT and MST, which connect to superior parietal regions forming a dorsal auditory "where" stream involved in actions in space. Connections of PBelt, A4, and A5 with BA44 may form a language-related dorsal stream.
Collapse
|
11
|
Functional network properties of the auditory cortex. Hear Res 2023; 433:108768. [PMID: 37075536 DOI: 10.1016/j.heares.2023.108768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 03/27/2023] [Accepted: 04/11/2023] [Indexed: 04/21/2023]
Abstract
The auditory system transforms auditory stimuli from the external environment into perceptual auditory objects. Recent studies have focused on the contribution of the auditory cortex to this transformation. Other studies have yielded important insights into the contributions of neural activity in the auditory cortex to cognition and decision-making. However, despite this important work, the relationship between auditory-cortex activity and behavior/perception has not been fully elucidated. Two of the more important gaps in our understanding are (1) the specific and differential contributions of different fields of the auditory cortex to auditory perception and behavior and (2) the way networks of auditory neurons impact and facilitate auditory information processing. Here, we focus on recent work from non-human-primate models of hearing and review work related to these gaps and put forth challenges to further our understanding of how single-unit activity and network activity in different cortical fields contribution to behavior and perception.
Collapse
|
12
|
F = ma. Is the macaque brain Newtonian? Cogn Neuropsychol 2023; 39:376-408. [PMID: 37045793 DOI: 10.1080/02643294.2023.2191843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/14/2023]
Abstract
Intuitive Physics, the ability to anticipate how the physical events involving mass objects unfold in time and space, is a central component of intelligent systems. Intuitive physics is a promising tool for gaining insight into mechanisms that generalize across species because both humans and non-human primates are subject to the same physical constraints when engaging with the environment. Physical reasoning abilities are widely present within the animal kingdom, but monkeys, with acute 3D vision and a high level of dexterity, appreciate and manipulate the physical world in much the same way humans do.
Collapse
|
13
|
An integrative view of human hippocampal function: Differences with other species and capacity considerations. Hippocampus 2023; 33:616-634. [PMID: 36965048 DOI: 10.1002/hipo.23527] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Revised: 02/11/2023] [Accepted: 03/09/2023] [Indexed: 03/27/2023]
Abstract
We describe an integrative model that encodes associations between related concepts in the human hippocampal formation, constituting the skeleton of episodic memories. The model, based on partially overlapping assemblies of "concept cells," contrast markedly with the well-established notion of pattern separation, which relies on conjunctive, context dependent single neuron responses, instead of the invariant, context independent responses found in the human hippocampus. We argue that the model of partially overlapping assemblies is better suited to cope with memory capacity limitations, that the finding of different types of neurons and functions in this area is due to a flexible and temporary use of the extraordinary machinery of the hippocampus to deal with the task at hand, and that only information that is relevant and frequently revisited will consolidate into long-term hippocampal representations, using partially overlapping assemblies. Finally, we propose that concept cells are uniquely human and that they may constitute the neuronal underpinnings of cognitive abilities that are much further developed in humans compared to other species.
Collapse
|
14
|
Human amygdala compared to orbitofrontal cortex connectivity, and emotion. Prog Neurobiol 2023; 220:102385. [PMID: 36442728 DOI: 10.1016/j.pneurobio.2022.102385] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2022] [Revised: 11/14/2022] [Accepted: 11/24/2022] [Indexed: 11/26/2022]
Abstract
The amygdala and orbitofrontal cortex have been implicated in emotion. To understand these regions better in humans, their effective connectivity with 360 cortical regions was measured in 171 humans from the Human Connectome Project, and complemented with functional connectivity and diffusion tractography. The human amygdala has effective connectivity from few cortical regions compared to the orbitofrontal cortex: primarily from auditory cortex A5 and the related superior temporal gyrus and temporal pole regions; the piriform (olfactory) cortex; the lateral orbitofrontal cortex 47m; somatosensory cortex; the hippocampus, entorhinal cortex, perirhinal cortex, and parahippocampal TF; and from the cholinergic nucleus basalis. The amygdala has effective connectivity to the hippocampus, entorhinal and perirhinal cortex; to the temporal pole; and to the lateral orbitofrontal cortex. The orbitofrontal cortex has effective connectivity from gustatory, olfactory, and temporal visual, auditory and pole cortex, and to the pregenual anterior and posterior cingulate cortex, hippocampal system, and prefrontal cortex, and provides for rewards and punishers to be used in reported emotions, and memory and navigation to goals. Given the paucity of amygdalo-neocortical connectivity in humans, it is proposed that the human amygdala is involved primarily in autonomic and conditioned responses via brainstem connectivity, rather than in reported (declarative) emotion.
Collapse
|
15
|
Socially meaningful visual context either enhances or inhibits vocalisation processing in the macaque brain. Nat Commun 2022; 13:4886. [PMID: 35985995 PMCID: PMC9391382 DOI: 10.1038/s41467-022-32512-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Accepted: 08/03/2022] [Indexed: 11/08/2022] Open
Abstract
Social interactions rely on the interpretation of semantic and emotional information, often from multiple sensory modalities. Nonhuman primates send and receive auditory and visual communicative signals. However, the neural mechanisms underlying the association of visual and auditory information based on their common social meaning are unknown. Using heart rate estimates and functional neuroimaging, we show that in the lateral and superior temporal sulcus of the macaque monkey, neural responses are enhanced in response to species-specific vocalisations paired with a matching visual context, or when vocalisations follow, in time, visual information, but inhibited when vocalisation are incongruent with the visual context. For example, responses to affiliative vocalisations are enhanced when paired with affiliative contexts but inhibited when paired with aggressive or escape contexts. Overall, we propose that the identified neural network represents social meaning irrespective of sensory modality. Social interaction involves processing semantic and emotional information. Here the authors show that in the macaque monkey lateral and superior temporal sulcus, cortical activity is enhanced in response to species-specific vocalisations predicted by matching face or social visual stimuli but inhibited when vocalisations are incongruent with the predictive visual context.
Collapse
|
16
|
Representation of expression and identity by ventral prefrontal neurons. Neuroscience 2022; 496:243-260. [PMID: 35654293 PMCID: PMC10363293 DOI: 10.1016/j.neuroscience.2022.05.033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 05/20/2022] [Accepted: 05/25/2022] [Indexed: 01/26/2023]
Abstract
Evidence has suggested that the ventrolateral prefrontal cortex (VLPFC) processes social stimuli, including faces and vocalizations, which are essential for communication. Features embedded within audiovisual stimuli, including emotional expression and caller identity, provide abundant information about an individual's intention, emotional state, motivation, and social status, which are important to encode in a social exchange. However, it is unknown to what extent the VLPFC encodes such features. To investigate the role of VLPFC during social communication, we recorded single-unit activity while rhesus macaques (Macaca mulatta) performed a nonmatch-to-sample task using species-specific face-vocalization stimuli that differed in emotional expression or caller identity. 75% of recorded cells were task-related and of these >70% were responsive during the nonmatch period. A larger proportion of nonmatch cells encoded the stimulus rather than the context of the trial type. A subset of responsive neurons were most commonly modulated by the identity of the nonmatch stimulus and less by the emotional expression, or both features within the face-vocalization stimuli presented during the nonmatch period. Neurons encoding identity were found in VLPFC across a broader region than expression related cells which were confined to only the anterolateral portion of the recording chamber in VLPFC. These findings suggest that, within a working memory paradigm, VLPFC processes features of face and vocal stimuli, such as emotional expression and identity, in addition to task and contextual information. Thus, stimulus and contextual information may be integrated by VLPFC during social communication.
Collapse
|
17
|
Structural Brain Asymmetries for Language: A Comparative Approach across Primates. Symmetry (Basel) 2022. [DOI: 10.3390/sym14050876] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
Humans are the only species that can speak. Nonhuman primates, however, share some ‘domain-general’ cognitive properties that are essential to language processes. Whether these shared cognitive properties between humans and nonhuman primates are the results of a continuous evolution [homologies] or of a convergent evolution [analogies] remain difficult to demonstrate. However, comparing their respective underlying structure—the brain—to determinate their similarity or their divergence across species is critical to help increase the probability of either of the two hypotheses, respectively. Key areas associated with language processes are the Planum Temporale, Broca’s Area, the Arcuate Fasciculus, Cingulate Sulcus, The Insula, Superior Temporal Sulcus, the Inferior Parietal lobe, and the Central Sulcus. These structures share a fundamental feature: They are functionally and structurally specialised to one hemisphere. Interestingly, several nonhuman primate species, such as chimpanzees and baboons, show human-like structural brain asymmetries for areas homologous to key language regions. The question then arises: for what function did these asymmetries arise in non-linguistic primates, if not for language per se? In an attempt to provide some answers, we review the literature on the lateralisation of the gestural communication system, which may represent the missing behavioural link to brain asymmetries for language area’s homologues in our common ancestor.
Collapse
|
18
|
Faces and Voices Processing in Human and Primate Brains: Rhythmic and Multimodal Mechanisms Underlying the Evolution and Development of Speech. Front Psychol 2022; 13:829083. [PMID: 35432052 PMCID: PMC9007199 DOI: 10.3389/fpsyg.2022.829083] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2021] [Accepted: 03/07/2022] [Indexed: 11/24/2022] Open
Abstract
While influential works since the 1970s have widely assumed that imitation is an innate skill in both human and non-human primate neonates, recent empirical studies and meta-analyses have challenged this view, indicating other forms of reward-based learning as relevant factors in the development of social behavior. The visual input translation into matching motor output that underlies imitation abilities instead seems to develop along with social interactions and sensorimotor experience during infancy and childhood. Recently, a new visual stream has been identified in both human and non-human primate brains, updating the dual visual stream model. This third pathway is thought to be specialized for dynamics aspects of social perceptions such as eye-gaze, facial expression and crucially for audio-visual integration of speech. Here, we review empirical studies addressing an understudied but crucial aspect of speech and communication, namely the processing of visual orofacial cues (i.e., the perception of a speaker's lips and tongue movements) and its integration with vocal auditory cues. Along this review, we offer new insights from our understanding of speech as the product of evolution and development of a rhythmic and multimodal organization of sensorimotor brain networks, supporting volitional motor control of the upper vocal tract and audio-visual voices-faces integration.
Collapse
|
19
|
Parallel functional subnetworks embedded in the macaque face patch system. SCIENCE ADVANCES 2022; 8:eabm2054. [PMID: 35263138 PMCID: PMC8906740 DOI: 10.1126/sciadv.abm2054] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/01/2021] [Accepted: 01/19/2022] [Indexed: 06/14/2023]
Abstract
During normal vision, our eyes provide the brain with a continuous stream of useful information about the world. How visually specialized areas of the cortex, such as face-selective patches, operate under natural modes of behavior is poorly understood. Here we report that, during the free viewing of movies, cohorts of face-selective neurons in the macaque cortex fractionate into distributed and parallel subnetworks that carry distinct information. We classified neurons into functional groups on the basis of their movie-driven coupling with functional magnetic resonance imaging time courses across the brain. Neurons from each group were distributed across multiple face patches but intermixed locally with other groups at each recording site. These findings challenge prevailing views about functional segregation in the cortex and underscore the importance of naturalistic paradigms for cognitive neuroscience.
Collapse
|
20
|
Prefrontal cortex interactions with the amygdala in primates. Neuropsychopharmacology 2022; 47:163-179. [PMID: 34446829 PMCID: PMC8616954 DOI: 10.1038/s41386-021-01128-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/29/2021] [Revised: 07/21/2021] [Accepted: 07/22/2021] [Indexed: 02/07/2023]
Abstract
This review addresses functional interactions between the primate prefrontal cortex (PFC) and the amygdala, with emphasis on their contributions to behavior and cognition. The interplay between these two telencephalic structures contributes to adaptive behavior and to the evolutionary success of all primate species. In our species, dysfunction in this circuitry creates vulnerabilities to psychopathologies. Here, we describe amygdala-PFC contributions to behaviors that have direct relevance to Darwinian fitness: learned approach and avoidance, foraging, predator defense, and social signaling, which have in common the need for flexibility and sensitivity to specific and rapidly changing contexts. Examples include the prediction of positive outcomes, such as food availability, food desirability, and various social rewards, or of negative outcomes, such as threats of harm from predators or conspecifics. To promote fitness optimally, these stimulus-outcome associations need to be rapidly updated when an associative contingency changes or when the value of a predicted outcome changes. We review evidence from nonhuman primates implicating the PFC, the amygdala, and their functional interactions in these processes, with links to experimental work and clinical findings in humans where possible.
Collapse
|
21
|
Abstract
Face perception is a socially important but complex process with many stages and many facets. There is substantial evidence from many sources that it involves a large extent of the temporal lobe, from the ventral occipitotemporal cortex and superior temporal sulci to anterior temporal regions. While early human neuroimaging work suggested a core face network consisting of the occipital face area, fusiform face area, and posterior superior temporal sulcus, studies in both humans and monkeys show a system of face patches stretching from posterior to anterior in both the superior temporal sulcus and inferotemporal cortex. Sophisticated techniques such as fMRI adaptation have shown that these face-activated regions show responses that have many of the attributes of human face processing. Lesions of some of these regions in humans lead to variants of prosopagnosia, the inability to recognize the identity of a face. Lesion, imaging, and electrophysiologic data all suggest that there is a segregation between identity and expression processing, though some suggest this may be better characterized as a distinction between static and dynamic facial information.
Collapse
|
22
|
Abstract
It is now widely accepted that the brunt of animal communication is conducted via several modalities, e.g. acoustic and visual, either simultaneously or sequentially. This is a laudable multimodal turn relative to traditional accounts of temporal aspects of animal communication which have focused on a single modality at a time. However, the fields that are currently contributing to the study of multimodal communication are highly varied, and still largely disconnected given their sole focus on a particular level of description or their particular concern with human or non-human animals. Here, we provide an integrative overview of converging findings that show how multimodal processes occurring at neural, bodily, as well as social interactional levels each contribute uniquely to the complex rhythms that characterize communication in human and non-human animals. Though we address findings for each of these levels independently, we conclude that the most important challenge in this field is to identify how processes at these different levels connect. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
|
23
|
Abstract
In order to understand ecologically meaningful social behaviors and their neural substrates in humans and other animals, researchers have been using a variety of social stimuli in the laboratory with a goal of extracting specific processes in real-life scenarios. However, certain stimuli may not be sufficiently effective at evoking typical social behaviors and neural responses. Here, we review empirical research employing different types of social stimuli by classifying them into five levels of naturalism. We describe the advantages and limitations while providing selected example studies for each level. We emphasize the important trade-off between experimental control and ecological validity across the five levels of naturalism. Taking advantage of newly emerging tools, such as real-time videos, virtual avatars, and wireless neural sampling techniques, researchers are now more than ever able to adopt social stimuli at a higher level of naturalism to better capture the dynamics and contingency of real-life social interaction.
Collapse
|
24
|
Face and voice perception: Monkey see, monkey hear. Curr Biol 2021; 31:R435-R437. [PMID: 33974868 DOI: 10.1016/j.cub.2021.02.060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Primate brains contain specialized areas for perceiving social cues. New research shows that only some of these areas integrate visual faces with auditory voices.
Collapse
|