1
|
Sensorimotor regulation of facial expression - An untouched frontier. Neurosci Biobehav Rev 2024; 162:105684. [PMID: 38710425 DOI: 10.1016/j.neubiorev.2024.105684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2024] [Revised: 04/16/2024] [Accepted: 04/18/2024] [Indexed: 05/08/2024]
Abstract
Facial expression is a critical form of nonverbal social communication which promotes emotional exchange and affiliation among humans. Facial expressions are generated via precise contraction of the facial muscles, guided by sensory feedback. While the neural pathways underlying facial motor control are well characterized in humans and primates, it remains unknown how tactile and proprioceptive information reaches these pathways to guide facial muscle contraction. Thus, despite the importance of facial expressions for social functioning, little is known about how they are generated as a unique sensorimotor behavior. In this review, we highlight current knowledge about sensory feedback from the face and how it is distinct from other body regions. We describe connectivity between the facial sensory and motor brain systems, and call attention to the other brain systems which influence facial expression behavior, including vision, gustation, emotion, and interoception. Finally, we petition for more research on the sensory basis of facial expressions, asserting that incomplete understanding of sensorimotor mechanisms is a barrier to addressing atypical facial expressivity in clinical populations.
Collapse
|
2
|
Neuronal Population Encoding of Identity in Primate Prefrontal Cortex. J Neurosci 2024; 44:e0703232023. [PMID: 37963766 PMCID: PMC10860606 DOI: 10.1523/jneurosci.0703-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 08/22/2023] [Accepted: 10/10/2023] [Indexed: 11/16/2023] Open
Abstract
The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorical features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (Macaca mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression, or their interaction in their stimulus-related firing rates; however, decoding of expression and identity using single-unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudo-populations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.
Collapse
|
3
|
Multisensory interactions of face and vocal information during perception and memory in ventrolateral prefrontal cortex. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220343. [PMID: 37545305 PMCID: PMC10404928 DOI: 10.1098/rstb.2022.0343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 03/21/2023] [Indexed: 08/08/2023] Open
Abstract
The ventral frontal lobe is a critical node in the circuit that underlies communication, a multisensory process where sensory features of faces and vocalizations come together. The neural basis of face and vocal integration is a topic of great importance since the integration of multiple sensory signals is essential for the decisions that govern our social interactions. Investigations have shown that the macaque ventrolateral prefrontal cortex (VLPFC), a proposed homologue of the human inferior frontal gyrus, is involved in the processing, integration and remembering of audiovisual signals. Single neurons in VLPFC encode and integrate species-specific faces and corresponding vocalizations. During working memory, VLPFC neurons maintain face and vocal information online and exhibit selective activity for face and vocal stimuli. Population analyses indicate that identity, a critical feature of social stimuli, is encoded by VLPFC neurons and dictates the structure of dynamic population activity in the VLPFC during the perception of vocalizations and their corresponding facial expressions. These studies suggest that VLPFC may play a primary role in integrating face and vocal stimuli with contextual information, in order to support decision making during social communication. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
|
4
|
Representation of expression and identity by ventral prefrontal neurons. Neuroscience 2022; 496:243-260. [PMID: 35654293 PMCID: PMC10363293 DOI: 10.1016/j.neuroscience.2022.05.033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 05/20/2022] [Accepted: 05/25/2022] [Indexed: 01/26/2023]
Abstract
Evidence has suggested that the ventrolateral prefrontal cortex (VLPFC) processes social stimuli, including faces and vocalizations, which are essential for communication. Features embedded within audiovisual stimuli, including emotional expression and caller identity, provide abundant information about an individual's intention, emotional state, motivation, and social status, which are important to encode in a social exchange. However, it is unknown to what extent the VLPFC encodes such features. To investigate the role of VLPFC during social communication, we recorded single-unit activity while rhesus macaques (Macaca mulatta) performed a nonmatch-to-sample task using species-specific face-vocalization stimuli that differed in emotional expression or caller identity. 75% of recorded cells were task-related and of these >70% were responsive during the nonmatch period. A larger proportion of nonmatch cells encoded the stimulus rather than the context of the trial type. A subset of responsive neurons were most commonly modulated by the identity of the nonmatch stimulus and less by the emotional expression, or both features within the face-vocalization stimuli presented during the nonmatch period. Neurons encoding identity were found in VLPFC across a broader region than expression related cells which were confined to only the anterolateral portion of the recording chamber in VLPFC. These findings suggest that, within a working memory paradigm, VLPFC processes features of face and vocal stimuli, such as emotional expression and identity, in addition to task and contextual information. Thus, stimulus and contextual information may be integrated by VLPFC during social communication.
Collapse
|
5
|
The cortical and subcortical correlates of face pareidolia in the macaque brain. Soc Cogn Affect Neurosci 2022; 17:965-976. [PMID: 35445247 PMCID: PMC9629476 DOI: 10.1093/scan/nsac031] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2021] [Revised: 03/27/2022] [Accepted: 04/19/2022] [Indexed: 01/12/2023] Open
Abstract
Face detection is a foundational social skill for primates. This vital function is thought to be supported by specialized neural mechanisms; however, although several face-selective regions have been identified in both humans and nonhuman primates, there is no consensus about which region(s) are involved in face detection. Here, we used naturally occurring errors of face detection (i.e. objects with illusory facial features referred to as examples of 'face pareidolia') to identify regions of the macaque brain implicated in face detection. Using whole-brain functional magnetic resonance imaging to test awake rhesus macaques, we discovered that a subset of face-selective patches in the inferior temporal cortex, on the lower lateral edge of the superior temporal sulcus, and the amygdala respond more to objects with illusory facial features than matched non-face objects. Multivariate analyses of the data revealed differences in the representation of illusory faces across the functionally defined regions of interest. These differences suggest that the cortical and subcortical face-selective regions contribute uniquely to the detection of facial features. We conclude that face detection is supported by a multiplexed system in the primate brain.
Collapse
|
6
|
Visual response of ventrolateral prefrontal neurons and their behavior-related modulation. Sci Rep 2021; 11:10118. [PMID: 33980932 PMCID: PMC8115110 DOI: 10.1038/s41598-021-89500-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Accepted: 04/26/2021] [Indexed: 11/08/2022] Open
Abstract
The ventral part of lateral prefrontal cortex (VLPF) of the monkey receives strong visual input, mainly from inferotemporal cortex. It has been shown that VLPF neurons can show visual responses during paradigms requiring to associate arbitrary visual cues to behavioral reactions. Further studies showed that there are also VLPF neurons responding to the presentation of specific visual stimuli, such as objects and faces. However, it is largely unknown whether VLPF neurons respond and differentiate between stimuli belonging to different categories, also in absence of a specific requirement to actively categorize or to exploit these stimuli for choosing a given behavior. The first aim of the present study is to evaluate and map the responses of neurons of a large sector of VLPF to a wide set of visual stimuli when monkeys simply observe them. Recent studies showed that visual responses to objects are also present in VLPF neurons coding action execution, when they are the target of the action. Thus, the second aim of the present study is to compare the visual responses of VLPF neurons when the same objects are simply observed or when they become the target of a grasping action. Our results indicate that: (1) part of VLPF visually responsive neurons respond specifically to one stimulus or to a small set of stimuli, but there is no indication of a “passive” categorical coding; (2) VLPF neuronal visual responses to objects are often modulated by the task conditions in which the object is observed, with the strongest response when the object is target of an action. These data indicate that VLPF performs an early passive description of several types of visual stimuli, that can then be used for organizing and planning behavior. This could explain the modulation of visual response both in associative learning and in natural behavior.
Collapse
|
7
|
The macaque face patch system: a turtle’s underbelly for the brain. Nat Rev Neurosci 2020; 21:695-716. [DOI: 10.1038/s41583-020-00393-w] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/30/2020] [Indexed: 02/06/2023]
|
8
|
Damage to Orbitofrontal Areas 12 and 13, but Not Area 14, Results in Blunted Attention and Arousal to Socioemotional Stimuli in Rhesus Macaques. Front Behav Neurosci 2020; 14:150. [PMID: 33093825 PMCID: PMC7506161 DOI: 10.3389/fnbeh.2020.00150] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2020] [Accepted: 08/03/2020] [Indexed: 12/12/2022] Open
Abstract
An earlier study in monkeys indicated that lesions to the mid-portion of the ventral orbitofrontal cortex (OFC), including Walker’s areas 11 and 13 (OFC11/13), altered the spontaneous scanning of still pictures of primate faces (neutral and emotional) and the modulation of arousal. Yet, these conclusions were limited by several shortcomings, including the lesion approach, use of static rather than dynamic stimuli, and manual data analyses. To confirm and extend these earlier findings, we compared attention and arousal to social and nonsocial scenes in three groups of rhesus macaques with restricted lesions to one of three OFC areas (OFC12, OFC13, or OFC14) and a sham-operated control group using eye-tracking to capture scanning patterns, focal attention and pupil size. Animals with damage to the lateral OFC areas (OFC12 and OFC13) showed decreased attention specifically to the eyes of negative (threatening) social stimuli and increased arousal (increased pupil diameter) to positive social scenes. In contrast, animals with damage to the ventromedial OFC area (OFC14) displayed no differences in attention or arousal in the presence of social stimuli compared to controls. These findings support the notion that areas of the lateral OFC are critical for directing attention and modulating arousal to emotional social cues. Together with the existence of face-selective neurons in these lateral OFC areas, the data suggest that the lateral OFC may set the stage for multidimensional information processing related to faces and emotion and may be involved in social judgments.
Collapse
|
9
|
Abstract
Perceiving social and emotional information from faces is a critical primate skill. For this purpose, primates evolved dedicated cortical architecture, especially in occipitotemporal areas, utilizing face-selective cells. Less understood face-selective neurons are present in the orbitofrontal cortex (OFC) and are our object of study. We examined 179 face-selective cells in the lateral sulcus of the OFC by characterizing their responses to a rich set of photographs of conspecific faces varying in age, gender, and facial expression. Principal component analysis and unsupervised cluster analysis of stimulus space both revealed that face cells encode face dimensions for social categories and emotions. Categories represented strongly were facial expressions (grin and threat versus lip smack), juvenile, and female monkeys. Cluster analyses of a control population of nearby cells lacking face selectivity did not categorize face stimuli in a meaningful way, suggesting that only face-selective cells directly support face categorization in OFC. Time course analyses of face cell activity from stimulus onset showed that faces were discriminated from nonfaces early, followed by within-face categorization for social and emotion content (i.e., young and facial expression). Face cells revealed no response to acoustic stimuli such as vocalizations and were poorly modulated by vocalizations added to faces. Neuronal responses remained stable when paired with positive or negative reinforcement, implying that face cells encode social information but not learned reward value associated to faces. Overall, our results shed light on a substantial role of the OFC in the characterizations of facial information bearing on social and emotional behavior.
Collapse
|
10
|
Abstract
In humans and monkeys, face perception activates a distributed cortical network that includes extrastriate, limbic, and prefrontal regions. Within face-responsive regions, emotional faces evoke stronger responses than neutral faces ("valence effect"). We used fMRI and Dynamic Causal Modeling (DCM) to test the hypothesis that emotional faces differentially alter the functional coupling among face-responsive regions. Three monkeys viewed conspecific faces with neutral, threatening, fearful, and appeasing expressions. Using Bayesian model selection, various models of neural interactions between the posterior (TEO) and anterior (TE) portions of inferior temporal (IT) cortex, the amygdala, the orbitofrontal (OFC), and ventrolateral prefrontal cortex (VLPFC) were tested. The valence effect was mediated by feedback connections from the amygdala to TE and TEO, and feedback connections from VLPFC to the amygdala and TE. Emotional faces were associated with differential effective connectivity: Fearful faces evoked stronger modulations in the connections from the amygdala to TE and TEO; threatening faces evoked weaker modulations in the connections from the amygdala and VLPFC to TE; and appeasing faces evoked weaker modulations in the connection from VLPFC to the amygdala. Our results suggest dynamic alterations in neural coupling during the perception of behaviorally relevant facial expressions that are vital for social communication.
Collapse
|
11
|
Neural Coding for Action Execution and Action Observation in the Prefrontal Cortex and Its Role in the Organization of Socially Driven Behavior. Front Neurosci 2017; 11:492. [PMID: 28936159 PMCID: PMC5594103 DOI: 10.3389/fnins.2017.00492] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Accepted: 08/22/2017] [Indexed: 11/13/2022] Open
Abstract
The lateral prefrontal cortex (LPF) plays a fundamental role in planning, organizing, and optimizing behavioral performance. Neuroanatomical and neurophysiological studies have suggested that in this cortical sector, information processing becomes more abstract when moving from caudal to rostral and that such processing involves parietal and premotor areas. We review studies that have shown that the LPF, in addition to its involvement in implementing rules and setting behavioral goals, activates during the execution of forelimb movements even in the absence of a learned relationship between an instruction and its associated motor output. Thus, we propose that the prefrontal cortex is involved in exploiting contextual information for planning and guiding behavioral responses, also in natural situations. Among contextual cues, those provided by others' actions are particularly relevant for social interactions. Functional studies of macaques have demonstrated that the LPF is activated by the observation of biological stimuli, in particular those related to goal-directed actions. We review these studies and discuss the idea that the prefrontal cortex codes high-order representations of observed actions rather than simple visual descriptions of them. Based on evidence that the same sector of the LPF contains both neurons coding own action goals and neurons coding others' goals, we propose that this sector is involved in the selection of own actions appropriate for reacting in a particular social context and for the creation of new action sequences in imitative learning.
Collapse
|
12
|
Neuronal Encoding of Self and Others' Head Rotation in the Macaque Dorsal Prefrontal Cortex. Sci Rep 2017; 7:8571. [PMID: 28819117 PMCID: PMC5561028 DOI: 10.1038/s41598-017-08936-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Accepted: 07/17/2017] [Indexed: 12/25/2022] Open
Abstract
Following gaze is a crucial skill, in primates, for understanding where and at what others are looking, and often requires head rotation. The neural basis underlying head rotation are deemed to overlap with the parieto-frontal attention/gaze-shift network. Here, we show that a set of neurons in monkey’s Brodmann area 9/46dr (BA 9/46dr), which is involved in orienting processes and joint attention, becomes active during self head rotation and that the activity of these neurons cannot be accounted for by saccade-related activity (head-rotation neurons). Another set of BA 9/46dr neurons encodes head rotation performed by an observed agent facing the monkey (visually triggered neurons). Among these latter neurons, almost half exhibit the intriguing property of encoding both execution and observation of head rotation (mirror-like neurons). Finally, by means of neuronal tracing techniques, we showed that BA 9/46dr takes part into two distinct networks: a dorso/mesial network, playing a role in spatial head/gaze orientation, and a ventrolateral network, likely involved in processing social stimuli and mirroring others’ head. The overall results of this study provide a new, comprehensive picture of the role of BA 9/46dr in encoding self and others’ head rotation, likely playing a role in head-following behaviors.
Collapse
|
13
|
Two different mirror neuron networks: The sensorimotor (hand) and limbic (face) pathways. Neuroscience 2017; 358:300-315. [PMID: 28687313 DOI: 10.1016/j.neuroscience.2017.06.052] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2017] [Revised: 06/27/2017] [Accepted: 06/28/2017] [Indexed: 12/15/2022]
Abstract
The vast majority of functional studies investigating mirror neurons (MNs) explored their properties in relation to hand actions, and very few investigated how MNs respond to mouth actions or communicative gestures. Since hand and mouth MNs were recorded in two partially overlapping sectors of the ventral precentral cortex of the macaque monkey, there is a general assumption that they share a same neuroanatomical network, with the parietal cortex as a main source of visual information. In the current review, we challenge this perspective and describe the connectivity pattern of mouth MN sector. The mouth MNs F5/opercular region is connected with premotor, parietal areas mostly related to the somatosensory and motor representation of the face/mouth, and with area PrCO, involved in processing gustatory and somatosensory intraoral input. Unlike hand MNs, mouth MNs do not receive their visual input from parietal regions. Such information related to face/communicative behaviors could come from the ventrolateral prefrontal cortex. Further strong connections derive from limbic structures involved in encoding emotional facial expressions and motivational/reward processing. These brain structures include the anterior cingulate cortex, the anterior and mid-dorsal insula, orbitofrontal cortex and the basolateral amygdala. The mirror mechanism is therefore composed and supported by at least two different anatomical pathways: one is concerned with sensorimotor transformation in relation to reaching and hand grasping within the traditional parietal-premotor circuits; the second one is linked to the mouth/face motor control and is connected with limbic structures, involved in communication/emotions and reward processing.
Collapse
|
14
|
Action observation activates neurons of the monkey ventrolateral prefrontal cortex. Sci Rep 2017; 7:44378. [PMID: 28290511 PMCID: PMC5349536 DOI: 10.1038/srep44378] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2016] [Accepted: 02/07/2017] [Indexed: 01/09/2023] Open
Abstract
Prefrontal cortex is crucial for exploiting contextual information for the planning and guidance of behavioral responses. Among contextual cues, those provided by others’ behavior are particularly important, in primates, for selecting appropriate reactions and suppressing the inappropriate ones. These latter functions deeply rely on the ability to understand others’ actions. However, it is largely unknown whether prefrontal neurons are activated by action observation. To address this issue, we recorded the activity of ventrolateral prefrontal (VLPF) neurons of macaque monkeys during the observation of videos depicting biological movements performed by a monkey or a human agent, and object motion. Our results show that a population of VLPF neurons respond to the observation of biological movements, in particular those representing goal directed actions. Many of these neurons also show a preference for the agent performing the action. The neural response is present also when part of the observed movement is obscured, suggesting that these VLPF neurons code a high order representation of the observed action rather than a simple visual description of it.
Collapse
|
15
|
Neural circuits in auditory and audiovisual memory. Brain Res 2016; 1640:278-88. [PMID: 26656069 PMCID: PMC4868791 DOI: 10.1016/j.brainres.2015.11.042] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2015] [Revised: 10/28/2015] [Accepted: 11/25/2015] [Indexed: 01/01/2023]
Abstract
Working memory is the ability to employ recently seen or heard stimuli and apply them to changing cognitive context. Although much is known about language processing and visual working memory, the neurobiological basis of auditory working memory is less clear. Historically, part of the problem has been the difficulty in obtaining a robust animal model to study auditory short-term memory. In recent years there has been neurophysiological and lesion studies indicating a cortical network involving both temporal and frontal cortices. Studies specifically targeting the role of the prefrontal cortex (PFC) in auditory working memory have suggested that dorsal and ventral prefrontal regions perform different roles during the processing of auditory mnemonic information, with the dorsolateral PFC performing similar functions for both auditory and visual working memory. In contrast, the ventrolateral PFC (VLPFC), which contains cells that respond robustly to auditory stimuli and that process both face and vocal stimuli may be an essential locus for both auditory and audiovisual working memory. These findings suggest a critical role for the VLPFC in the processing, integrating, and retaining of communication information. This article is part of a Special Issue entitled SI: Auditory working memory.
Collapse
|
16
|
Functional differences in face processing between the amygdala and ventrolateral prefrontal cortex in monkeys. Neuroscience 2015. [DOI: 10.1016/j.neuroscience.2015.07.047] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
17
|
Inactivation of Primate Prefrontal Cortex Impairs Auditory and Audiovisual Working Memory. J Neurosci 2015; 35:9666-75. [PMID: 26134649 PMCID: PMC4571503 DOI: 10.1523/jneurosci.1218-15.2015] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2015] [Revised: 05/21/2015] [Accepted: 05/27/2015] [Indexed: 11/21/2022] Open
Abstract
The prefrontal cortex is associated with cognitive functions that include planning, reasoning, decision-making, working memory, and communication. Neurophysiology and neuropsychology studies have established that dorsolateral prefrontal cortex is essential in spatial working memory while the ventral frontal lobe processes language and communication signals. Single-unit recordings in nonhuman primates has shown that ventral prefrontal (VLPFC) neurons integrate face and vocal information and are active during audiovisual working memory. However, whether VLPFC is essential in remembering face and voice information is unknown. We therefore trained nonhuman primates in an audiovisual working memory paradigm using naturalistic face-vocalization movies as memoranda. We inactivated VLPFC, with reversible cortical cooling, and examined performance when faces, vocalizations or both faces and vocalization had to be remembered. We found that VLPFC inactivation impaired subjects' performance in audiovisual and auditory-alone versions of the task. In contrast, VLPFC inactivation did not disrupt visual working memory. Our studies demonstrate the importance of VLPFC in auditory and audiovisual working memory for social stimuli but suggest a different role for VLPFC in unimodal visual processing. SIGNIFICANCE STATEMENT The ventral frontal lobe, or inferior frontal gyrus, plays an important role in audiovisual communication in the human brain. Studies with nonhuman primates have found that neurons within ventral prefrontal cortex (VLPFC) encode both faces and vocalizations and that VLPFC is active when animals need to remember these social stimuli. In the present study, we temporarily inactivated VLPFC by cooling the cortex while nonhuman primates performed a working memory task. This impaired the ability of subjects to remember a face and vocalization pair or just the vocalization alone. Our work highlights the importance of the primate VLPFC in the processing of faces and vocalizations in a manner that is similar to the inferior frontal gyrus in the human brain.
Collapse
|
18
|
Abstract
Increasing evidence has shown that oxytocin (OT), a mammalian hormone, modifies the way social stimuli are perceived and the way they affect behavior. Thus, OT may serve as a treatment for psychiatric disorders, many of which are characterized by dysfunctional social behavior. To explore the neural mechanisms mediating the effects of OT in macaque monkeys, we investigated whether OT would modulate functional magnetic resonance imaging (fMRI) responses in face-responsive regions (faces vs. blank screen) evoked by the perception of various facial expressions (neutral, fearful, aggressive, and appeasing). In the placebo condition, we found significantly increased activation for emotional (mainly fearful and appeasing) faces compared with neutral faces across the face-responsive regions. OT selectively, and differentially, altered fMRI responses to emotional expressions, significantly reducing responses to both fearful and aggressive faces in face-responsive regions while leaving responses to appeasing as well as neutral faces unchanged. We also found that OT administration selectively reduced functional coupling between the amygdala and areas in the occipital and inferior temporal cortex during the viewing of fearful and aggressive faces, but not during the viewing of neutral or appeasing faces. Taken together, our results indicate homologies between monkeys and humans in the neural circuits mediating the effects of OT. Thus, the monkey may be an ideal animal model to explore the development of OT-based pharmacological strategies for treating patients with dysfunctional social behavior.
Collapse
|
19
|
Structure and function of the middle temporal visual area (MT) in the marmoset: Comparisons with the macaque monkey. Neurosci Res 2015; 93:62-71. [DOI: 10.1016/j.neures.2014.09.012] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2014] [Revised: 09/16/2014] [Accepted: 09/16/2014] [Indexed: 11/22/2022]
|
20
|
Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection. Proc Natl Acad Sci U S A 2013; 110:E4668-77. [PMID: 24218574 DOI: 10.1073/pnas.1312518110] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
How low-level sensory areas help mediate the detection and discrimination advantages of integrating faces and voices is the subject of intense debate. To gain insights, we investigated the role of the auditory cortex in face/voice integration in macaque monkeys performing a vocal-detection task. Behaviorally, subjects were slower to detect vocalizations as the signal-to-noise ratio decreased, but seeing mouth movements associated with vocalizations sped up detection. Paralleling this behavioral relationship, as the signal to noise ratio decreased, the onset of spiking responses were delayed and magnitudes were decreased. However, when mouth motion accompanied the vocalization, these responses were uniformly faster. Conversely, and at odds with previous assumptions regarding the neural basis of face/voice integration, changes in the magnitude of neural responses were not related consistently to audiovisual behavior. Taken together, our data reveal that facilitation of spike latency is a means by which the auditory cortex partially mediates the reaction time benefits of combining faces and voices.
Collapse
|
21
|
Motor excitability during visual perception of known and unknown spoken languages. BRAIN AND LANGUAGE 2013; 126:1-7. [PMID: 23644583 PMCID: PMC3682190 DOI: 10.1016/j.bandl.2013.03.002] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/16/2012] [Revised: 02/15/2013] [Accepted: 03/19/2013] [Indexed: 06/02/2023]
Abstract
It is possible to comprehend speech and discriminate languages by viewing a speaker's articulatory movements. Transcranial magnetic stimulation studies have shown that viewing speech enhances excitability in the articulatory motor cortex. Here, we investigated the specificity of this enhanced motor excitability in native and non-native speakers of English. Both groups were able to discriminate between speech movements related to a known (i.e., English) and unknown (i.e., Hebrew) language. The motor excitability was higher during observation of a known language than an unknown language or non-speech mouth movements, suggesting that motor resonance is enhanced specifically during observation of mouth movements that convey linguistic information. Surprisingly, however, the excitability was equally high during observation of a static face. Moreover, the motor excitability did not differ between native and non-native speakers. These findings suggest that the articulatory motor cortex processes several kinds of visual cues during speech communication.
Collapse
|
22
|
The ventral visual pathway: an expanded neural framework for the processing of object quality. Trends Cogn Sci 2012; 17:26-49. [PMID: 23265839 DOI: 10.1016/j.tics.2012.10.011] [Citation(s) in RCA: 656] [Impact Index Per Article: 54.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2012] [Revised: 10/24/2012] [Accepted: 10/29/2012] [Indexed: 01/01/2023]
Abstract
Since the original characterization of the ventral visual pathway, our knowledge of its neuroanatomy, functional properties, and extrinsic targets has grown considerably. Here we synthesize this recent evidence and propose that the ventral pathway is best understood as a recurrent occipitotemporal network containing neural representations of object quality both utilized and constrained by at least six distinct cortical and subcortical systems. Each system serves its own specialized behavioral, cognitive, or affective function, collectively providing the raison d'être for the ventral visual pathway. This expanded framework contrasts with the depiction of the ventral visual pathway as a largely serial staged hierarchy culminating in singular object representations and more parsimoniously incorporates attentional, contextual, and feedback effects.
Collapse
|
23
|
Integration of faces and vocalizations in ventral prefrontal cortex: implications for the evolution of audiovisual speech. Proc Natl Acad Sci U S A 2012; 109 Suppl 1:10717-24. [PMID: 22723356 DOI: 10.1073/pnas.1204335109] [Citation(s) in RCA: 64] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The integration of facial gestures and vocal signals is an essential process in human communication and relies on an interconnected circuit of brain regions, including language regions in the inferior frontal gyrus (IFG). Studies have determined that ventral prefrontal cortical regions in macaques [e.g., the ventrolateral prefrontal cortex (VLPFC)] share similar cytoarchitectonic features as cortical areas in the human IFG, suggesting structural homology. Anterograde and retrograde tracing studies show that macaque VLPFC receives afferents from the superior and inferior temporal gyrus, which provide complex auditory and visual information, respectively. Moreover, physiological studies have shown that single neurons in VLPFC integrate species-specific face and vocal stimuli. Although bimodal responses may be found across a wide region of prefrontal cortex, vocalization responsive cells, which also respond to faces, are mainly found in anterior VLPFC. This suggests that VLPFC may be specialized to process and integrate social communication information, just as the IFG is specialized to process and integrate speech and gestures in the human brain.
Collapse
|
24
|
Timing of audiovisual inputs to the prefrontal cortex and multisensory integration. Neuroscience 2012; 214:36-48. [PMID: 22516006 DOI: 10.1016/j.neuroscience.2012.03.025] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2010] [Revised: 03/14/2012] [Accepted: 03/15/2012] [Indexed: 11/30/2022]
Abstract
A number of studies have demonstrated that the relative timing of audiovisual stimuli is especially important for multisensory integration of speech signals although the neuronal mechanisms underlying this complex behavior are unknown. Temporal coincidence and congruency are thought to underlie the successful merging of two intermodal stimuli into a coherent perceptual representation. It has been previously shown that single neurons in the non-human primate prefrontal cortex integrate face and vocalization information. However, these multisensory responses and the degree to which they depend on temporal coincidence have yet to be determined. In this study we analyzed the response latency of ventrolateral prefrontal (VLPFC) neurons to face, vocalization and combined face-vocalization stimuli and an offset (asynchronous) version of the face-vocalization stimulus. Our results indicate that for most prefrontal multisensory neurons, the response latency for the vocalization was the shortest, followed by the combined face-vocalization stimuli. The face stimulus had the longest onset response latency. When tested with a dynamic face-vocalization stimulus that had been temporally offset (asynchronous) one-third of multisensory cells in VLPFC demonstrated a change in response compared to the response to the natural, synchronous face-vocalization movie. Our results indicate that prefrontal neurons are sensitive to the temporal properties of audiovisual stimuli. A disruption in the temporal synchrony of an audiovisual signal which results in a change in the firing of communication related prefrontal neurons could underlie the loss in intelligibility which occurs with asynchronous speech stimuli.
Collapse
|