1
|
Bachmann HP, Japee S, Merriam EP, Liu TT. Emotion and anxiety interact to bias spatial attention. Emotion 2024; 24:1109-1124. [PMID: 38127536 PMCID: PMC11116080 DOI: 10.1037/emo0001322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2023]
Abstract
Emotional expressions are an evolutionarily conserved means of social communication essential for social interactions. It is important to understand how anxious individuals perceive their social environments, including emotional expressions, especially with the rising prevalence of anxiety during the COVID-19 pandemic. Anxiety is often associated with an attentional bias for threat-related stimuli, such as angry faces. Yet the mechanisms by which anxiety enhances or impairs two key components of spatial attention-attentional capture and attentional disengagement-to emotional expressions are still unclear. Moreover, positive valence is often ignored in studies of threat-related attention and anxiety, despite the high occurrence of happy faces during everyday social interaction. Here, we investigated the relationship between anxiety, emotional valence, and spatial attention in 574 participants across two preregistered studies (data collected in 2021 and 2022; Experiment 1: n = 154, 54.5% male, Mage = 43.5 years; Experiment 2: n = 420, 58% male, Mage = 36.46 years). We found that happy faces capture attention more quickly than angry faces during the visual search experiment and found delayed disengagement from both angry and happy faces over neutral faces during the spatial cueing experiment. We also show that anxiety has a distinct impact on both attentional capture and disengagement of emotional faces. Together, our findings highlight the role of positively valenced stimuli in attracting and holding attention and suggest that anxiety is a critical factor in modulating spatial attention to emotional stimuli. (PsycInfo Database Record (c) 2024 APA, all rights reserved).
Collapse
Affiliation(s)
- Helena P. Bachmann
- Computational Neuroimaging and Perception Group, Laboratory
of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, MD,
USA
| | - Shruti Japee
- Section on Learning and Plasticity, Laboratory of Brain and
Cognition, National Institute of Mental Health, NIH, Bethesda, MD, USA
| | - Elisha P. Merriam
- Computational Neuroimaging and Perception Group, Laboratory
of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, MD,
USA
| | - Tina T. Liu
- Computational Neuroimaging and Perception Group, Laboratory
of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, MD,
USA
| |
Collapse
|
2
|
Japee S. On the Role of Sensorimotor Experience in Facial Expression Perception. J Cogn Neurosci 2024:1-13. [PMID: 38527075 DOI: 10.1162/jocn_a_02148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/27/2024]
Abstract
Humans recognize the facial expressions of others rapidly and effortlessly. Although much is known about how we perceive expressions, the role of facial experience in shaping this remarkable ability remains unclear. Is our perception of expressions linked to how we ourselves make facial expressions? Are we better at recognizing other's facial expressions if we are experts at making the same expressions ourselves? And if we could not make facial expressions at all, would it impact our ability to recognize others' facial expressions? The current article aims to examine these questions by explicating the link between facial experience and facial expression recognition. It includes a comprehensive appraisal of the related literature and examines three main theories that posit a connection between making and recognizing facial expressions. First, recent studies in individuals with Moebius syndrome support the role of facial ability (i.e., the ability to move one's face to make facial expressions) in facial expression recognition. Second, motor simulation theory suggests that humans recognize others' facial expressions by covertly mimicking the observed expression (without overt motor action) and that this facial mimicry helps us identify and feel the associated emotion. Finally, the facial feedback hypothesis provides a framework for enhanced emotional experience via proprioceptive feedback from facial muscles when mimicking a viewed facial expression. Evidence for and against these theories is presented as well as some considerations and outstanding questions for future research studies investigating the role of facial experience in facial expression perception.
Collapse
Affiliation(s)
- Shruti Japee
- National Institute of Mental Health, Bethesda, MD
| |
Collapse
|
3
|
Taubert J, Japee S. Real Face Value: The Processing of Naturalistic Facial Expressions in the Macaque Inferior Temporal Cortex. J Cogn Neurosci 2024:1-17. [PMID: 38261366 DOI: 10.1162/jocn_a_02108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2024]
Abstract
For primates, expressions of fear are thought to be powerful social signals. In laboratory settings, faces with fearful expressions have reliably evoked valence effects in inferior temporal cortex. However, because macaques use so called "fear grins" in a variety of different contexts, the deeper question is whether the macaque inferior temporal cortex is tuned to the prototypical fear grin, or to conspecifics signaling fear? In this study, we combined neuroimaging with the results of a behavioral task to investigate how macaques encode a wide variety of fearful facial expressions. In Experiment 1, we identified two sets of macaque face stimuli using different approaches; we selected faces based on the emotional context (i.e., calm vs. fearful), and we selected faces based on the engagement of action units (i.e., neutral vs. fear grins). We also included human faces in Experiment 1. Then, using fMRI, we found that the faces selected based on context elicited a larger valence effect in the inferior temporal cortex than faces selected based on visual appearance. Furthermore, human facial expressions only elicited weak valence effects. These observations were further supported by the results of a two-alternative, forced-choice task (Experiment 2), suggesting that fear grins vary in their perceived pleasantness. Collectively, these findings indicate that the macaque inferior temporal cortex is more involved in social intelligence than commonly assumed, encoding emergent properties in naturalistic face stimuli that transcend basic visual features. These results demand a rethinking of theories surrounding the function and operationalization of primate inferior temporal cortex.
Collapse
Affiliation(s)
- Jessica Taubert
- The National Institute of Mental Health, Bethesda, MD
- The University of Queensland
| | - Shruti Japee
- The National Institute of Mental Health, Bethesda, MD
| |
Collapse
|
4
|
Japee S, Jordan J, Licht J, Lokey S, Chen G, Snow J, Jabs EW, Webb BD, Engle EC, Manoli I, Baker C, Ungerleider LG. Inability to move one's face dampens facial expression perception. Cortex 2023; 169:35-49. [PMID: 37852041 PMCID: PMC10836030 DOI: 10.1016/j.cortex.2023.08.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Revised: 05/31/2023] [Accepted: 08/02/2023] [Indexed: 10/20/2023]
Abstract
Humans rely heavily on facial expressions for social communication to convey their thoughts and emotions and to understand them in others. One prominent but controversial view is that humans learn to recognize the significance of facial expressions by mimicking the expressions of others. This view predicts that an inability to make facial expressions (e.g., facial paralysis) would result in reduced perceptual sensitivity to others' facial expressions. To test this hypothesis, we developed a diverse battery of sensitive emotion recognition tasks to characterize expression perception in individuals with Moebius Syndrome (MBS), a congenital neurological disorder that causes facial palsy. Using computer-based detection tasks we systematically assessed expression perception thresholds for static and dynamic face and body expressions. We found that while MBS individuals were able to perform challenging perceptual control tasks and body expression tasks, they were less efficient at extracting emotion from facial expressions, compared to matched controls. Exploratory analyses of fMRI data from a small group of MBS participants suggested potentially reduced engagement of the amygdala in MBS participants during expression processing relative to matched controls. Collectively, these results suggest a role for facial mimicry and consequent facial feedback and motor experience in the perception of others' facial expressions.
Collapse
Affiliation(s)
- Shruti Japee
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA.
| | - Jessica Jordan
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Judith Licht
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Savannah Lokey
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | - Gang Chen
- Scientific and Statistical Computing Core, NIMH, NIH, Bethesda, MD, USA
| | - Joseph Snow
- Office of the Clinical Director, NIMH, NIH, Bethesda, MD, USA
| | - Ethylin Wang Jabs
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Bryn D Webb
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, USA; Department of Pediatrics, Division of Genetics and Metabolism, University of Wisconsin-Madison, Madison, WI, USA
| | - Elizabeth C Engle
- Departments of Neurology and Ophthalmology, Boston Children's Hospital and Harvard Medical School, Boston, MA, USA; Howard Hughes Medical Institute, Chevy Chase, MD, USA
| | - Irini Manoli
- Medical Genomics and Metabolic Genetics, NHGRI, NIH, Bethesda, MD, USA
| | - Chris Baker
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, USA
| | | |
Collapse
|
5
|
Liu T, Fu J, Chai Y, Japee S, Chen G, Ungerleider L, Merriam E. Contributed Session I: Layer-specific, retinotopically-diffuse modulation in human visual cortex by emotional faces. J Vis 2023; 23:10. [PMID: 37733568 DOI: 10.1167/jov.23.11.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/23/2023] Open
Abstract
Emotionally expressive faces evoke enhanced neural responses in multiple brain regions, a phenomenon thought to depend critically on the amygdala. This emotion-related modulation is evident even in primary visual cortex (V1), providing a potential neural substrate by which emotionally salient stimuli can affect perception. How does emotional valence information, computed in the amygdala, reach V1? Here we use high-resolution functional MRI to investigate the layer profile and retinotopic distribution of neural activity specific to emotional facial expressions. Across three experiments, human participants viewed centrally presented face stimuli varying in emotional expression and performed a gender judgment task. We found that facial valence sensitivity was evident only in superficial cortical layers and was not restricted to the retinotopic location of the stimuli, consistent with diffuse feedback-like projections from the amygdala. Together, our results provide a feedback mechanism by which the amygdala directly modulates activity at the earliest stage of visual processing. .
Collapse
|
6
|
Long H, Peluso N, Baker CI, Japee S, Taubert J. A database of heterogeneous faces for studying naturalistic expressions. Sci Rep 2023; 13:5383. [PMID: 37012369 PMCID: PMC10070342 DOI: 10.1038/s41598-023-32659-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Facial expressions are thought to be complex visual signals, critical for communication between social agents. Most prior work aimed at understanding how facial expressions are recognized has relied on stimulus databases featuring posed facial expressions, designed to represent putative emotional categories (such as 'happy' and 'angry'). Here we use an alternative selection strategy to develop the Wild Faces Database (WFD); a set of one thousand images capturing a diverse range of ambient facial behaviors from outside of the laboratory. We characterized the perceived emotional content in these images using a standard categorization task in which participants were asked to classify the apparent facial expression in each image. In addition, participants were asked to indicate the intensity and genuineness of each expression. While modal scores indicate that the WFD captures a range of different emotional expressions, in comparing the WFD to images taken from other, more conventional databases, we found that participants responded more variably and less specifically to the wild-type faces, perhaps indicating that natural expressions are more multiplexed than a categorical model would predict. We argue that this variability can be employed to explore latent dimensions in our mental representation of facial expressions. Further, images in the WFD were rated as less intense and more genuine than images taken from other databases, suggesting a greater degree of authenticity among WFD images. The strong positive correlation between intensity and genuineness scores demonstrating that even the high arousal states captured in the WFD were perceived as authentic. Collectively, these findings highlight the potential utility of the WFD as a new resource for bridging the gap between the laboratory and real world in studies of expression recognition.
Collapse
Affiliation(s)
- Houqiu Long
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Natalie Peluso
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia.
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| |
Collapse
|
7
|
Zhang H, Ding X, Liu N, Nolan R, Ungerleider LG, Japee S. Equivalent processing of facial expression and identity by macaque visual system and task-optimized neural network. Neuroimage 2023; 273:120067. [PMID: 36997134 PMCID: PMC10165955 DOI: 10.1016/j.neuroimage.2023.120067] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2022] [Revised: 02/20/2023] [Accepted: 03/27/2023] [Indexed: 03/30/2023] Open
Abstract
Both the primate visual system and artificial deep neural network (DNN) models show an extraordinary ability to simultaneously classify facial expression and identity. However, the neural computations underlying the two systems are unclear. Here, we developed a multi-task DNN model that optimally classified both monkey facial expressions and identities. By comparing the fMRI neural representations of the macaque visual cortex with the best-performing DNN model, we found that both systems: 1) share initial stages for processing low-level face features which segregate into separate branches at later stages for processing facial expression and identity respectively, and 2) gain more specificity for the processing of either facial expression or identity as one progresses along each branch towards higher stages. Correspondence analysis between the DNN and monkey visual areas revealed that the amygdala and anterior fundus face patch (AF) matched well with later layers of the DNN's facial expression branch, while the anterior medial face patch (AM) matched well with later layers of the DNN's facial identity branch. Our results highlight the anatomical and functional similarities between macaque visual system and DNN model, suggesting a common mechanism between the two systems.
Collapse
Affiliation(s)
- Hui Zhang
- School of Engineering Medicine, Beihang University; Key Laboratory of Biomechanics and Mechanobiology (Beihang University), Ministry of Education, Key Laboratory of Big Data-Based Precision Medicine, Ministry of Industry and Information Technology of the People's Republic of China, Beijing 100191, China; Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, Maryland 20892, USA.
| | - Xuetong Ding
- School of Engineering Medicine, Beihang University; Key Laboratory of Biomechanics and Mechanobiology (Beihang University), Ministry of Education, Key Laboratory of Big Data-Based Precision Medicine, Ministry of Industry and Information Technology of the People's Republic of China, Beijing 100191, China
| | - Ning Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing 100101, China..óSchool of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, 100049, China; Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei 230088, China; Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, Maryland 20892, USA
| | - Rachel Nolan
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, Maryland 20892, USA
| | | | - Shruti Japee
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, Maryland 20892, USA
| |
Collapse
|
8
|
Bachmann HP, Japee S, Merriam EP, Liu TT. The relationship between emotional valence, anxiety, and attentional bias. J Vis 2022. [DOI: 10.1167/jov.22.14.3721] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
Affiliation(s)
- Helena P. Bachmann
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Elisha P. Merriam
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Tina T. Liu
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| |
Collapse
|
9
|
Andrews S, Japee S, Ritchie B. Hometown context and childhood activities predict face recognition performance. J Vis 2022. [DOI: 10.1167/jov.22.14.3717] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
|
10
|
Taubert J, Japee S, Patterson A, Wild H, Goyal S, Yu D, Ungerleider LG. A broadly tuned network for affective body language in the macaque brain. Sci Adv 2022; 8:eadd6865. [PMID: 36427322 PMCID: PMC9699662 DOI: 10.1126/sciadv.add6865] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Accepted: 10/27/2022] [Indexed: 06/16/2023]
Abstract
Body language is a powerful tool that we use to communicate how we feel, but it is unclear whether other primates also communicate in this way. Here, we use functional magnetic resonance imaging to show that the body-selective patches in macaques are activated by affective body language. Unexpectedly, we found these regions to be tolerant of naturalistic variation in posture as well as species; the bodies of macaques, humans, and domestic cats all evoked a stronger response when they conveyed fear than when they conveyed no affect. Multivariate analyses confirmed that the neural representation of fear-related body expressions was species-invariant. Collectively, these findings demonstrate that, like humans, macaques have body-selective brain regions in the ventral visual pathway for processing affective body language. These data also indicate that representations of body stimuli in these regions are built on the basis of emergent properties, such as socio-affective meaning, and not just putative image properties.
Collapse
Affiliation(s)
- Jessica Taubert
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
- School of Psychology, The University of Queensland, Brisbane, QLD 4072, Australia
| | - Shruti Japee
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| | - Amanda Patterson
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| | - Hannah Wild
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| | - Shivani Goyal
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| | - David Yu
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| | - Leslie G. Ungerleider
- Section on Neurocircuitry, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892, USA
| |
Collapse
|
11
|
Liu TT, Fu JZ, Chai Y, Japee S, Chen G, Ungerleider LG, Merriam EP. Layer-specific, retinotopically-diffuse modulation in human visual cortex in response to viewing emotionally expressive faces. Nat Commun 2022; 13:6302. [PMID: 36273204 PMCID: PMC9588045 DOI: 10.1038/s41467-022-33580-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2021] [Accepted: 09/22/2022] [Indexed: 12/25/2022] Open
Abstract
Viewing faces that are perceived as emotionally expressive evokes enhanced neural responses in multiple brain regions, a phenomenon thought to depend critically on the amygdala. This emotion-related modulation is evident even in primary visual cortex (V1), providing a potential neural substrate by which emotionally salient stimuli can affect perception. How does emotional valence information, computed in the amygdala, reach V1? Here we use high-resolution functional MRI to investigate the layer profile and retinotopic distribution of neural activity specific to emotional facial expressions. Across three experiments, human participants viewed centrally presented face stimuli varying in emotional expression and performed a gender judgment task. We found that facial valence sensitivity was evident only in superficial cortical layers and was not restricted to the retinotopic location of the stimuli, consistent with diffuse feedback-like projections from the amygdala. Together, our results provide a feedback mechanism by which the amygdala directly modulates activity at the earliest stage of visual processing.
Collapse
Affiliation(s)
- Tina T. Liu
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Jason Z Fu
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Yuhui Chai
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Shruti Japee
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Gang Chen
- grid.416868.50000 0004 0464 0574Scientific and Statistical Computing Core, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Leslie G. Ungerleider
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| | - Elisha P. Merriam
- grid.416868.50000 0004 0464 0574Laboratory of Brain and Cognition, National Institute of Mental Health, NIH, Bethesda, 20892 MD USA
| |
Collapse
|
12
|
Chung JY, Gibbons A, Atlas L, Ballard E, Ernst M, Japee S, Farmer C, Shaw J, Pereira F. COVID-19 and Mental Health: Predicted Mental Health Status is Associated with Clinical Symptoms and Pandemic-Related Psychological and Behavioral Responses. medRxiv 2021. [PMID: 34671781 DOI: 10.1101/2021.10.12.21264902] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Background The COVID-19 pandemic led to dramatic threats to health and social life. Study objectives - develop a prediction model leveraging subsample of known Patient/Controls and evaluate the relationship of predicted mental health status to clinical outcome measures and pandemic-related psychological and behavioral responses during lockdown (spring/summer 2020). Methods Online cohort study conducted by National Institute of Mental Health Intramural Research Program. Convenience sample of English-speaking adults (enrolled 4/4-5/16/20; n=1,992). Enrollment measures: demographics, clinical history, functional status, psychiatric and family history, alcohol/drug use. Outcome measures (enrollment and q2 weeks/6 months): distress, loneliness, mental health symptoms, and COVID-19 survey. NIMH IRP Patient/Controls survey responses informed assignment of Patient Probability Scores (PPS) for all participants. Regression models analyzed the relationship between PPS and outcome measures. Outcomes Mean age 46.0 (±14.7), female (82.4%), white (88.9 %). PPS correlated with distress, loneliness, depression, and mental health factors. PPS associated with negative psychological responses to COVID-19. Worry about mental health (OR 1.46) exceeded worry about physical health (OR 1.13). PPS not associated with adherence to social distancing guidelines but was with stress related to social distancing and worries about infection of self/others. Interpretation Mental health status (PPS) was associated with concurrent clinical ratings and COVID-specific negative responses. A focus on mental health during the pandemic is warranted, especially among those with mental health vulnerabilities. We will include PPS when conducting longitudinal analyses of mental health trajectories and risk and resilience factors that may account for differing clinical outcomes. Funding NIMH (ZIAMH002922); NCCIH (ZIAAT000030).
Collapse
|
13
|
Taubert J, Japee S. Using FACS to trace the neural specializations underlying the recognition of facial expressions: A commentary on Waller et al. (2020). Neurosci Biobehav Rev 2020; 120:75-77. [PMID: 33227326 DOI: 10.1016/j.neubiorev.2020.10.016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 10/20/2020] [Accepted: 10/30/2020] [Indexed: 02/08/2023]
Abstract
In the recent review by Waller et al. (2020) the authors discuss how the Facial Action Coding System (FACS) can be used to study the evolution of facial behaviors. This is a timely and thought-provoking review which highlights the numerous ways in which FACS could be used to compare the mechanisms responsible for the production of facial behaviors across species. We propose that FACS could also be used to study the recognition of facial behaviors in nonhuman subjects where one of the key challenges is finding suitable stimuli that convey different emotions. By using FACS-rated images in awake neuroimaging experiments, researchers could accurately identify the brain mechanisms responsible for recognizing expressions across mammalian species. This approach would reveal neural homologs and deepen our understanding of how nonverbal social communication has evolved.
Collapse
Affiliation(s)
- Jessica Taubert
- The Laboratory of Brain and Cognition, The National Institute of Mental Health, United States.
| | - Shruti Japee
- The Laboratory of Brain and Cognition, The National Institute of Mental Health, United States
| |
Collapse
|
14
|
Wild H, Goyal S, Japee S, Ungerleider L, Taubert J. Cross-species characterization of facial expression and head orientation processing. J Vis 2020. [DOI: 10.1167/jov.20.11.1273] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
- Hannah Wild
- Laboratory of Brain and Cognition, NIMH, NIH
| | | | | | | | | |
Collapse
|
15
|
Liu T, Fu J, Japee S, Chai Y, Ungerleider L, Merriam E. Layer-specific modulation of visual responses in human visual cortex by emotional faces. J Vis 2020. [DOI: 10.1167/jov.20.11.587] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
- Tina Liu
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Jason Fu
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Yuhui Chai
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Leslie Ungerleider
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| | - Elisha Merriam
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD
| |
Collapse
|
16
|
Goyal S, Wild H, Herald S, Duchaine B, Ungerleider LG, Japee S. Processing of Facial Expressions in Developmental Prosopagnosia. J Vis 2020. [DOI: 10.1167/jov.20.11.1166] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
| | - Hannah Wild
- Laboratory of Brain and Cognition, NIMH, NIH
| | - Sarah Herald
- Department of Psychological and Brain Sciences, Dartmouth University
| | - Brad Duchaine
- Department of Psychological and Brain Sciences, Dartmouth University
| | | | | |
Collapse
|
17
|
Zhang H, Japee S, Stacy A, Flessert M, Ungerleider LG. Anterior superior temporal sulcus is specialized for non-rigid facial motion in both monkeys and humans. Neuroimage 2020; 218:116878. [PMID: 32360168 PMCID: PMC7478875 DOI: 10.1016/j.neuroimage.2020.116878] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2019] [Revised: 04/18/2020] [Accepted: 04/23/2020] [Indexed: 02/01/2023] Open
Abstract
Facial motion plays a fundamental role in the recognition of facial expressions in primates, but the neural substrates underlying this special type of biological motion are not well understood. Here, we used fMRI to investigate the extent to which the specialization for facial motion is represented in the visual system and compared the neural mechanisms for the processing of non-rigid facial motion in macaque monkeys and humans. We defined the areas specialized for facial motion as those significantly more activated when subjects perceived the motion caused by dynamic faces (dynamic faces > static faces) than when they perceived the motion caused by dynamic non-face objects (dynamic objects > static objects). We found that, in monkeys, significant activations evoked by facial motion were in the fundus of anterior superior temporal sulcus (STS), which overlapped the anterior fundus face patch. In humans, facial motion activated three separate foci in the right STS: posterior, middle, and anterior STS, with the anterior STS location showing the most selectivity for facial motion compared with other facial motion areas. In both monkeys and humans, facial motion shows a gradient preference as one progresses anteriorly along the STS. Taken together, our results indicate that monkeys and humans share similar neural substrates within the anterior temporal lobe specialized for the processing of non-rigid facial motion.
Collapse
Affiliation(s)
- Hui Zhang
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, 20892, USA; Beijing Advanced Innovation Center for Big Data-Based Precision Medicine, Beijing Advanced Innovation Center for Big Data and Brain Computing, Beihang University, Beijing, 100191, China.
| | - Shruti Japee
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, 20892, USA
| | - Andrea Stacy
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, 20892, USA
| | - Molly Flessert
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD, 20892, USA
| | | |
Collapse
|
18
|
Vernet M, Quentin R, Japee S, Ungerleider LG. From visual awareness to consciousness without sensory input: The role of spontaneous brain activity. Cogn Neuropsychol 2020; 37:216-219. [PMID: 32093525 DOI: 10.1080/02643294.2020.1731442] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Affiliation(s)
- Marine Vernet
- IMPACT team, Lyon Neuroscience Research Center (CRNL), CNRS UMR 5292, INSERM UMRS 1028, University Claude Bernard Lyon 1, Lyon, France.,Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Romain Quentin
- Human Cortical Physiology and Neurorehabilitation Section, NINDS/NIH, Bethesda, MD, USA
| | - Shruti Japee
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Leslie G Ungerleider
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| |
Collapse
|
19
|
Zhang X, Mlynaryk N, Ahmed S, Japee S, Ungerleider LG. The role of inferior frontal junction in controlling the spatially global effect of feature-based attention in human visual areas. PLoS Biol 2018; 16:e2005399. [PMID: 29939981 PMCID: PMC6034892 DOI: 10.1371/journal.pbio.2005399] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2018] [Revised: 07/06/2018] [Accepted: 06/06/2018] [Indexed: 12/02/2022] Open
Abstract
Feature-based attention has a spatially global effect, i.e., responses to stimuli that share features with an attended stimulus are enhanced not only at the attended location but throughout the visual field. However, how feature-based attention modulates cortical neural responses at unattended locations remains unclear. Here we used functional magnetic resonance imaging (fMRI) to examine this issue as human participants performed motion- (Experiment 1) and color- (Experiment 2) based attention tasks. Results indicated that, in both experiments, the respective visual processing areas (middle temporal area [MT+] for motion and V4 for color) as well as early visual, parietal, and prefrontal areas all showed the classic feature-based attention effect, with neural responses to the unattended stimulus significantly elevated when it shared the same feature with the attended stimulus. Effective connectivity analysis using dynamic causal modeling (DCM) showed that this spatially global effect in the respective visual processing areas (MT+ for motion and V4 for color), intraparietal sulcus (IPS), frontal eye field (FEF), medial frontal gyrus (mFG), and primary visual cortex (V1) was derived by feedback from the inferior frontal junction (IFJ). Complementary effective connectivity analysis using Granger causality modeling (GCM) confirmed that, in both experiments, the node with the highest outflow and netflow degree was IFJ, which was thus considered to be the source of the network. These results indicate a source for the spatially global effect of feature-based attention in the human prefrontal cortex. Attentional selection is the mechanism by which relevant sensory information is processed preferentially. Feature-based attention plays a key role in identifying an attentional target in a complex scene, because we often know the features of the target but not its exact location. The ability to quickly select the target is mainly attributed to enhancement of responses to stimuli that share features with an attended stimulus, not only at the attended location but throughout the whole visual field. However, little is known regarding how feature-based attention modulates brain responses at unattended locations. Here we used fMRI and advanced connectivity analyses to examine human subjects as they performed either motion- or color-based attention tasks. Our results indicated that the visual processing areas for motion and color showed the feature-based attention effect. Effective connectivity analysis showed that this feature-based attention effect was derived by feedback from the inferior frontal junction, an area of the posterior lateral prefrontal cortex involved in many different cognitive processes, including spatial attention and working memory. Further modeling confirmed that the inferior frontal junction showed connectivity features supporting its role as the source of the network. Our results support the hypothesis that the inferior frontal junction plays a key role in the spatially global effect of feature-based attention.
Collapse
Affiliation(s)
- Xilin Zhang
- School of Psychology, South China Normal University, Guangzhou, Guangdong, China
- Guangdong Provincial Key Laboratory of Mental Health and Cognitive Science, South China Normal University, Guangzhou, Guangdong, China
- * E-mail:
| | - Nicole Mlynaryk
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Sara Ahmed
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Leslie G. Ungerleider
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| |
Collapse
|
20
|
Zhang X, Mlynaryk N, Japee S, Ungerleider LG. Attentional selection of multiple objects in the human visual system. Neuroimage 2017; 163:231-243. [PMID: 28951352 DOI: 10.1016/j.neuroimage.2017.09.050] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2017] [Revised: 09/18/2017] [Accepted: 09/22/2017] [Indexed: 11/28/2022] Open
Abstract
Classic theories of object-based attention assume a single object of selection but real-world tasks, such as driving a car, often require attending to multiple objects simultaneously. However, whether object-based attention can operate on more than one object at a time remains unexplored. Here, we used functional magnetic resonance imaging (fMRI) to address this question as human participants performed object-based attention tasks that required simultaneous attention to two objects differing in either their features or locations. Simultaneous attention to two objects differing in features (face and house) did not show significantly different responses in the fusiform face area (FFA) or parahippocampal place area (PPA), respectively, compared to attending a single object (face or house), but did enhance the response in the inferior frontal gyrus (IFG). Simultaneous attention to two circular arcs differing in locations did not show significantly different responses in the primary visual cortex (V1) compared to attending a single circular arc, but did enhance the response in the intraparietal sulcus (IPS). These results suggest that object-based attention can simultaneously select at least two objects differing in their features or locations, processes mediated by the frontal and parietal cortex, respectively.
Collapse
Affiliation(s)
- Xilin Zhang
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA.
| | - Nicole Mlynaryk
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| | - Leslie G Ungerleider
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
| |
Collapse
|
21
|
Pitcher D, Japee S, Rauth L, Ungerleider L. The superior temporal sulcus is causally connected to the amygdala: A combined TBS-fMRI study. J Vis 2017. [DOI: 10.1167/17.10.258] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
22
|
Vernet M, Japee S, Lokey S, Ahmed S, Zachariou V, Ungerleider LG. Endogenous visuospatial attention increases visual awareness independent of visual discrimination sensitivity. Neuropsychologia 2017; 128:297-304. [PMID: 28807647 DOI: 10.1016/j.neuropsychologia.2017.08.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2017] [Accepted: 08/10/2017] [Indexed: 11/18/2022]
Abstract
Visuospatial attention often improves task performance by increasing signal gain at attended locations and decreasing noise at unattended locations. Attention is also believed to be the mechanism that allows information to enter awareness. In this experiment, we assessed whether orienting endogenous visuospatial attention with cues differentially affects visual discrimination sensitivity (an objective task performance) and visual awareness (the subjective feeling of perceiving) during the same discrimination task. Gabor patch targets were presented laterally, either at low contrast (contrast stimuli) or at high contrast embedded in noise (noise stimuli). Participants reported their orientation either in a 3-alternative choice task (clockwise, counterclockwise, unknown) that allowed for both objective and subjective reports, or in a 2-alternative choice task (clockwise, counterclockwise) that provided a control for objective reports. Signal detection theory models were fit to the experimental data: estimated perceptual sensitivity reflected objective performance; decision criteria, or subjective biases, were a proxy for visual awareness. Attention increased sensitivity (i.e., improved objective performance) for the contrast, but not for the noise stimuli. Indeed, with the latter, attention did not further enhance the already high target signal or reduce the already low uncertainty on its position. Interestingly, for both contrast and noise stimuli, attention resulted in more liberal criteria, i.e., awareness increased. The noise condition is thus an experimental configuration where people think they see the targets they attend to better, even if they do not. This could be explained by an internal representation of their attentional state, which influences awareness independent of objective visual signals.
Collapse
Affiliation(s)
- Marine Vernet
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA.
| | - Shruti Japee
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Savannah Lokey
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Sara Ahmed
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Valentinos Zachariou
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| | - Leslie G Ungerleider
- Section on Neurocircuitry, Laboratory of Brain and Cognition, NIMH/NIH, Bethesda, MD, USA
| |
Collapse
|
23
|
Abstract
The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects’ attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention. Using a combination of psychophysics and functional MRI, this study reveals that emotional attention interacts with normalization processes depending on emotional valence (positive or negative faces), best explained by feedback modulation from the dorsolateral prefrontal cortex. Attentional selection is the mechanism by which the subset of incoming information is preferentially processed at the expense of distractors. The normalization model of attention suggests that attention-triggered modulatory effects on sensory responses in the visual cortex depend on two factors: the stimulus size and the attention field size. However, little is known regarding whether emotional attention shapes perception by means of the normalization framework. To test this hypothesis, we manipulated the attention field by emotional valence—negative faces versus positive faces—while holding the stimulus size constant in a spatial cueing task. We observed that attention increased response gain for negative faces, with the largest cueing effects occurring at high contrasts and little to no effect at low and mid-contrasts; however, attention increased contrast gain for positive faces, with the largest cueing effects occurring at mid-contrasts and little to no effect at low and high contrasts. A complementary neuroimaging experiment confirmed that subjects' attention fields were narrowed for negative faces and broadened for positive faces. Across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response-gain and contrast-gain changes and with narrowed and broadened attention fields in the primary visual cortex. Mechanistically, we found that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex to the primary visual cortex. Our findings provide evidence for a normalization framework for emotional attention and for the critical role of feedback from the prefrontal cortex to the early visual cortex in this normalization.
Collapse
Affiliation(s)
- Xilin Zhang
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
- * E-mail:
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Zaid Safiullah
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Nicole Mlynaryk
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Leslie G. Ungerleider
- Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland, United States of America
| |
Collapse
|
24
|
Zhang X, Mlynaryk N, Japee S, Ungerleider L. Multiple Objects of Attentional Selection in Human Visual Cortex. J Vis 2016. [DOI: 10.1167/16.12.603] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
25
|
Flessert M, Zhang H, Japee S, Ungerleider L. Comparing the specialization for facial motion in macaques and humans. J Vis 2016. [DOI: 10.1167/16.12.722] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
26
|
Lokey S, Japee S, Baker C, Ungerleider L. Emotion processing deficits in Moebius Syndrome. J Vis 2016. [DOI: 10.1167/16.12.1256] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
27
|
Zhang H, Japee S, Nolan R, Chu C, Liu N, Ungerleider LG. Face-selective regions differ in their ability to classify facial expressions. Neuroimage 2016; 130:77-90. [PMID: 26826513 PMCID: PMC4808360 DOI: 10.1016/j.neuroimage.2016.01.045] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2015] [Revised: 01/11/2016] [Accepted: 01/20/2016] [Indexed: 10/22/2022] Open
Abstract
Recognition of facial expressions is crucial for effective social interactions. Yet, the extent to which the various face-selective regions in the human brain classify different facial expressions remains unclear. We used functional magnetic resonance imaging (fMRI) and support vector machine pattern classification analysis to determine how well face-selective brain regions are able to decode different categories of facial expression. Subjects participated in a slow event-related fMRI experiment in which they were shown 32 face pictures, portraying four different expressions: neutral, fearful, angry, and happy and belonging to eight different identities. Our results showed that only the amygdala and the posterior superior temporal sulcus (STS) were able to accurately discriminate between these expressions, albeit in different ways: the amygdala discriminated fearful faces from non-fearful faces, whereas STS discriminated neutral from emotional (fearful, angry and happy) faces. In contrast to these findings on the classification of emotional expression, only the fusiform face area (FFA) and anterior inferior temporal cortex (aIT) could discriminate among the various facial identities. Further, the amygdala and STS were better than FFA and aIT at classifying expression, while FFA and aIT were better than the amygdala and STS at classifying identity. Taken together, our findings indicate that the decoding of facial emotion and facial identity occurs in different neural substrates: the amygdala and STS for the former and FFA and aIT for the latter.
Collapse
Affiliation(s)
- Hui Zhang
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD 20892, USA.
| | - Shruti Japee
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD 20892, USA
| | - Rachel Nolan
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD 20892, USA
| | - Carlton Chu
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD 20892, USA
| | - Ning Liu
- Laboratory of Brain and Cognition, NIMH, NIH, Bethesda, MD 20892, USA
| | | |
Collapse
|
28
|
Japee S, Holiday K, Satyshur MD, Mukai I, Ungerleider LG. A role of right middle frontal gyrus in reorienting of attention: a case study. Front Syst Neurosci 2015; 9:23. [PMID: 25784862 PMCID: PMC4347607 DOI: 10.3389/fnsys.2015.00023] [Citation(s) in RCA: 290] [Impact Index Per Article: 32.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2014] [Accepted: 02/11/2015] [Indexed: 12/29/2022] Open
Abstract
The right middle fontal gyrus (MFG) has been proposed to be a site of convergence of the dorsal and ventral attention networks, by serving as a circuit-breaker to interrupt ongoing endogenous attentional processes in the dorsal network and reorient attention to an exogenous stimulus. Here, we probed the contribution of the right MFG to both endogenous and exogenous attention by comparing performance on an orientation discrimination task of a patient with a right MFG resection and a group of healthy controls. On endogenously cued trials, participants were shown a central cue that predicted with 90% accuracy the location of a subsequent peri-threshold Gabor patch stimulus. On exogenously cued trials, a cue appeared briefly at one of two peripheral locations, followed by a variable inter-stimulus interval (ISI; range 0–700 ms) and a Gabor patch in the same or opposite location as the cue. Behavioral data showed that for endogenous, and short ISI exogenous trials, valid cues facilitated responses compared to invalid cues, for both the patient and controls. However, at long ISIs, the patient exhibited difficulty in reverting to top-down attentional control, once the facilitatory effect of the exogenous cue had dissipated. When explicitly cued during long ISIs to attend to both stimulus locations, the patient was able to engage successfully in top-down control. This result indicates that the right MFG may play an important role in reorienting attention from exogenous to endogenous attentional control. Resting state fMRI data revealed that the right superior parietal lobule and right orbitofrontal cortex, showed significantly higher correlations with a left MFG seed region (a region tightly coupled with the right MFG in controls) in the patient relative to controls. We hypothesize that this paradoxical increase in cortical coupling represents a compensatory mechanism in the patient to offset the loss of function of the resected tissue in right prefrontal cortex.
Collapse
Affiliation(s)
- Shruti Japee
- Lab of Brain and Cognition, National Institute of Mental Health, National Institutes of Health Bethesda, MD, USA
| | - Kelsey Holiday
- Lab of Brain and Cognition, National Institute of Mental Health, National Institutes of Health Bethesda, MD, USA
| | | | - Ikuko Mukai
- Laureate Institute for Brain Research Tulsa, OK, USA
| | - Leslie G Ungerleider
- Lab of Brain and Cognition, National Institute of Mental Health, National Institutes of Health Bethesda, MD, USA
| |
Collapse
|
29
|
Doty TJ, Japee S, Ingvar M, Ungerleider LG. Fearful face detection sensitivity in healthy adults correlates with anxiety-related traits. ACTA ACUST UNITED AC 2013; 13:183-8. [PMID: 23398584 DOI: 10.1037/a0031373] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Threatening faces have a privileged status in the brain, which can be reflected in a processing advantage. However, this effect varies among individuals, even healthy adults. For example, one recent study showed that fearful face detection sensitivity correlated with trait anxiety in healthy adults (S. Japee, L. Crocker, F. Carver, L. Pessoa, & L. G. Ungerleider, 2009. Individual differences in valence modulation of face-selective M170 response. Emotion, 9, 59-69). Here, we expanded on those findings by investigating whether intersubject variability in fearful face detection is also associated with state anxiety, as well as more broadly with other traits related to anxiety. To measure fearful face detection sensitivity, we used a masked face paradigm where the target face was presented for only 33 ms and was immediately followed by a neutral face mask. Subjects then rated their confidence in detecting either fear or no fear in the target face. Fearful face detection sensitivity was calculated for each subject using signal detection theory. Replicating previous results, we found a significant positive correlation between trait anxiety and fearful face detection sensitivity. However, this behavioral advantage did not correlate with state anxiety. We also found that fearful face detection sensitivity correlated with other personality measures, including neuroticism and harm avoidance. Our data suggest that fearful face detection sensitivity varies parametrically across the healthy population, is associated broadly with personality traits related to anxiety, but remains largely unaffected by situational fluctuations in anxiety. These results underscore the important contribution of anxiety-related personality traits to threat processing in healthy adults.
Collapse
Affiliation(s)
- Tracy J Doty
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| | | | | | | |
Collapse
|
30
|
Japee S, Crocker L, Carver F, Pessoa L, Ungerleider LG. Individual differences in valence modulation of face-selective M170 response. ACTA ACUST UNITED AC 2009; 9:59-69. [PMID: 19186917 DOI: 10.1037/a0014487] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Magnetoencephalography was used to examine the effect of individual differences on the temporal dynamics of emotional face processing by grouping subjects based on their ability to detect masked valence-laden stimuli. Receiver operating characteristic curves and a nonparametric sensitivity measure were used to categorize subjects into those that could and could not reliably detect briefly presented fearful faces that were backward-masked by neutral faces. Results showed that, in a cluster of face-responsive sensors, the strength of the M170 response was modulated by valence only when subjects could reliably detect the masked fearful faces. Source localization of the M170 peak using synthetic aperture magnetometry identified sources in face processing areas such as right middle occipital gyrus and left fusiform gyrus that showed the valence effect for those target durations at which subjects were sensitive to the fearful stimulus. Subjects who were better able to detect fearful faces also showed higher trait anxiety levels. These results suggest that individual differences between subjects, such as trait anxiety levels and sensitivity to fearful stimuli, may be an important factor to consider when studying emotion processing.
Collapse
Affiliation(s)
- Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD 20892-1366, USA.
| | | | | | | | | |
Collapse
|
31
|
Olsen RK, Kippenhan JS, Japee S, Kohn P, Mervis CB, Saad ZS, Morris CA, Meyer-Lindenberg A, Berman KF. Retinotopically defined primary visual cortex in Williams syndrome. ACTA ACUST UNITED AC 2009; 132:635-44. [PMID: 19255058 DOI: 10.1093/brain/awn362] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Williams syndrome, caused by a hemizygous microdeletion on chromosome 7q11.23, is characterized by severe impairment in visuospatial construction. To examine potential contributions of early visual processing to this cognitive problem, we functionally mapped the size and neuroanatomical variability of primary visual cortex (V1) in high-functioning adults with Williams syndrome and age- and IQ-matched control participants from the general population by using fMRI-based retinotopic mapping and cortical surface models generated from high-resolution structural MRI. Visual stimulation, consisting of rotating hemicircles and expanding rings, was used to retinotopically define early visual processing areas. V1 boundaries based on computed phase and field sign maps were used to calculate the functional area of V1. Neuroanatomical variability was assessed by computing overlap maps of V1 location for each group on standardized cortical surfaces, and non-parametric permutation test methods were used for statistical inference. V1 did not differ in size between groups, although its anatomical boundaries were more variable in the group with Williams syndrome. V1 overlap maps showed that the average centres of gravity for the two groups were similarly located near the fundus of the calcarine fissure, approximately 25 mm away from the most posterior aspect of the occipital lobe. In summary, our functional definition of V1 size and location indicates that recruitment of primary visual cortex is grossly normal in Williams syndrome, consistent with the notion that neural abnormalities underlying visuospatial construction arise at later stages in the visual processing hierarchy.
Collapse
Affiliation(s)
- Rosanna K Olsen
- Section on Integrative Neuroimaging, Clinical Brain Disorders Branch, National Institute of Mental Health, NIH, DHHS, Bethesda, MD 20892-1365, USA
| | | | | | | | | | | | | | | | | |
Collapse
|
32
|
Abstract
The goals of the present study were twofold. First, we wished to investigate the neural correlates of aware and unaware emotional face perception after characterizing each subject's behavioral performance via signal detection theory methods. Second, we wished to investigate the extent to which amygdala responses to fearful faces depend on the physical characteristics of the stimulus independently of the percept. We show that amygdala responses depend on visual awareness. Under conditions in which subjects were not aware of fearful faces flashed for 33 ms, no differential activation was observed in the amygdala. On the other hand, differential activation was observed for 67 ms fearful targets that the subjects could reliably detect. When trials were divided into hits, misses, correct rejects, and false alarms, we show that target visibility is an important factor in determining amygdala responses to fearful faces. Taken together, our results further challenge the view that amygdala responses occur automatically.
Collapse
Affiliation(s)
- Luiz Pessoa
- Department of Psychology, Brown University, 89 Waterman Street, Providence, RI 02912, USA.
| | | | | | | |
Collapse
|
33
|
Abstract
A commonly held view is that emotional stimuli are processed independently of awareness. Here, the authors parametrically varied the duration of a fearful face target stimulus that was backward masked by a neutral face. The authors evaluated awareness by characterizing behavioral performance using receiver operating characteristic curves from signal detection theory. Their main finding was that no universal objective awareness threshold exists for fear perception. Although several subjects displayed a behavioral pattern consistent with previous reports (i.e., targets masked at 33 ms), a considerable percentage of their subjects (64%) were capable of reliably detecting 33-ms targets. Their findings suggest that considerable information is available even in briefly presented stimuli (possibly as short as 17 ms) to support masked fear detection.
Collapse
Affiliation(s)
- Luiz Pessoa
- Department of Psychology, Brown University, Providence, RI 02912, USA.
| | | | | |
Collapse
|
34
|
Verchinski B, Meyer-Lindenberg A, Japee S, Kohn P, Egan M, Bigelow L, Callicott J, Bertolino A, Mattay V, Berman K, Weinberger D. Gender differences in gray matter density: A study of structural MRI images using voxel-based morphometry. Neuroimage 2000. [DOI: 10.1016/s1053-8119(00)91160-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022] Open
|
35
|
Kohn P, Meyer-Lindenberg A, Japee S, Berman K. A method for automatically determining the Talairach origin of MR scans. Neuroimage 2000. [DOI: 10.1016/s1053-8119(00)91485-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022] Open
|