1
|
Hill AT, Ford TC, Bailey NW, Lum JAG, Bigelow FJ, Oberman LM, Enticott PG. EEG during dynamic facial emotion processing reveals neural activity patterns associated with autistic traits in children. Cereb Cortex 2025; 35:bhaf020. [PMID: 39927786 PMCID: PMC11808805 DOI: 10.1093/cercor/bhaf020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2024] [Revised: 12/09/2024] [Accepted: 01/21/2025] [Indexed: 02/11/2025] Open
Abstract
Altered brain connectivity and atypical neural oscillations have been observed in autism, yet their relationship with autistic traits in nonclinical populations remains underexplored. Here, we employ electroencephalography to examine functional connectivity, oscillatory power, and broadband aperiodic activity during a dynamic facial emotion processing task in 101 typically developing children aged 4 to 12 years. We investigate associations between these electrophysiological measures of brain dynamics and autistic traits as assessed by the Social Responsiveness Scale, 2nd Edition (SRS-2). Our results revealed that increased facial emotion processing-related connectivity across theta (4 to 7 Hz) and beta (13 to 30 Hz) frequencies correlated positively with higher SRS-2 scores, predominantly in right-lateralized (theta) and bilateral (beta) cortical networks. Additionally, a steeper 1/f-like aperiodic slope (spectral exponent) across fronto-central electrodes was associated with higher SRS-2 scores. Greater aperiodic-adjusted theta and alpha oscillatory power further correlated with both higher SRS-2 scores and steeper aperiodic slopes. These findings underscore important links between facial emotion processing-related brain dynamics and autistic traits in typically developing children. Future work could extend these findings to assess these electroencephalography-derived markers as potential mechanisms underlying behavioral difficulties in autism.
Collapse
Affiliation(s)
- Aron T Hill
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Burwood, VIC 3125, Australia
| | - Talitha C Ford
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Burwood, VIC 3125, Australia
- Centre for Mental Health and Brain Sciences, Swinburne University of Technology, Hawthorn, VIC 3122, Australia
| | - Neil W Bailey
- School of Medicine and Psychology, The Australian National University, Canberra, ACT 2601, Australia
- Monarch Research Institute, Monarch Mental Health Group, Sydney, New South Wales 2000, Australia
| | - Jarrad A G Lum
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Burwood, VIC 3125, Australia
| | - Felicity J Bigelow
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Burwood, VIC 3125, Australia
| | - Lindsay M Oberman
- Noninvasive Neuromodulation Unit, Experimental Therapeutics and Pathophysiology Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, United States
| | - Peter G Enticott
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Burwood, VIC 3125, Australia
| |
Collapse
|
2
|
Representational structure of fMRI/EEG responses to dynamic facial expressions. Neuroimage 2022; 263:119631. [PMID: 36113736 DOI: 10.1016/j.neuroimage.2022.119631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Revised: 09/09/2022] [Accepted: 09/12/2022] [Indexed: 11/23/2022] Open
Abstract
Face perception provides an excellent example of how the brain processes nuanced visual differences and transforms them into behaviourally useful representations of identities and emotional expressions. While a body of literature has looked into the spatial and temporal neural processing of facial expressions, few studies have used a dimensionally varying set of stimuli containing subtle perceptual changes. In the current study, we used 48 short videos varying dimensionally in their intensity and category (happy, angry, surprised) of expression. We measured both fMRI and EEG responses to these video clips and compared the neural response patterns to the predictions of models based on image features and models derived from behavioural ratings of the stimuli. In fMRI, the inferior frontal gyrus face area (IFG-FA) carried information related only to the intensity of the expression, independent of image-based models. The superior temporal sulcus (STS), inferior temporal (IT) and lateral occipital (LO) areas contained information about both expression category and intensity. In the EEG, the coding of expression category and low-level image features were most pronounced at around 400 ms. The expression intensity model did not, however, correlate significantly at any EEG timepoint. Our results show a specific role for IFG-FA in the coding of expressions and suggest that it contains image and category invariant representations of expression intensity.
Collapse
|
3
|
Event-Related Potentials during Verbal Recognition of Naturalistic Neutral-to-Emotional Dynamic Facial Expressions. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12157782] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Event-related potentials during facial emotion recognition have been studied for more than twenty years. Nowadays, there has been a growing interest in the use of naturalistic stimuli. This research was aimed, therefore, at studying event-related potentials (ERP) during recognition of dynamic facial neutral-to-emotional expressions, more ecologically valid than static faces. We recorded the ERP of 112 participants who watched 144 dynamic morphs depicting a gradual change from a neutral expression to a basic emotional expression (anger, disgust, fear, happiness, sadness and surprise) and labelled those emotions verbally. We revealed some typical ERP, like N170, P2, EPN and LPP. Participants with lower accuracy exhibited a larger posterior P2. Participants with faster correct responses exhibited a larger amplitude of P2 and LPP. We also conducted a classification analysis that yielded the accuracy of 76% for prediction of participants who recognise emotions quickly on the basis of the amplitude of posterior P2 and LPP. These results extend data from previous research about the electroencephalographic correlates of facial emotion recognition.
Collapse
|
4
|
Sadeghi S, Schmidt SNL, Mier D, Hass J. Effective Connectivity of the Human Mirror Neuron System During Social Cognition. Soc Cogn Affect Neurosci 2022; 17:732-743. [PMID: 35086135 PMCID: PMC9340111 DOI: 10.1093/scan/nsab138] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2021] [Revised: 11/15/2021] [Accepted: 01/27/2022] [Indexed: 11/17/2022] Open
Abstract
The human mirror neuron system (MNS) can be considered the neural basis of social cognition. Identifying the global network structure of this system can provide significant progress in the field. In this study, we use dynamic causal modeling (DCM) to determine the effective connectivity between central regions of the MNS for the first time during different social cognition tasks. Sixty-seven healthy participants completed fMRI scanning while performing social cognition tasks, including imitation, empathy and theory of mind. Superior temporal sulcus (STS), inferior parietal lobule (IPL) and Brodmann area 44 (BA44) formed the regions of interest for DCM. Varying connectivity patterns, 540 models were built and fitted for each participant. By applying group-level analysis, Bayesian model selection and Bayesian model averaging, the optimal family and model for all experimental tasks were found. For all social-cognitive processes, effective connectivity from STS to IPL and from STS to BA44 was found. For imitation, additional mutual connections occurred between STS and BA44, as well as BA44 and IPL. The results suggest inverse models in which the motor regions BA44 and IPL receive sensory information from the STS. In contrast, for imitation, a sensory loop with an exchange of motor-to-sensory and sensory-to-motor information seems to exist.
Collapse
Affiliation(s)
- Sadjad Sadeghi
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Mannheim 68159, Germany
- Department of Physics and Astronomy, Heidelberg University, Heidelberg 69120, Germany
| | | | | | - Joachim Hass
- Correspondence should be addressed to Joachim Hass, Faculty of Applied Psychology, SRH University of Applied Sciences, Maria-Probst-Strasse 3A, Heidelberg 69123, Germany. E-mail:
| |
Collapse
|
5
|
Matt S, Dzhelyova M, Maillard L, Lighezzolo-Alnot J, Rossion B, Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex 2021; 144:168-184. [PMID: 34666300 DOI: 10.1016/j.cortex.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 07/08/2021] [Accepted: 08/09/2021] [Indexed: 01/23/2023]
Abstract
Emotional expressions are quickly and automatically read from human faces under natural viewing conditions. Yet, categorization of facial expressions is typically measured in experimental contexts with homogenous sets of face stimuli. Here we evaluated how the 6 basic facial emotions (Fear, Disgust, Happiness, Anger, Surprise or Sadness) can be rapidly and automatically categorized with faces varying in head orientation, lighting condition, identity, gender, age, ethnic origin and background context. High-density electroencephalography was recorded in 17 participants viewing 50 s sequences with natural variable images of neutral-expression faces alternating at a 6 Hz rate. Every five stimuli (1.2 Hz), variable natural images of one of the six basic expressions were presented. Despite the wide physical variability across images, a significant F/5 = 1.2 Hz response and its harmonics (e.g., 2F/5 = 2.4 Hz, etc.) was observed for all expression changes at the group-level and in every individual participant. Facial categorization responses were found mainly over occipito-temporal sites, with distinct hemispheric lateralization and cortical topographies according to the different expressions. Specifically, a stronger response was found to Sadness categorization, especially over the left hemisphere, as compared to Fear and Happiness, together with a right hemispheric dominance for categorization of Fearful faces. Importantly, these differences were specific to upright faces, ruling out the contribution of low-level visual cues. Overall, these observations point to robust rapid and automatic facial expression categorization processes in the human brain.
Collapse
Affiliation(s)
- Stéphanie Matt
- Université de Lorraine, 2LPN, Nancy, France; Université de Lorraine, Laboratoire INTERPSY, Nancy, France.
| | - Milena Dzhelyova
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium.
| | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | | | - Bruno Rossion
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium; Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | - Stéphanie Caharel
- Université de Lorraine, 2LPN, Nancy, France; Institut Universitaire de France, Paris, France.
| |
Collapse
|
6
|
Marshall CR, Hardy CJD, Russell LL, Bond RL, Sivasathiaseelan H, Greaves C, Moore KM, Agustus JL, van Leeuwen JEP, Wastling SJ, Rohrer JD, Kilner JM, Warren JD. The functional neuroanatomy of emotion processing in frontotemporal dementias. Brain 2020; 142:2873-2887. [PMID: 31321407 PMCID: PMC7959336 DOI: 10.1093/brain/awz204] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2018] [Revised: 05/08/2019] [Accepted: 05/12/2019] [Indexed: 11/13/2022] Open
Abstract
Impaired processing of emotional signals is a core feature of frontotemporal dementia syndromes, but the underlying neural mechanisms have proved challenging to characterize and measure. Progress in this field may depend on detecting functional changes in the working brain, and disentangling components of emotion processing that include sensory decoding, emotion categorization and emotional contagion. We addressed this using functional MRI of naturalistic, dynamic facial emotion processing with concurrent indices of autonomic arousal, in a cohort of patients representing all major frontotemporal dementia syndromes relative to healthy age-matched individuals. Seventeen patients with behavioural variant frontotemporal dementia [four female; mean (standard deviation) age 64.8 (6.8) years], 12 with semantic variant primary progressive aphasia [four female; 66.9 (7.0) years], nine with non-fluent variant primary progressive aphasia [five female; 67.4 (8.1) years] and 22 healthy controls [12 female; 68.6 (6.8) years] passively viewed videos of universal facial expressions during functional MRI acquisition, with simultaneous heart rate and pupillometric recordings; emotion identification accuracy was assessed in a post-scan behavioural task. Relative to healthy controls, patient groups showed significant impairments (analysis of variance models, all P < 0.05) of facial emotion identification (all syndromes) and cardiac (all syndromes) and pupillary (non-fluent variant only) reactivity. Group-level functional neuroanatomical changes were assessed using statistical parametric mapping, thresholded at P < 0.05 after correction for multiple comparisons over the whole brain or within pre-specified regions of interest. In response to viewing facial expressions, all participant groups showed comparable activation of primary visual cortex while patient groups showed differential hypo-activation of fusiform and posterior temporo-occipital junctional cortices. Bi-hemispheric, syndrome-specific activations predicting facial emotion identification performance were identified (behavioural variant, anterior insula and caudate; semantic variant, anterior temporal cortex; non-fluent variant, frontal operculum). The semantic and non-fluent variant groups additionally showed complex profiles of central parasympathetic and sympathetic autonomic involvement that overlapped signatures of emotional visual and categorization processing and extended (in the non-fluent group) to brainstem effector pathways. These findings open a window on the functional cerebral mechanisms underpinning complex socio-emotional phenotypes of frontotemporal dementia, with implications for novel physiological biomarker development.
Collapse
Affiliation(s)
- Charles R Marshall
- Preventive Neurology Unit, Wolfson Institute of Preventive Medicine, Queen Mary University of London, London, UK.,Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK.,Sobell Department of Motor Neuroscience and Movement Disorders, UCL Institute of Neurology, London, UK
| | - Christopher J D Hardy
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Lucy L Russell
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Rebecca L Bond
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Harri Sivasathiaseelan
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Caroline Greaves
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Katrina M Moore
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Jennifer L Agustus
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Janneke E P van Leeuwen
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - Stephen J Wastling
- Department of Neuroradiology, National Hospital for Neurology and Neurosurgery, London, UK
| | - Jonathan D Rohrer
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| | - James M Kilner
- Sobell Department of Motor Neuroscience and Movement Disorders, UCL Institute of Neurology, London, UK
| | - Jason D Warren
- Dementia Research Centre, Department of Neurodegenerative Disease, UCL Institute of Neurology, London, UK
| |
Collapse
|
7
|
Williams JHG, Huggins CF, Zupan B, Willis M, Van Rheenen TE, Sato W, Palermo R, Ortner C, Krippl M, Kret M, Dickson JM, Li CSR, Lowe L. A sensorimotor control framework for understanding emotional communication and regulation. Neurosci Biobehav Rev 2020; 112:503-518. [PMID: 32070695 PMCID: PMC7505116 DOI: 10.1016/j.neubiorev.2020.02.014] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Revised: 01/22/2020] [Accepted: 02/11/2020] [Indexed: 12/12/2022]
Abstract
Our research team was asked to consider the relationship of the neuroscience of sensorimotor control to the language of emotions and feelings. Actions are the principal means for the communication of emotions and feelings in both humans and other animals, and the allostatic mechanisms controlling action also apply to the regulation of emotional states by the self and others. We consider how motor control of hierarchically organised, feedback-based, goal-directed action has evolved in humans, within a context of consciousness, appraisal and cultural learning, to serve emotions and feelings. In our linguistic analysis, we found that many emotion and feelings words could be assigned to stages in the sensorimotor learning process, but the assignment was often arbitrary. The embodied nature of emotional communication means that action words are frequently used, but that the meanings or senses of the word depend on its contextual use, just as the relationship of an action to an emotion is also contextually dependent.
Collapse
Affiliation(s)
- Justin H G Williams
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom.
| | - Charlotte F Huggins
- University of Aberdeen, Institute of Medical Sciences, Foresterhill, AB25 2ZD, Scotland, United Kingdom
| | - Barbra Zupan
- Central Queensland University, School of Health, Medical and Applied Sciences, Bruce Highway, Rockhampton, QLD 4702, Australia
| | - Megan Willis
- Australian Catholic University, School of Psychology, ARC Centre for Excellence in Cognition and its Disorders, Sydney, NSW 2060, Australia
| | - Tamsyn E Van Rheenen
- University of Melbourne, Melbourne Neuropsychiatry Centre, Department of Psychiatry, 161 Barry Street, Carlton, VIC 3053, Australia
| | - Wataru Sato
- Kyoto University, Kokoro Research Centre, 46 Yoshidashimoadachicho, Sakyo Ward, Kyoto, 606-8501, Japan
| | - Romina Palermo
- University of Western Australia, School of Psychological Science, Perth, WA, 6009, Australia
| | - Catherine Ortner
- Thompson Rivers University, Department of Psychology, 805 TRU Way, Kamloops, BC V2C 0C8, Canada
| | - Martin Krippl
- Otto von Guericke University Magdeburg, Faculty of Natural Sciences, Department of Psychology, Universitätsplatz 2, Magdeburg, 39106, Germany
| | - Mariska Kret
- Leiden University, Cognitive Psychology, Pieter de la Court, Waassenaarseweg 52, Leiden, 2333 AK, the Netherlands
| | - Joanne M Dickson
- Edith Cowan University, Psychology Department, School of Arts and Humanities, 270 Joondalup Dr, Joondalup, WA 6027, Australia
| | - Chiang-Shan R Li
- Yale University, Connecticut Mental Health Centre, S112, 34 Park Street, New Haven, CT 06519-1109, USA
| | - Leroy Lowe
- Neuroqualia, Room 229A, Forrester Hall, 36 Arthur Street, Truro, Nova Scotia, B2N 1X5, Canada
| |
Collapse
|
8
|
Behavioral and electctrophysiological evidence for enhanced sensitivity to subtle variations of pain expressions of same-race than other-race faces. Neuropsychologia 2019; 129:302-309. [DOI: 10.1016/j.neuropsychologia.2019.04.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2018] [Revised: 03/12/2019] [Accepted: 04/14/2019] [Indexed: 01/10/2023]
|
9
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Empathy in Facial Mimicry of Fear and Disgust: Simultaneous EMG-fMRI Recordings During Observation of Static and Dynamic Facial Expressions. Front Psychol 2019; 10:701. [PMID: 30971997 PMCID: PMC6445885 DOI: 10.3389/fpsyg.2019.00701] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2018] [Accepted: 03/13/2019] [Indexed: 01/18/2023] Open
Abstract
Real-life faces are dynamic by nature, particularly when expressing emotion. Increasing evidence suggests that the perception of dynamic displays enhances facial mimicry and induces activation in widespread brain structures considered to be part of the mirror neuron system, a neuronal network linked to empathy. The present study is the first to investigate the relations among facial muscle responses, brain activity, and empathy traits while participants observed static and dynamic (videos) facial expressions of fear and disgust. During display presentation, blood-oxygen level-dependent (BOLD) signal as well as muscle reactions of the corrugator supercilii and levator labii were recorded simultaneously from 46 healthy individuals (21 females). It was shown that both fear and disgust faces caused activity in the corrugator supercilii muscle, while perception of disgust produced facial activity additionally in the levator labii muscle, supporting a specific pattern of facial mimicry for these emotions. Moreover, individuals with higher, compared to individuals with lower, empathy traits showed greater activity in the corrugator supercilii and levator labii muscles; however, these responses were not differentiable between static and dynamic mode. Conversely, neuroimaging data revealed motion and emotional-related brain structures in response to dynamic rather than static stimuli among high empathy individuals. In line with this, there was a correlation between electromyography (EMG) responses and brain activity suggesting that the Mirror Neuron System, the anterior insula and the amygdala might constitute the neural correlates of automatic facial mimicry for fear and disgust. These results revealed that the dynamic property of (emotional) stimuli facilitates the emotional-related processing of facial expressions, especially among whose with high trait empathy.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland
| |
Collapse
|
10
|
De Stefani E, Nicolini Y, Belluardo M, Ferrari PF. Congenital facial palsy and emotion processing: The case of Moebius syndrome. GENES BRAIN AND BEHAVIOR 2019; 18:e12548. [PMID: 30604920 DOI: 10.1111/gbb.12548] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 11/16/2018] [Accepted: 12/15/2018] [Indexed: 12/13/2022]
Abstract
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.
Collapse
Affiliation(s)
- Elisa De Stefani
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Ylenia Nicolini
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Mauro Belluardo
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Pier Francesco Ferrari
- Department of Medicine and Surgery, University of Parma, Parma, Italy.,Institut des Sciences Cognitives Marc Jeannerod, CNRS, Université de Lyon, Lyon, France
| |
Collapse
|
11
|
Liu P, Bai X, Pérez-Edgar KE. Integrating high-density ERP and fMRI measures of face-elicited brain activity in 9-12-year-old children: An ERP source localization study. Neuroimage 2018; 184:599-608. [PMID: 30268845 DOI: 10.1016/j.neuroimage.2018.09.070] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2018] [Revised: 09/17/2018] [Accepted: 09/25/2018] [Indexed: 10/28/2022] Open
Abstract
Social information processing is a critical mechanism underlying children's socio-emotional development. Central to this process are patterns of activation associated with one of our most salient socioemotional cues, the face. In this study, we obtained fMRI activation and high-density ERP source data evoked by parallel face dot-probe tasks from 9-to-12-year-old children. We then integrated the two modalities of data to explore the neural spatial-temporal dynamics of children's face processing. Our results showed that the tomography of the ERP sources broadly corresponded with the fMRI activation evoked by the same facial stimuli. Further, we combined complementary information from fMRI and ERP by defining fMRI activation as functional ROIs and applying them to the ERP source data. Indices of ERP source activity were extracted from these ROIs at three a priori ERP peak latencies critical for face processing. We found distinct temporal patterns among the three time points across ROIs. The observed spatial-temporal profiles converge with a dual-system neural network model for face processing: a core system (including the occipito-temporal and parietal ROIs) supports the early visual analysis of facial features, and an extended system (including the paracentral, limbic, and prefrontal ROIs) processes the socio-emotional meaning gleaned and relayed by the core system. Our results for the first time illustrate the spatial validity of high-density source localization of ERP dot-probe data in children. By directly combining the two modalities of data, our findings provide a novel approach to understanding the spatial-temporal dynamics of face processing. This approach can be applied in future research to investigate different research questions in various study populations.
Collapse
Affiliation(s)
- Pan Liu
- Department of Psychology, Child Study Center, The Pennsylvania State University, University Park, PA, 16802, USA.
| | - Xiaoxiao Bai
- Social, Life, and Engineering Sciences Imaging Center, The Pennsylvania State University, University Park, PA, 16802, USA.
| | - Koraly E Pérez-Edgar
- Department of Psychology, Child Study Center, The Pennsylvania State University, University Park, PA, 16802, USA.
| |
Collapse
|
12
|
Zinchenko O, Yaple ZA, Arsalidou M. Brain Responses to Dynamic Facial Expressions: A Normative Meta-Analysis. Front Hum Neurosci 2018; 12:227. [PMID: 29922137 PMCID: PMC5996092 DOI: 10.3389/fnhum.2018.00227] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 05/16/2018] [Indexed: 01/08/2023] Open
Abstract
Identifying facial expressions is crucial for social interactions. Functional neuroimaging studies show that a set of brain areas, such as the fusiform gyrus and amygdala, become active when viewing emotional facial expressions. The majority of functional magnetic resonance imaging (fMRI) studies investigating face perception typically employ static images of faces. However, studies that use dynamic facial expressions (e.g., videos) are accumulating and suggest that a dynamic presentation may be more sensitive and ecologically valid for investigating faces. By using quantitative fMRI meta-analysis the present study examined concordance of brain regions associated with viewing dynamic facial expressions. We analyzed data from 216 participants that participated in 14 studies, which reported coordinates for 28 experiments. Our analysis revealed bilateral fusiform and middle temporal gyri, left amygdala, left declive of the cerebellum and the right inferior frontal gyrus. These regions are discussed in terms of their relation to models of face processing.
Collapse
Affiliation(s)
- Oksana Zinchenko
- Centre for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow, Russia
| | - Zachary A Yaple
- Centre for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow, Russia.,Department of Psychology, National University of Singapore, Singapore, Singapore
| | - Marie Arsalidou
- Department of Psychology, National Research University Higher School of Economics, Moscow, Russia.,Department of Psychology, York University, Toronto, ON, Canada
| |
Collapse
|
13
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions. Front Psychol 2018; 9:52. [PMID: 29467691 PMCID: PMC5807922 DOI: 10.3389/fpsyg.2018.00052] [Citation(s) in RCA: 38] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 01/12/2018] [Indexed: 11/13/2022] Open
Abstract
Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
14
|
Foley E, Rippon G, Senior C. Modulation of Neural Oscillatory Activity during Dynamic Face Processing. J Cogn Neurosci 2017; 30:338-352. [PMID: 29160744 DOI: 10.1162/jocn_a_01209] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Various neuroimaging and neurophysiological methods have been used to examine neural activation patterns in response to faces. However, much of previous research has relied on static images of faces, which do not allow a complete description of the temporal structure of face-specific neural activities to be made. More recently, insights are emerging from fMRI studies about the neural substrates that underpin our perception of naturalistic dynamic face stimuli, but the temporal and spectral oscillatory activity associated with processing dynamic faces has yet to be fully characterized. Here, we used MEG and beamformer source localization to examine the spatiotemporal profile of neurophysiological oscillatory activity in response to dynamic faces. Source analysis revealed a number of regions showing enhanced activation in response to dynamic relative to static faces in the distributed face network, which were spatially coincident with regions that were previously identified with fMRI. Furthermore, our results demonstrate that perception of realistic dynamic facial stimuli activates a distributed neural network at varying time points facilitated by modulations in low-frequency power within alpha and beta frequency ranges (8-30 Hz). Naturalistic dynamic face stimuli may provide a better means of representing the complex nature of perceiving facial expressions in the real world, and neural oscillatory activity can provide additional insights into the associated neural processes.
Collapse
|
15
|
Furl N, Lohse M, Pizzorni-Ferrarese F. Low-frequency oscillations employ a general coding of the spatio-temporal similarity of dynamic faces. Neuroimage 2017; 157:486-499. [PMID: 28619657 PMCID: PMC6390175 DOI: 10.1016/j.neuroimage.2017.06.023] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2016] [Revised: 06/01/2017] [Accepted: 06/09/2017] [Indexed: 12/14/2022] Open
Abstract
Brain networks use neural oscillations as information transfer mechanisms. Although the face perception network in occipitotemporal cortex is well-studied, contributions of oscillations to face representation remain an open question. We tested for links between oscillatory responses that encode facial dimensions and the theoretical proposal that faces are encoded in similarity-based "face spaces". We quantified similarity-based encoding of dynamic faces in magnetoencephalographic sensor-level oscillatory power for identity, expression, physical and perceptual similarity of facial form and motion. Our data show that evoked responses manifest physical and perceptual form similarity that distinguishes facial identities. Low-frequency induced oscillations (< 20Hz) manifested more general similarity structure, which was not limited to identity, and spanned physical and perceived form and motion. A supplementary fMRI-constrained source reconstruction implicated fusiform gyrus and V5 in this similarity-based representation. These findings introduce a potential link between "face space" encoding and oscillatory network communication, which generates new hypotheses about the potential oscillation-mediated mechanisms that might encode facial dimensions.
Collapse
Affiliation(s)
- Nicholas Furl
- Department of Psychology, Royal Holloway, University of London, Surrey TW20 0EX, United Kingdom; Cognition and Brain Sciences Unit, Medical Research Council, Cambridge CB2 7EF, United Kingdom.
| | - Michael Lohse
- Cognition and Brain Sciences Unit, Medical Research Council, Cambridge CB2 7EF, United Kingdom; Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford OX1 3QX, United Kingdom
| | | |
Collapse
|
16
|
Sato W, Kochiyama T, Uono S, Yoshikawa S, Toichi M. Direction of Amygdala-Neocortex Interaction During Dynamic Facial Expression Processing. Cereb Cortex 2017; 27:1878-1890. [PMID: 26908633 DOI: 10.1093/cercor/bhw036] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023] Open
Abstract
Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion.
Collapse
Affiliation(s)
- Wataru Sato
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Graduate School of Medicine and
| | - Takanori Kochiyama
- Brain Activity Imaging Center, Advanced Telecommunications Research Institute International, Soraku-gun, Kyoto 619-0288, Japan
| | - Shota Uono
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Graduate School of Medicine and
| | - Sakiko Yoshikawa
- Kokoro Research Center, Kyoto University, Sakyo-ku, Kyoto 606-8501, Japan
| | - Motomi Toichi
- Faculty of Human Health Science, Graduate School of Medicine, Kyoto University, Sakyo-ku, Kyoto 606-8507, Japan
| |
Collapse
|
17
|
Pawling R, Kirkham AJ, Hayes AE, Tipper SP. Incidental retrieval of prior emotion mimicry. Exp Brain Res 2017; 235:1173-1184. [PMID: 28188326 PMCID: PMC5477702 DOI: 10.1007/s00221-017-4882-y] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2016] [Accepted: 01/13/2017] [Indexed: 11/30/2022]
Abstract
When observing emotional expressions, similar sensorimotor states are activated in the observer, often resulting in physical mimicry. For example, when observing a smile, the zygomaticus muscles associated with smiling are activated in the observer, and when observing a frown, the corrugator brow muscles. We show that the consistency of an individual’s facial emotion, whether they always frown or smile, can be encoded into memory. When the individuals are viewed at a later time expressing no emotion, muscle mimicry of the prior state can be detected, even when the emotion itself is task irrelevant. The results support simulation accounts of memory, where prior embodiments of other’s states during encoding are reactivated when re-encountering a person.
Collapse
Affiliation(s)
- Ralph Pawling
- School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool, UK.
| | | | - Amy E Hayes
- School of Sport, Health and Exercise Sciences, Bangor University, Bangor, UK
| | | |
Collapse
|
18
|
Sato W, Kochiyama T, Uono S, Matsuda K, Usui K, Usui N, Inoue Y, Toichi M. Rapid gamma oscillations in the inferior occipital gyrus in response to eyes. Sci Rep 2016; 6:36321. [PMID: 27805017 PMCID: PMC5090864 DOI: 10.1038/srep36321] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2016] [Accepted: 10/13/2016] [Indexed: 11/17/2022] Open
Abstract
Eyes are an indispensable communication medium for human social interactions. Although previous neuroscientific evidence suggests the activation of the inferior occipital gyrus (IOG) during eye processing, the temporal profile of this activation remains unclear. To investigate this issue, we analyzed intracranial electroencephalograms of the IOG during the presentation of eyes and mosaics, in either averted or straight directions. Time–frequency statistical parametric mapping analyses revealed greater gamma-band activation in the right IOG beginning at 114 ms in response to eyes relative to mosaics, irrespective of their averted or straight direction. These results suggest that gamma oscillations in the right IOG are involved in the early stages of eye processing, such as eye detection.
Collapse
Affiliation(s)
- Wataru Sato
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Graduate School of Medicine, Kyoto University, 53 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507, Japan
| | - Takanori Kochiyama
- Brain Activity Imaging Center, Advanced Telecommunications Research Institute International, 2-2-2 Hikaridai, Seika, Soraku, Kyoto 619-0288, Japan
| | - Shota Uono
- Department of Neurodevelopmental Psychiatry, Habilitation and Rehabilitation, Graduate School of Medicine, Kyoto University, 53 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507, Japan
| | - Kazumi Matsuda
- National Epilepsy Center, Shizuoka Institute of Epilepsy and Neurological Disorders, Urushiyama 886, Shizuoka 420-8688, Japan
| | - Keiko Usui
- National Epilepsy Center, Shizuoka Institute of Epilepsy and Neurological Disorders, Urushiyama 886, Shizuoka 420-8688, Japan
| | - Naotaka Usui
- National Epilepsy Center, Shizuoka Institute of Epilepsy and Neurological Disorders, Urushiyama 886, Shizuoka 420-8688, Japan
| | - Yushi Inoue
- National Epilepsy Center, Shizuoka Institute of Epilepsy and Neurological Disorders, Urushiyama 886, Shizuoka 420-8688, Japan
| | - Motomi Toichi
- Faculty of Human Health Science, Graduate School of Medicine, Kyoto University, 53 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507, Japan.,The Organization for Promoting Developmental Disorder Research, 40 Shogoin-Sannocho, Sakyo, Kyoto 606-8392, Japan
| |
Collapse
|
19
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry? PLoS One 2016; 11:e0158534. [PMID: 27390867 PMCID: PMC4938565 DOI: 10.1371/journal.pone.0158534] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2015] [Accepted: 06/17/2016] [Indexed: 11/18/2022] Open
Abstract
Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
20
|
Wood A, Rychlowska M, Korb S, Niedenthal P. Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition. Trends Cogn Sci 2016; 20:227-240. [PMID: 26876363 DOI: 10.1016/j.tics.2015.12.010] [Citation(s) in RCA: 201] [Impact Index Per Article: 22.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2015] [Revised: 12/18/2015] [Accepted: 12/31/2015] [Indexed: 10/22/2022]
Abstract
When we observe a facial expression of emotion, we often mimic it. This automatic mimicry reflects underlying sensorimotor simulation that supports accurate emotion recognition. Why this is so is becoming more obvious: emotions are patterns of expressive, behavioral, physiological, and subjective feeling responses. Activation of one component can therefore automatically activate other components. When people simulate a perceived facial expression, they partially activate the corresponding emotional state in themselves, which provides a basis for inferring the underlying emotion of the expresser. We integrate recent evidence in favor of a role for sensorimotor simulation in emotion recognition. We then connect this account to a domain-general understanding of how sensory information from multiple modalities is integrated to generate perceptual predictions in the brain.
Collapse
Affiliation(s)
- Adrienne Wood
- University of Wisconsin - Madison, 1202 West Johnson Street, Madison, WI 53706, USA.
| | | | - Sebastian Korb
- International School for Advanced Studies (SISSA), Via Bonomea 265, 34136 Trieste, Italy
| | - Paula Niedenthal
- University of Wisconsin - Madison, 1202 West Johnson Street, Madison, WI 53706, USA
| |
Collapse
|