1
|
Wang HT, Lyu JL, Chien SHL. Dynamic Emotion Recognition and Expression Imitation in Neurotypical Adults and Their Associations with Autistic Traits. SENSORS (BASEL, SWITZERLAND) 2024; 24:8133. [PMID: 39771868 PMCID: PMC11679109 DOI: 10.3390/s24248133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/15/2024] [Revised: 12/13/2024] [Accepted: 12/18/2024] [Indexed: 01/11/2025]
Abstract
Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by deficits in social interaction and communication. While many studies suggest that individuals with ASD struggle with emotion processing, the association between emotion processing and autistic traits in non-clinical populations is still unclear. We examine whether neurotypical adults' facial emotion recognition and expression imitation are associated with autistic traits. We recruited 32 neurotypical adults; each received two computerized tasks, the Dynamic Emotion Recognition and Expression Imitation, and two standardized measures: the Chinese version AQ and the Twenty-Item Prosopagnosia Index (PI-20). Results for the dynamic emotion recognition showed that happiness has the highest mean accuracy, followed by surprise, sadness, anger, fear, and disgust. For expression imitation, it was easiest to imitate surprise and happiness, followed by disgust, while the accuracy of imitating sadness, anger, and fear was much lower. Importantly, individual AQ scores negatively correlated with emotion recognition accuracy and positively correlated with PI-20. The AQ imagination, communication sub-scores, and PI-20 positively correlated with the expression imitation of surprise. In summary, we found a significant link between recognizing emotional expressions and the level of autistic traits in non-clinical populations, supporting the concept of broader autism phenotype.
Collapse
Affiliation(s)
- Hai-Ting Wang
- Graduate Institute of Biological Science and Technology, China Medical University, Taichung 404, Taiwan;
| | - Jia-Ling Lyu
- Graduate Institute of Biomedical Sciences, China Medical University, Taichung 404, Taiwan;
| | - Sarina Hui-Lin Chien
- Graduate Institute of Biomedical Sciences, China Medical University, Taichung 404, Taiwan;
- Neuroscience and Brain Disease Center, China Medical University, Taichung 404, Taiwan
| |
Collapse
|
2
|
Huang Y, Du J, Guo X, Li Y, Wang H, Xu J, Xu S, Wang Y, Zhang R, Xiao L, Su T, Tang Y. Insomnia and impacts on facial expression recognition accuracy, intensity and speed: A meta-analysis. J Psychiatr Res 2023; 160:248-257. [PMID: 36870234 DOI: 10.1016/j.jpsychires.2023.02.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 01/05/2023] [Accepted: 02/03/2023] [Indexed: 02/05/2023]
Abstract
Facial expressions provide nonverbal cues that are important for delivering and interpreting human emotions. Previous studies have shown that the ability to interpret facial emotions correctly could be partially impaired in sleep-deprived people. People with insomnia might also suffer from sleep loss, so we assumed that facial expression recognition ability might also be impaired in people with insomnia. Despite a growing body of research exploring insomnia's potential impacts on facial expression recognition, conflicting results have been reported, and no systematic review of this research body has been conducted. In this study, after screening 1100 records identified through database searches, six articles examining insomnia and facial expression recognition ability were included in a quantitative synthesis. The main outcomes were classification accuracy (ACC), reaction time (RT), and intensity rating-the three most studied facial expression processing variables. Subgroup analysis was performed to identify altered perceptions according to the facial expressions of four emotions-happiness, sadness, fear, and anger-used to examine insomnia and emotion recognition. The pooled standard mean differences (SMDs) and corresponding 95% confidence intervals (CIs) demonstrated that facial expression recognition among people with insomnia was less accurate (SMD = -0.30; 95% CI: -0.46, -0.14) and slower (SMD = 0.67; 95% CI: 0.18, -1.15) compared to good sleepers. The classification ACC of fearful expression was lower in the insomnia group (SMD = -0.66; 95% CI: -1.02, -0.30). This meta-analysis was registered using PROSPERO.
Collapse
Affiliation(s)
- Yujia Huang
- Psychology Department, The Second Naval Hospital of Southern Theater Command, Sanya, China; Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Jing Du
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Xin Guo
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Yinan Li
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Hao Wang
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Jingzhou Xu
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Shuyu Xu
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Yajing Wang
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Ruike Zhang
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China
| | - Lei Xiao
- Department of Medical Psychology, Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China.
| | - Tong Su
- Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China.
| | - Yunxiang Tang
- Faculty of Psychology, Naval Medical University (Second Military Medical University), Shanghai, China.
| |
Collapse
|
3
|
Doğdu C, Kessler T, Schneider D, Shadaydeh M, Schweinberger SR. A Comparison of Machine Learning Algorithms and Feature Sets for Automatic Vocal Emotion Recognition in Speech. SENSORS (BASEL, SWITZERLAND) 2022; 22:7561. [PMID: 36236658 PMCID: PMC9571288 DOI: 10.3390/s22197561] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Revised: 09/26/2022] [Accepted: 10/02/2022] [Indexed: 06/16/2023]
Abstract
Vocal emotion recognition (VER) in natural speech, often referred to as speech emotion recognition (SER), remains challenging for both humans and computers. Applied fields including clinical diagnosis and intervention, social interaction research or Human Computer Interaction (HCI) increasingly benefit from efficient VER algorithms. Several feature sets were used with machine-learning (ML) algorithms for discrete emotion classification. However, there is no consensus for which low-level-descriptors and classifiers are optimal. Therefore, we aimed to compare the performance of machine-learning algorithms with several different feature sets. Concretely, seven ML algorithms were compared on the Berlin Database of Emotional Speech: Multilayer Perceptron Neural Network (MLP), J48 Decision Tree (DT), Support Vector Machine with Sequential Minimal Optimization (SMO), Random Forest (RF), k-Nearest Neighbor (KNN), Simple Logistic Regression (LOG) and Multinomial Logistic Regression (MLR) with 10-fold cross validation using four openSMILE feature sets (i.e., IS-09, emobase, GeMAPS and eGeMAPS). Results indicated that SMO, MLP and LOG show better performance (reaching to 87.85%, 84.00% and 83.74% accuracies, respectively) compared to RF, DT, MLR and KNN (with minimum 73.46%, 53.08%, 70.65% and 58.69% accuracies, respectively). Overall, the emobase feature set performed best. We discuss the implications of these findings for applications in diagnosis, intervention or HCI.
Collapse
Affiliation(s)
- Cem Doğdu
- Department of Social Psychology, Institute of Psychology, Friedrich Schiller University Jena, Humboldtstraße 26, 07743 Jena, Germany
- Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University Jena, 07743 Jena, Germany
- Social Potential in Autism Research Unit, Friedrich Schiller University Jena, 07743 Jena, Germany
| | - Thomas Kessler
- Department of Social Psychology, Institute of Psychology, Friedrich Schiller University Jena, Humboldtstraße 26, 07743 Jena, Germany
| | - Dana Schneider
- Department of Social Psychology, Institute of Psychology, Friedrich Schiller University Jena, Humboldtstraße 26, 07743 Jena, Germany
- Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University Jena, 07743 Jena, Germany
- Social Potential in Autism Research Unit, Friedrich Schiller University Jena, 07743 Jena, Germany
- DFG Scientific Network “Understanding Others”, 10117 Berlin, Germany
| | - Maha Shadaydeh
- Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University Jena, 07743 Jena, Germany
- Computer Vision Group, Department of Mathematics and Computer Science, Friedrich Schiller University Jena, 07743 Jena, Germany
| | - Stefan R. Schweinberger
- Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University Jena, 07743 Jena, Germany
- Social Potential in Autism Research Unit, Friedrich Schiller University Jena, 07743 Jena, Germany
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Am Steiger 3/Haus 1, 07743 Jena, Germany
- German Center for Mental Health (DZPG), Site Jena-Magdeburg-Halle, 07743 Jena, Germany
| |
Collapse
|
4
|
Li X, Bai X, Conway CM, Shi W, Wang X. Statistical learning for non-social and socially-meaningful stimuli in individuals with high and low levels of autistic traits. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-02703-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
5
|
The structural neural correlates of atypical facial expression recognition in autism spectrum disorder. Brain Imaging Behav 2022; 16:1428-1440. [PMID: 35048265 DOI: 10.1007/s11682-021-00626-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/18/2021] [Indexed: 11/02/2022]
Abstract
Previous studies have demonstrated that individuals with autism spectrum disorder (ASD) are worse at recognizing facial expressions than are typically developing (TD) individuals. The present study investigated the differences in structural neural correlates of emotion recognition between individuals with and without ASD using voxel-based morphometry (VBM). We acquired structural MRI data from 27 high-functioning adults with ASD and 27 age- and sex-matched TD individuals. The ability to recognize facial expressions was measured using a label-matching paradigm featuring six basic emotions (anger, disgust, fear, happiness, sadness, and surprise). The behavioural task did not find deficits of emotion recognition in ASD after controlling for intellectual ability. However, the VBM analysis for the region of interest showed a positive correlation between the averaged percent accuracy across six basic emotions and the grey matter volume of the right inferior frontal gyrus in TD individuals, but not in individuals with ASD. The VBM for the whole brain region under each emotion condition revealed a positive correlation between the percent accuracy for disgusted faces and the grey matter volume of the left dorsomedial prefrontal cortex in individuals with ASD, but not in TD individuals. The different pattern of correlations suggests that individuals with and without ASD use different processing mechanisms for recognizing others' facial expressions.
Collapse
|