1
|
Ringwald WR, Feltman S, Schwartz HA, Samaras D, Khudari C, Luft BJ, Kotov R. Day-to-day dynamics of facial emotion expressions in posttraumatic stress disorder. J Affect Disord 2025; 380:331-339. [PMID: 40122249 DOI: 10.1016/j.jad.2025.03.109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/26/2024] [Revised: 03/13/2025] [Accepted: 03/19/2025] [Indexed: 03/25/2025]
Abstract
Facial expressions are an essential component of emotions that may reveal mechanisms maintaining posttraumatic stress disorder (PTSD). However, most research on emotions in PTSD has relied on self-reports, which only capture subjective affect. The few studies on outward emotion expressions have been hampered by methodological limitations, including low ecological validity and failure to capture the dynamic nature of emotions and symptoms. Our study addresses these limitations with an approach that has not been applied to psychopathology: person-specific models of day-to-day facial emotion expression and PTSD symptom dynamics. We studied a sample of World Trade Center responders (N = 112) with elevated PTSD pathology who recorded a daily video diary and self-reported symptoms for 90 days (8953 videos altogether). Facial expressions were detected from video recordings with a facial emotion recognition model. In data-driven, idiographic network models, most participants (80 %) had at least one, reliable expression-symptom link. Six expression-symptom dynamics were significant for >10 % of the sample. Each of these dynamics had statistically meaningful heterogeneity, with some people's symptoms related to over-expressivity and others to under-expressivity. Our results provide the foundation for a more complete understanding of emotions in PTSD that not only includes subjective feelings but also outward emotion expressions.
Collapse
Affiliation(s)
- Whitney R Ringwald
- Department of Psychology, University of Minnesota, United States of America.
| | - Scott Feltman
- Department of Applied Mathematics, Stony Brook University, United States of America
| | - H Andrew Schwartz
- Department of Computer Science, Stony Brook University, United States of America
| | - Dimitris Samaras
- Department of Computer Science, Stony Brook University, United States of America
| | - Christopher Khudari
- Department of Psychology, Rosalind Franklin University, United States of America
| | - Benjamin J Luft
- World Trade Center Health Program, Stony Brook University, United States of America; Department of Medicine, Stony Brook University, United States of America
| | - Roman Kotov
- Department of Psychiatry, Stony Brook University, United States of America
| |
Collapse
|
2
|
Shu L, Barradas VR, Qin Z, Koike Y. Facial expression recognition through muscle synergies and estimation of facial keypoint displacements through a skin-musculoskeletal model using facial sEMG signals. Front Bioeng Biotechnol 2025; 13:1490919. [PMID: 40013307 PMCID: PMC11861201 DOI: 10.3389/fbioe.2025.1490919] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2024] [Accepted: 01/20/2025] [Indexed: 02/28/2025] Open
Abstract
The development of facial expression recognition (FER) and facial expression generation (FEG) systems is essential to enhance human-robot interactions (HRI). The facial action coding system is widely used in FER and FEG tasks, as it offers a framework to relate the action of facial muscles and the resulting facial motions to the execution of facial expressions. However, most FER and FEG studies are based on measuring and analyzing facial motions, leaving the facial muscle component relatively unexplored. This study introduces a novel framework using surface electromyography (sEMG) signals from facial muscles to recognize facial expressions and estimate the displacement of facial keypoints during the execution of the expressions. For the facial expression recognition task, we studied the coordination patterns of seven muscles, expressed as three muscle synergies extracted through non-negative matrix factorization, during the execution of six basic facial expressions. Muscle synergies are groups of muscles that show coordinated patterns of activity, as measured by their sEMG signals, and are hypothesized to form the building blocks of human motor control. We then trained two classifiers for the facial expressions based on extracted features from the sEMG signals and the synergy activation coefficients of the extracted muscle synergies, respectively. The accuracy of both classifiers outperformed other systems that use sEMG to classify facial expressions, although the synergy-based classifier performed marginally worse than the sEMG-based one (classification accuracy: synergy-based 97.4%, sEMG-based 99.2%). However, the extracted muscle synergies revealed common coordination patterns between different facial expressions, allowing a low-dimensional quantitative visualization of the muscle control strategies involved in human facial expression generation. We also developed a skin-musculoskeletal model enhanced by linear regression (SMSM-LRM) to estimate the displacement of facial keypoints during the execution of a facial expression based on sEMG signals. Our proposed approach achieved a relatively high fidelity in estimating these displacements (NRMSE 0.067). We propose that the identified muscle synergies could be used in combination with the SMSM-LRM model to generate motor commands and trajectories for desired facial displacements, potentially enabling the generation of more natural facial expressions in social robotics and virtual reality.
Collapse
Affiliation(s)
- Lun Shu
- Department of Information and Communications Engineering, Institute of Science Tokyo, Yokohama, Japan
| | - Victor R. Barradas
- Institute of Integrated Research, Institute of Science Tokyo, Yokohama, Japan
| | - Zixuan Qin
- Department of Information and Communications Engineering, Institute of Science Tokyo, Yokohama, Japan
| | - Yasuharu Koike
- Institute of Integrated Research, Institute of Science Tokyo, Yokohama, Japan
| |
Collapse
|
3
|
Tsikandilakis M, Bali P, Karlis A, Morfi P, Mével PA, Madan C, Milbank A. "Sentio ergo est": Unmasking the psychological realities of emotional misperception. Perception 2025; 54:3-31. [PMID: 39648752 DOI: 10.1177/03010066241302996] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/10/2024]
Abstract
Perception is an important aspect of our personal lives, interpersonal interactions and professional activities and performance. A large body of psychological research has been dedicated to exploring how perception happens, whether and when it involves conscious awareness and what are the physiological correlates, such as skin-conductance and heart-rate responses, that occur when we perceive particularly emotional elicitors. A more recent and less explored question in psychological science is how and when misperception happens, and what are the physiological characteristics of the misperception of emotion. Therefore, in the current study, for the first time in relevant research, we recruited participants using trial-contour power calculations for false-positive responses, such as incorrectly reporting that a brief backward masked face was presented and thoroughly explored these responses. We reported that false-positive responses for backward masked emotional faces were characterised by pre-trial arousal, and post-trial arousal increases, high confidence ratings, and corresponding to stimulus-type misperception valence and arousal participant ratings. These outcomes were most pronounced for false-positive responses for fearful faces. Based on these findings, we discussed the possibility of a mechanism for partial self-encapsulated emotional-experiential apperception and the possibility of a fear primacy socio-emotional response module during combined visual ambiguity and high psychophysiological arousal.
Collapse
Affiliation(s)
- Myron Tsikandilakis
- School of Psychology, University of Nottingham, Nottingham, UK Medical School, Faculty of Medicine and Health Sciences, University of Nottingham, Nottingham, UK
- School of Cultures, Languages and Area Studies, University of Nottingham, Nottingham, UK
| | - Persefoni Bali
- School of Psychology, University of Nottingham, Nottingham, UK
| | - Alexander Karlis
- Department of Physics, National and Kapodistrian University of Athens, Athens, Greece
| | - Patty Morfi
- School of Engineering, Nottingham Trent University, Nottingham, UK
| | - Pierre-Alexis Mével
- School of Cultures, Languages and Area Studies, University of Nottingham, Nottingham, UK
| | | | - Alison Milbank
- Department of Philosophy, University of Nottingham, Nottingham, UK
| |
Collapse
|
4
|
Simoncini G, Borghesi F, Cipresso P. Linking Affect Dynamics and Well-Being: A Novel Methodological Approach for Mental Health. Healthcare (Basel) 2024; 12:1690. [PMID: 39273715 PMCID: PMC11395663 DOI: 10.3390/healthcare12171690] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2024] [Revised: 08/17/2024] [Accepted: 08/22/2024] [Indexed: 09/15/2024] Open
Abstract
Emotions are dynamic processes; their variability relates to psychological well-being and psychopathology. Affective alterations have been linked to mental diseases like depression, although little is known about how similar patterns occur in healthy individuals. This study investigates the psychophysiological correlations of emotional processing in healthy subjects, specifically exploring the relationship between depressive traits, cognitive distortions, and facial electromyographic (f-EMG) responses during affective transitions. A cohort of 44 healthy participants underwent f-EMG recording while viewing emotional images from the International Affective Picture System (IAPS). Self-report measures included the Beck Depression Inventory (BDI) and the Cognitive Distortion Scale (CDS). Higher BDI scores were associated with increased EMG activity in the corrugator muscle during transitions between positive and negative emotional states. Cognitive distortions such as Catastrophizing, All-or-Nothing Thinking, and Minimization showed significant positive correlations with EMG activity, indicating that individuals with higher levels of these distortions experienced greater facial muscle activation during emotional transitions. This study's results indicate that there is a bidirectional correlation between depressed features and cognitive distortions and alterations in facial emotional processing, even in healthy subjects. Facial EMG in the context of dynamic affective transitions has the potential to be used as a non-invasive method for detecting abnormal emotional reactions at an early stage. This might help in identifying individuals who are at risk of developing depression and guide therapies to prevent its advancement.
Collapse
Affiliation(s)
- Gloria Simoncini
- Department of Psychology, University of Turin, 10124 Turin, Italy
| | | | - Pietro Cipresso
- Department of Psychology, University of Turin, 10124 Turin, Italy
| |
Collapse
|
5
|
Ahmad I, Rashid J, Faheem M, Akram A, Khan NA, Amin RU. Autism spectrum disorder detection using facial images: A performance comparison of pretrained convolutional neural networks. Healthc Technol Lett 2024; 11:227-239. [PMID: 39100502 PMCID: PMC11294932 DOI: 10.1049/htl2.12073] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Revised: 12/15/2023] [Accepted: 12/18/2023] [Indexed: 08/06/2024] Open
Abstract
Autism spectrum disorder (ASD) is a complex psychological syndrome characterized by persistent difficulties in social interaction, restricted behaviours, speech, and nonverbal communication. The impacts of this disorder and the severity of symptoms vary from person to person. In most cases, symptoms of ASD appear at the age of 2 to 5 and continue throughout adolescence and into adulthood. While this disorder cannot be cured completely, studies have shown that early detection of this syndrome can assist in maintaining the behavioural and psychological development of children. Experts are currently studying various machine learning methods, particularly convolutional neural networks, to expedite the screening process. Convolutional neural networks are considered promising frameworks for the diagnosis of ASD. This study employs different pre-trained convolutional neural networks such as ResNet34, ResNet50, AlexNet, MobileNetV2, VGG16, and VGG19 to diagnose ASD and compared their performance. Transfer learning was applied to every model included in the study to achieve higher results than the initial models. The proposed ResNet50 model achieved the highest accuracy, 92%, compared to other transfer learning models. The proposed method also outperformed the state-of-the-art models in terms of accuracy and computational cost.
Collapse
Affiliation(s)
- Israr Ahmad
- Department of Automation ScienceBeihang UniversityBeijingChina
| | - Javed Rashid
- Department of IT ServicesUniversity of OkaraOkaraPunjabPakistan
- MLC LabOkaraPunjabPakistan
| | - Muhammad Faheem
- Department of Computing SciencesSchool of Technology and Innovations, University of VaasaVaasaFinland
| | - Arslan Akram
- MLC LabOkaraPunjabPakistan
- Department of Computer ScienceUniversity of OkaraOkaraPunjabPakistan
| | - Nafees Ahmad Khan
- MLC LabOkaraPunjabPakistan
- Department of Computer ScienceUniversity of OkaraOkaraPunjabPakistan
| | - Riaz ul Amin
- MLC LabOkaraPunjabPakistan
- Department of Computer ScienceUniversity of OkaraOkaraPunjabPakistan
| |
Collapse
|
6
|
Kent RD. The Feel of Speech: Multisystem and Polymodal Somatosensation in Speech Production. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2024; 67:1424-1460. [PMID: 38593006 DOI: 10.1044/2024_jslhr-23-00575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/11/2024]
Abstract
PURPOSE The oral structures such as the tongue and lips have remarkable somatosensory capacities, but understanding the roles of somatosensation in speech production requires a more comprehensive knowledge of somatosensation in the speech production system in its entirety, including the respiratory, laryngeal, and supralaryngeal subsystems. This review was conducted to summarize the system-wide somatosensory information available for speech production. METHOD The search was conducted with PubMed/Medline and Google Scholar for articles published until November 2023. Numerous search terms were used in conducting the review, which covered the topics of psychophysics, basic and clinical behavioral research, neuroanatomy, and neuroscience. RESULTS AND CONCLUSIONS The current understanding of speech somatosensation rests primarily on the two pillars of psychophysics and neuroscience. The confluence of polymodal afferent streams supports the development, maintenance, and refinement of speech production. Receptors are both canonical and noncanonical, with the latter occurring especially in the muscles innervated by the facial nerve. Somatosensory representation in the cortex is disproportionately large and provides for sensory interactions. Speech somatosensory function is robust over the lifespan, with possible declines in advanced aging. The understanding of somatosensation in speech disorders is largely disconnected from research and theory on speech production. A speech somatoscape is proposed as the generalized, system-wide sensation of speech production, with implications for speech development, speech motor control, and speech disorders.
Collapse
|
7
|
Hall NT, Hallquist MN, Martin EA, Lian W, Jonas KG, Kotov R. Automating the analysis of facial emotion expression dynamics: A computational framework and application in psychotic disorders. Proc Natl Acad Sci U S A 2024; 121:e2313665121. [PMID: 38530896 PMCID: PMC10998559 DOI: 10.1073/pnas.2313665121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2023] [Accepted: 01/18/2024] [Indexed: 03/28/2024] Open
Abstract
Facial emotion expressions play a central role in interpersonal interactions; these displays are used to predict and influence the behavior of others. Despite their importance, quantifying and analyzing the dynamics of brief facial emotion expressions remains an understudied methodological challenge. Here, we present a method that leverages machine learning and network modeling to assess the dynamics of facial expressions. Using video recordings of clinical interviews, we demonstrate the utility of this approach in a sample of 96 people diagnosed with psychotic disorders and 116 never-psychotic adults. Participants diagnosed with schizophrenia tended to move from neutral expressions to uncommon expressions (e.g., fear, surprise), whereas participants diagnosed with other psychoses (e.g., mood disorders with psychosis) moved toward expressions of sadness. This method has broad applications to the study of normal and altered expressions of emotion and can be integrated with telemedicine to improve psychiatric assessment and treatment.
Collapse
Affiliation(s)
- Nathan T. Hall
- Department of Psychology and Neuroscience, University of North Carolina at Chapel Hill, Chapel Hill, NC27599
| | - Michael N. Hallquist
- Department of Psychology and Neuroscience, University of North Carolina at Chapel Hill, Chapel Hill, NC27599
| | - Elizabeth A. Martin
- Department of Psychological Science, University of California, Irvine, CA92697
| | - Wenxuan Lian
- Department of Psychiatry, Stony Brook University, Stoney Brook, NY11794
| | | | - Roman Kotov
- Department of Psychiatry, Stony Brook University, Stoney Brook, NY11794
| |
Collapse
|
8
|
Broulidakis MJ, Kiprijanovska I, Severs L, Stankoski S, Gjoreski M, Mavridou I, Gjoreski H, Cox S, Bradwell D, Stone JM, Nduka C. Optomyography-based sensing of facial expression derived arousal and valence in adults with depression. Front Psychiatry 2023; 14:1232433. [PMID: 37614653 PMCID: PMC10442807 DOI: 10.3389/fpsyt.2023.1232433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Accepted: 07/28/2023] [Indexed: 08/25/2023] Open
Abstract
Background Continuous assessment of affective behaviors could improve the diagnosis, assessment and monitoring of chronic mental health and neurological conditions such as depression. However, there are no technologies well suited to this, limiting potential clinical applications. Aim To test if we could replicate previous evidence of hypo reactivity to emotional salient material using an entirely new sensing technique called optomyography which is well suited to remote monitoring. Methods Thirty-eight depressed and 37 controls (≥18, ≤40 years) who met a research diagnosis of depression and an age-matched non-depressed control group. Changes in facial muscle activity over the brow (corrugator supercilli) and cheek (zygomaticus major) were measured whilst volunteers watched videos varying in emotional salience. Results Across all participants, videos rated as subjectively positive were associated with activation of muscles in the cheek relative to videos rated as neutral or negative. Videos rated as subjectively negative were associated with brow activation relative to videos judged as neutral or positive. Self-reported arousal was associated with a step increase in facial muscle activation across the brow and cheek. Group differences were significantly reduced activation in facial muscles during videos considered subjectively negative or rated as high arousal in depressed volunteers compared with controls. Conclusion We demonstrate for the first time that it is possible to detect facial expression hypo-reactivity in adults with depression in response to emotional content using glasses-based optomyography sensing. It is hoped these results may encourage the use of optomyography-based sensing to track facial expressions in the real-world, outside of a specialized testing environment.
Collapse
Affiliation(s)
| | | | | | | | - Martin Gjoreski
- Faculty of Informatics, Università della Svizzera italiana, Lugano, Switzerland
| | | | - Hristijan Gjoreski
- Ss. Cyril and Methodius University in Skopje (UKIM), Skopje, North Macedonia
| | | | | | - James M. Stone
- Brighton and Sussex Medical School, University of Sussex, Brighton, United Kingdom
| | - Charles Nduka
- Emteq Ltd., Brighton, United Kingdom
- Queen Victoria Hospital, East Grinstead, United Kingdom
| |
Collapse
|
9
|
Abdulghafor R, Abdelmohsen A, Turaev S, Ali MAH, Wani S. An Analysis of Body Language of Patients Using Artificial Intelligence. Healthcare (Basel) 2022; 10:healthcare10122504. [PMID: 36554028 PMCID: PMC9778650 DOI: 10.3390/healthcare10122504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Revised: 11/29/2022] [Accepted: 12/01/2022] [Indexed: 12/14/2022] Open
Abstract
In recent decades, epidemic and pandemic illnesses have grown prevalent and are a regular source of concern throughout the world. The extent to which the globe has been affected by the COVID-19 epidemic is well documented. Smart technology is now widely used in medical applications, with the automated detection of status and feelings becoming a significant study area. As a result, a variety of studies have begun to focus on the automated detection of symptoms in individuals infected with a pandemic or epidemic disease by studying their body language. The recognition and interpretation of arm and leg motions, facial recognition, and body postures is still a developing field, and there is a dearth of comprehensive studies that might aid in illness diagnosis utilizing artificial intelligence techniques and technologies. This literature review is a meta review of past papers that utilized AI for body language classification through full-body tracking or facial expressions detection for various tasks such as fall detection and COVID-19 detection, it looks at different methods proposed by each paper, their significance and their results.
Collapse
Affiliation(s)
- Rawad Abdulghafor
- Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia
- Correspondence: (R.A.); (S.T.)
| | - Abdelrahman Abdelmohsen
- Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia
| | - Sherzod Turaev
- Department of Computer Science and Software Engineering, College of Information Technology, United Arab Emirates University, Al Ain 15551, United Arab Emirates
- Correspondence: (R.A.); (S.T.)
| | - Mohammed A. H. Ali
- Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur 50603, Malaysia
| | - Sharyar Wani
- Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia
| |
Collapse
|
10
|
Kim M, Cho Y, Kim SY. Effects of diagnostic regions on facial emotion recognition: The moving window technique. Front Psychol 2022; 13:966623. [PMID: 36186300 PMCID: PMC9518794 DOI: 10.3389/fpsyg.2022.966623] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
With regard to facial emotion recognition, previous studies found that specific facial regions were attended more in order to identify certain emotions. We investigated whether a preferential search for emotion-specific diagnostic regions could contribute toward the accurate recognition of facial emotions. Twenty-three neurotypical adults performed an emotion recognition task using six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. The participants’ exploration patterns for the faces were measured using the Moving Window Technique (MWT). This technique presented a small window on a blurred face, and the participants explored the face stimuli through a mouse-controlled window in order to recognize the emotions on the face. Our results revealed that when the participants explored the diagnostic regions for each emotion more frequently, the correct recognition of the emotions occurred at a faster rate. To the best of our knowledge, this current study is the first to present evidence that an exploration of emotion-specific diagnostic regions can predict the reaction time of accurate emotion recognition among neurotypical adults. Such findings can be further applied in the evaluation and/or training (regarding emotion recognition functions) of both typically and atypically developing children with emotion recognition difficulties.
Collapse
Affiliation(s)
- Minhee Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
| | - Youngwug Cho
- Department of Computer Science, Hanyang University, Seoul, South Korea
| | - So-Yeon Kim
- Department of Psychology, Duksung Women’s University, Seoul, South Korea
- *Correspondence: So-Yeon Kim,
| |
Collapse
|
11
|
Abdulghafor R, Turaev S, Ali MAH. Body Language Analysis in Healthcare: An Overview. Healthcare (Basel) 2022; 10:healthcare10071251. [PMID: 35885777 PMCID: PMC9325107 DOI: 10.3390/healthcare10071251] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Revised: 06/22/2022] [Accepted: 06/23/2022] [Indexed: 11/16/2022] Open
Abstract
Given the current COVID-19 pandemic, medical research today focuses on epidemic diseases. Innovative technology is incorporated in most medical applications, emphasizing the automatic recognition of physical and emotional states. Most research is concerned with the automatic identification of symptoms displayed by patients through analyzing their body language. The development of technologies for recognizing and interpreting arm and leg gestures, facial features, and body postures is still in its early stage. More extensive research is needed using artificial intelligence (AI) techniques in disease detection. This paper presents a comprehensive survey of the research performed on body language processing. Upon defining and explaining the different types of body language, we justify the use of automatic recognition and its application in healthcare. We briefly describe the automatic recognition framework using AI to recognize various body language elements and discuss automatic gesture recognition approaches that help better identify the external symptoms of epidemic and pandemic diseases. From this study, we found that since there are studies that have proven that the body has a language called body language, it has proven that language can be analyzed and understood by machine learning (ML). Since diseases also show clear and different symptoms in the body, the body language here will be affected and have special features related to a particular disease. From this examination, we discovered that it is possible to specialize the features and language changes of each disease in the body. Hence, ML can understand and detect diseases such as pandemic and epidemic diseases and others.
Collapse
Affiliation(s)
- Rawad Abdulghafor
- Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia
- Correspondence: (R.A.); (S.T.); (M.A.H.A.)
| | - Sherzod Turaev
- Department of Computer Science and Software Engineering, College of Information Technology, United Arab Emirates University, Al-Ain, Abu Dhabi P.O. Box 15556, United Arab Emirates
- Correspondence: (R.A.); (S.T.); (M.A.H.A.)
| | - Mohammed A. H. Ali
- Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur 50603, Malaysia
- Correspondence: (R.A.); (S.T.); (M.A.H.A.)
| |
Collapse
|
12
|
Rodríguez-Fuertes A, Alard-Josemaría J, Sandubete JE. Measuring the Candidates' Emotions in Political Debates Based on Facial Expression Recognition Techniques. Front Psychol 2022; 13:785453. [PMID: 35615169 PMCID: PMC9126085 DOI: 10.3389/fpsyg.2022.785453] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2021] [Accepted: 04/14/2022] [Indexed: 11/13/2022] Open
Abstract
This article presents the analysis of the main Spanish political candidates for the elections to be held on April 2019. The analysis focuses on the Facial Expression Analysis (FEA), a technique widely used in neuromarketing research. It allows to identify the micro-expressions that are very brief, involuntary. They are signals of hidden emotions that cannot be controlled voluntarily. The video with the final interventions of every candidate has been post-processed using the classification algorithms given by the iMotions's AFFDEX platform. We have then analyzed these data. Firstly, we have identified and compare the basic emotions showed by each politician. Second, we have associated the basic emotions with specific moments of the candidate's speech, identifying the topics they address and relating them directly to the expressed emotion. Third, we have analyzed whether the differences shown by each candidate in every emotion are statistically significant. In this sense, we have applied the non-parametric chi-squared goodness-of-fit test. We have also considered the ANOVA analysis in order to test whether, on average, there are differences between the candidates. Finally, we have checked if there is consistency between the results provided by different surveys from the main media in Spain regarding the evaluation of the debate and those obtained in our empirical analysis. A predominance of negative emotions has been observed. Some inconsistencies were found between the emotion expressed in the facial expression and the verbal content of the message. Also, evidences got from statistical analysis confirm that the differences observed between the various candidates with respect to the basic emotions, on average, are statistically significant. In this sense, this article provides a methodological contribution to the analysis of the public figures' communication, which could help politicians to improve the effectiveness of their messages identifying and evaluating the intensity of the expressed emotions.
Collapse
Affiliation(s)
| | | | - Julio E Sandubete
- Complutense University of Madrid, Madrid, Spain.,CEU San Pablo University, Madrid, Spain
| |
Collapse
|
13
|
Camerini AL, Marciano L, Annoni AM, Ort A, Petrocchi S. Exploring the Emotional Experience During Instant Messaging Among Young Adults: An Experimental Study Incorporating Physiological Correlates of Arousal. Front Psychol 2022; 13:840845. [PMID: 35444584 PMCID: PMC9015695 DOI: 10.3389/fpsyg.2022.840845] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Accepted: 02/08/2022] [Indexed: 11/28/2022] Open
Abstract
Instant messaging (IM) is a highly diffused form of communication among younger populations, yet little is known about the emotional experience during IM. The present study aimed to investigate the emotional experience during IM by drawing on the Circumplex Model of Affect and measuring heart rate and electrodermal activity as indicators of arousal in addition to self-reported perceived emotional valence. Using an experimental design, we manipulated message latency (i.e., response after 1 min versus 7 min) and message valence (positive versus negative response). Based on data collected from 65 young adults (50% male; Mage = 23.28, SD = 3.75), we observed arousal as participants’ electrodermal activity levels increased from the time a fictitious peer started typing a response to the receipt of that response, especially in the delayed condition. Electrodermal activity levels also increased in both the positive and the negative message conditions. No changes were observed for heart rate. Participants’ self-report perceived emotional valence revealed that positive messages were evaluated as more pleasant and the peer as more available, while no difference in the self-report was found for message latency. These findings shed light on the emotional experience during IM by adding valuable insights on the physiological processes underlying the anticipation of social reward, but only during delayed IM exchange that can be observed in Human–Computer-Interaction.
Collapse
Affiliation(s)
- Anne-Linda Camerini
- Institute of Public Health, Università della Svizzera Italiana, Lugano, Switzerland
| | - Laura Marciano
- Institute of Public Health, Università della Svizzera Italiana, Lugano, Switzerland
| | - Anna Maria Annoni
- Institute of Public Health, Università della Svizzera Italiana, Lugano, Switzerland.,Department of Business Economics, Health and Social Care, University of Applied Sciences and Arts of Southern Switzerland, Manno, Switzerland
| | - Alexander Ort
- Department of Health Sciences and Medicine, University of Lucerne, Lucerne, Switzerland
| | - Serena Petrocchi
- Faculty of Communication, Culture and Society, Università della Svizzera Italiana, Lugano, Switzerland
| |
Collapse
|
14
|
Cross MP, Acevedo AM, Leger KA, Pressman SD. How and Why Could Smiling Influence Physical Health? A Conceptual Review. Health Psychol Rev 2022; 17:321-343. [PMID: 35285408 DOI: 10.1080/17437199.2022.2052740] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Smiling has been a topic of interest to psychologists for decades, with a myriad of studies tying this behavior to well-being. Despite this, we know surprisingly little about the nature of the connections between smiling and physical health. We review the literature connecting both naturally occurring smiles and experimentally manipulated smiles to physical health and health-relevant outcomes. This work is discussed in the context of existing affect and health-relevant theoretical models that help explain the connection between smiling and physical health including the facial feedback hypothesis, the undoing hypothesis, the generalized unsafety theory of stress, and polyvagal theory. We also describe a number of plausible pathways, some new and relatively untested, through which smiling may influence physical health such as trait or state positive affect, social relationships, stress buffering, and the oculocardiac reflex. Finally, we provide a discussion of possible future directions, including the importance of cultural variation and replication. Although this field is still in its infancy, the findings from both naturally occurring smile studies and experimentally manipulated smile studies consistently suggest that smiling may have a number of health-relevant benefits including beneficially impacting our physiology during acute stress, improved stress recovery, and reduced illness over time.
Collapse
Affiliation(s)
- Marie P Cross
- Department of Biobehavioral Health, Pennsylvania State University, University Park, University Park, PA, USA
| | - Amanda M Acevedo
- Department of Psychological Science, University of California, Irvine, Irvine, CA, USA
| | - Kate A Leger
- Department of Psychology, University of Kentucky, Lexington, Lexington, KY, USA
| | - Sarah D Pressman
- Department of Psychological Science, University of California, Irvine, Irvine, CA, USA
| |
Collapse
|
15
|
Höfling TTA, Alpers GW, Büdenbender B, Föhl U, Gerdes ABM. What's in a face: Automatic facial coding of untrained study participants compared to standardized inventories. PLoS One 2022; 17:e0263863. [PMID: 35239654 PMCID: PMC8893617 DOI: 10.1371/journal.pone.0263863] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2020] [Accepted: 01/28/2022] [Indexed: 11/19/2022] Open
Abstract
Automatic facial coding (AFC) is a novel research tool to automatically analyze emotional facial expressions. AFC can classify emotional expressions with high accuracy in standardized picture inventories of intensively posed and prototypical expressions. However, classification of facial expressions of untrained study participants is more error prone. This discrepancy requires a direct comparison between these two sources of facial expressions. To this end, 70 untrained participants were asked to express joy, anger, surprise, sadness, disgust, and fear in a typical laboratory setting. Recorded videos were scored with a well-established AFC software (FaceReader, Noldus Information Technology). These were compared with AFC measures of standardized pictures from 70 trained actors (i.e., standardized inventories). We report the probability estimates of specific emotion categories and, in addition, Action Unit (AU) profiles for each emotion. Based on this, we used a novel machine learning approach to determine the relevant AUs for each emotion, separately for both datasets. First, misclassification was more frequent for some emotions of untrained participants. Second, AU intensities were generally lower in pictures of untrained participants compared to standardized pictures for all emotions. Third, although profiles of relevant AU overlapped substantially across the two data sets, there were also substantial differences in their AU profiles. This research provides evidence that the application of AFC is not limited to standardized facial expression inventories but can also be used to code facial expressions of untrained participants in a typical laboratory setting.
Collapse
Affiliation(s)
- T. Tim A. Höfling
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| | - Georg W. Alpers
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| | - Björn Büdenbender
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| | - Ulrich Föhl
- Business School, Pforzheim University of Applied Sciences, Pforzheim, Germany
| | - Antje B. M. Gerdes
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| |
Collapse
|
16
|
Romano JA, Vosper L, Kingslake JA, Dourish CT, Higgs S, Thomas JM, Raslescu A, Dawson GR. Validation of the P1vital® Faces Set for Use as Stimuli in Tests of Facial Emotion Recognition. Front Psychiatry 2022; 13:663763. [PMID: 35222109 PMCID: PMC8874121 DOI: 10.3389/fpsyt.2022.663763] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 01/19/2022] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Negative bias in facial emotion recognition is a well-established concept in mental disorders such as depression. However, existing face sets of emotion recognition tests may be of limited use in international research, which could benefit from more contemporary and diverse alternatives. Here, we developed and provide initial validation for the P1vital® Affective Faces set (PAFs) as a contemporary alternative to the widely-used Pictures of Facial Affect (PoFA). METHODS The PAFs was constructed of 133 color photographs of facial expressions of ethnically-diverse trained actors and compared with the PoFA, comprised of 110 black and white photographs of facial expressions of generally Caucasian actors. Sixty-one recruits were asked to classify faces from both sets over six emotions (happy, sad, fear, anger, disgust, surprise) varying in intensity in 10% increments from 0 to 100%. RESULTS Participants were significantly more accurate in identifying correct emotions viewing faces from the PAFs. In both sets, participants identified happy faces more accurately than fearful faces, were least likely to misclassify facial expressions as happy and most likely to misclassify all emotions at low intensity as neutral. Accuracy in identifying facial expressions improved with increasing emotion intensity for both sets, reaching peaks at 60 and 80% intensity for the PAFs and PoFA, respectively. The study was limited by small sizes and age-range of participants and ethnic diversity of actors. CONCLUSIONS The PAFs successfully depicted a range of emotional expressions with improved performance over the PoFA and may be used as a contemporary set in facial expression recognition tests.
Collapse
Affiliation(s)
| | | | | | | | - Suzanne Higgs
- School of Psychology, University of Birmingham, Birmingham, United Kingdom
| | - Jason M Thomas
- Department of Psychology, Aston University, Birmingham, United Kingdom
| | | | | |
Collapse
|
17
|
Khomchenkova A, Prokopenko S, Gurevich V, Peresunko P. Diagnosis of hypomimia in Parkinson’s disease. Zh Nevrol Psikhiatr Im S S Korsakova 2022; 122:24-29. [DOI: 10.17116/jnevro202212211224] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
18
|
Escelsior A, Amadeo MB, Esposito D, Rosina A, Trabucco A, Inuggi A, Pereira da Silva B, Serafini G, Gori M, Amore M. COVID-19 and psychiatric disorders: The impact of face masks in emotion recognition face masks and emotion recognition in psychiatry. Front Psychiatry 2022; 13:932791. [PMID: 36238943 PMCID: PMC9551300 DOI: 10.3389/fpsyt.2022.932791] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 08/26/2022] [Indexed: 11/13/2022] Open
Abstract
Since the outbreak of the COVID-19 pandemic, reading facial expressions has become more complex due to face masks covering the lower part of people's faces. A history of psychiatric illness has been associated with higher rates of complications, hospitalization, and mortality due to COVID-19. Psychiatric patients have well-documented difficulties reading emotions from facial expressions; accordingly, this study assesses how using face masks, such as those worn for preventing COVID-19 transmission, impacts the emotion recognition skills of patients with psychiatric disorders. To this end, the current study asked patients with bipolar disorder, major depressive disorder, schizophrenia, and healthy individuals to identify facial emotions on face images with and without facial masks. Results demonstrate that the emotion recognition skills of all participants were negatively influenced by face masks. Moreover, the main insight of the study is that the impairment is crucially significant when patients with major depressive disorder and schizophrenia had to identify happiness at a low-intensity level. These findings have important implications for satisfactory social relationships and well-being. If emotions with positive valence are hardly understood by specific psychiatric patients, there is an even greater requirement for doctor-patient interactions in public primary care.
Collapse
Affiliation(s)
- Andrea Escelsior
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Maria Bianca Amadeo
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Davide Esposito
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Anna Rosina
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Alice Trabucco
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy
| | - Alberto Inuggi
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Beatriz Pereira da Silva
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Gianluca Serafini
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Monica Gori
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Mario Amore
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| |
Collapse
|
19
|
Shuster A, Inzelberg L, Ossmy O, Izakson L, Hanein Y, Levy DJ. Lie to my face: An electromyography approach to the study of deceptive behavior. Brain Behav 2021; 11:e2386. [PMID: 34677007 PMCID: PMC8671780 DOI: 10.1002/brb3.2386] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 08/25/2021] [Accepted: 09/06/2021] [Indexed: 11/09/2022] Open
Abstract
BACKGROUND Deception is present in all walks of life, from social interactions to matters of homeland security. Nevertheless, reliable indicators of deceptive behavior in real-life scenarios remain elusive. METHODS By integrating electrophysiological and communicative approaches, we demonstrate a new and objective detection approach to identify participant-specific indicators of deceptive behavior in an interactive scenario of a two-person deception task. We recorded participants' facial muscle activity using novel dry screen-printed electrode arrays and applied machine-learning algorithms to identify lies based on brief facial responses. RESULTS With an average accuracy of 73%, we identified two groups of participants: Those who revealed their lies by activating their cheek muscles and those who activated their eyebrows. We found that the participants lied more often with time, with some switching their telltale muscle groups. Moreover, while the automated classifier, reported here, outperformed untrained human detectors, their performance was correlated, suggesting reliance on shared features. CONCLUSIONS Our findings demonstrate the feasibility of using wearable electrode arrays in detecting human lies in a social setting and set the stage for future research on individual differences in deception expression.
Collapse
Affiliation(s)
- Anastasia Shuster
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.,Coller School of Management, Tel Aviv University, Tel Aviv, Israel
| | - Lilah Inzelberg
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.,Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel
| | - Ori Ossmy
- Department of Psychology and Center of Neural Science, New York University, New York City, New York, USA
| | - Liz Izakson
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.,Coller School of Management, Tel Aviv University, Tel Aviv, Israel
| | - Yael Hanein
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.,Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel.,School of Electrical Engineering, Tel Aviv University, Tel Aviv, Israel
| | - Dino J Levy
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.,Coller School of Management, Tel Aviv University, Tel Aviv, Israel
| |
Collapse
|
20
|
Koundourou C, Ioannou M, Stephanou C, Paparistodemou M, Katsigari T, Tsitsas G, Sotiropoulou K. Emotional Well-Being and Traditional Cypriot Easter Games: A Qualitative Analysis. Front Psychol 2021; 12:613173. [PMID: 34630192 PMCID: PMC8499803 DOI: 10.3389/fpsyg.2021.613173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Accepted: 08/09/2021] [Indexed: 11/13/2022] Open
Abstract
The aim of the current study is to examine the effect of the Traditional Easter Games of Cyprus on the emotional well-being of the participants. Data were collected using a qualitative analysis. It consisted of interviews from 51 participants aged 32-93 years old, and observations were made from audiovisual material of the Traditional Cypriot Easter Games being played by a sample of 20 children aged 6-14 years old and 43 adults aged 18-65 years old. Demographic data were collected by using interviews and analyzed using IBM SPSS program. The observations of the audiovisual material focused on the emotions of the participants and were grouped into prevailing and secondary emotions according to frequency and duration. The results indicate that games produce emotions such as joy, excitement, and euphoria. Emotions such as embarrassment, frustration, and anger were also observed occasionally, specifically in situations of competitiveness and defeat. In addition, the differences and similarities between adults and children were recorded. The findings of the present study extend previous work by demonstrating the positive impact of the traditional games on children's and adult's emotional well-being.
Collapse
Affiliation(s)
- Christiana Koundourou
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Markella Ioannou
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Chara Stephanou
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Maria Paparistodemou
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Theodora Katsigari
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Georgios Tsitsas
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| | - Kyriaki Sotiropoulou
- Department of Psychology, School of Health Sciences, Neapolis University Pafos, Paphos, Cyprus
| |
Collapse
|
21
|
Long F, Zhao S, Wei X, Ng SC, Ni X, Chi A, Fang P, Zeng W, Wei B. Positive and Negative Emotion Classification Based on Multi-channel. Front Behav Neurosci 2021; 15:720451. [PMID: 34512288 PMCID: PMC8428531 DOI: 10.3389/fnbeh.2021.720451] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2021] [Accepted: 07/29/2021] [Indexed: 11/13/2022] Open
Abstract
The EEG features of different emotions were extracted based on multi-channel and forehead channels in this study. The EEG signals of 26 subjects were collected by the emotional video evoked method. The results show that the energy ratio and differential entropy of the frequency band can be used to classify positive and negative emotions effectively, and the best effect can be achieved by using an SVM classifier. When only the forehead and forehead signals are used, the highest classification accuracy can reach 66%. When the data of all channels are used, the highest accuracy of the model can reach 82%. After channel selection, the best model of this study can be obtained. The accuracy is more than 86%.
Collapse
Affiliation(s)
- Fangfang Long
- Department of Psychology, Nanjing University, Nanjing, China
| | - Shanguang Zhao
- Centre for Sport and Exercise Sciences, University of Malaya, Kuala Lumpur, Malaysia
| | - Xin Wei
- Institute of Social Psychology, School of Humanities and Social Sciences, Xi'an Jiaotong University, Xi'an, China.,Key & Core Technology Innovation Institute of the Greater Bay Area, Guangdong, China
| | - Siew-Cheok Ng
- Faculty of Engineering, University of Malaya, Kuala Lumpur, Malaysia
| | - Xiaoli Ni
- Institute of Social Psychology, School of Humanities and Social Sciences, Xi'an Jiaotong University, Xi'an, China
| | - Aiping Chi
- School of Sports, Shaanxi Normal University, Xi'an, China
| | - Peng Fang
- Department of the Psychology of Military Medicine, Air Force Medical University, Xi'an, China
| | - Weigang Zeng
- Key & Core Technology Innovation Institute of the Greater Bay Area, Guangdong, China
| | - Bokun Wei
- Xi'an Middle School of Shaanxi Province, Xi'an, China
| |
Collapse
|
22
|
Schumann NP, Bongers K, Scholle HC, Guntinas-Lichius O. Atlas of voluntary facial muscle activation: Visualization of surface electromyographic activities of facial muscles during mimic exercises. PLoS One 2021; 16:e0254932. [PMID: 34280246 PMCID: PMC8289121 DOI: 10.1371/journal.pone.0254932] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 07/06/2021] [Indexed: 12/29/2022] Open
Abstract
Complex facial muscle movements are essential for many motoric and emotional functions. Facial muscles are unique in the musculoskeletal system as they are interwoven, so that the contraction of one muscle influences the contractility characteristic of other mimic muscles. The facial muscles act more as a whole than as single facial muscle movements. The standard for clinical and psychosocial experiments to detect these complex interactions is surface electromyography (sEMG). What is missing, is an atlas showing which facial muscles are activated during specific tasks. Based on high-resolution sEMG data of 10 facial muscles of both sides of the face simultaneously recorded during 29 different facial muscle tasks, an atlas visualizing voluntary facial muscle activation was developed. For each task, the mean normalized EMG amplitudes of the examined facial muscles were visualized by colors. The colors were spread between the lowest and highest EMG activity. Gray shades represent no to very low EMG activities, light and dark brown shades represent low to medium EMG activities and red shades represent high to very high EMG activities relatively with respect to each task. The present atlas should become a helpful tool to design sEMG experiments not only for clinical trials and psychological experiments, but also for speech therapy and orofacial rehabilitation studies.
Collapse
Affiliation(s)
- Nikolaus P. Schumann
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Kevin Bongers
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Hans C. Scholle
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Orlando Guntinas-Lichius
- Department of Otolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
- * E-mail:
| |
Collapse
|
23
|
Burr DA, Pizzie RG, Kraemer DJM. Anxiety, not regulation tendency, predicts how individuals regulate in the laboratory: An exploratory comparison of self-report and psychophysiology. PLoS One 2021; 16:e0247246. [PMID: 33711022 PMCID: PMC7954312 DOI: 10.1371/journal.pone.0247246] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 02/03/2021] [Indexed: 11/19/2022] Open
Abstract
Anxiety influences how individuals experience and regulate emotions in a variety of ways. For example, individuals with lower anxiety tend to cognitively reframe (reappraise) negative emotion and those with higher anxiety tend to suppress negative emotion. Research has also investigated these individual differences with psychophysiology. These lines of research assume coherence between how individuals regulate outside the laboratory, typically measured with self-report, and how they regulate during an experiment. Indeed, performance during experiments is interpreted as an indication of future behavior outside the laboratory, yet this relationship is seldom directly explored. To address this gap, we computed psychophysiological profiles of uninstructed (natural) regulation in the laboratory and explored the coherence between these profiles and a) self-reported anxiety and b) self-reported regulation tendency. Participants viewed negative images and were instructed to reappraise, suppress or naturally engage. Electrodermal and facial electromyography signals were recorded to compute a multivariate psychophysiological profile of regulation. Participants with lower anxiety exhibited similar profiles when naturally regulating and following instructions to reappraise, suggesting they naturally reappraised more. Participants with higher anxiety exhibited similar profiles when naturally regulating and following instructions to suppress, suggesting they naturally suppressed more. However, there was no association between self-reported reappraisal or suppression tendency and psychophysiology. These exploratory results indicate that anxiety, but not regulation tendency, predicts how individuals regulate emotion in the laboratory. These findings suggest that how individuals report regulating in the real world does not map on to how they regulate in the laboratory. Taken together, this underscores the importance of developing emotion-regulation interventions and paradigms that more closely align to and predict real-world outcomes.
Collapse
Affiliation(s)
- Daisy A. Burr
- Department of Psychology and Neuroscience, Duke University, Durham, NC, United States of America
| | - Rachel G. Pizzie
- Program in Educational Neuroscience, Gallaudet University, Washington, D.C., United States of America
| | - David J. M. Kraemer
- Department of Education and Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH, United States of America
| |
Collapse
|
24
|
Facial features and head movements obtained with a webcam correlate with performance deterioration during prolonged wakefulness. Atten Percept Psychophys 2020; 83:525-540. [PMID: 33205369 DOI: 10.3758/s13414-020-02199-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/01/2020] [Indexed: 01/19/2023]
Abstract
We have performed a direct comparison between facial features obtained from a webcam and vigilance-task performance during prolonged wakefulness. Prolonged wakefulness deteriorates working performance due to changes in cognition, emotion, and by delayed response. Facial features can be potentially collected everywhere using webcams located in the workplace. If this type of device can obtain relevant information to predict performance deterioration, this technology can potentially reduce serious accidents and fatality. We extracted 34 facial indices, including head movements, facial expressions, and perceived facial emotions from 20 participants undergoing the psychomotor vigilance task (PVT) over 25 hours. We studied the correlation between facial indices and the performance indices derived from PVT, and evaluated the feasibility of facial indices as detectors of diminished reaction time during the PVT. Furthermore, we tested the feasibility of classifying performance as normal or impaired using several machine learning algorithms with correlated facial indices. Twenty-one indices were found significantly correlated with PVT indices. Pitch, from the head movement indices, and four perceived facial emotions-anger, surprise, sadness, and disgust-exhibited significant correlations with indices of performance. The eye-related facial expression indices showed especially strong correlation and higher feasibility of facial indices as classifiers. Significantly correlated indices were shown to explain more variance than the other indices for most of the classifiers. The facial indices obtained from a webcam strongly correlate with working performance during 25 hours of prolonged wakefulness.
Collapse
|
25
|
The Study of Facial Muscle Movements for Non-Invasive Thermal Discomfort Detection via Bio-Sensing Technology. Part I: Development of the Experimental Design and Description of the Collected Data. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10207315] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In the time of climate change, as heat waves become a more regular occurrence, indoor thermal comfort is an important factor in day to day life. Due to such circumstances, many researchers have focused their studies on finding an effective solution that will not only enable thermal comfort, but also increase satisfaction within the indoor environment and, as a result, productivity. The fast development of the biometrical field encouraged the study focused on the investigation of how bio-markers, in combination with artificial intelligence algorithms, can be collected within an experimental setting to create a new approach for non-invasive thermal discomfort detection. The developed experimental design provides synergy between automatic facial coding, pulse, and galvanic skin response measurements via iMotions software in a controlled environment. The iMotions software has built-in machine vision algorithms, and with Shimmer sensors and a post-processing tool through Affectiva AFFDEX, is able to collect facial action data through detection of the facial muscle movements and various bio-markers. The Zero Emission Building (ZEB) Test Cell laboratory was used as the control environment and transformed to imitate an office space for the data collection campaign at NTNU in Trondheim. The given experimental design provides an opportunity to create an immense database with bio-markers that are linked to the subcortical level of the brain, indoor parameters, and direct feedback on the comfort level of occupants within an office-like environment. In total, 111 data collection sessions were registered with iMotions. The discomfort button was pressed 240 times and 1080 planned indoor comfort evaluation surveys were held during experiment. The discomfort button was pressed 49 times to indicate that participant felt discomfort due to low temperature and 52 due to high temperature. Collected data revealed a big deviation in the discomfort temperature values for experiment participants with respect to performed temperature ramps. While it is common to use the same predefined temperature range for facility management, it became clear that the complexity of the task is greater and should not be approached on a human computational level. Implementation of AI can potentially provide higher value accuracy within thermal discomfort detection and enable unique personal user experience at the workplace.
Collapse
|
26
|
Facial-expression recognition: An emergent approach to the measurement of tourist satisfaction through emotions. TELEMATICS AND INFORMATICS 2020. [DOI: 10.1016/j.tele.2020.101404] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
27
|
Mieronkoski R, Syrjälä E, Jiang M, Rahmani A, Pahikkala T, Liljeberg P, Salanterä S. Developing a pain intensity prediction model using facial expression: A feasibility study with electromyography. PLoS One 2020; 15:e0235545. [PMID: 32645045 PMCID: PMC7347182 DOI: 10.1371/journal.pone.0235545] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2020] [Accepted: 06/17/2020] [Indexed: 11/25/2022] Open
Abstract
The automatic detection of facial expressions of pain is needed to ensure accurate pain assessment of patients who are unable to self-report pain. To overcome the challenges of automatic systems for determining pain levels based on facial expressions in clinical patient monitoring, a surface electromyography method was tested for feasibility in healthy volunteers. In the current study, two types of experimental gradually increasing pain stimuli were induced in thirty-one healthy volunteers who attended the study. We used a surface electromyography method to measure the activity of five facial muscles to detect facial expressions during pain induction. Statistical tests were used to analyze the continuous electromyography data, and a supervised machine learning was applied for pain intensity prediction model. Muscle activation of corrugator supercilii was most strongly associated with self-reported pain, and the levator labii superioris and orbicularis oculi showed a statistically significant increase in muscle activation when the pain stimulus reached subjects' self -reported pain thresholds. The two strongest features associated with pain, the waveform length of the corrugator supercilii and levator labii superioris, were selected for a prediction model. The performance of the pain prediction model resulted in a c-index of 0.64. In the study results, the most detectable difference in muscle activity during the pain experience was connected to eyebrow lowering, nose wrinkling and upper lip raising. As the performance of the prediction model remains modest, yet with a statistically significant ordinal classification, we suggest testing with a larger sample size to further explore the variables that affect variation in expressiveness and subjective pain experience.
Collapse
Affiliation(s)
| | - Elise Syrjälä
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Mingzhe Jiang
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Amir Rahmani
- Department of Computer Science, University of California, Irvine, California, United States of America
- School of Nursing, University of California, Irvine, California, United States of America
| | - Tapio Pahikkala
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Pasi Liljeberg
- Department of Future Technologies, University of Turku, Turku, Finland
| | - Sanna Salanterä
- Department of Nursing Science, University of Turku, Turku, Finland
- Turku University Hospital, Turku, Finland
| |
Collapse
|
28
|
Höfling TTA, Gerdes ABM, Föhl U, Alpers GW. Read My Face: Automatic Facial Coding Versus Psychophysiological Indicators of Emotional Valence and Arousal. Front Psychol 2020; 11:1388. [PMID: 32636788 PMCID: PMC7316962 DOI: 10.3389/fpsyg.2020.01388] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Accepted: 05/25/2020] [Indexed: 12/12/2022] Open
Abstract
Facial expressions provide insight into a person's emotional experience. To automatically decode these expressions has been made possible by tremendous progress in the field of computer vision. Researchers are now able to decode emotional facial expressions with impressive accuracy in standardized images of prototypical basic emotions. We tested the sensitivity of a well-established automatic facial coding software program to detect spontaneous emotional reactions in individuals responding to emotional pictures. We compared automatically generated scores for valence and arousal of the Facereader (FR; Noldus Information Technology) with the current psychophysiological gold standard of measuring emotional valence (Facial Electromyography, EMG) and arousal (Skin Conductance, SC). We recorded physiological and behavioral measurements of 43 healthy participants while they looked at pleasant, unpleasant, or neutral scenes. When viewing pleasant pictures, FR Valence and EMG were both comparably sensitive. However, for unpleasant pictures, FR Valence showed an expected negative shift, but the signal differentiated not well between responses to neutral and unpleasant stimuli, that were distinguishable with EMG. Furthermore, FR Arousal values had a stronger correlation with self-reported valence than with arousal while SC was sensitive and specifically associated with self-reported arousal. This is the first study to systematically compare FR measurement of spontaneous emotional reactions to standardized emotional images with established psychophysiological measurement tools. This novel technology has yet to make strides to surpass the sensitivity of established psychophysiological measures. However, it provides a promising new measurement technique for non-contact assessment of emotional responses.
Collapse
Affiliation(s)
- T. Tim A. Höfling
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| | - Antje B. M. Gerdes
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| | - Ulrich Föhl
- Business Unit, Pforzheim University of Applied Sciences, Pforzheim, Germany
| | - Georg W. Alpers
- Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany
| |
Collapse
|
29
|
Association Between Hypomimia and Mild Cognitive Impairment in De Novo Parkinson's Disease Patients. Can J Neurol Sci 2020; 47:855-857. [PMID: 32406363 DOI: 10.1017/cjn.2020.93] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
|
30
|
Flynn M, Effraimidis D, Angelopoulou A, Kapetanios E, Williams D, Hemanth J, Towell T. Assessing the Effectiveness of Automated Emotion Recognition in Adults and Children for Clinical Investigation. Front Hum Neurosci 2020; 14:70. [PMID: 32317947 PMCID: PMC7156005 DOI: 10.3389/fnhum.2020.00070] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Accepted: 02/17/2020] [Indexed: 12/04/2022] Open
Abstract
Recent success stories in automated object or face recognition, partly fuelled by deep learning artificial neural network (ANN) architectures, have led to the advancement of biometric research platforms and, to some extent, the resurrection of Artificial Intelligence (AI). In line with this general trend, inter-disciplinary approaches have been taken to automate the recognition of emotions in adults or children for the benefit of various applications, such as identification of children's emotions prior to a clinical investigation. Within this context, it turns out that automating emotion recognition is far from being straightforward, with several challenges arising for both science (e.g., methodology underpinned by psychology) and technology (e.g., the iMotions biometric research platform). In this paper, we present a methodology and experiment and some interesting findings, which raise the following research questions for the recognition of emotions and attention in humans: (a) the adequacy of well-established techniques such as the International Affective Picture System (IAPS), (b) the adequacy of state-of-the-art biometric research platforms, (c) the extent to which emotional responses may be different in children and adults. Our findings and first attempts to answer some of these research questions are based on a mixed sample of adults and children who took part in the experiment, resulting in a statistical analysis of numerous variables. These are related to both automatically and interactively captured responses of participants to a sample of IAPS pictures.
Collapse
Affiliation(s)
- Maria Flynn
- School of Social Sciences, University of Westminster, London, United Kingdom
| | - Dimitris Effraimidis
- School of Computer Science and Engineering, University of Westminster, London, United Kingdom
| | - Anastassia Angelopoulou
- School of Computer Science and Engineering, University of Westminster, London, United Kingdom
| | - Epaminondas Kapetanios
- School of Computer Science and Engineering, University of Westminster, London, United Kingdom
| | - David Williams
- School of Social Sciences, University of Westminster, London, United Kingdom
| | - Jude Hemanth
- ECE Department, Karunya Institute of Technology and Sciences, Coimbatore, India
| | - Tony Towell
- School of Social Sciences, University of Westminster, London, United Kingdom
| |
Collapse
|
31
|
|
32
|
Cheong JH, Brooks S, Chang LJ. FaceSync: Open source framework for recording facial expressions with head-mounted cameras. F1000Res 2019; 8:702. [PMID: 32185017 PMCID: PMC7059847 DOI: 10.12688/f1000research.18187.1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 04/25/2019] [Indexed: 12/14/2022] Open
Abstract
Advances in computer vision and machine learning algorithms have enabled researchers to extract facial expression data from face video recordings with greater ease and speed than standard manual coding methods, which has led to a dramatic increase in the pace of facial expression research. However, there are many limitations in recording facial expressions in laboratory settings. Conventional video recording setups using webcams, tripod-mounted cameras, or pan-tilt-zoom cameras require making compromises between cost, reliability, and flexibility. As an alternative, we propose the use of a mobile head-mounted camera that can be easily constructed from our open-source instructions and blueprints at a fraction of the cost of conventional setups. The head-mounted camera framework is supported by the open source Python toolbox FaceSync, which provides an automated method for synchronizing videos. We provide four proof-of-concept studies demonstrating the benefits of this recording system in reliably measuring and analyzing facial expressions in diverse experimental setups, including group interaction experiments.
Collapse
Affiliation(s)
- Jin Hyun Cheong
- Psychological and Brain Sciences, Dartmouth College, Hanover, NH, 03755, USA
| | - Sawyer Brooks
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, 44074, USA
| | - Luke J. Chang
- Psychological and Brain Sciences, Dartmouth College, Hanover, NH, 03755, USA
| |
Collapse
|
33
|
Automatic Recognition of Posed Facial Expression of Emotion in Individuals with Autism Spectrum Disorder. J Autism Dev Disord 2019; 49:279-293. [PMID: 30298462 DOI: 10.1007/s10803-018-3757-9] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
Facial expression is impaired in autism spectrum disorder (ASD), but rarely systematically studied. We focus on the ability of individuals with ASD to produce facial expressions of emotions in response to a verbal prompt. We used the Janssen Autism Knowledge Engine (JAKE®), including automated facial expression analysis software (FACET) to measure facial expressions in individuals with ASD (n = 144) and a typically developing (TD) comparison group (n = 41). Differences in ability to produce facial expressions were observed between ASD and TD groups, demonstrated by activation of facial action units (happy, scared, surprised, disgusted, but not angry or sad). Activation of facial action units correlated with parent-reported social communication skills. This approach has potential for diagnostic and response to intervention measures.Trial Registration NCT02299700.
Collapse
|
34
|
Abstract
The goal of this study was to validate AFFDEX and FACET, two algorithms classifying emotions from facial expressions, in iMotions's software suite. In Study 1, pictures of standardized emotional facial expressions from three databases, the Warsaw Set of Emotional Facial Expression Pictures (WSEFEP), the Amsterdam Dynamic Facial Expression Set (ADFES), and the Radboud Faces Database (RaFD), were classified with both modules. Accuracy (Matching Scores) was computed to assess and compare the classification quality. Results show a large variance in accuracy across emotions and databases, with a performance advantage for FACET over AFFDEX. In Study 2, 110 participants' facial expressions were measured while being exposed to emotionally evocative pictures from the International Affective Picture System (IAPS), the Geneva Affective Picture Database (GAPED) and the Radboud Faces Database (RaFD). Accuracy again differed for distinct emotions, and FACET performed better. Overall, iMotions can achieve acceptable accuracy for standardized pictures of prototypical (vs. natural) facial expressions, but performs worse for more natural facial expressions. We discuss potential sources for limited validity and suggest research directions in the broader context of emotion research.
Collapse
|
35
|
Armbruster D, Grage T, Kirschbaum C, Strobel A. Processing emotions: Effects of menstrual cycle phase and premenstrual symptoms on the startle reflex, facial EMG and heart rate. Behav Brain Res 2018; 351:178-187. [DOI: 10.1016/j.bbr.2018.05.030] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2018] [Revised: 05/25/2018] [Accepted: 05/29/2018] [Indexed: 12/12/2022]
|
36
|
Hildebrandt T, Schulz K, Fleysher L, Griffen T, Heywood A, Sysko R. Development of a methodology to combine fMRI and EMG to measure emotional responses in patients with anorexia nervosa. Int J Eat Disord 2018; 51:722-729. [PMID: 30120839 PMCID: PMC8720298 DOI: 10.1002/eat.22893] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/30/2018] [Revised: 05/15/2018] [Accepted: 05/16/2018] [Indexed: 12/26/2022]
Abstract
OBJECTIVE Individuals with eating disorders are theorized to have basic impairments in affective appraisal and social-emotional processing that contribute to pathogenesis of the disease. We aimed to determine if facial electromyography could be used to discriminate between happy and disgust emotions during simultaneous acquisition of an fMRI BOLD sequence in efforts to establish a novel tool for investigating emotion-driven hypotheses about eating pathology. In line with standards for rigor and reproducibility, we provide detailed protocols and code to support each step of this project. METHOD Sixteen adolescents with low-weight eating disorders viewed emotional faces (Happy or Disgust) and were asked to mimic their facial expression during simultaneous BOLD and EMG (Corrugator supercilli, Lavator lavii, Zygomaticus major) acquisition. Trials were repeated with the scanner off and again with scanner on (i.e., fatigue). RESULTS The Levator and Zygomaticus activation patterns discriminated disgust and happy faces successfully. The pattern held between scanner on and off conditions, but muscle activation attenuated in the Fatigue condition, especially for the Zygomaticus. DISCUSSION Simultaneous fMRI-EMG is a new tool capable of discriminating specific emotions based on muscle activation patterns and can be leveraged to answer emotion-driven hypotheses about clinical populations characterized by difficulty labeling or processing emotions.
Collapse
Affiliation(s)
- Tom Hildebrandt
- Eating and Weight Disorders Program, Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kurt Schulz
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Lazar Fleysher
- Department of Radiology, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Trevor Griffen
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Ashley Heywood
- Eating and Weight Disorders Program, Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Robyn Sysko
- Eating and Weight Disorders Program, Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| |
Collapse
|
37
|
Inzelberg L, Rand D, Steinberg S, David-Pur M, Hanein Y. A Wearable High-Resolution Facial Electromyography for Long Term Recordings in Freely Behaving Humans. Sci Rep 2018; 8:2058. [PMID: 29391503 PMCID: PMC5794977 DOI: 10.1038/s41598-018-20567-y] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2017] [Accepted: 01/21/2018] [Indexed: 11/23/2022] Open
Abstract
Human facial expressions are a complex capacity, carrying important psychological and neurological information. Facial expressions typically involve the co-activation of several muscles; they vary between individuals, between voluntary versus spontaneous expressions, and depend strongly on personal interpretation. Accordingly, while high-resolution recording of muscle activation in a non-laboratory setting offers exciting opportunities, it remains a major challenge. This paper describes a wearable and non-invasive method for objective mapping of facial muscle activation and demonstrates its application in a natural setting. We focus on muscle activation associated with "enjoyment", "social" and "masked" smiles; three categories with distinct social meanings. We use an innovative, dry, soft electrode array designed specifically for facial surface electromyography recording, a customized independent component analysis algorithm, and a short training procedure to achieve the desired mapping. First, identification of the orbicularis oculi and the levator labii superioris was demonstrated from voluntary expressions. Second, the zygomaticus major was identified from voluntary and spontaneous Duchenne and non-Duchenne smiles. Finally, using a wireless device in an unmodified work environment revealed expressions of diverse emotions in face-to-face interaction. Our high-resolution and crosstalk-free mapping, along with excellent user-convenience, opens new opportunities in gaming, virtual-reality, bio-feedback and objective psychological and neurological assessment.
Collapse
Affiliation(s)
- Lilah Inzelberg
- Tel Aviv University Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel.
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.
| | - David Rand
- Tel Aviv University Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel
| | | | - Moshe David-Pur
- Tel Aviv University Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel
| | - Yael Hanein
- Tel Aviv University Center for Nanoscience and Nanotechnology, Tel Aviv University, Tel Aviv, Israel
- School of Electrical Engineering, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| |
Collapse
|
38
|
Ganster DC, Crain TL, Brossoit RM. Physiological Measurement in the Organizational Sciences: A Review and Recommendations for Future Use. ANNUAL REVIEW OF ORGANIZATIONAL PSYCHOLOGY AND ORGANIZATIONAL BEHAVIOR 2018. [DOI: 10.1146/annurev-orgpsych-032117-104613] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Daniel C. Ganster
- Department of Management, College of Business, Colorado State University, Fort Collins, Colorado 80523, USA
| | - Tori L. Crain
- Department of Psychology, College of Natural Sciences, Colorado State University, Fort Collins, Colorado 80523, USA
| | - Rebecca M. Brossoit
- Department of Psychology, College of Natural Sciences, Colorado State University, Fort Collins, Colorado 80523, USA
| |
Collapse
|