1
|
Kucukdereli H, Amsalem O, Pottala T, Lim M, Potgieter L, Hasbrouck A, Lutas A, Andermann ML. Repeated stress triggers seeking of a starvation-like state in anxiety-prone female mice. Neuron 2024:S0896-6273(24)00234-4. [PMID: 38642553 DOI: 10.1016/j.neuron.2024.03.027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 01/28/2024] [Accepted: 03/27/2024] [Indexed: 04/22/2024]
Abstract
Elevated anxiety often precedes anorexia nervosa and persists after weight restoration. Patients with anorexia nervosa often describe self-starvation as pleasant, potentially because food restriction can be anxiolytic. Here, we tested whether repeated stress can cause animals to prefer a starvation-like state. We developed a virtual reality place preference paradigm in which head-fixed mice can voluntarily seek a starvation-like state induced by optogenetic stimulation of hypothalamic agouti-related peptide (AgRP) neurons. Prior to stress exposure, males but not females showed a mild aversion to AgRP stimulation. Strikingly, following multiple days of stress, a subset of females developed a strong preference for AgRP stimulation that was predicted by high baseline anxiety. Such stress-induced changes in preference were reflected in changes in facial expressions during AgRP stimulation. Our study suggests that stress may cause females predisposed to anxiety to seek a starvation state and provides a powerful experimental framework for investigating the underlying neural mechanisms.
Collapse
Affiliation(s)
- Hakan Kucukdereli
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Oren Amsalem
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Trent Pottala
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Michelle Lim
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Leilani Potgieter
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Amanda Hasbrouck
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Andrew Lutas
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA
| | - Mark L Andermann
- Division of Endocrinology, Diabetes and Metabolism, Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02215, USA; Department of Neurobiology, Harvard Medical School, Boston, MA, 02115, USA.
| |
Collapse
|
2
|
Cheng X, Wang S, Wei H, Sun X, Xin L, Li L, Li C, Wang Z. Application of Stereo Digital Image Correlation on Facial Expressions Sensing. Sensors (Basel) 2024; 24:2450. [PMID: 38676067 PMCID: PMC11054127 DOI: 10.3390/s24082450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/03/2024] [Revised: 04/06/2024] [Accepted: 04/08/2024] [Indexed: 04/28/2024]
Abstract
Facial expression is an important way to reflect human emotions and it represents a dynamic deformation process. Analyzing facial movements is an effective means of understanding expressions. However, there is currently a lack of methods capable of analyzing the dynamic details of full-field deformation in expressions. In this paper, in order to enable effective dynamic analysis of expressions, a classic optical measuring method called stereo digital image correlation (stereo-DIC or 3D-DIC) is employed to analyze the deformation fields of facial expressions. The forming processes of six basic facial expressions of certain experimental subjects are analyzed through the displacement and strain fields calculated by 3D-DIC. The displacement fields of each expression exhibit strong consistency with the action units (AUs) defined by the classical Facial Action Coding System (FACS). Moreover, it is shown that the gradient of the displacement, i.e., the strain fields, offers special advantages in characterizing facial expressions due to their localized nature, effectively sensing the nuanced dynamics of facial movements. By processing extensive data, this study demonstrates two featured regions in six basic expressions, one where deformation begins and the other where deformation is most severe. Based on these two regions, the temporal evolutions of the six basic expressions are discussed. The presented investigations demonstrate the superior performance of 3D-DIC in the quantitative analysis of facial expressions. The proposed analytical strategy might have potential value in objectively characterizing human expressions based on quantitative measurement.
Collapse
Affiliation(s)
- Xuanshi Cheng
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Shibin Wang
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Huixin Wei
- School of Civil Engineering and Architecture, Nanchang University, Nanchang 330000, China
| | - Xin Sun
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Lipan Xin
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Linan Li
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Chuanwei Li
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| | - Zhiyong Wang
- School of Mechanical Engineering, Tianjin University, Tianjin 300350, China; (X.C.)
| |
Collapse
|
3
|
Montalti M, Mirabella G. Investigating the impact of surgical masks on behavioral reactions to facial emotions in the COVID-19 era. Front Psychol 2024; 15:1359075. [PMID: 38638526 PMCID: PMC11025472 DOI: 10.3389/fpsyg.2024.1359075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Accepted: 03/04/2024] [Indexed: 04/20/2024] Open
Abstract
Introduction The widespread use of surgical masks during the COVID-19 pandemic has posed challenges in interpreting facial emotions. As the mouth is known to play a crucial role in decoding emotional expressions, its covering is likely to affect this process. Recent evidence suggests that facial expressions impact behavioral responses only when their emotional content is relevant to subjects' goals. Thus, this study investigates whether and how masked emotional faces alter such a phenomenon. Methods Forty participants completed two reaching versions of the Go/No-go task in a counterbalanced fashion. In the Emotional Discrimination Task (EDT), participants were required to respond to angry, fearful, or happy expressions by performing a reaching movement and withholding it when a neutral face was presented. In the Gender Discrimination Task (GDT), the same images were shown, but participants had to respond according to the poser's gender. The face stimuli were presented in two conditions: covered by a surgical mask (masked) or without any covering (unmasked). Results Consistent with previous studies, valence influenced behavioral control in the EDT but not in the GDT. Nevertheless, responses to facial emotions in the EDT exhibited significant differences between unmasked and masked conditions. In the former, angry expressions led to a slowdown in participants' responses. Conversely, in the masked condition, behavioral reactions were impacted by fearful and, to a greater extent, by happy expressions. Responses to fearful faces were slower, and those to happy faces exhibited increased variability in the masked condition compared to the unmasked condition. Furthermore, response accuracy to masked happy faces dramatically declined compared to the unmasked condition and other masked emotions. Discussion In sum, our findings indicate that surgical masks disrupt reactions to emotional expressions, leading people to react less accurately and with heightened variability to happy expressions, provided that the emotional dimension is relevant to people's goals.
Collapse
Affiliation(s)
- Martina Montalti
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
| | - Giovanni Mirabella
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
- IRCCS Neuromed, Pozzilli, Italy
| |
Collapse
|
4
|
Labuschagne I, Dominguez JF, Grace S, Mizzi S, Henry JD, Peters C, Rabinak CA, Sinclair E, Lorenzetti V, Terrett G, Rendell PG, Pedersen M, Hocking DR, Heinrichs M. Specialization of amygdala subregions in emotion processing. Hum Brain Mapp 2024; 45:e26673. [PMID: 38590248 PMCID: PMC11002533 DOI: 10.1002/hbm.26673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Revised: 02/28/2024] [Accepted: 03/13/2024] [Indexed: 04/10/2024] Open
Abstract
The amygdala is important for human fear processing. However, recent research has failed to reveal specificity, with evidence that the amygdala also responds to other emotions. A more nuanced understanding of the amygdala's role in emotion processing, particularly relating to fear, is needed given the importance of effective emotional functioning for everyday function and mental health. We studied 86 healthy participants (44 females), aged 18-49 (mean 26.12 ± 6.6) years, who underwent multiband functional magnetic resonance imaging. We specifically examined the reactivity of four amygdala subregions (using regions of interest analysis) and related brain connectivity networks (using generalized psycho-physiological interaction) to fear, angry, and happy facial stimuli using an emotional face-matching task. All amygdala subregions responded to all stimuli (p-FDR < .05), with this reactivity strongly driven by the superficial and centromedial amygdala (p-FDR < .001). Yet amygdala subregions selectively showed strong functional connectivity with other occipitotemporal and inferior frontal brain regions with particular sensitivity to fear recognition and strongly driven by the basolateral amygdala (p-FDR < .05). These findings suggest that amygdala specialization to fear may not be reflected in its local activity but in its connectivity with other brain regions within a specific face-processing network.
Collapse
Affiliation(s)
- Izelle Labuschagne
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
- School of PsychologyThe University of QueenslandBrisbaneQueenslandAustralia
| | | | - Sally Grace
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Simone Mizzi
- School of Health and Biomedical ScienceRMIT UniversityMelbourneVictoriaAustralia
| | - Julie D. Henry
- School of PsychologyThe University of QueenslandBrisbaneQueenslandAustralia
| | - Craig Peters
- Department of Pharmacy PracticeWayne State UniversityDetroitMichiganUSA
| | | | - Erin Sinclair
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Valentina Lorenzetti
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Gill Terrett
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Peter G. Rendell
- Healthy Brain and Mind Research Centre, School of Behavioural and Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Mangor Pedersen
- Department of Psychology and NeuroscienceAuckland University of TechnologyAucklandNew Zealand
- The Florey Institute of Neuroscience and Mental HealthThe University of MelbourneMelbourneVictoriaAustralia
| | - Darren R. Hocking
- Institute for Health & SportVictoria UniversityMelbourneVictoriaAustralia
| | - Markus Heinrichs
- Department of PsychologyAlbert‐Ludwigs‐University of FreiburgFreiburg im BreisgauGermany
- Freiburg Brain Imaging CenterUniversity Medical Center, Albert‐Ludwigs University of FreiburgFreiburg im BreisgauGermany
| |
Collapse
|
5
|
McKenna BS, Anthenelli RM, Schuckit MA. Sex differences in alcohol's effects on fronto-amygdalar functional connectivity during processing of emotional stimuli. Alcohol Clin Exp Res (Hoboken) 2024; 48:612-622. [PMID: 38379361 PMCID: PMC11015979 DOI: 10.1111/acer.15279] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/22/2023] [Revised: 01/19/2024] [Accepted: 01/28/2024] [Indexed: 02/22/2024]
Abstract
BACKGROUND Amygdala function underlying emotion processing has been shown to vary with an individuals' biological sex. Expanding upon functional magnetic resonance imaging (fMRI) findings reported previously where a low level of response was the focus, we examined alcohol and sex effects on functional connectivity between the amygdala and other brain regions. The central hypothesis predicted that sex would influence alcohol's effects on frontal-limbic functional circuits underlying the processing of negative and positive facial emotions. METHODS Secondary analyses were conducted on data from a double-blind, placebo controlled, within-subjects, cross-over study in 54 sex-matched pairs (N = 108) of 18- to 25-year-old individuals without an alcohol use disorder at baseline. Participants performed an emotional faces fMRI processing task after placebo or approximately 0.7 mL/kg of ethanol. Psychophysiological interaction analyses examined functional connectivity between the amygdala with other brain regions. RESULTS There were significant alcohol-by-sex interactions when processing negatively valenced faces. Whereas intoxicated men exhibited decreased functional connectivity between the amygdala and ventral and dorsal anterior cingulate, angular gyrus, and middle frontal gyrus connectivity was increased in intoxicated women. There was also a main sex effect where women exhibited less functional connectivity in the middle insula than men regardless of whether they received alcohol or placebo. For happy faces, main effects of both sex and alcohol were observed. Women exhibited less amygdala functional connectivity in the right inferior frontal gyrus than men. Both men and women exhibited greater functional connectivity in the superior frontal gyrus in response to alcohol than placebo. CONCLUSIONS Alcohol's effects on amygdala functional circuits that underlying emotional processing vary by sex. Women had higher functional connectivity than men following exposure to a moderate dose of alcohol which could indicate that women are better than men at processing affectively laden stimuli when intoxicated.
Collapse
Affiliation(s)
- Benjamin S McKenna
- Department of Psychiatry, University of California, San Diego, Health Sciences, La Jolla, California, USA
- VA San Diego Healthcare System, San Diego, California, USA
| | - Robert M Anthenelli
- Department of Psychiatry, University of California, San Diego, Health Sciences, La Jolla, California, USA
| | - Marc A Schuckit
- Department of Psychiatry, University of California, San Diego, Health Sciences, La Jolla, California, USA
| |
Collapse
|
6
|
Tal S, Ben-David Sela T, Dolev-Amit T, Hel-Or H, Zilcha-Mano S. Reactivity and stability in facial expressions as an indicator of therapeutic alliance strength. Psychother Res 2024:1-15. [PMID: 38442022 DOI: 10.1080/10503307.2024.2311777] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Accepted: 01/22/2024] [Indexed: 03/07/2024] Open
Abstract
Objective: Aspects of our emotional state are constantly being broadcast via our facial expressions. Psychotherapeutic theories highlight the importance of emotional dynamics between patients and therapists for an effective therapeutic relationship. Two emotional dynamics suggested by the literature are emotional reactivity (i.e., when one person is reacting to the other) and emotional stability (i.e., when a person has a tendency to remain in a given emotional state). Yet, little is known empirically about the association between these dynamics and the therapeutic alliance. This study investigates the association between the therapeutic alliance and the emotional dynamics of reactivity and stability, as manifested in the facial expressions of patients and therapists within the session. Methods: Ninety-four patients with major depressive disorder underwent short-term treatment for depression (N = 1256 sessions). Results: Both therapist reactivity and stability were associated with the alliance, across all time spans. Patient reactivity was associated with the alliance only in a short time span (1 s). Conclusions: These findings may potentially guide therapists in the field to attenuate not only their emotional reaction to their patients, but also their own unique presence in the therapy room.
Collapse
Affiliation(s)
- Shachaf Tal
- Department of Psychology, University of Haifa, Haifa, Israel
| | | | | | - Hagit Hel-Or
- Department of Computer Science, University of Haifa, Haifa, Israel
| | | |
Collapse
|
7
|
Quettier T, Moro E, Tsuchiya N, Sessa P. When mind and body align: examining the role of cross-modal congruency in conscious representations of happy facial expressions. Cogn Emot 2024; 38:267-275. [PMID: 37997901 DOI: 10.1080/02699931.2023.2285823] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Accepted: 11/13/2023] [Indexed: 11/25/2023]
Abstract
This study explored how congruency between facial mimicry and observed expressions affects the stability of conscious facial expression representations. Focusing on the congruency effect between proprioceptive/sensorimotor signals and visual stimuli for happy expressions, participants underwent a binocular rivalry task displaying neutral and happy faces. Mimicry was either facilitated with a chopstick or left unrestricted. Key metrics included Initial Percept (bias indicator), Onset Resolution Time (time from onset to Initial Percept), and Cumulative Time (content stabilization measure). Results indicated that mimicry manipulation significantly impacted Cumulative Time for happy faces, highlighting the importance of congruent mimicry in stabilizing conscious awareness of facial expressions. This supports embodied cognition models, showing the integration of proprioceptive information significantly biases conscious visual perception of facial expressions.
Collapse
Affiliation(s)
- Thomas Quettier
- Department of Developmental and Social Psychology, University of Padova, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padova, Padova, Italy
| | - Elena Moro
- Department of Developmental and Social Psychology, University of Padova, Padua, Italy
| | - Naotsugu Tsuchiya
- Turner Institute for Brain and Mental Health & School of Psychological Sciences, Faculty of Medicine, Nursing, and Health Sciences, Monash University, Melbourne, VIC, Australia
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology (NICT), Osaka, Japan
- Advanced Telecommunications Research Computational Neuroscience Laboratories, Kyoto, Japan
| | - Paola Sessa
- Department of Developmental and Social Psychology, University of Padova, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padova, Padova, Italy
| |
Collapse
|
8
|
Onuma K, Watanabe M, Sasaki N. The grimace scale: a useful tool for assessing pain in laboratory animals. Exp Anim 2024:24-0010. [PMID: 38382945 DOI: 10.1538/expanim.24-0010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/23/2024] Open
Abstract
Accurately and promptly assessing pain in experimental animals is extremely important to avoid unnecessary suffering of the animals and to enhance the reproducibility of experiments. This is a key concern for veterinarians, animal caretakers, and researchers from the perspectives of veterinary care and animal welfare. Various methods including ethology, immunohistochemistry, electrophysiology, and molecular biology are used for pain assessment. However, the grimace scale, which was developed by taking cues from interpreting pain through facial expressions of non-verbal infants, has become recognized as a very simple and practical method for objectively evaluating pain levels by scoring changes in an animal's expressions. This method, which was first implemented with mice approximately 10 years ago, is now being applied to various experimental animals and is widely used in research settings. This review focuses on the usability of the grimace scale from the "cage-side" perspective, aiming to make it a more user-friendly tool for those involved in animal experiments. Differences in facial expressions in response to pain in various animals, examples of applying the grimace scale, current automated analytical methods, and future prospects are discussed.
Collapse
Affiliation(s)
- Kenta Onuma
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University
| | - Masaki Watanabe
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University
| | - Nobuya Sasaki
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University
| |
Collapse
|
9
|
Pu L, Coppieters MW, Smalbrugge M, Jones C, Byrnes J, Todorovic M, Moyle W. Associations between facial expressions and observational pain in residents with dementia and chronic pain. J Adv Nurs 2024. [PMID: 38334268 DOI: 10.1111/jan.16063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2023] [Revised: 12/13/2023] [Accepted: 01/07/2024] [Indexed: 02/10/2024]
Abstract
AIM To identify specific facial expressions associated with pain behaviors using the PainChek application in residents with dementia. DESIGN This is a secondary analysis from a study exploring the feasibility of PainChek to evaluate the effectiveness of a social robot (PARO) intervention on pain for residents with dementia from June to November 2021. METHODS Participants experienced PARO individually five days per week for 15 min (once or twice) per day for three consecutive weeks. The PainChek app assessed each resident's pain levels before and after each session. The association between nine facial expressions and the adjusted PainChek scores was analyzed using a linear mixed model. RESULTS A total of 1820 assessments were completed with 46 residents. Six facial expressions were significantly associated with a higher adjusted PainChek score. Horizontal mouth stretch showed the strongest association with the score, followed by brow lowering parting lips, wrinkling of the nose, raising of the upper lip and closing eyes. However, the presence of cheek raising, tightening of eyelids and pulling at the corner lip were not significantly associated with the score. Limitations of using the PainChek app were identified. CONCLUSION Six specific facial expressions were associated with observational pain scores in residents with dementia. Results indicate that automated real-time facial analysis is a promising approach to assessing pain in people with dementia. However, it requires further validation by human observers before it can be used for decision-making in clinical practice. IMPACT Pain is common in people with dementia, while assessing pain is challenging in this group. This study generated new evidence of facial expressions of pain in residents with dementia. Results will inform the development of valid artificial intelligence-based algorithms that will support healthcare professionals in identifying pain in people with dementia in clinical situations. REPORTING METHOD The study adheres to the CONSORT reporting guidelines. PATIENT OR PUBLIC CONTRIBUTION One resident with dementia and two family members of people with dementia were consulted and involved in the study design, where they provided advice on the protocol, information sheets and consent forms, and offered valuable insights to ensure research quality and relevance. TRIAL REGISTRATION Australian and New Zealand Clinical Trials Registry number (ACTRN12621000837820).
Collapse
Affiliation(s)
- Lihui Pu
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
| | - Michel W Coppieters
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Health Sciences and Social Work, Griffith University, Brisbane, Queensland, Australia
- Amsterdam Movement Sciences - Program Musculoskeletal Health, Faculty of Behavioral and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Martin Smalbrugge
- Department of Medicine for Older People, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
- Amsterdam Public Health Research Institute, Aging & Later Life, Amsterdam, The Netherlands
| | - Cindy Jones
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- Faculty of Health Sciences & Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Joshua Byrnes
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- Centre for Applied Health Economics, School of Medicine and Dentistry, Griffith University, Brisbane, Queensland, Australia
| | - Michael Todorovic
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
- Faculty of Health Sciences & Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Wendy Moyle
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
| |
Collapse
|
10
|
Bianchini E, Rinaldi D, Alborghetti M, Simonelli M, D’Audino F, Onelli C, Pegolo E, Pontieri FE. The Story behind the Mask: A Narrative Review on Hypomimia in Parkinson's Disease. Brain Sci 2024; 14:109. [PMID: 38275529 PMCID: PMC10814039 DOI: 10.3390/brainsci14010109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 01/18/2024] [Accepted: 01/19/2024] [Indexed: 01/27/2024] Open
Abstract
Facial movements are crucial for social and emotional interaction and well-being. Reduced facial expressions (i.e., hypomimia) is a common feature in patients with Parkinson's disease (PD) and previous studies linked this manifestation to both motor symptoms of the disease and altered emotion recognition and processing. Nevertheless, research on facial motor impairment in PD has been rather scarce and only a limited number of clinical evaluation tools are available, often suffering from poor validation processes and high inter- and intra-rater variability. In recent years, the availability of technology-enhanced quantification methods of facial movements, such as automated video analysis and machine learning application, led to increasing interest in studying hypomimia in PD. In this narrative review, we summarize the current knowledge on pathophysiological hypotheses at the basis of hypomimia in PD, with particular focus on the association between reduced facial expressions and emotional processing and analyze the current evaluation tools and management strategies for this symptom, as well as future research perspectives.
Collapse
Affiliation(s)
- Edoardo Bianchini
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- AGEIS, Université Grenoble Alpes, 38000 Grenoble, France
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Domiziana Rinaldi
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marika Alborghetti
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
| | - Marta Simonelli
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Ospedale dei Castelli, ASL Rome 6, 00040 Ariccia, Italy
| | | | - Camilla Onelli
- Department of Molecular Medicine, University of Padova, 35121 Padova, Italy;
| | - Elena Pegolo
- Department of Information Engineering, University of Padova, 35131 Padova, Italy;
| | - Francesco E. Pontieri
- Department of Neuroscience, Mental Health and Sensory Organs (NESMOS), Sapienza University of Rome, 00189 Rome, Italy; (E.B.); (D.R.); (M.A.); (M.S.)
- Sant’Andrea University Hospital, 00189 Rome, Italy;
- Fondazione Santa Lucia IRCCS, 00179 Rome, Italy
| |
Collapse
|
11
|
Atwell R, Vankan D. Prospective Study of 506 Dogs with Tick Paralysis: Investigating Measures of Severity and Clinical Signs as Predictors of Mortality and Assessing the Benefits of Different Therapeutics. Animals (Basel) 2024; 14:188. [PMID: 38254357 PMCID: PMC10812437 DOI: 10.3390/ani14020188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2023] [Revised: 12/31/2023] [Accepted: 01/02/2024] [Indexed: 01/24/2024] Open
Abstract
Survey data from 42 Australian eastern seaboard veterinary practices involving 506 cases are reported with regard to clinical signs, disease severity, mortality, use of pharmaceuticals, and recovery times. New measures of disease severity (visual analogue scales (VAS) and facial expressions) were tested alongside "gold standard" measures (neuromuscular junction (NMJ) scores). Univariable and multivariable logistic regression analyses were conducted to evaluate associations between variables. The VAS scores were progressive, prognostic (especially the respiratory scores) and correlated with the NMJ scores. The presence of inspiratory dyspnoea and crackles on the day of hospitalisation, progressing to expiratory dyspnoea and an expiratory wheeze 24 h later, were highly predictive of mortality. Altered facial features on hospital admission were also highly predictive of mortality. The previously used respiratory score (using various clinical signs) was not predictive of mortality. Older animals had a higher mortality rate, and no gender or breed susceptibility was found. The only pharmaceuticals that were positively associated with mortality were tick antiserum and, in severe cases, antibiotics. The use of many pharmaceutical products (acepromazine, atropine, steroids, antihistamines, antiemetics, diuretics, and S8 anti-anxiety and sedation drugs) had no effect on mortality. More drug classes were used with increasing clinical severity and specific factors (e.g., vomiting/retching, hydration) affected the period of hospitalisation. Geographic variation in respiratory signs and toxicity scores was evident, whereas mortality and disease severity were not different across regions.
Collapse
Affiliation(s)
| | - Dianne Vankan
- School of Veterinary Science, The University of Queensland, Gatton 4343, Australia;
| |
Collapse
|
12
|
Valderrama CE, Gomes Ferreira MG, Mayor Torres JM, Garcia-Ramirez AR, Camorlinga SG. Editorial: Machine learning approaches to recognize human emotions. Front Psychol 2024; 14:1333794. [PMID: 38239471 PMCID: PMC10794425 DOI: 10.3389/fpsyg.2023.1333794] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Accepted: 12/13/2023] [Indexed: 01/22/2024] Open
Affiliation(s)
- Camilo E. Valderrama
- Department of Applied Computer Science, University of Winnipeg, Winnipeg, MB, Canada
| | | | - Juan Manuel Mayor Torres
- Department of Information Engineering and Computer Science, University of Trento, Trento, Italy
- Digital Ware Advisory Inc., Bogota, Colombia
| | | | - Sergio G. Camorlinga
- Department of Applied Computer Science, University of Winnipeg, Winnipeg, MB, Canada
| |
Collapse
|
13
|
Buçgün İ, Korkmaz ŞA, Öyekçin DG. Facial emotion recognition is associated with executive functions and depression scores, but not staging of dementia, in mild-to-moderate Alzheimer's disease. Brain Behav 2024; 14:e3390. [PMID: 38376045 PMCID: PMC10808849 DOI: 10.1002/brb3.3390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Revised: 12/20/2023] [Accepted: 12/22/2023] [Indexed: 02/21/2024] Open
Abstract
BACKGROUND Although deficits in facial emotion recognition (FER) significantly affect interpersonal communication and social functioning, there is no consensus on how FER affects Alzheimer's disease (AD). In this study, we aimed to investigate the clinical and neuropsychological factors affecting the possible deficits in the FER abilities of patients with AD. METHODS This cross-sectional study included 37 patients with mild [clinical dementia rating (CDR) scale score = 1] or moderate (CDR = 2) AD, in whom vascular dementia and depression were excluded, and 24 cognitively normal (CDR = 0) subjects. FER ability was determined using the facial emotion identification test (FEIT) and facial emotion discrimination test (FEDT). All participants underwent mini-mental state examination (MMSE), frontal assessment battery (FAB), and geriatric depression scale (GDS). The neuropsychiatric inventory-clinician rating scale (NPI-C), Katz index of independence in activities of daily living, and Lawton instrumental activities of daily living were also administered to patients with AD. RESULTS The FEIT and FEDT total scores showed that patients with mild and moderate AD had significant FER deficits compared to healthy controls. However, no significant difference was observed between patients with mild and moderate AD in the FEIT and FEDT total scores. FEIT and FEDT scores were not correlated with the MMSE and NPI-C total and subscales scores in patients with AD. Linear regression indicated that FEIT and FEDT total scores were significantly related to age and FAB scores. The GDS score negatively moderated the relationship between FAB and FEDT. CONCLUSIONS This study demonstrated a decreased FER ability in patients with AD. The critical point in FER deficits is the presence of dementia, not the dementia stage, in AD. It has been determined that executive functions and depression (even at a subsyndromal level), which have limited knowledge, are associated with FER abilities.
Collapse
Affiliation(s)
| | - Şükrü Alperen Korkmaz
- Department of PsychiatryFaculty of MedicineÇanakkale Onsekiz Mart UniversityÇanakkaleTurkey
| | - Demet Güleç Öyekçin
- Department of PsychiatryFaculty of MedicineÇanakkale Onsekiz Mart UniversityÇanakkaleTurkey
| |
Collapse
|
14
|
Thomas PJN, Caharel S. Do masks cover more than just a face? A study on how facemasks affect the perception of emotional expressions according to their degree of intensity. Perception 2024; 53:3-16. [PMID: 37709269 DOI: 10.1177/03010066231201230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/16/2023]
Abstract
Emotional facial expressions convey crucial information in nonverbal communication and serve as a mediator in face-to-face relationships. Their recognition would rely on specific facial traits depending on the perceived emotion. During the COVID-19 pandemic, wearing a facemask has thus disrupted the human ability to read emotions from faces. Yet, these effects are usually assessed across studies from faces expressing stereotypical and exaggerated emotions, which is far removed from real-life conditions. The objective of the present study was to evaluate the impact of facemasks through an emotion categorization task using morphs ranging from a neutral face and an expressive face (anger, disgust, fear, happiness, and sadness) (from 0% neutral to 100% expressive in 20% steps). Our results revealed a strong impact of facemasks on the recognition of expressions of disgust, happiness, and sadness, resulting in a decrease in performance and an increase in misinterpretations, both for low and high levels of intensity. In contrast, the recognition of anger and fear, as well as neutral expression, was found to be less impacted by mask-wearing. Future studies should address this issue from a more ecological point of view with the aim of taking concrete adaptive measures in the context of daily interactions.
Collapse
|
15
|
Cheng AJ, Malo A, Garbin M, Monteiro BP, Steagall PV. Construct validity, responsiveness and reliability of the Feline Grimace Scale in kittens. J Feline Med Surg 2023; 25:1098612X231211765. [PMID: 38095930 DOI: 10.1177/1098612x231211765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
OBJECTIVES The aim of the present study was to investigate the construct validity, responsiveness and reliability of the Feline Grimace Scale (FGS) in kittens. METHODS A total of 36 healthy female kittens (aged 10 weeks to 6 months) were included in a prospective, randomized, blinded study. Video recordings of all kittens were made before and 1 and 2 h after ovariohysterectomy using an opioid-free injectable anesthetic protocol with or without multimodal analgesia. Additional recordings were taken before and 1 h after administration of rescue analgesia (buprenorphine 0.02 mg/kg IM) to painful kittens. Screenshots of facial images were collected from the video recordings for FGS scoring. Four observers blinded to treatment groups and time points scored 111 randomized images twice with a 5-week interval using the FGS. Five action units (AUs) were scored (ear position, orbital tightening, muzzle tension, whiskers position and head position; 0-2 each). Construct validity, responsiveness, and inter- and intra-rater reliability were evaluated using linear models with Benjamini-Hochberg correction, Wilcoxon signed-rank test and single intraclass correlation coefficients (ICCsingle), respectively (P <0.05). RESULTS FGS total ratio scores were higher at 1 and 2 h after ovariohysterectomy (median [interquartile range, IQR]: 0.30 [0.20-0.40] and 0.30 [0.20-0.40], respectively) than at baseline (median [IQR]: 0.10 [0.00-0.30]) (P <0.001). FGS total ratio scores were lower after the administration of rescue analgesia (median [IQR] before and after rescue analgesia) 0.40 [0.20-0.50] and 0.20 [0.10-0.38], respectively (P <0.001). Inter-rater ICCsingle was 0.68 for the FGS total ratio scores and 0.35-0.70 for all AUs considered individually. Intra-rater ICCsingle was 0.77-0.91 for the FGS total ratio scores and 0.55-1.00 for all AUs considered individually. CONCLUSIONS AND RELEVANCE The FGS is a valid and responsive acute pain-scoring instrument with moderate inter-rater reliability and good to excellent intra-rater reliability in kittens.
Collapse
Affiliation(s)
- Alice J Cheng
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Université de Montréal, Saint-Hyacinthe, QC, Canada
| | - Annie Malo
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Université de Montréal, Saint-Hyacinthe, QC, Canada
| | - Marta Garbin
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Université de Montréal, Saint-Hyacinthe, QC, Canada
| | - Beatriz P Monteiro
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Université de Montréal, Saint-Hyacinthe, QC, Canada
| | - Paulo V Steagall
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Université de Montréal, Saint-Hyacinthe, QC, Canada
- Department of Veterinary Clinical Sciences, Centre for Companion Animal Health and Welfare, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong SAR, China
| |
Collapse
|
16
|
Garrido MV, Godinho S. The influence of consonant wanderings and facial expressions in warmth and competence judgments. Cogn Emot 2023; 37:1272-1280. [PMID: 37675963 DOI: 10.1080/02699931.2023.2253423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 08/18/2023] [Accepted: 08/22/2023] [Indexed: 09/08/2023]
Abstract
ABSTRACTThe preference for usernames whose oral pronunciation implies inward wandering articulatory movements over those involving outward movements - the in-out effect - has been shown to shape person perception judgments. Across three studies, we further tested the boundary conditions to this effect by combining the manipulation of the articulation direction of mock online usernames with one of the most critical cues for interpersonal judgments - facial expressions. As expected, users displaying smiling faces were rated as warmer and more competent than those displaying angry expressions. Notably, even in the presence of such diagnostic cues for social judgment, the articulatory activity involved in pronouncing a person's name still affected the impressions formed, particularly in the warmth dimension. These results show that the in-out effect did not vanish even when highly diagnostic visual information was available. Overall, the current work further emphasises the role of sensorimotor experience in person perception while providing additional evidence for the in-out effect, its boundary conditions, and potential mechanisms.
Collapse
Affiliation(s)
- Margarida V Garrido
- Iscte - Instituto Universitário de Lisboa, Centro de Investigação e Intervenção Social, Lisboa, Portugal
| | - Sandra Godinho
- Iscte - Instituto Universitário de Lisboa, Centro de Investigação e Intervenção Social, Lisboa, Portugal
| |
Collapse
|
17
|
Constantinou E, Vlemincx E, Panayiotou G. Testing emotional response coherence assumptions: Comparing emotional versus non-emotional states. Psychophysiology 2023; 60:e14359. [PMID: 37282750 DOI: 10.1111/psyp.14359] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Revised: 05/09/2023] [Accepted: 05/10/2023] [Indexed: 06/08/2023]
Abstract
Although central to theories of emotion, emotional response coherence, that is, coordination among various emotion response systems, has received inconsistent empirical support. This study tests a basic assumption of response coherence, that is, that it characterizes emotional states defining their beginning and end. To do so, we (a) compare response coherence between emotional versus non-emotional states and (b) examine how emotional coherence changes over time, before, during, and after an emotional episode. Seventy-nine participants viewed neutral, pleasant, and unpleasant film clips and rated continuously how pleasant they felt (experience) before (anticipation), during, and after (recovery) each clip. Autonomic physiological arousal responses (skin conductance level, heart rate; physiology) and facial expressions (corrugator, zygomatic activity; expression) were recorded. Within-person cross-correlations between all emotional response pairs were calculated for each phase. Analyses comparing coherence during emotional versus neutral film viewing showed that only experience-expression coherence was higher for emotional versus neutral films, indicating specificity for emotional states. Examining coherence across phases indicated that coherence increased from anticipation to emotional film viewing, as expected, for experience-expression and experience-physiology pairs (SCL only). Of those pairs, increased coherence returned to baseline during recovery, as theoretically assumed, only for experience-corrugator activity coherence. Current findings provide empirical support for theoretical views of response coherence as a defining feature of emotional episodes, but mostly for the coherence between experience and facial expressions. Further research needs to investigate the role of sympathetic arousal indices, as well as the role of response coherence in emotional recovery.
Collapse
Affiliation(s)
- Elena Constantinou
- Department of Psychology, University of Cyprus, Nicosia, Cyprus
- Department of Social and Behavioural Sciences, European University Cyprus, Nicosia, Cyprus
| | - Elke Vlemincx
- Department of Health Sciences, Vrije Universiteit Amsterdam, Amsterdam Public Health Research Institute, Amsterdam Movement Sciences Research Institute, Amsterdam, The Netherlands
| | | |
Collapse
|
18
|
Vaessen M, Van der Heijden K, de Gelder B. Modality-specific brain representations during automatic processing of face, voice and body expressions. Front Neurosci 2023; 17:1132088. [PMID: 37869514 PMCID: PMC10587395 DOI: 10.3389/fnins.2023.1132088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2022] [Accepted: 09/05/2023] [Indexed: 10/24/2023] Open
Abstract
A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
Collapse
|
19
|
Gong M, Chen Y, Li F, Lin Z. The availability of attentional resources modulates the anger superiority effect. Psych J 2023; 12:628-636. [PMID: 37421365 DOI: 10.1002/pchj.664] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Accepted: 05/16/2023] [Indexed: 07/10/2023]
Abstract
It is much debated whether there is an anger superiority effect (ASE) in the recognition of facial expressions. Recent research has shown that the attentional demand of a task plays a vital role in the emergence and magnitude of the ASE. However, only a visual crowding task was employed to manipulate attentional demands, and it is unclear whether the emergence and magnitude of the ASE was contingent on the availability of attentional resources in general. The present study employed a dual-task paradigm to manipulate the availability of attentional resources for facial expression discrimination in which participants were instructed to perform a central letter discrimination task and a peripheral facial expression discrimination task concurrently. Experiment 1 showed an ASE in the dual task but no ASE was yielded when the facial expression discrimination task was performed alone. Experiment 2 replicated this finding and further demonstrated a gradual shift from no ASE to an attenuated ASE and finally to a strong ASE as attentional resources that were available for facial expression discrimination gradually became limited. Together, these results suggest that the emergence and magnitude of the ASE is modulated by the availability of attentional resources, which supports an Attentional Demands Modulation Hypothesis of the ASE.
Collapse
Affiliation(s)
- Mingliang Gong
- School of Psychology, Jiangxi Normal University, Nanchang, People's Republic of China
| | - Yufei Chen
- School of Psychology, Jiangxi Normal University, Nanchang, People's Republic of China
| | - Fanghui Li
- School of Psychology, Jiangxi Normal University, Nanchang, People's Republic of China
| | - Zhen Lin
- School of Psychology, Jiangxi Normal University, Nanchang, People's Republic of China
| |
Collapse
|
20
|
Rodosky SE, Stephens JE, Hittner EF, Rompilla DB, Mittal VA, Haase CM. Facial expressions in adolescent-parent interactions and mental health: A proof-of-concept study. Emotion 2023; 23:2110-2115. [PMID: 36729505 PMCID: PMC10394109 DOI: 10.1037/emo0001216] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
Parent-child relationships are hotbeds of emotion and play a key role in mental health. The present proof-of-concept study examined facial expressions of emotion during adolescent-parent interactions and links with internalizing mental health symptoms. Neutral, negative, and positive facial expressions were objectively measured in 28 parent-adolescent dyads during three 10-min dyadic interactions. Internalizing mental health symptoms were measured using anxiety and depressive symptom questionnaires. Data were analyzed using actor-partner interdependence modeling. Results revealed that higher levels of (a) adolescents' neutral facial expressions as well as (b) parents' negative facial expressions were associated with higher levels of adolescents' mental health symptoms. Findings did not support a robust link between (c) positive expressions and mental health symptoms. Together, these results demonstrate the utility of facial expressions of emotion during parent-child interactions as behavioral correlates of adolescents' internalizing mental health symptoms, highlight the need for replication with larger samples, and suggest directions for future research. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Collapse
Affiliation(s)
| | - Jacquelyn E Stephens
- Department of Medical Social Sciences, Osher Center for Integrative Medicine, Northwestern University Feinberg School of Medicine
| | - Emily F Hittner
- School of Education and Social Policy, Northwestern University
| | | | | | - Claudia M Haase
- School of Education and Social Policy, Northwestern University
| |
Collapse
|
21
|
Guillin A, Chaby L, Vergilino-Perez D. He must be mad; she might be sad: perceptual and decisional aspects of emotion recognition in ambiguous faces. Cogn Emot 2023:1-10. [PMID: 37732611 DOI: 10.1080/02699931.2023.2258585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Accepted: 07/02/2023] [Indexed: 09/22/2023]
Abstract
While the recognition of ambiguous emotions is crucial for successful social interactions, previous work has shown that they are perceived differently depending on whether they are viewed on male or female faces. The present paper aims to shed light on this phenomenon by exploring two hypotheses: the confounded signal hypothesis, which posits the existence of perceptual overlaps between emotions and gendered morphotypes, and the social role hypothesis, according to which the observer's responses are biased by stereotypes. Participants were asked to categorise blended faces (i.e. artificial faces made ambiguous by mixing two emotions) in a forced-choice task. Six emotions were used to create each blend (neutral, surprise, sadness, fear, happiness, anger), for a total of 15 expressions. We then applied signal detection theory - considering both the morphotype of the stimuli and the participants' gender - to distinguish participants' perceptual processes from their response biases. The results showed a perceptual advantage for anger on male faces and for sadness on female faces. However, different strategies were deployed when labelling emotions on gendered morphotypes. In particular, a response bias towards angry male faces establishes their special status, as they resulted in both excellent detection and a tendency to be over-reported, especially by women.
Collapse
Affiliation(s)
- Amandine Guillin
- Vision Action Cognition, Université Paris Cité, Boulogne-Billancourt, France
| | - Laurence Chaby
- CNRS, Institut des Systèmes Intelligents et de Robotique, ISIR, Sorbonne Université, Paris, France
- Université Paris Cité, UFR de Psychologie, Boulogne-Billancourt, France
| | | |
Collapse
|
22
|
Matsufuji Y, Ueji K, Yamamoto T. Predicting Perceived Hedonic Ratings through Facial Expressions of Different Drinks. Foods 2023; 12:3490. [PMID: 37761199 PMCID: PMC10528552 DOI: 10.3390/foods12183490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Revised: 09/07/2023] [Accepted: 09/18/2023] [Indexed: 09/29/2023] Open
Abstract
Previous studies have established the utility of facial expressions as an objective assessment approach for determining the hedonics (overall pleasure) of food and beverages. This study endeavors to validate the conclusions drawn from preceding research, illustrating that facial expressions prompted by tastants possess the capacity to forecast the perceived hedonic ratings of these tastants. Facial expressions of 29 female participants, aged 18-55 years, were captured using a digital camera during their consumption of diverse concentrations of solutions representative of five basic tastes. Employing the widely employed facial expression analysis application FaceReader, the facial expressions were meticulously assessed, identifying seven emotions (surprise, happiness, scare, neutral, disgust, sadness, and anger) characterized by scores ranging from 0 to 1-a numerical manifestation of emotional intensity. Simultaneously, participants rated the hedonics of each solution, utilizing a scale spanning from -5 (extremely unpleasant) to +5 (extremely pleasant). Employing a multiple linear regression analysis, a predictive model for perceived hedonic ratings was devised. The model's efficacy was scrutinized by assessing emotion scores from 11 additional taste solutions, sampled from 20 other participants. The anticipated hedonic ratings demonstrated robust alignment and agreement with the observed ratings, underpinning the validity of earlier findings even when incorporating diverse software and taste stimuli across a varied participant base. We discuss some limitations and practical implications of our technique in predicting food and beverage hedonics using facial expressions.
Collapse
Affiliation(s)
| | | | - Takashi Yamamoto
- Department of Nutrition, Faculty of Health Sciences, Kio University, 4-2-2 Umami-naka, Koryo, Kitakatsuragi, Nara 635-0832, Japan; (Y.M.); (K.U.)
| |
Collapse
|
23
|
Frazier TW, Busch RM, Klaas P, Lachlan K, Jeste S, Kolevzon A, Loth E, Harris J, Speer L, Pepper T, Anthony K, Graglia JM, Delagrammatikas CG, Bedrosian-Sermone S, Smith-Hicks C, Huba K, Longyear R, Green-Snyder L, Shic F, Sahin M, Eng C, Hardan AY, Uljarević M. Development of webcam-collected and artificial-intelligence-derived social and cognitive performance measures for neurodevelopmental genetic syndromes. Am J Med Genet C Semin Med Genet 2023; 193:e32058. [PMID: 37534867 PMCID: PMC10543620 DOI: 10.1002/ajmg.c.32058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 07/19/2023] [Indexed: 08/04/2023]
Abstract
This study focused on the development and initial psychometric evaluation of a set of online, webcam-collected, and artificial intelligence-derived patient performance measures for neurodevelopmental genetic syndromes (NDGS). Initial testing and qualitative input was used to develop four stimulus paradigms capturing social and cognitive processes, including social attention, receptive vocabulary, processing speed, and single-word reading. The paradigms were administered to a sample of 375 participants, including 163 with NDGS, 56 with idiopathic neurodevelopmental disability (NDD), and 156 neurotypical controls. Twelve measures were created from the four stimulus paradigms. Valid completion rates varied from 87 to 100% across measures, with lower but adequate completion rates in participants with intellectual disability. Adequate to excellent internal consistency reliability (α = 0.67 to 0.95) was observed across measures. Test-retest reproducibility at 1-month follow-up and stability at 4-month follow-up was fair to good (r = 0.40-0.73) for 8 of the 12 measures. All gaze-based measures showed evidence of convergent and discriminant validity with parent-report measures of other cognitive and behavioral constructs. Comparisons across NDGS groups revealed distinct patterns of social and cognitive functioning, including people with PTEN mutations showing a less impaired overall pattern and people with SYNGAP1 mutations showing more attentional, processing speed, and social processing difficulties relative to people with NFIX mutations. Webcam-collected performance measures appear to be a reliable and potentially useful method for objective characterization and monitoring of social and cognitive processes in NDGS and idiopathic NDD. Additional validation work, including more detailed convergent and discriminant validity analyses and examination of sensitivity to change, is needed to replicate and extend these observations.
Collapse
Affiliation(s)
- Thomas W Frazier
- Department of Psychology, John Carroll University, University Heights, Ohio, USA
- Departments of Pediatrics and Psychiatry, SUNY Upstate Medical University, Syracuse, New York, USA
| | - Robyn M Busch
- Department of Neurology, Neurological Institute, Cleveland Clinic, Cleveland, Ohio, USA
- Genomic Medicine Institute, Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio, USA
| | - Patricia Klaas
- Department of Neurology, Neurological Institute, Cleveland Clinic, Cleveland, Ohio, USA
| | - Katherine Lachlan
- Human Genetics and Genomic Medicine, Faculty of Medicine, University of Southampton and Wessex Clinical Genetics Service, University Hospital Southampton NHS Foundation Trust, Southampton, UK
| | - Shafali Jeste
- Division of Neurology, Children's Hospital of Los Angeles, Los Angeles, California, USA
| | - Alexander Kolevzon
- Departments of Psychiatry and Pediatrics, Seaver Autism Center for Research and Treatment, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Eva Loth
- Department of Forensic and Neurodevelopmental Science, Institute of Psychiatry, Psychology and Neuroscience, Kings College London, London, UK
| | - Jacqueline Harris
- Department of Neurology, Kennedy Krieger Institute and Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | | | - Tom Pepper
- PTEN Research Foundation, Cheltenham, UK
| | - Kristin Anthony
- PTEN Hamartoma Tumor Syndrome Foundation, Huntsville, Alabama, USA
| | | | | | | | - Constance Smith-Hicks
- Department of Neurology, Kennedy Krieger Institute and Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Katie Huba
- Department of Psychology, John Carroll University, University Heights, Ohio, USA
| | | | | | - Frederick Shic
- Department of Pediatrics, University of Washington and Seattle Children's Research Institute, Seattle, Washington, USA
| | - Mustafa Sahin
- Rosamund Stone Zander Translational Neuroscience Center, Department of Neurology, Boston Children's Hospital and Harvard Medical School, Boston, Massachusetts, USA
| | - Charis Eng
- Genomic Medicine Institute, Lerner Research Institute, Cleveland Clinic, Cleveland, Ohio, USA
| | - Antonio Y Hardan
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, California, USA
| | - Mirko Uljarević
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, California, USA
- Melbourne School of Psychological Sciences, Faculty of Medicine, Dentistry, and Health Sciences, The University of Melbourne, Melbourne, Victoria, Australia
| |
Collapse
|
24
|
Gupta T, Osborne KJ, Nadig A, Haase CM, Mittal VA. Alterations in facial expressions in individuals at risk for psychosis: a facial electromyography approach using emotionally evocative film clips. Psychol Med 2023; 53:5829-5838. [PMID: 36285533 PMCID: PMC10130238 DOI: 10.1017/s0033291722003087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
BACKGROUND Negative symptoms such as blunted facial expressivity are characteristic of schizophrenia. However, it is not well-understood if and what abnormalities are present in individuals at clinical high-risk (CHR) for psychosis. METHODS This experimental study employed facial electromyography (left zygomaticus major and left corrugator supercilia) in a sample of CHR individuals (N = 34) and healthy controls (N = 32) to detect alterations in facial expressions in response to emotionally evocative film clips and to determine links with symptoms. RESULTS Findings revealed that the CHR group showed facial blunting manifested in reduced zygomatic activity in response to an excitement (but not amusement, fear, or sadness) film clip compared to controls. Reductions in zygomatic activity in the CHR group emerged in response to the emotionally evocative peak period of the excitement film clip. Lower zygomaticus activity during the excitement clip was related to anxiety while lower rates of change in zygomatic activity during the excitement video clip were related to higher psychosis risk conversion scores. CONCLUSIONS Together, these findings inform vulnerability/disease driving mechanisms and biomarker and treatment development.
Collapse
Affiliation(s)
- Tina Gupta
- Department of Psychiatry, University of Pittsburgh, Pittsburgh, PA, USA
| | - K. Juston Osborne
- Department of Psychology, Northwestern University, Evanston, IL, USA
| | - Ajay Nadig
- Harvard/MIT MD-PhD Program, Harvard Medical School, Boston, MA, 02115
| | - Claudia M. Haase
- Department of Psychology, Northwestern University, Evanston, IL, USA
- School of Education and Social Policy, Northwestern University, Evanston, IL, USA
| | - Vijay A. Mittal
- Department of Psychology, Northwestern University, Evanston, IL, USA
| |
Collapse
|
25
|
Othman E, Werner P, Saxen F, Al-Hamadi A, Gruss S, Walter S. Automated Electrodermal Activity and Facial Expression Analysis for Continuous Pain Intensity Monitoring on the X-ITE Pain Database. Life (Basel) 2023; 13:1828. [PMID: 37763232 PMCID: PMC10533107 DOI: 10.3390/life13091828] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 08/14/2023] [Accepted: 08/25/2023] [Indexed: 09/29/2023] Open
Abstract
This study focuses on improving healthcare quality by introducing an automated system that continuously monitors patient pain intensity. The system analyzes the Electrodermal Activity (EDA) sensor modality modality, compares the results obtained from both EDA and facial expressions modalities, and late fuses EDA and facial expressions modalities. This work extends our previous studies of pain intensity monitoring via an expanded analysis of the two informative methods. The EDA sensor modality and facial expression analysis play a prominent role in pain recognition; the extracted features reflect the patient's responses to different pain levels. Three different approaches were applied: Random Forest (RF) baseline methods, Long-Short Term Memory Network (LSTM), and LSTM with the sample-weighting method (LSTM-SW). Evaluation metrics included Micro average F1-score for classification and Mean Squared Error (MSE) and intraclass correlation coefficient (ICC [3, 1]) for both classification and regression. The results highlight the effectiveness of late fusion for EDA and facial expressions, particularly in almost balanced datasets (Micro average F1-score around 61%, ICC about 0.35). EDA regression models, particularly LSTM and LSTM-SW, showed superiority in imbalanced datasets and outperformed guessing (where the majority of votes indicate no pain) and baseline methods (RF indicates Random Forest classifier (RFc) and Random Forest regression (RFr)). In conclusion, by integrating both modalities or utilizing EDA, they can provide medical centers with reliable and valuable insights into patients' pain experiences and responses.
Collapse
Affiliation(s)
- Ehsan Othman
- Department of Neuro-Information Technology, Institute for Information Technology and Communications, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany; (P.W.); (F.S.); (A.A.-H.)
| | - Philipp Werner
- Department of Neuro-Information Technology, Institute for Information Technology and Communications, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany; (P.W.); (F.S.); (A.A.-H.)
| | - Frerk Saxen
- Department of Neuro-Information Technology, Institute for Information Technology and Communications, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany; (P.W.); (F.S.); (A.A.-H.)
| | - Ayoub Al-Hamadi
- Department of Neuro-Information Technology, Institute for Information Technology and Communications, Otto-von-Guericke University Magdeburg, 39106 Magdeburg, Germany; (P.W.); (F.S.); (A.A.-H.)
| | - Sascha Gruss
- Department of Medical Psychology, Ulm University, 89081 Ulm, Germany; (S.G.); (S.W.)
| | - Steffen Walter
- Department of Medical Psychology, Ulm University, 89081 Ulm, Germany; (S.G.); (S.W.)
| |
Collapse
|
26
|
Broulidakis MJ, Kiprijanovska I, Severs L, Stankoski S, Gjoreski M, Mavridou I, Gjoreski H, Cox S, Bradwell D, Stone JM, Nduka C. Optomyography-based sensing of facial expression derived arousal and valence in adults with depression. Front Psychiatry 2023; 14:1232433. [PMID: 37614653 PMCID: PMC10442807 DOI: 10.3389/fpsyt.2023.1232433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Accepted: 07/28/2023] [Indexed: 08/25/2023] Open
Abstract
Background Continuous assessment of affective behaviors could improve the diagnosis, assessment and monitoring of chronic mental health and neurological conditions such as depression. However, there are no technologies well suited to this, limiting potential clinical applications. Aim To test if we could replicate previous evidence of hypo reactivity to emotional salient material using an entirely new sensing technique called optomyography which is well suited to remote monitoring. Methods Thirty-eight depressed and 37 controls (≥18, ≤40 years) who met a research diagnosis of depression and an age-matched non-depressed control group. Changes in facial muscle activity over the brow (corrugator supercilli) and cheek (zygomaticus major) were measured whilst volunteers watched videos varying in emotional salience. Results Across all participants, videos rated as subjectively positive were associated with activation of muscles in the cheek relative to videos rated as neutral or negative. Videos rated as subjectively negative were associated with brow activation relative to videos judged as neutral or positive. Self-reported arousal was associated with a step increase in facial muscle activation across the brow and cheek. Group differences were significantly reduced activation in facial muscles during videos considered subjectively negative or rated as high arousal in depressed volunteers compared with controls. Conclusion We demonstrate for the first time that it is possible to detect facial expression hypo-reactivity in adults with depression in response to emotional content using glasses-based optomyography sensing. It is hoped these results may encourage the use of optomyography-based sensing to track facial expressions in the real-world, outside of a specialized testing environment.
Collapse
Affiliation(s)
| | | | | | | | - Martin Gjoreski
- Faculty of Informatics, Università della Svizzera italiana, Lugano, Switzerland
| | | | - Hristijan Gjoreski
- Ss. Cyril and Methodius University in Skopje (UKIM), Skopje, North Macedonia
| | | | | | - James M. Stone
- Brighton and Sussex Medical School, University of Sussex, Brighton, United Kingdom
| | - Charles Nduka
- Emteq Ltd., Brighton, United Kingdom
- Queen Victoria Hospital, East Grinstead, United Kingdom
| |
Collapse
|
27
|
Juuse L, Kreegipuu K, Põldver N, Kask A, Mogom T, Anbarjafari G, Allik J. Processing emotions from faces and words measured by event-related brain potentials. Cogn Emot 2023; 37:959-972. [PMID: 37338015 DOI: 10.1080/02699931.2023.2223906] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2022] [Revised: 05/29/2023] [Accepted: 05/30/2023] [Indexed: 06/21/2023]
Abstract
Affective aspects of a stimulus can be processed rapidly and before cognitive attribution, acting much earlier for verbal stimuli than previously considered. Aimed for specific mechanisms, event-related brain potentials (ERPs), expressed in facial expressions or word meaning and evoked by six basic emotions - anger, disgust, fear, happy, sad, and surprise - relative to emotionally neutral stimuli were analysed in a sample of 116 participants. Brain responses in the occipital and left temporal regions elicited by the sadness in facial expressions or words were indistinguishable from responses evoked by neutral faces or words. Confirming previous findings, facial fear elicited an early and strong posterior negativity. Instead of expected parietal positivity, both the happy faces and words produced significantly more negative responses compared to neutral. Surprise in facial expressions and words elicited a strong early response in the left temporal cortex, which could be a signature of appraisal. The results of this study are consistent with the view that both types of affective stimuli, facial emotions and word meaning, set off rapid processing and responses occur very early in the processing stage.
Collapse
Affiliation(s)
- Liina Juuse
- Institute of Psychology, University of Tartu, Tartu, Estonia
- Institute of Technology, University of Tartu, Tartu, Estonia
- Doctoral School of Behavioural, Social and Health Sciences, University of Tartu, Tartu, Estonia
| | - Kairi Kreegipuu
- Institute of Psychology, University of Tartu, Tartu, Estonia
| | - Nele Põldver
- Institute of Psychology, University of Tartu, Tartu, Estonia
| | - Annika Kask
- Institute of Psychology, University of Tartu, Tartu, Estonia
- Doctoral School of Behavioural, Social and Health Sciences, University of Tartu, Tartu, Estonia
| | - Tiit Mogom
- Institute of Psychology, University of Tartu, Tartu, Estonia
| | | | - Jüri Allik
- Institute of Psychology, University of Tartu, Tartu, Estonia
- Estonian Academy of Sciences, Tartu, Estonia
| |
Collapse
|
28
|
Xu Q, Wang W, Yang Y, Li W. Effects of emotion words activation and satiation on facial expression perception: evidence from behavioral and ERP investigations. Front Psychiatry 2023; 14:1192450. [PMID: 37588024 PMCID: PMC10425554 DOI: 10.3389/fpsyt.2023.1192450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Accepted: 07/12/2023] [Indexed: 08/18/2023] Open
Abstract
Objective The present study investigated the impact of emotion concepts obtained from external environmental experiences on the perception of facial expressions by manipulating the activation and satiation of emotion words, which was based on the argument between basic emotion theory and constructed emotion theory. Methods Experiment 1 explored the effects of emotion activation on happy, disgusted, emotion-label words and emotion-laden words in a facial expression judgment task through behavioral experimentation. Experiment 2 explored the effect of semantic satiation on emotion-label words and emotion-laden words using the event-related potential technique. Results Experiment 1 found that facial expression perception was influenced by both types of emotion words and showed a significant emotional consistency effect. Experiment 2 found that N170 exhibited a more negative amplitude in the consistent condition compared to the inconsistent condition in the right hemisphere. More importantly, in the later stage of facial expression processing, emotion-label words and emotion-laden words both obstructed the perception of disgusted facial expressions and elicited more negative N400 amplitude in the emotion consistency condition, showing a reversed N400 effect. Conclusion These results suggested that emotion concepts in the form of language influenced the perception of facial expressions, but there were differences between happy and disgusted faces. Disgusted faces were more dependent on emotion concept information and showed different performances in semantic activation and satiation conditions.
Collapse
Affiliation(s)
- Qiang Xu
- Department of Psychology, Ningbo University, Ningbo, China
| | - Weihan Wang
- Department of Psychology, Ningbo University, Ningbo, China
| | - Yaping Yang
- Department of Psychology, Ningbo University, Ningbo, China
| | - Wanyue Li
- Department of Psychology, Ningbo University, Ningbo, China
- School of Psychology, South China Normal University, Guangzhou, China
| |
Collapse
|
29
|
Burgess R, Culpin I, Costantini I, Bould H, Nabney I, Pearson RM. Quantifying the efficacy of an automated facial coding software using videos of parents. Front Psychol 2023; 14:1223806. [PMID: 37583610 PMCID: PMC10425266 DOI: 10.3389/fpsyg.2023.1223806] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Accepted: 07/10/2023] [Indexed: 08/17/2023] Open
Abstract
Introduction This work explores the use of an automated facial coding software - FaceReader - as an alternative and/or complementary method to manual coding. Methods We used videos of parents (fathers, n = 36; mothers, n = 29) taken from the Avon Longitudinal Study of Parents and Children. The videos-obtained during real-life parent-infant interactions in the home-were coded both manually (using an existing coding scheme) and by FaceReader. We established a correspondence between the manual and automated coding categories - namely Positive, Neutral, Negative, and Surprise - before contingency tables were employed to examine the software's detection rate and quantify the agreement between manual and automated coding. By employing binary logistic regression, we examined the predictive potential of FaceReader outputs in determining manually classified facial expressions. An interaction term was used to investigate the impact of gender on our models, seeking to estimate its influence on the predictive accuracy. Results We found that the automated facial detection rate was low (25.2% for fathers, 24.6% for mothers) compared to manual coding, and discuss some potential explanations for this (e.g., poor lighting and facial occlusion). Our logistic regression analyses found that Surprise and Positive expressions had strong predictive capabilities, whilst Negative expressions performed poorly. Mothers' faces were more important for predicting Positive and Neutral expressions, whilst fathers' faces were more important in predicting Negative and Surprise expressions. Discussion We discuss the implications of our findings in the context of future automated facial coding studies, and we emphasise the need to consider gender-specific influences in automated facial coding research.
Collapse
Affiliation(s)
- R. Burgess
- The Digital Health Engineering Group, Merchant Venturers Building, University of Bristol, Bristol, United Kingdom
| | - I. Culpin
- The Centre for Academic Mental Health, Bristol Medical School, Bristol, United Kingdom
- Florence Nightingale Faculty of Nursing, Midwifery and Palliative Care, King’s College London, London, United Kingdom
| | - I. Costantini
- The Centre for Academic Mental Health, Bristol Medical School, Bristol, United Kingdom
| | - H. Bould
- The Centre for Academic Mental Health, Bristol Medical School, Bristol, United Kingdom
- The Medical Research Council Integrative Epidemiology Unit, University of Bristol, Bristol, United Kingdom
- The Gloucestershire Health and Care NHS Foundation Trust, Gloucester, United Kingdom
| | - I. Nabney
- The Digital Health Engineering Group, Merchant Venturers Building, University of Bristol, Bristol, United Kingdom
| | - R. M. Pearson
- The Centre for Academic Mental Health, Bristol Medical School, Bristol, United Kingdom
- The Department of Psychology, Manchester Metropolitan University, Manchester, United Kingdom
| |
Collapse
|
30
|
Stahelski A, Radeke MK, Reavis M. Social perception inferences of computer-generated faces: an Asian Indian and United States cultural comparison. Front Psychol 2023; 14:1174662. [PMID: 37554135 PMCID: PMC10406511 DOI: 10.3389/fpsyg.2023.1174662] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 06/28/2023] [Indexed: 08/10/2023] Open
Abstract
Results from research with computer-generated faces have demonstrated that participants are able to make different trait inferences to different generated faces. However, only a few studies using computer-generated faces with cross-cultural samples have been done. This study compared the facial trait inference results from India and the United States, using three validated neutral expression computer-generated faces from the University of Chicago Perception and Judgment Lab database as facial stimuli. The three faces varied in perceived threat. Participants were asked about the attractiveness, pleasing-ness (to look at), honesty, and potential threat in each of the three faces. Results indicated that participants from both cultural samples made the same inferences to the three faces; participants rated the attractiveness, pleasing-ness, and honesty highest in the low threat face and lowest in the high threat face. Indian participants perceive the high threat face to be less threatening than the United States participants. Participants were also asked about the emotional expression on each of the faces, even though the faces were presumably neutral. United States participants were significantly more likely to indicate that the faces in all three threat conditions were emotionally neutral, compared to Indian participants, reflecting a cultural In-group bias, in which members of a culture are more accurately able to identify expressions on faces from their own culture.
Collapse
Affiliation(s)
- Anthony Stahelski
- Department of Psychology, Central Washington University, Ellensburg, WA, United States
| | - Mary Katherine Radeke
- Department of Psychology, Central Washington University, Ellensburg, WA, United States
| | - Maxie Reavis
- College of Osteopathic Medicine, Pacific Northwest University of Health Sciences, Yakima, WA, United States
| |
Collapse
|
31
|
Eilert DW, Buchheim A. Attachment-Related Differences in Emotion Regulation in Adults: A Systematic Review on Attachment Representations. Brain Sci 2023; 13:884. [PMID: 37371364 DOI: 10.3390/brainsci13060884] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Revised: 05/24/2023] [Accepted: 05/29/2023] [Indexed: 06/29/2023] Open
Abstract
In recent years, there has been an increase in the prevalence of mental disorders connected with affective dysregulation and insecure attachment. Therefore, it is even more important to understand the interplay between an individual's attachment representation and patterns of emotion regulation. To our knowledge, this is the first systematic review to examine this association. PsycInfo, PsyArticles, and PubMed were searched for studies that examined attachment-related differences in emotion regulation in adults. To examine the unconscious attachment representation, only studies using the Adult Attachment Interview or the Adult Attachment Projective Picture System were included. Thirty-seven peer-reviewed studies (with a total of 2006 subjects) matched the PICO criteria. Emotion regulation was measured via four objective approaches: autonomic nervous system, brain activity, biochemistry, or nonverbal behavior. Across all measurements, results reveal a significant correlation between attachment representation and emotion regulation. Secure attachment correlates consistently with balanced emotion regulation, whereas it is impaired in insecure and dysfunctional in unresolved attachment. Specifically, unresolved individuals display counterintuitive responses and fail to use attachment as a resource. Insecure-dismissing attachment is associated with an emotionally deactivating strategy, while on a physiological, biochemical, and nonverbal level, emotional stress is still present. There is still a lack of studies examining preoccupied individuals. In addition to interpreting the results, we also discuss the risk of bias, implications for psychotherapy and coaching, and an outlook for future research.
Collapse
Affiliation(s)
- Dirk W Eilert
- Institute of Psychology, University of Innsbruck, 6020 Innsbruck, Austria
| | - Anna Buchheim
- Institute of Psychology, University of Innsbruck, 6020 Innsbruck, Austria
| |
Collapse
|
32
|
Liu S, Wang Y, Song Y. Atypical facial mimicry for basic emotions in children with autism spectrum disorder. Autism Res 2023. [PMID: 37246606 DOI: 10.1002/aur.2957] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Accepted: 05/14/2023] [Indexed: 05/30/2023]
Abstract
During social encounters, people tend to reproduce the facial expressions of others, termed "facial mimicry," which is believed to underlie many important social cognitive functions. Clinically, atypical mimicry is closely associated with serious social dysfunction. However, findings regarding the facial mimicry ability of children with autism spectrum disorder (ASD) are inconsistent; it is necessary to test whether deficits in facial mimicry are core defects of autism and explore the potential mechanism underlying this process. Using quantitative analysis, this study investigated voluntary and automatic facial mimicry performance of six basic expressions in children with and without ASD. There was no significant group difference in mimicry accuracy, but children with ASD showed less intensity in voluntary and automatic mimicry than typically developing children; they also presented less voluntary mimicry intensity for happy, sad, and fearful expressions. Performance on voluntary and automatic mimicry was significantly correlated with the level of autistic symptoms (r >-.43) and theory of mind (r >.34). Furthermore, theory of mind mediated the relationship between autistic symptoms and the intensity of facial mimicry. These results suggest that individuals with ASD show atypical facial mimicry (i.e., less intensity for both voluntary and automatic mimicry, mainly for voluntary mimicry of happiness, sadness, and fear), which might offer a potential cognitive marker for quantifying syndrome manifestations in children with ASD. These findings suggest that theory of mind plays a mediating role in facial mimicry, which may provide insight into the theoretical mechanism of social dysfunction in children with autism.
Collapse
Affiliation(s)
- Shuo Liu
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Yue Wang
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Yongning Song
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| |
Collapse
|
33
|
Gu S, Jiang Y, Liu M, Li Y, Liang Y, Feng R, Xu M, Wang F, Huang JH. Eye movements and ERP biomarkers for face processing problems in avoidant attachment-style individuals. Front Behav Neurosci 2023; 17:1135909. [PMID: 37273280 PMCID: PMC10235504 DOI: 10.3389/fnbeh.2023.1135909] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2023] [Accepted: 05/03/2023] [Indexed: 06/06/2023] Open
Abstract
Background Avoidant attachment poses a serious risk to intimate relationships and offspring. However, there are few studies on the face-processing characteristics and impairments of avoidant individuals based on basic emotion theory. Therefore, this study investigated the issues of emotional processing and deactivation strategies in individuals with avoidant attachment. Methods Avoidant and secure individuals were recruited to participate in an eye-tracking experiment and a two-choice oddball task in which they had to distinguish facial expressions of basic emotions (sadness, anger, fear, disgust, and neutral). Eye fixation durations to various parts of the face, including the eyes, nose, and mouth, were measured, and three event-related potentials (ERP) components (P100, N170, and P300) were monitored. Results Avoidant individuals could not process facial expressions as easily as secure individuals. Avoidant individuals focused less on the eyes of angry faces when compared to secure individuals. They also exhibited a more positive P100 component and a less negative N170 component when processing faces and a larger amplitude of the P300 component than secure individuals when processing emotional expressions. Conclusion Avoidant individuals use deactivating strategies and exhibit specific characteristics at different stages, which are of great significance in social interaction.
Collapse
Affiliation(s)
- Simeng Gu
- Department of Psychology, School of Medicine, Jiangsu University, Zhenjiang, Jiangsu, China
| | - Yao Jiang
- Institute of Brain and Psychological Science, Sichuan Normal University, Chengdu, Sichuan, China
| | - Mei Liu
- Institute of Brain and Psychological Science, Sichuan Normal University, Chengdu, Sichuan, China
| | - Yumeng Li
- Department of Psychology, School of Medicine, Jiangsu University, Zhenjiang, Jiangsu, China
| | - Yuan Liang
- Institute of Brain and Psychological Science, Sichuan Normal University, Chengdu, Sichuan, China
| | - Rou Feng
- Institute of Brain and Psychological Science, Sichuan Normal University, Chengdu, Sichuan, China
| | - Minghong Xu
- Department of Neurology, Lianyungang Hospital of Chinese Medicine, Affiliated Hospital of Nanjing University of Chinese Medicine, Nanjing, China
| | - Fushun Wang
- Institute of Brain and Psychological Science, Sichuan Normal University, Chengdu, Sichuan, China
| | - Jason H. Huang
- Department of Neurosurgery, Baylor Scott & White Health Center, Temple, TX, United States
- Department of Surgery, Texas A&M University, Temple, TX, United States
| |
Collapse
|
34
|
Horigome T, Yoshida S, Tanikawa T, Mimura M, Kishimoto T. Modification of the therapist's facial expressions using virtual reality technology during the treatment of social anxiety disorder: a case series. Front Psychol 2023; 14:1030050. [PMID: 37255521 PMCID: PMC10225735 DOI: 10.3389/fpsyg.2023.1030050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2022] [Accepted: 04/28/2023] [Indexed: 06/01/2023] Open
Abstract
Exposure therapy is a mainstream of treatment for social anxiety disorder (SAD). However, effort and time are required to recreate interpersonal situations that produce moderate anxiety. On the other hand, virtual reality exposure therapy can easily control anxiety-inducing conditions and allow for graduated exposure. However, artificial intelligence and animations that speak as naturally as actual humans are not yet practical, adding to the limitations of these treatments. The authors propose the use of a virtual reality technology that can transform facial expressions into smiling or sad faces in real time and display them on a monitor, potentially solving the above-mentioned problems associated with virtual reality animations. This feasibility study was conducted to determine whether this system can be safely applied to the treatment of SAD patients. A total of four SAD patients received 16 exposure therapy sessions led by an experienced therapist over a monitor; throughout the sessions, the facial expressions of the therapist were modified using software to display expressions ranging from smiling to sad on the monitor that was being viewed by the patient. Client satisfaction, treatment alliance, and symptom assessments were then conducted. Although one patient dropped out of the study, treatment satisfaction and treatment alliance were scored high in all the cases. In two of the four cases, the improvement in symptoms was sustained over time. Exposure therapy in which the interviewer's facial expressions are modified to induce appropriate levels of anxiety in the patient can be safely used for the treatment of SAD patients and may be effective for some patients.
Collapse
Affiliation(s)
- Toshiro Horigome
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
- Department of Psychiatry, Shonan Keiiku Hospital, Kanagawa, Japan
| | - Shigeo Yoshida
- Research Center for Advanced Science and Technology, The University of Tokyo, Tokyo, Japan
- OMRON SINIC X Corporation, Tokyo, Japan
| | - Tomohiro Tanikawa
- Next Generation Artificial Intelligence Research Center, The University of Tokyo, Tokyo, Japan
| | - Masaru Mimura
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
| | - Taishiro Kishimoto
- Hills Joint Research Laboratory for Future Preventive Medicine and Wellness, Keio University School of Medicine, Tokyo, Japan
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, New York, NY, United States
| |
Collapse
|
35
|
Dureux A, Zanini A, Everling S. Face-Selective Patches in Marmosets Are Involved in Dynamic and Static Facial Expression Processing. J Neurosci 2023; 43:3477-3494. [PMID: 37001990 PMCID: PMC10184744 DOI: 10.1523/jneurosci.1484-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 03/09/2023] [Accepted: 03/25/2023] [Indexed: 04/03/2023] Open
Abstract
The correct identification of facial expressions is critical for understanding the intention of others during social communication in the daily life of all primates. Here we used ultra-high-field fMRI at 9.4 T to investigate the neural network activated by facial expressions in awake New World common marmosets from both male and female sex, and to determine the effect of facial motions on this network. We further explored how the face-patch network is involved in the processing of facial expressions. Our results show that dynamic and static facial expressions activate face patches in temporal and frontal areas (O, PV, PD, MD, AD, and PL) as well as in the amygdala, with stronger responses for negative faces, also associated with an increase of the respiration rates of the monkey. Processing of dynamic facial expressions involves an extended network recruiting additional regions not known to be part of the face-processing network, suggesting that face motions may facilitate the recognition of facial expressions. We report for the first time in New World marmosets that the perception and identification of changeable facial expressions, vital for social communication, recruit face-selective brain patches also involved in face detection processing and are associated with an increase of arousal.SIGNIFICANCE STATEMENT Recent research in humans and nonhuman primates has highlighted the importance to correctly recognize and process facial expressions to understand others' emotions in social interactions. The current study focuses on the fMRI responses of emotional facial expressions in the common marmoset (Callithrix jacchus), a New World primate species sharing several similarities of social behavior with humans. Our results reveal that temporal and frontal face patches are involved in both basic face detection and facial expression processing. The specific recruitment of these patches for negative faces associated with an increase of the arousal level show that marmosets process facial expressions of their congener, vital for social communication.
Collapse
Affiliation(s)
- Audrey Dureux
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, Ontario N6A 5K8, Canada
| | - Alessandro Zanini
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, Ontario N6A 5K8, Canada
| | - Stefan Everling
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, University of Western Ontario, London, Ontario N6A 5K8, Canada
- Department of Physiology and Pharmacology, University of Western Ontario, London, Ontario N6A 5K8, Canada
| |
Collapse
|
36
|
Freiburger E, Sim M, Halberstadt AG, Hugenberg K. A Race-Based Size Bias for Black Adolescent Boys: Size, Innocence, and Threat. Pers Soc Psychol Bull 2023:1461672231167978. [PMID: 37158215 DOI: 10.1177/01461672231167978] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
We adopted an intersectional stereotyping lens to investigate whether race-based size bias-the tendency to judge Black men as larger than White men-extends to adolescents. Participants judged Black boys as taller than White boys, despite no real size differences (Studies 1A and 1B), and even when boys were matched in age (Study 1B). The size bias persisted when participants viewed computer-generated faces that varied only in apparent race (Study 2A) and extended to perceptions of physical strength, with Black boys judged as stronger than White boys (Study 2B). The size bias was associated with threat-related perceptions, including beliefs that Black boys were less innocent than White boys (Study 3). Finally, the size bias was moderated by a valid threat signal (i.e., anger expressions, Studies 4A and 4B). Thus, adult-like threat stereotypes are perpetrated upon Black boys, leading them to be erroneously perceived as more physically formidable than White boys.
Collapse
|
37
|
Draganov M, Galiano-Landeira J, Doruk Camsari D, Ramírez JE, Robles M, Chanes L. Noninvasive modulation of predictive coding in humans: causal evidence for frequency-specific temporal dynamics. Cereb Cortex 2023:7156779. [PMID: 37154618 DOI: 10.1093/cercor/bhad127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Revised: 03/17/2023] [Accepted: 03/18/2023] [Indexed: 05/10/2023] Open
Abstract
Increasing evidence indicates that the brain predicts sensory input based on past experiences, importantly constraining how we experience the world. Despite a growing interest on this framework, known as predictive coding, most of such approaches to multiple psychological domains continue to be theoretical or primarily provide correlational evidence. We here explored the neural basis of predictive processing using noninvasive brain stimulation and provide causal evidence of frequency-specific modulations in humans. Participants received 20 Hz (associated with top-down/predictions), 50 Hz (associated with bottom-up/prediction errors), or sham transcranial alternating current stimulation on the left dorsolateral prefrontal cortex while performing a social perception task in which facial expression predictions were induced and subsequently confirmed or violated. Left prefrontal 20 Hz stimulation reinforced stereotypical predictions. In contrast, 50 Hz and sham stimulation failed to yield any significant behavioral effects. Moreover, the frequency-specific effect observed was further supported by electroencephalography data, which showed a boost of brain activity at the stimulated frequency band. These observations provide causal evidence for how predictive processing may be enabled in the human brain, setting up a needed framework to understand how it may be disrupted across brain-related conditions and potentially restored through noninvasive methods.
Collapse
Affiliation(s)
- Metodi Draganov
- Department of Clinical and Health Psychology, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
| | - Jordi Galiano-Landeira
- Department of Clinical and Health Psychology, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
| | - Deniz Doruk Camsari
- Department of Psychiatry and Psychology, Mayo Clinic, Rochester, MN 55905, United States
| | - Jairo-Enrique Ramírez
- Department of Clinical and Health Psychology, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
| | - Marta Robles
- Department of Clinical and Health Psychology, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Munich, Munich 80336, Germany
| | - Lorena Chanes
- Department of Clinical and Health Psychology, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
- Institut de Neurociències, Universitat Autònoma de Barcelona, Barcelona 08193, Spain
- Serra Húnter Programme, Generalitat de Catalunya, Barcelona 08002, Spain
| |
Collapse
|
38
|
Quettier T, Maffei A, Gambarota F, Ferrari PF, Sessa P. Testing EEG functional connectivity between sensorimotor and face processing visual regions in individuals with congenital facial palsy. Front Syst Neurosci 2023; 17:1123221. [PMID: 37215358 PMCID: PMC10196055 DOI: 10.3389/fnsys.2023.1123221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Accepted: 04/17/2023] [Indexed: 05/24/2023] Open
Abstract
Moebius syndrome (MBS) is characterized by the congenital absence or underdevelopment of cranial nerves VII and VI, leading to facial palsy and impaired lateral eye movements. As a result, MBS individuals cannot produce facial expressions and did not develop motor programs for facial expressions. In the latest model of sensorimotor simulation, an iterative communication between somatosensory, motor/premotor cortices, and visual regions has been proposed, which should allow more efficient discriminations among subtle facial expressions. Accordingly, individuals with congenital facial motor disability, specifically with MBS, should exhibit atypical communication within this network. Here, we aimed to test this facet of the sensorimotor simulation models. We estimated the functional connectivity between the visual cortices for face processing and the sensorimotor cortices in healthy and MBS individuals. To this aim, we studied the strength of beta band functional connectivity between these two systems using high-density EEG, combined with a change detection task with facial expressions (and a control condition involving non-face stimuli). The results supported our hypothesis such that when discriminating subtle facial expressions, participants affected by congenital facial palsy (compared to healthy controls) showed reduced connectivity strength between sensorimotor regions and visual regions for face processing. This effect was absent for the condition with non-face stimuli. These findings support sensorimotor simulation models and the communication between sensorimotor and visual areas during subtle facial expression processing.
Collapse
Affiliation(s)
- Thomas Quettier
- Department of Developmental and Social Psychology, University of Padua, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padua, Padua, Italy
| | - Antonio Maffei
- Department of Developmental and Social Psychology, University of Padua, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padua, Padua, Italy
| | - Filippo Gambarota
- Department of Developmental and Social Psychology, University of Padua, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padua, Padua, Italy
| | - Pier Francesco Ferrari
- Institut des Sciences Cognitives Marc Jeannerod, CNRS/Université Claude Bernard Lyon 1, Bron, France
| | - Paola Sessa
- Department of Developmental and Social Psychology, University of Padua, Padua, Italy
- Padova Neuroscience Center (PNC), University of Padua, Padua, Italy
| |
Collapse
|
39
|
Venkitakrishnan S, Wu YH. Facial Expressions as an Index of Listening Difficulty and Emotional Response. Semin Hear 2023; 44:166-187. [PMID: 37122878 PMCID: PMC10147507 DOI: 10.1055/s-0043-1766104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/07/2023] Open
Abstract
Knowledge about listening difficulty experienced during a task can be used to better understand speech perception processes, to guide amplification outcomes, and can be used by individuals to decide whether to participate in communication. Another factor affecting these decisions is individuals' emotional response which has not been measured objectively previously. In this study, we describe a novel method of measuring listening difficulty and affect of individuals in adverse listening situations using automatic facial expression algorithm. The purpose of our study was to determine if facial expressions of confusion and frustration are sensitive to changes in listening difficulty. We recorded speech recognition scores, facial expressions, subjective listening effort scores, and subjective emotional responses in 33 young participants with normal hearing. We used the signal-to-noise ratios of -1, +2, and +5 dB SNR and quiet conditions to vary the difficulty level. We found that facial expression of confusion and frustration increased with increase in difficulty level, but not with change in each level. We also found a relationship between facial expressions and both subjective emotion ratings and subjective listening effort. Emotional responses in the form of facial expressions show promise as a measure of affect and listening difficulty. Further research is needed to determine the specific contribution of affect to communication in challenging listening environments.
Collapse
Affiliation(s)
- Soumya Venkitakrishnan
- Department of Communication Sciences and Disorders, California State University, Sacramento, California
| | - Yu-Hsiang Wu
- Department of Communication Sciences and Disorders, University of Iowa, Iowa City, Iowa
| |
Collapse
|
40
|
Zhang X, Li T, Wang C, Tian T, Pang H, Pang J, Su C, Shi X, Li J, Ren L, Wang J, Li L, Ma Y, Li S, Wang L. Recognizing schizophrenia using facial expressions based on convolutional neural network. Brain Behav 2023; 13:e3002. [PMID: 37062964 PMCID: PMC10175991 DOI: 10.1002/brb3.3002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/11/2022] [Revised: 03/02/2023] [Accepted: 03/28/2023] [Indexed: 04/18/2023] Open
Abstract
OBJECTIVE Facial expressions have been served as clinical symptoms to convey mental conditions in psychiatry. This paper proposes to recognize patients with schizophrenia (SCZ) using their facial images based on deep learning algorithm, and to investigate objective differences in facial expressions between SCZ patients and healthy controls using deep learning algorithm and statistical analyses. METHODS The study consists of two parts. The first part recruited 106 SCZ patients and 101 healthy controls, and videotaped their facial expressions through a fixed experimental paradigm. The video data were randomly divided into two sets, one for training a convolutional neural network (CNN) with the classification of "healthy control" or "SCZ patient" as output and the other for evaluating the classification result of the trained CNN. In the second part, all facial images of the recruited participants were put into another CNN separately, which was priorly trained with a facial expression database and will output the most likely facial expressions of the recruited participants. Statistical analyses were performed on the obtained facial expressions to find out the objective differences in facial expressions between the two recruited groups. RESULTS The trained CNN achieved an overall accuracy of 95.18% for classifying "healthy control" or "SCZ patient." Statistical results on the obtained facial expressions demonstrated that the objective differences between the two recruited groups were statistically significant (p < .05). CONCLUSIONS Facial expressions hold great promise as SCZ clues with the help of deep learning algorithm. The proposed approach would be potentially applied to mobile devices for autorecognizing SCZ in the context of clinical and daily life.
Collapse
Affiliation(s)
- Xiaofei Zhang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Tongxin Li
- Institute of Mental Health, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Conghui Wang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Tian Tian
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Haizhu Pang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Jisong Pang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Chen Su
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Xiaomei Shi
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Jiangong Li
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Lina Ren
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Jing Wang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Lulu Li
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Yanyan Ma
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Shen Li
- Institute of Mental Health, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| | - Lili Wang
- Department of Psychiatry, Tianjin Anding Hospital, Mental Health Center of Tianjin Medical University, Tianjin, China
| |
Collapse
|
41
|
Benuzzi F, Ballotta D, Casadio C, Zanelli V, Porro CA, Nichelli PF, Lui F. "When You're Smiling": How Posed Facial Expressions Affect Visual Recognition of Emotions. Brain Sci 2023; 13:brainsci13040668. [PMID: 37190633 DOI: 10.3390/brainsci13040668] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Revised: 04/06/2023] [Accepted: 04/14/2023] [Indexed: 05/17/2023] Open
Abstract
Facial imitation occurs automatically during the perception of an emotional facial expression, and preventing it may interfere with the accuracy of emotion recognition. In the present fMRI study, we evaluated the effect of posing a facial expression on the recognition of ambiguous facial expressions. Since facial activity is affected by various factors, such as empathic aptitudes, the Interpersonal Reactivity Index (IRI) questionnaire was administered and scores were correlated with brain activity. Twenty-six healthy female subjects took part in the experiment. The volunteers were asked to pose a facial expression (happy, disgusted, neutral), then to watch an ambiguous emotional face, finally to indicate whether the emotion perceived was happiness or disgust. As stimuli, blends of happy and disgusted faces were used. Behavioral results showed that posing an emotional face increased the percentage of congruence with the perceived emotion. When participants posed a facial expression and perceived a non-congruent emotion, a neural network comprising bilateral anterior insula was activated. Brain activity was also correlated with empathic traits, particularly with empathic concern, fantasy and personal distress. Our findings support the idea that facial mimicry plays a crucial role in identifying emotions, and that empathic emotional abilities can modulate the brain circuits involved in this process.
Collapse
Affiliation(s)
- Francesca Benuzzi
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Daniela Ballotta
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Claudia Casadio
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Vanessa Zanelli
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Carlo Adolfo Porro
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Paolo Frigio Nichelli
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| | - Fausta Lui
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
| |
Collapse
|
42
|
Meulemeester CD, Lowyck B, Boets B, van der Donck S, Verhaest Y, Luyten P. "Feeling Invisible": Individuals With Borderline Personality Disorder Underestimate the Transparency of Their Emotions. J Pers Disord 2023; 37:213-232. [PMID: 37002937 DOI: 10.1521/pedi.2023.37.2.213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 04/04/2023]
Abstract
The present study investigated transparency estimation, that is, the ability to estimate how observable one's emotions are, in patients diagnosed with borderline personality disorder (BPD) (n = 35) and healthy controls (HCs; n = 35). Participants watched emotionally evocative video clips and estimated the transparency of their own emotional experience while watching the clip. Facial expression coding software (FaceReader) quantified their objective transparency. BPD patients felt significantly less transparent than HCs, but there were no differences in objective transparency. BPD patients tended to underestimate the transparency of their emotions compared to HCs, who in turn overestimated their transparency. This suggests that BPD patients expect that others will not know how they feel, irrespective of how observable their emotions actually are. We link these findings to low emotional awareness and a history of emotional invalidation in BPD, and we discuss their impact on BPD patients' social functioning.
Collapse
Affiliation(s)
| | - Benedicte Lowyck
- University Psychiatric Hospital UPC KU Leuven, Campus Kortenberg, and Department of Neurosciences, Faculty of Medicine, KU Leuven, Belgium
| | - Bart Boets
- Center for Developmental Psychiatry, Department of Neurosciences, KU Leuven, Belgium
| | | | - Yannic Verhaest
- University Psychiatric Hospital UPC KU Leuven, Campus Kortenberg, and Department of Neurosciences, Faculty of Medicine, KU Leuven, Belgium
| | - Patrick Luyten
- Faculty of Psychology and Educational Sciences, KU Leuven, Belgium
- Research Department of Clinical, Educational and Health Psychology, University College London, United Kingdom
| |
Collapse
|
43
|
Chamberland JA, Collin CA. Effects of forward mask duration variability on the temporal dynamics
of brief facial expression categorization. Iperception 2023; 14:20416695231162580. [PMID: 36968319 PMCID: PMC10031613 DOI: 10.1177/20416695231162580] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 02/22/2023] [Indexed: 03/24/2023] Open
Abstract
The Japanese and Caucasian Brief Affect Recognition Task (JACBART) has been
proposed as a standardized method for measuring people's ability to accurately
categorize briefly presented images of facial expressions. However, the factors
that impact performance in this task are not entirely understood. The current
study sought to explore the role of the forward mask's duration (i.e., fixed vs.
variable) in brief affect categorization across expressions of the six basic
emotions (i.e., anger, disgust, fear, happiness, sadness, and surprise) and
three presentation times (i.e., 17, 67, and 500 ms). Current findings do not
demonstrate evidence that a variable duration forward mask negatively impacts
brief affect categorization. However, efficiency and necessity thresholds were
observed to vary across the expressions of emotion. Further exploration of the
temporal dynamics of facial affect categorization will therefore require a
consideration of these differences.
Collapse
Affiliation(s)
- Justin A. Chamberland
- Justin A. Chamberland, School of
Psychology/École de psychologie, University of Ottawa/Université d’Ottawa,
Ottawa, Ontario, K1N 6N5, Canada.
| | | |
Collapse
|
44
|
Dildine TC, Amir CM, Parsons J, Atlas LY. How Pain-Related Facial Expressions Are Evaluated in Relation to Gender, Race, and Emotion. Affect Sci 2023. [PMCID: PMC9982800 DOI: 10.1007/s42761-023-00181-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Inequities in pain assessment are well-documented; however, the psychological mechanisms underlying such biases are poorly understood. We investigated potential perceptual biases in the judgments of faces displaying pain-related movements. Across five online studies, 956 adult participants viewed images of computer-generated faces (“targets”) that varied in features related to race (Black and White) and gender (women and men). Target identity was manipulated across participants, and each target had equivalent facial movements that displayed varying intensities of movement in facial action-units related to pain (Studies 1–4) or pain and emotion (Study 5). On each trial, participants provided categorical judgments as to whether a target was in pain (Studies 1–4) or which expression the target displayed (Study 5) and then rated the perceived intensity of the expression. Meta-analyses of Studies 1–4 revealed that movement intensity was positively associated with both categorizing a trial as painful and perceived pain intensity. Target race and gender did not consistently affect pain-related judgments, contrary to well-documented clinical inequities. In Study 5, in which pain was equally likely relative to other emotions, pain was the least frequently selected emotion (5%). Our results suggest that perceivers can utilize facial movements to evaluate pain in other individuals, but perceiving pain may depend on contextual factors. Furthermore, assessments of computer-generated, pain-related facial movements online do not replicate sociocultural biases observed in the clinic. These findings provide a foundation for future studies comparing CGI and real images of pain and emphasize the need for further work on the relationship between pain and emotion.
Collapse
Affiliation(s)
- Troy C. Dildine
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA ,Department of Clinical Neuroscience, Karolinska Institute, 171 77 Solna, Sweden
| | - Carolyn M. Amir
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
| | - Julie Parsons
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
| | - Lauren Y. Atlas
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA ,National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892 USA ,National Institute on Drug Abuse, National Institutes of Health, Baltimore, MD 21224 USA
| |
Collapse
|
45
|
Min Y, Kim SH. How Do Looming and Receding Emotional Faces Modulate Duration Perception? Percept Mot Skills 2023; 130:54-79. [PMID: 36355475 DOI: 10.1177/00315125221138394] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
The direction of visual motion has been shown to affect the perception of interval duration; objects moving towards an observer (i.e., looming) are perceived to last longer than objects moving away (i.e., receding), and this has been explained in terms of arousal- or attention-based modulation. To dissociate the two competing accounts, we investigated how the influence of motion direction on duration perception is modulated by the emotional content of stimuli. Participants were given the temporal bisection task with images of emotional faces (angry, happy, and neutral) presented in a static (Experiment 1) or dynamic (Experiment 2) display. In Experiment 1, we found no influence of facial emotion on perceived duration. In Experiment 2, however, looming (i.e., expanding) stimuli were perceived as lasting longer than receding (contracting) ones. More importantly, we found an interaction between participant-rated arousal to faces and motion direction: The looming/receding asymmetry was pronounced when the stimulus arousal was rated low, but this asymmetry diminished with increasing arousal ratings. Thus, looming/receding temporal asymmetry seems to be reduced when arousing facial expressions enhance attentional engagement.
Collapse
Affiliation(s)
- Yeji Min
- Department of Psychology, 26717Ewha Womans University, Seoul, South Korea
| | - Sung-Ho Kim
- Department of Psychology, 26717Ewha Womans University, Seoul, South Korea
| |
Collapse
|
46
|
Zalmenson T, Azriel O, Bar-Haim Y. Enhanced recognition of disgusted expressions occurs in spite of attentional avoidance at encoding. Front Psychol 2023; 13:1063073. [PMID: 36687960 PMCID: PMC9846063 DOI: 10.3389/fpsyg.2022.1063073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 12/12/2022] [Indexed: 01/05/2023] Open
Abstract
Introduction Negative emotional content is prioritized in memory. Prioritized attention to negative stimuli has been suggested to mediate this valence-memory association. However, research suggests only a limited role for attention in this observed memory advantage. We tested the role of attention in memory for disgusted facial expressions, a powerful social-emotional stimulus. Methods We measured attention using an incidental, free-viewing encoding task and memory using a surprise memory test for the viewed expressions. Results and Discussion Replicating prior studies, we found increased attentional dwell-time for neutral over disgusted expressions at encoding. However, contrary to the attention-memory link hypothesis, disgusted faces were better remembered than neutral faces. Although dwell-time was found to partially mediate the association between valence and memory, this effect was much weaker than the opposite direct effect. These findings point to independence of memory for disgusted faces from attention during encoding.
Collapse
|
47
|
Hoffmann J, Travers-Podmaniczky G, Pelzl MA, Brück C, Jacob H, Hölz L, Martinelli A, Wildgruber D. Impairments in recognition of emotional facial expressions, affective prosody, and multisensory facilitation of response time in high-functioning autism. Front Psychiatry 2023; 14:1151665. [PMID: 37168084 PMCID: PMC10165112 DOI: 10.3389/fpsyt.2023.1151665] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/03/2023] [Indexed: 05/13/2023] Open
Abstract
Introduction Deficits in emotional perception are common in autistic people, but it remains unclear to which extent these perceptual impairments are linked to specific sensory modalities, specific emotions or multisensory facilitation. Methods This study aimed to investigate uni- and bimodal perception of emotional cues as well as multisensory facilitation in autistic (n = 18, mean age: 36.72 years, SD: 11.36) compared to non-autistic (n = 18, mean age: 36.41 years, SD: 12.18) people using auditory, visual and audiovisual stimuli. Results Lower identification accuracy and longer response time were revealed in high-functioning autistic people. These differences were independent of modality and emotion and showed large effect sizes (Cohen's d 0.8-1.2). Furthermore, multisensory facilitation of response time was observed in non-autistic people that was absent in autistic people, whereas no differences were found in multisensory facilitation of accuracy between the two groups. Discussion These findings suggest that processing of auditory and visual components of audiovisual stimuli is carried out more separately in autistic individuals (with equivalent temporal demands required for processing of the respective unimodal cues), but still with similar relative improvement in accuracy, whereas earlier integrative multimodal merging of stimulus properties seems to occur in non-autistic individuals.
Collapse
Affiliation(s)
- Jonatan Hoffmann
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- *Correspondence: Jonatan Hoffmann,
| | | | - Michael Alexander Pelzl
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Carolin Brück
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Heike Jacob
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Lea Hölz
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| | - Anne Martinelli
- School of Psychology, Fresenius University of Applied Sciences, Frankfurt am Main, Germany
| | - Dirk Wildgruber
- Department of General Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
| |
Collapse
|
48
|
Santos IM, Bem-Haja P, Silva A, Rosa C, Queiroz DF, Alves MF, Barroso T, Cerri L, Silva CF. The Interplay between Chronotype and Emotion Regulation in the Recognition of Facial Expressions of Emotion. Behav Sci (Basel) 2022; 13. [PMID: 36661610 DOI: 10.3390/bs13010038] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2022] [Revised: 12/12/2022] [Accepted: 12/22/2022] [Indexed: 01/03/2023] Open
Abstract
Emotion regulation strategies affect the experience and processing of emotions and emotional stimuli. Chronotype has also been shown to influence the processing of emotional stimuli, with late chronotypes showing a bias towards better processing of negative stimuli. Additionally, greater eveningness has been associated with increased difficulties in emotion regulation and preferential use of expressive suppression strategies. Therefore, the present study aimed to understand the interplay between chronotype and emotion regulation on the recognition of dynamic facial expressions of emotion. To that end, 287 participants answered self-report measures and performed an online facial emotion recognition task from short video clips where a neutral face gradually morphed into a full-emotion expression (one of the six basic emotions). Participants should press the spacebar to stop each video as soon as they could recognize the emotional expression, and then identify it from six provided labels/emotions. Greater eveningness was associated with shorter response times (RT) in the identification of sadness, disgust and happiness. Higher scores of expressive suppression were associated with longer RT in identifying sadness, disgust, anger and surprise. Expressive suppression significantly moderated the relationship between chronotype and the recognition of sadness and anger, with chronotype being a significant predictor of emotion recognition times only at higher levels of expressive suppression. No significant effects were observed for cognitive reappraisal. These results are consistent with a negative bias in emotion processing in late chronotypes and increased difficulty in anger and sadness recognition for expressive suppressor morning-types.
Collapse
|
49
|
Yu X, Xu B, Zhang E. Others' Facial Expressions Influence Individuals Making Choices and Processing Feedback: The Event-Related Potential and Behavioral Evidence. Int J Environ Res Public Health 2022; 20:568. [PMID: 36612890 PMCID: PMC9819307 DOI: 10.3390/ijerph20010568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 12/27/2022] [Accepted: 12/27/2022] [Indexed: 06/17/2023]
Abstract
To date, several studies have found the effect of facial expressions on trust decision, using the event-related potentials (ERPs). However, little is known about the neural mechanism underlying the modulation effect of facial expressions on making choices and subsequent outcome evaluation. In the present study, using an ERP technique, we investigated how the neural process of making choices and subsequent outcome evaluation were influenced by others' facial expressions for the first time. Specifically, participants played a modified version of the Trust Game, in which they watched a photo of the trustee before making choices. Critically, trustees' faces differed regarding emotional types (i.e., happy, neutral, or angry) and gender (i.e., female or male). Behaviorally, an interaction between expressions and gender was observed on investment rates. On the neural level, the N2 and P3 amplitudes were modulated by facial expressions in the making-choice stage. Additionally, the feedback-related P3 was also modulated by facial expressions. The present study proved the effect of facial expressions on making choices and subsequent outcome evaluation.
Collapse
Affiliation(s)
- Xin Yu
- Institute of Cognition, Brain & Health, Henan University, Kaifeng 475001, China
- Institute of Psychology and Behavior, Henan University, Kaifeng 475001, China
| | - Bo Xu
- Institute of Cognition, Brain & Health, Henan University, Kaifeng 475001, China
- Institute of Psychology and Behavior, Henan University, Kaifeng 475001, China
| | - Entao Zhang
- Institute of Cognition, Brain & Health, Henan University, Kaifeng 475001, China
- Institute of Psychology and Behavior, Henan University, Kaifeng 475001, China
| |
Collapse
|
50
|
Koizumi M, Tomoda A, Takiguchi S, Kosaka PH. Impact of Stable Environments on Maltreated Children. Pediatr Int 2022; 65:e15443. [PMID: 36528865 DOI: 10.1111/ped.15443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/29/2021] [Revised: 12/02/2022] [Accepted: 12/11/2022] [Indexed: 12/23/2022]
Abstract
BACKGROUND The development of the ability to understand others' facial expressions is thought to be dependent on the environment in which one has been reared. METHODS This study compared the ability to understand others' facial expressions between 15 children who were in an unstable environment, 11 children who had been maltreated before and were in a stable environment, like a foster family, and 33 children who had never been maltreated. We used the "Reading the Mind in the Eyes Test" (RMET) as measure. RESULTS Children who were in an unstable environment scored higher in the RMET than children who had never been maltreated. CONCLUSIONS The results suggested that hypersensitivity to others' facial expressions might be an adaptive response to a harmful environment, and that it might decline when in a stable environment because such sensitivity is no longer needed.
Collapse
Affiliation(s)
- Michiko Koizumi
- Research Center for Child Mental Development, University of Fukui, Fukui, Japan
| | - Akemi Tomoda
- Research Center for Child Mental Development, University of Fukui, Fukui, Japan.,Department of Child and Adolescent Psychological Medicine, University of Fukui Hospital, Fukui, Japan.,Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, Matsuoka-Shimoaizuki, Fukui, Japan
| | - Shinichiro Takiguchi
- Department of Child and Adolescent Psychological Medicine, University of Fukui Hospital, Fukui, Japan.,Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, Matsuoka-Shimoaizuki, Fukui, Japan
| | - PHirotaka Kosaka
- Research Center for Child Mental Development, University of Fukui, Fukui, Japan.,Department of Child and Adolescent Psychological Medicine, University of Fukui Hospital, Fukui, Japan.,Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, Matsuoka-Shimoaizuki, Fukui, Japan.,Department of Neuropsychiatry, University of Fukui, Fukui, Japan
| |
Collapse
|