1
|
Sex differences in facial expressions of pain: results from a combined sample. Pain 2024:00006396-990000000-00522. [PMID: 38334501 DOI: 10.1097/j.pain.0000000000003180] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2023] [Accepted: 12/13/2023] [Indexed: 02/10/2024]
Abstract
ABSTRACT Facial expressions of pain play an important role in pain diagnostics and social interactions. Given the prominent impact of sex on various aspects of pain, it is not surprising that sex differences have also been explored regarding facial expressions of pain; however, with inconclusive findings. We aim to further investigate sex differences in facial expressions of pain by using a large, combined sample to maximize statistical power. Data from 7 previous studies of our group were merged, combining in total the data of 392 participants (male: 192, female: 200). All participants received phasic heat pain, with intensities being tailored to the individual pain threshold. Pain intensity ratings were assessed, and facial responses were manually analyzed using the Facial Action Coding. To compare facial and subjective responses between sexes, linear mixed-effects models were used, with study ID as a random effect. We found significant sex differences in facial responses, with females showing elevated facial responses to pain, although they received lower physical heat intensities (women had lower pain thresholds). In contrast, pain intensity ratings did not differ between sexes. Additionally, facial and subjective responses to pain were significantly associated across sexes, with females showing slightly stronger associations. Although variations in facial expressions of pain are very large even within each sex, our findings demonstrate that women facially communicate pain more intensively and with a better match to their subjective experience compared with men. This indicates that women might be better in using facial communication of pain in an intensity-discriminative manner.
Collapse
|
2
|
Brain mechanisms associated with facial encoding of affective states. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2023; 23:1281-1290. [PMID: 37349604 PMCID: PMC10545577 DOI: 10.3758/s13415-023-01114-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/15/2023] [Indexed: 06/24/2023]
Abstract
Affective states are typically accompanied by facial expressions, but these behavioral manifestations are highly variable. Even highly arousing and negative valent experiences, such as pain, show great instability in facial affect encoding. The present study investigated which neural mechanisms are associated with variations in facial affect encoding by focusing on facial encoding of sustained pain experiences. Facial expressions, pain ratings, and brain activity (BOLD-fMRI) during tonic heat pain were recorded in 27 healthy participants. We analyzed facial expressions by using the Facial Action Coding System (FACS) and examined brain activations during epochs of painful stimulation that were accompanied by facial expressions of pain. Epochs of facial expressions of pain were coupled with activity increase in motor areas (M1, premotor and SMA) as well as in areas involved in nociceptive processing, including primary and secondary somatosensory cortex, posterior and anterior insula, and the anterior part of the mid-cingulate cortex. In contrast, prefrontal structures (ventrolateral and medial prefrontal) were less activated during incidences of facial expressions, consistent with a role in down-regulating facial displays. These results indicate that incidences of facial encoding of pain reflect activity within nociceptive pathways interacting or possibly competing with prefrontal inhibitory systems that gate the level of expressiveness.
Collapse
|
3
|
The effect of survey administration mode on youth mental health measures: Social desirability bias and sensitive questions. Heliyon 2023; 9:e20131. [PMID: 37809858 PMCID: PMC10559918 DOI: 10.1016/j.heliyon.2023.e20131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 08/31/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Aim Research on trends in youth mental health is used to inform government policy and service funding decisions. It often uses interviewer-administered surveys, which may be affected by mode effects related to social desirability bias. This study sought to determine the impact of survey administration mode on mental health measures, comparing mode effects for sensitive mental health measures (psychological distress and wellbeing) and non-sensitive (physical activity) measures. Methods Data were from two large national community samples of young Australians aged 12-25 years conducted in 2020 (N = 6238) and 2022 (N = 4122), which used both interviewer-administered and self-report modes of data collection. Results Results showed participants reported lower psychological distress and higher wellbeing in the interviewer-assisted compared with the self-report mode. No mode effects were found for the non-sensitive physical activity measures. No interaction between mode and gender was found, but an age group by mode interaction revealed that those in the 18-21 and 22-25-year age groups were more strongly affected than younger adolescents. Conclusions These findings suggest underestimates of mental health issues from interview survey formats, particularly for young adults. The results show how even a weak mode effect can have a large impact on mental health prevalence indicators. Researchers and policy makers need to be aware of the impact social desirability bias can have on mental health measures and consider taking steps to mitigate this effect.
Collapse
|
4
|
Facial Regulation During Dyadic Interaction: Interpersonal Effects on Cooperation. AFFECTIVE SCIENCE 2023; 4:506-516. [PMID: 37744968 PMCID: PMC10514003 DOI: 10.1007/s42761-023-00208-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 07/18/2023] [Indexed: 09/26/2023]
Abstract
This study investigated interpersonal effects of regulating naturalistic facial signals on cooperation during an iterative Prisoner's Dilemma (IPD) game. Fifty pairs of participants played ten IPD rounds across a video link then reported on their own and their partner's expressed emotion and facial regulation in a video-cued recall (VCR) procedure. iMotions software allowed us to auto-code actors' and partners' facial activity following the outcome of each round. We used two-level mixed effects logistic regression to assess over-time actor and partner effects of auto-coded facial activity, self-reported facial regulation, and perceptions of the partner's facial regulation on the actor's subsequent cooperation. Actors were significantly less likely to cooperate when their partners had defected on the previous round. None of the lagged scores based on auto-coded facial activity were significant predictors of cooperation. However, VCR variables representing partner's positive regulation of expressions and actor's perception of partner's positive regulation both significantly increased the probability of subsequent actor cooperation after controlling for prior defection. These results offer preliminary evidence about interpersonal effects of facial regulation in interactive contexts and illustrate how dynamic dyadic emotional processes can be systematically investigated in controlled settings. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-023-00208-y.
Collapse
|
5
|
Optomyography-based sensing of facial expression derived arousal and valence in adults with depression. Front Psychiatry 2023; 14:1232433. [PMID: 37614653 PMCID: PMC10442807 DOI: 10.3389/fpsyt.2023.1232433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Accepted: 07/28/2023] [Indexed: 08/25/2023] Open
Abstract
Background Continuous assessment of affective behaviors could improve the diagnosis, assessment and monitoring of chronic mental health and neurological conditions such as depression. However, there are no technologies well suited to this, limiting potential clinical applications. Aim To test if we could replicate previous evidence of hypo reactivity to emotional salient material using an entirely new sensing technique called optomyography which is well suited to remote monitoring. Methods Thirty-eight depressed and 37 controls (≥18, ≤40 years) who met a research diagnosis of depression and an age-matched non-depressed control group. Changes in facial muscle activity over the brow (corrugator supercilli) and cheek (zygomaticus major) were measured whilst volunteers watched videos varying in emotional salience. Results Across all participants, videos rated as subjectively positive were associated with activation of muscles in the cheek relative to videos rated as neutral or negative. Videos rated as subjectively negative were associated with brow activation relative to videos judged as neutral or positive. Self-reported arousal was associated with a step increase in facial muscle activation across the brow and cheek. Group differences were significantly reduced activation in facial muscles during videos considered subjectively negative or rated as high arousal in depressed volunteers compared with controls. Conclusion We demonstrate for the first time that it is possible to detect facial expression hypo-reactivity in adults with depression in response to emotional content using glasses-based optomyography sensing. It is hoped these results may encourage the use of optomyography-based sensing to track facial expressions in the real-world, outside of a specialized testing environment.
Collapse
|
6
|
Are women truly “more emotional” than men? Sex differences in an indirect model-based measure of emotional feelings. CURRENT PSYCHOLOGY 2023. [DOI: 10.1007/s12144-022-04227-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
|
7
|
Assessing patient attitudes toward genetic testing for hereditary hematologic malignancy. Eur J Haematol 2023; 110:109-116. [PMID: 36209474 DOI: 10.1111/ejh.13880] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Revised: 09/29/2022] [Accepted: 10/03/2022] [Indexed: 06/16/2023]
Abstract
Since 2003, more than 15 genes have been identified to predispose to hereditary hematologic malignancy (HHM). Although the yield of germline analysis for leukemia appears like that of solid tumors, genetic referrals in adults with leukemia remain underperformed. We assessed leukemia patients' attitudes toward genetic testing and leukemia-related distress through a survey of 1093 patients diagnosed with acute or chronic leukemia, myelodysplastic syndrome, or aplastic anemia. Principal component analysis (PCA) was used to analyze patient attitudes. Distress was measured through the Impact of Event Scale-Revised (IES-R). Exactly 19.8% of eligible respondents completed the survey. The majority reported interest in (77%) or choosing to have (78%) genetic testing for HHM. Slightly over half identified worry about cost of genetic testing (58%) or health insurance coverage (61%) as possible barriers. PCA identified relevant themes of interest in genetic testing, impact on leukemia treatment, discrimination and confidentiality, psychosocial and familial impacts, and cost of testing. The majority reported low distress. Leukemia patients report high interest in genetic testing, few barriers, and relatively low distress.
Collapse
|
8
|
Automated detection of smiles as discrete episodes. J Oral Rehabil 2022; 49:1173-1180. [PMID: 36205621 PMCID: PMC9828522 DOI: 10.1111/joor.13378] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2022] [Revised: 07/25/2022] [Accepted: 08/22/2022] [Indexed: 01/12/2023]
Abstract
BACKGROUND Patients seeking restorative and orthodontic treatment expect an improvement in their smiles and oral health-related quality of life. Nonetheless, the qualitative and quantitative characteristics of dynamic smiles are yet to be understood. OBJECTIVE To develop, validate, and introduce open-access software for automated analysis of smiles in terms of their frequency, genuineness, duration, and intensity. MATERIALS AND METHODS A software script was developed using the Facial Action Coding System (FACS) and artificial intelligence to assess activations of (1) cheek raiser, a marker of smile genuineness; (2) lip corner puller, a marker of smile intensity; and (3) perioral lip muscles, a marker of lips apart. Thirty study participants were asked to view a series of amusing videos. A full-face video was recorded using a webcam. The onset and cessation of smile episodes were identified by two examiners trained with FACS coding. A Receiver Operating Characteristic (ROC) curve was then used to assess detection accuracy and optimise thresholding. The videos of participants were then analysed off-line to automatedly assess the features of smiles. RESULTS The area under the ROC curve for smile detection was 0.94, with a sensitivity of 82.9% and a specificity of 89.7%. The software correctly identified 90.0% of smile episodes. While watching the amusing videos, study participants smiled 1.6 (±0.8) times per minute. CONCLUSIONS Features of smiles such as frequency, duration, genuineness, and intensity can be automatedly assessed with an acceptable level of accuracy. The software can be used to investigate the impact of oral conditions and their rehabilitation on smiles.
Collapse
|
9
|
Comunicación no verbal de emociones: variables sociodemográficas y ventaja endogrupal. REVISTA IBEROAMERICANA DE PSICOLOGÍA 2022. [DOI: 10.33881/2027-1786.rip.15209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023] Open
Abstract
En el campo de la comunicación no verbal de las emociones aún existe un debate en torno a la universalidad de las expresiones de emoción y el efecto que tiene la cultura en ellas. Actualmente existen dos teorías que tratan de explicar este fenómeno, la teoría neurocultural y la teoría de los dialectos. Ambas se enfocan en explicar la comunicación no verbal de emociones, pero la primera se centra en los aspectos universales, mientras que la segunda lo hace en la cultura. El objetivo del presente estudio fue indagar la ventaja endogrupal al interior de una cultura. Se diseñó un cuasiexperimento en el que se solicitó a 107 participantes que indicaran la emoción expresada en 42 estímulos en tres formatos de presentación distintos. Los resultados indican la existencia de dicha ventaja en las mujeres y jóvenes. Los presentes resultados ilustran los efectos de la cultura en este fenómeno.
Collapse
|
10
|
Subjective and objective difficulty of emotional facial expression perception from dynamic stimuli. PLoS One 2022; 17:e0269156. [PMID: 35709093 PMCID: PMC9202844 DOI: 10.1371/journal.pone.0269156] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Accepted: 05/17/2022] [Indexed: 11/18/2022] Open
Abstract
This study aimed to discover predictors of subjective and objective difficulty in emotion perception from dynamic facial expressions. We used a multidimensional emotion perception framework, in which observers rated the perceived emotion along a number of dimensions instead of choosing from traditionally-used discrete categories of emotions. Data were collected online from 441 participants who rated facial expression stimuli in a novel paradigm designed to separately measure subjective (self-reported) and objective (deviation from the population consensus) difficulty. We targeted person-specific (sex and age of observers and actors) and stimulus-specific (valence and arousal values) predictors of those difficulty scores. Our findings suggest that increasing age of actors makes emotion perception more difficult for observers, and that perception difficulty is underestimated by men in comparison to women, and by younger and older adults in comparison to middle-aged adults. The results also yielded an increase in the objective difficulty measure for female observers and female actors. Stimulus-specific factors–valence and arousal–exhibited quadratic relationships with subjective and objective difficulties: Very positive and very negative stimuli were linked to reduced subjective and objective difficulty, whereas stimuli of very low and high arousal were linked to decreased subjective but increased objective difficulty. Exploratory analyses revealed low relevance of person-specific variables for the prediction of difficulty but highlighted the importance of valence in emotion perception, in line with functional accounts of emotions. Our findings highlight the need to complement traditional emotion recognition paradigms with novel designs, like the one presented here, to grasp the “big picture” of human emotion perception.
Collapse
|
11
|
Abstract
The transformation that COVID-19 has brought upon the world is unparalleled. The impact on mental health is equally unprecedented and yet unexplored in depth. An online-based survey was administered to 413 community-based adults during COVID-19 confinement to explore psychological impact and identify high risk profiles. Young females concerned about the future, expressing high COVID-related distress, already following psychological therapy and suffering from pre-existing chronic conditions, were those at highest risk of psychological impact due to the COVID-19 situation. Findings could be employed to design tailored psychological interventions in the early stages of the outbreak to avoid the onset/exacerbation of psychopathology.
Collapse
|
12
|
Exploring the moderating role of gender in the relation between emotional expressivity and posttraumatic stress disorder symptom severity among Black trauma-exposed college students at a historically Black university. J Clin Psychol 2022; 78:343-356. [PMID: 34320220 PMCID: PMC8795200 DOI: 10.1002/jclp.23226] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2020] [Revised: 06/15/2021] [Accepted: 06/30/2021] [Indexed: 02/03/2023]
Abstract
OBJECTIVES Posttraumatic stress disorder (PTSD) is characterized in part by negative alterations of cognition or mood, including alterations in emotional expressivity, or the extent to which one outwardly displays emotions. Yet, research in this area has relied on predominantly white samples and neglected to consider the potential role of gender, despite there being demonstrated gender differences in both PTSD symptom severity and emotional expressivity, separately. The goal of the current study was to fill a critical gap in the literature by examining the moderating role of gender in the relation between PTSD symptom severity and emotional expressivity in a sample of trauma-exposed Black adults. METHODS Participants were 207 Black individuals enrolled in a historically Black university in the Southern United States (68.6% female; Mage = 22.32 years). RESULTS Findings provided support for the moderating role of gender in the association between PTSD symptom severity and emotional expressivity. Specifically, greater PTSD symptom severity was inversely related to emotional expressivity among trauma-exposed Black males and positively associated with emotional expressivity among trauma-exposed Black females. DISCUSSION These results suggest the potential need for gender-specific assessment and treatment techniques for PTSD symptom severity among trauma-exposed Black college students.
Collapse
|
13
|
Abstract
To study the neuroanatomical correlate of involuntary unilateral blinking in humans, using the example of patients with focal epilepsy. Patients with drug resistant focal epilepsy undergoing presurgical evaluation with stereotactically implanted EEG-electrodes (sEEG) were recruited from the local epilepsy monitoring unit. Only patients showing ictal unilateral blinking or unilateral blinking elicited by direct electrical stimulation were included (n = 16). MRI and CT data were used for visualization of the electrode positions. In two patients, probabilistic tractography with seeding from the respective electrodes was additionally performed. Three main findings were made: (1) involuntary unilateral blinking was associated with activation of the anterior temporal region, (2) tractography showed widespread projections to the ipsilateral frontal, pericentral, occipital, limbic and cerebellar regions and (3) blinking was observed predominantly in female patients with temporal lobe epilepsies. Unilateral blinking was found to be associated with an ipsilateral activation of the anterior temporal region. We suggest that the identified network is not part of the primary blinking control but might have modulating influence on ipsilateral blinking by integrating contextual information.
Collapse
|
14
|
Abstract
Reciprocating smiles is important for maintaining social bonds as it both signals affiliative intent and elicits affiliative responses. Feelings of social exclusion may increase mimicry as a means to regulate affiliative bonds with others. In this study, we examined whether feelings of exclusion lead people to selectively reciprocate the facial expressions of more affiliative-looking people. Participants first wrote about either a time they were excluded or a neutral event. They then classified 20 smiles-half spontaneous smiles and half posed. Facial electromyography recorded smile muscle activity. Excluded participants distinguished the two smile types better than controls. Excluded participants also showed greater zygomaticus major (mouth smiling) activity toward enjoyment smiles compared to posed smiles; control participants did not. Orbicularis oculi (eye crinkle) activity matched that of the smile type viewed, but did not vary by exclusion condition. Affiliative social regulation is discussed as a possible explanation for these effects.
Collapse
|
15
|
Exploring the Meanings of the “Heartfelt” Gesture: A Nonverbal Signal of Heartfelt Emotion and Empathy. JOURNAL OF NONVERBAL BEHAVIOR 2021. [DOI: 10.1007/s10919-021-00371-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
|
16
|
Correlations Between Facial Expressivity and Apathy in Elderly People With Neurocognitive Disorders: Exploratory Study. JMIR Form Res 2021; 5:e24727. [PMID: 33787499 PMCID: PMC8047819 DOI: 10.2196/24727] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 11/23/2020] [Accepted: 01/18/2021] [Indexed: 01/20/2023] Open
Abstract
Background Neurocognitive disorders are often accompanied by behavioral symptoms such as anxiety, depression, and/or apathy. These symptoms can occur very early in the disease progression and are often difficult to detect and quantify in nonspecialized clinical settings. Objective We focus in this study on apathy, one of the most common and debilitating neuropsychiatric symptoms in neurocognitive disorders. Specifically, we investigated whether facial expressivity extracted through computer vision software correlates with the severity of apathy symptoms in elderly subjects with neurocognitive disorders. Methods A total of 63 subjects (38 females and 25 males) with neurocognitive disorder participated in the study. Apathy was assessed using the Apathy Inventory (AI), a scale comprising 3 domains of apathy: loss of interest, loss of initiation, and emotional blunting. The higher the scale score, the more severe the apathy symptoms. Participants were asked to recall a positive and a negative event of their life, while their voice and face were recorded using a tablet device. Action units (AUs), which are basic facial movements, were extracted using OpenFace 2.0. A total of 17 AUs (intensity and presence) for each frame of the video were extracted in both positive and negative storytelling. Average intensity and frequency of AU activation were calculated for each participant in each video. Partial correlations (controlling for the level of depression and cognitive impairment) were performed between these indexes and AI subscales. Results Results showed that AU intensity and frequency were negatively correlated with apathy scale scores, in particular with the emotional blunting component. The more severe the apathy symptoms, the less expressivity in specific emotional and nonemotional AUs was displayed from participants while recalling an emotional event. Different AUs showed significant correlations depending on the sex of the participant and the task’s valence (positive vs negative story), suggesting the importance of assessing male and female participants independently. Conclusions Our study suggests the interest of employing computer vision-based facial analysis to quantify facial expressivity and assess the severity of apathy symptoms in subjects with neurocognitive disorders. This may represent a useful tool for a preliminary apathy assessment in nonspecialized settings and could be used to complement classical clinical scales. Future studies including larger samples should confirm the clinical relevance of this kind of instrument.
Collapse
|
17
|
Sex differences in the social brain and in social cognition. J Neurosci Res 2021; 101:730-738. [PMID: 33608982 DOI: 10.1002/jnr.24787] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Revised: 01/06/2021] [Accepted: 01/09/2021] [Indexed: 11/11/2022]
Abstract
Many studies have reported sex differences in empathy and social skills. In this review, several lines of empirical evidences about sex differences in functions and anatomy of social brain are discussed. The most relevant differences involve face processing, facial expression recognition, response to baby schema, the ability to see faces in things, the processing of social interactions, the response to the others' pain, interest in social information, processing of gestures and actions, biological motion, erotic, and affective stimuli. Sex differences in oxytocin-based parental response are also reported. In conclusion, the female and male brains show several neuro-functional differences in various aspects of social cognition, and especially in emotional coding, face processing, and response to baby schema. An interpretation of this sexual dimorphism is provided in the view of evolutionary psychobiology.
Collapse
|
18
|
|
19
|
Encoding differences in posed negative emotional expressions between prosocials and proselfs. CURRENT PSYCHOLOGY 2021. [DOI: 10.1007/s12144-018-9986-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
20
|
A Novel Test of the Duchenne Marker: Smiles After Botulinum Toxin Treatment for Crow's Feet Wrinkles. Front Psychol 2021; 11:612654. [PMID: 33510690 PMCID: PMC7835207 DOI: 10.3389/fpsyg.2020.612654] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Accepted: 12/08/2020] [Indexed: 11/24/2022] Open
Abstract
Smiles that vary in muscular configuration also vary in how they are perceived. Previous research suggests that “Duchenne smiles,” indicated by the combined actions of the orbicularis oculi (cheek raiser) and the zygomaticus major muscles (lip corner puller), signal enjoyment. This research has compared perceptions of Duchenne smiles with non-Duchenne smiles among individuals voluntarily innervating or inhibiting the orbicularis oculi muscle. Here we used a novel set of highly controlled stimuli: photographs of patients taken before and after receiving botulinum toxin treatment for crow’s feet lines that selectively paralyzed the lateral orbicularis oculi muscle and removed visible lateral eye wrinkles, to test perception of smiles. Smiles in which the orbicularis muscle was active (prior to treatment) were rated as more felt, spontaneous, intense, and happier. Post treatment patients looked younger, although not more attractive. We discuss the potential implications of these findings within the context of emotion science and clinical research on botulinum toxin.
Collapse
|
21
|
The Study of Facial Muscle Movements for Non-Invasive Thermal Discomfort Detection via Bio-Sensing Technology. Part I: Development of the Experimental Design and Description of the Collected Data. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10207315] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In the time of climate change, as heat waves become a more regular occurrence, indoor thermal comfort is an important factor in day to day life. Due to such circumstances, many researchers have focused their studies on finding an effective solution that will not only enable thermal comfort, but also increase satisfaction within the indoor environment and, as a result, productivity. The fast development of the biometrical field encouraged the study focused on the investigation of how bio-markers, in combination with artificial intelligence algorithms, can be collected within an experimental setting to create a new approach for non-invasive thermal discomfort detection. The developed experimental design provides synergy between automatic facial coding, pulse, and galvanic skin response measurements via iMotions software in a controlled environment. The iMotions software has built-in machine vision algorithms, and with Shimmer sensors and a post-processing tool through Affectiva AFFDEX, is able to collect facial action data through detection of the facial muscle movements and various bio-markers. The Zero Emission Building (ZEB) Test Cell laboratory was used as the control environment and transformed to imitate an office space for the data collection campaign at NTNU in Trondheim. The given experimental design provides an opportunity to create an immense database with bio-markers that are linked to the subcortical level of the brain, indoor parameters, and direct feedback on the comfort level of occupants within an office-like environment. In total, 111 data collection sessions were registered with iMotions. The discomfort button was pressed 240 times and 1080 planned indoor comfort evaluation surveys were held during experiment. The discomfort button was pressed 49 times to indicate that participant felt discomfort due to low temperature and 52 due to high temperature. Collected data revealed a big deviation in the discomfort temperature values for experiment participants with respect to performed temperature ramps. While it is common to use the same predefined temperature range for facility management, it became clear that the complexity of the task is greater and should not be approached on a human computational level. Implementation of AI can potentially provide higher value accuracy within thermal discomfort detection and enable unique personal user experience at the workplace.
Collapse
|
22
|
Gender‐related differences in the facial aging of Chinese subjects and their relations with perceived ages. Skin Res Technol 2020; 26:905-913. [DOI: 10.1111/srt.12893] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2020] [Accepted: 05/30/2020] [Indexed: 01/26/2023]
|
23
|
Audiovisual emotion perception develops differently from audiovisual phoneme perception during childhood. PLoS One 2020; 15:e0234553. [PMID: 32555620 PMCID: PMC7302908 DOI: 10.1371/journal.pone.0234553] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2019] [Accepted: 05/28/2020] [Indexed: 11/18/2022] Open
Abstract
This study investigated the developmental paths in the use of audiovisual information for the perception of emotions and phonemes by Japanese speakers. Children aged 5 to 12 years and adults aged 30 to 39 years engaged in an emotion perception task in which speakers expressed their emotions through their faces and voices, and a phoneme perception task using phonemic information in speakers' lip movements and speech sounds. Results indicated that Japanese children's judgement of emotions by using auditory information increased with increasing age, whereas the use of audiovisual information for judging phonemes remained constant with increasing age. Moreover, adults were affected by visual information more than children. We discuss whether these differences in developmental patterns are due to differential integration processes for information indicative of emotions and phonemes, as well as possible cultural / linguistic reasons for these differences.
Collapse
|
24
|
Gender-related differences in the facial aging of Caucasian French subjects and their relations with perceived ages and tiredness. J Cosmet Dermatol 2020; 20:227-236. [PMID: 32315489 DOI: 10.1111/jocd.13446] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Revised: 04/08/2020] [Accepted: 04/16/2020] [Indexed: 02/01/2023]
Abstract
OBJECTIVES (a) To assess and compare the changes in five facial signs with age between genders of Caucasian subjects and (b) to evaluate their links with perceived ages and tiredness. MATERIAL AND METHODS Once zoomed from standardized digital photographs, five facial signs of 518 Caucasian French subjects of both genders and different ages (18-69 years) were graded by 15 experts, using a referential Skin Aging Atlas. A large naïve panel of 1000 French subjects (500 men and 500 women) was asked to attribute a perceived age and a degree of tiredness to 200 subjects (among the 518). RESULTS The severity of the facial signs increases with time at a linear-like rate. The changes in marionette lines significantly differ between genders, much more pronounced in women, and nasolabial fold was found more pronounced in men at older ages (>50 years). Before 50's, Forehead wrinkles present a slightly higher severity in men whereas at 50's women present more severe ptosis. Crow's feet wrinkles did not show significant changes. Perceived ages were found significantly correlated with the severities of the facial signs and the perception of tiredness was associated with perceived ages in men, but not in women older than 40 years. The gender-related perceptions from the naïve panel in both perceived ages and tiredness showed a low discrepancy. Interestingly, as for changes in facial signs, the upper-half face seems more affected for men and lower-half face for women; after 40 years, the naïve panel seems more focusing on the same areas to predict a perceived age. CONCLUSION As compared to the previous Chinese study, the present work reveals some slight ethnical-related differences, indicating that the facial signs of the lower face play a major role in the assessment of perceived age of both genders from different ethnicity.
Collapse
|
25
|
Explicit and Implicit Responses of Seeing Own vs. Others' Emotions: An Electromyographic Study on the Neurophysiological and Cognitive Basis of the Self-Mirroring Technique. Front Psychol 2020; 11:433. [PMID: 32296363 PMCID: PMC7136519 DOI: 10.3389/fpsyg.2020.00433] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2019] [Accepted: 02/25/2020] [Indexed: 01/29/2023] Open
Abstract
Facial mimicry is described by embodied cognition theories as a human mirror system-based neural mechanism underpinning emotion recognition. This could play a critical role in the Self-Mirroring Technique (SMT), a method used in psychotherapy to foster patients’ emotion recognition by showing them a video of their own face recorded during an emotionally salient moment. However, dissociation in facial mimicry during the perception of own and others’ emotions has not been investigated so far. In the present study, we measured electromyographic (EMG) activity from three facial muscles, namely, the zygomaticus major (ZM), the corrugator supercilii (CS), and the levator labii superioris (LLS) while participants were presented with video clips depicting their own face or other unknown faces expressing anger, happiness, sadness, disgust, fear, or a neutral emotion. The results showed that processing self vs. other expressions differently modulated emotion perception at the explicit and implicit muscular levels. Participants were significantly less accurate in recognizing their own vs. others’ neutral expressions and rated fearful, disgusted, and neutral expressions as more arousing in the self condition than in the other condition. Even facial EMG evidenced different activations for self vs. other facial expressions. Increased activation of the ZM muscle was found in the self condition compared to the other condition for anger and disgust. Activation of the CS muscle was lower for self than for others’ expressions during processing a happy, sad, fearful, or neutral emotion. Finally, the LLS muscle showed increased activation in the self condition compared to the other condition for sad and fearful expressions but increased activation in the other condition compared to the self condition for happy and neutral expressions. Taken together, our complex pattern of results suggests a dissociation at both the explicit and implicit levels in emotional processing of self vs. other emotions that, in the light of the Emotion in Context view, suggests that STM effectiveness is primarily due to a contextual–interpretative process that occurs before that facial mimicry takes place.
Collapse
|
26
|
Are Traditional, Negative Gender Attitudes Associated with Violent Attitudes toward Women? Insights from a New, Culturally Adapted Measure in India. SEX ROLES 2019. [DOI: 10.1007/s11199-019-01102-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
|
27
|
Distinct Habits Of Sun Exposures Lead To Different Impacts On Some Facial Signs Of Chinese Men Of Different Ages. Clin Cosmet Investig Dermatol 2019; 12:833-841. [PMID: 31814750 PMCID: PMC6863120 DOI: 10.2147/ccid.s226331] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2019] [Accepted: 10/19/2019] [Indexed: 11/23/2022]
Abstract
Objective To clinically evaluate the impacts of sun exposures on some facial signs of differently aged Chinese men with a distinct behavior vis à vis sun exposures. Methods Two comparable cohorts of Chinese men (aged 18-75 years old), living in two cities (Shanghai, Hong Kong) were created according to their usual behavior towards sun exposure and through their variable use(s) of a photo-protective product, i.e. non-sun-phobic (N = 127) and sun-phobic (N = 134). Standard photographs (full-face and 45° lateral) allowed to focus on 13 facial signs that were further graded by 15 experts and dermatologists, using a referential Skin Aging Atlas. Absolute differences in the scores of each sign were used (non-sun-phobic minus sun-phobic), by age-classes, to better ascertain the impact of sun exposures and a photo-protecting product, when used. Results Most facial signs, particularly wrinkles and skin texture, differentiated the two cohorts. Some others showed some erratic changes with age, albeit more pronounced at older ages. In contrast with previous results obtained in Chinese women, the changes observed in men were not only of a lessened severity but were undetected at early ages (<30 years old). Overall, these different behaviors with regard to sun exposures led to significant differences in the facial signs of Chinese men. The latter can be illustrated by two virtual morphings that combine the impacts of both intrinsic and extrinsic aging processes. Conclusion The present work illustrates, for the first time, some specificities of the impacts of sun exposures on the facial skin of Chinese men, more expressed at older ages, inversely to those observed in Chinese women, occurring at younger ages.
Collapse
|
28
|
Emotional expressions associated with therapeutic inertia in multiple sclerosis care. Mult Scler Relat Disord 2019; 34:17-28. [PMID: 31226545 DOI: 10.1016/j.msard.2019.05.029] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2019] [Revised: 05/28/2019] [Accepted: 05/31/2019] [Indexed: 11/24/2022]
Abstract
BACKGROUND Emotions play a critical role in our daily decisions. However, it remains unclear how and what sort of emotional expressions are associated with therapeutic decisions in multiple sclerosis (MS) care. Our goal was to evaluate the relationship between emotions and affective states (as captured by muscle facial activity and emotional expressions) and TI amongst neurologists caring for MS patients when making therapeutic decisions. METHODS 38 neurologists with expertise in MS were invited to participate in a face-to-face study across Canada. Participants answered questions regarding their clinical practice, aversion to ambiguity, and the management of 10 simulated case-scenarios. TI was defined as lack of treatment initiation or escalation when there was clear evidence of clinical and radiological disease activity. We recorded facial muscle activations and their associated emotional expressions during the study, while participants made therapeutic choices. We used a validated machine learning algorithm of the AFFDEX software to code for facial muscle activations and a predefined mapping to emotional expressions (disgust, fear, surprise, etc.). Mixed effects models and mediation analyses were used to evaluate the relationship between ambiguity aversion, facial muscle activity/emotional expressions and TI measured as a binary variable and a continuous score. RESULTS 34 (89.4%) neurologists completed the study. The mean age [standard deviation (SD)] was 44.6 (11.5) years; 38.3% were female and 58.8% self-identified as MS specialists. Overall, 17 (50%) participants showed TI in at least one case-scenario and the mean (SD) TI score was 0.74 (0.90). Nineteen (55.9%) participants had aversion to ambiguity in the financial domain. The multivariate analysis adjusted for age, sex and MS expertise showed that aversion to ambiguity in the financial domain (OR 1.56, 95%CI 1.32-1.86) was associated with TI. Most common muscle activations included mouth open (23.4%), brow furrow (20.9%), brow raise (17.6%), and eye widening (13.1%). Most common emotional expressions included fear (5.1%), disgust (3.2%), sadness (2.9%), and surprise (2.8%). After adjustment for age, sex, and physicians' expertise, the multivariate analysis revealed that brow furrow (OR 1.04; 95%CI 1.003-1.09) and lip suck (OR 1.06; 95%CI 1.01-1.11) were associated with an increase in TI prevalence, whereas upper lip raise (OR 0.30; 95%CI 0.15-0.59), and chin raise (OR 0.90; 95%CI 0.83-0.98) were associated with lower likelihood of TI. Disgust and surprise were associated with a lower TI score (disgust: p < 0.001; surprise: p = 0.008) and lower prevalence of TI (ORdisgust: 0.14, 95%CI 0.03-0.65; ORsurprise: 0.66, 94%CI 0.47-0.92) after adjusting for covariates. The mediation analysis showed that brow furrow was a partial mediator explaining 21.2% (95%CI 14.9%-38.9%) of the association between aversion to ambiguity and TI score, followed by nose wrinkle 12.8% (95%CI 8.9%-23.4%). Similarly, disgust was the single emotional expression (partial mediator) that attenuated (-13.2%, 95%CI -9.2% to -24.3%) the effect of aversion to ambiguity on TI. CONCLUSIONS TI was observed in half of participants in at least one case-scenario. Our data suggest that facial metrics (e.g. brow furrow, nose wrinkle) and emotional expressions (e.g. disgust) are associated with physicians' choices and partially mediate the effect of aversion to ambiguity on TI.
Collapse
|
29
|
Abstract
Advances in computer vision and machine learning algorithms have enabled researchers to extract facial expression data from face video recordings with greater ease and speed than standard manual coding methods, which has led to a dramatic increase in the pace of facial expression research. However, there are many limitations in recording facial expressions in laboratory settings. Conventional video recording setups using webcams, tripod-mounted cameras, or pan-tilt-zoom cameras require making compromises between cost, reliability, and flexibility. As an alternative, we propose the use of a mobile head-mounted camera that can be easily constructed from our open-source instructions and blueprints at a fraction of the cost of conventional setups. The head-mounted camera framework is supported by the open source Python toolbox FaceSync, which provides an automated method for synchronizing videos. We provide four proof-of-concept studies demonstrating the benefits of this recording system in reliably measuring and analyzing facial expressions in diverse experimental setups, including group interaction experiments.
Collapse
|
30
|
The minute-scale dynamics of online emotions reveal the effects of affect labeling. Nat Hum Behav 2018; 3:92-100. [PMID: 30932057 DOI: 10.1038/s41562-018-0490-5] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Accepted: 11/06/2018] [Indexed: 11/09/2022]
Abstract
Putting one's feelings into words (also called affect labeling) can attenuate positive and negative emotions. Here, we track the evolution of specific emotions for 74,487 Twitter users by analysing the emotional content of their tweets before and after they explicitly report experiencing a positive or negative emotion. Our results describe the evolution of emotions and their expression at the temporal resolution of one minute. The expression of positive emotions is preceded by a short, steep increase in positive valence and followed by short decay to normal levels. Negative emotions, however, build up more slowly and are followed by a sharp reversal to previous levels, consistent with previous studies demonstrating the attenuating effects of affect labeling. We estimate that positive and negative emotions last approximately 1.25 and 1.5 h, respectively, from onset to evanescence. A separate analysis for male and female individuals suggests the potential for gender-specific differences in emotional dynamics.
Collapse
|
31
|
Sex Differences in Affective Facial Reactions Are Present in Childhood. Front Integr Neurosci 2018; 12:19. [PMID: 29875642 PMCID: PMC5974214 DOI: 10.3389/fnint.2018.00019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2017] [Accepted: 05/04/2018] [Indexed: 11/13/2022] Open
Abstract
Adults exposed to affective facial displays produce specific rapid facial reactions (RFRs) which are of lower intensity in males compared to females. We investigated such sex difference in a population of 60 primary school children (30 F; 30 M), aged 7–10 years. We recorded the surface electromyographic (EMG) signal from the corrugator supercilii and the zygomatici muscles, while children watched affective facial displays. Results showed the expected smiling RFR to smiling faces and the expected frowning RFR to sad faces. A systematic difference between male and female participants was observed, with boys showing less ample EMG responses than age-matched girls. We demonstrate that sex differences in the somatic component of affective motor patterns are present also in childhood.
Collapse
|
32
|
Abstract
The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these factors to explain recognition ability in the processing of emotional prosody.
Collapse
|