1
|
Murray T, O'Brien J, Sagiv N, Garrido L. The role of stimulus-based cues and conceptual information in processing facial expressions of emotion. Cortex 2021; 144:109-132. [PMID: 34666297 DOI: 10.1016/j.cortex.2021.08.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Revised: 07/16/2021] [Accepted: 08/09/2021] [Indexed: 01/07/2023]
Abstract
Face shape and surface textures are two important cues that aid in the perception of facial expressions of emotion. Additionally, this perception is also influenced by high-level emotion concepts. Across two studies, we use representational similarity analysis to investigate the relative roles of shape, surface, and conceptual information in the perception, categorisation, and neural representation of facial expressions. In Study 1, 50 participants completed a perceptual task designed to measure the perceptual similarity of expression pairs, and a categorical task designed to measure the confusability between expression pairs when assigning emotion labels to a face. We used representational similarity analysis and constructed three models of the similarities between emotions using distinct information. Two models were based on stimulus-based cues (face shapes and surface textures) and one model was based on emotion concepts. Using multiple linear regression, we found that behaviour during both tasks was related with the similarity of emotion concepts. The model based on face shapes was more related with behaviour in the perceptual task than in the categorical, and the model based on surface textures was more related with behaviour in the categorical than the perceptual task. In Study 2, 30 participants viewed facial expressions while undergoing fMRI, allowing for the measurement of brain representational geometries of facial expressions of emotion in three core face-responsive regions (the Fusiform Face Area, Occipital Face Area, and Superior Temporal Sulcus), and a region involved in theory of mind (Medial Prefrontal Cortex). Across all four regions, the representational distances between facial expression pairs were related to the similarities of emotion concepts, but not to either of the stimulus-based cues. Together, these results highlight the important top-down influence of high-level emotion concepts both in behavioural tasks and in the neural representation of facial expressions.
Collapse
Affiliation(s)
- Thomas Murray
- Psychology Department, School of Biological and Behavioural Sciences, Queen Mary University London, United Kingdom.
| | - Justin O'Brien
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Noam Sagiv
- Centre for Cognitive Neuroscience, Department of Life Sciences, Brunel University London, United Kingdom
| | - Lúcia Garrido
- Department of Psychology, City, University of London, United Kingdom
| |
Collapse
|
2
|
Abstract
This paper describes a method to measure the sensitivity of an individual to different facial expressions. It shows that individual participants are more sensitive to happy than to fearful expressions and that the differences are statistically significant using the model-comparison approach. Sensitivity is measured by asking participants to discriminate between an emotional facial expression and a neutral expression of the same face. The expression was diluted to different degrees by combining it in different proportions with the neutral expression using morphing software. Sensitivity is defined as measurement of the proportion of neutral expression in a stimulus required for participants to discriminate the emotional expression on 75% of presentations. Individuals could reliably discriminate happy expressions diluted with a greater proportion of the neutral expression compared with that required for discrimination of fearful expressions. This tells us that individual participants are more sensitive to happy compared with fearful expressions. Sensitivity is equivalent when measured on two different testing sessions, and greater sensitivity to happy expressions is maintained with short stimulus durations and stimuli generated using different morphing software. Increased sensitivity to happy compared with fear expressions was affected at smaller image sizes for some participants. Application of the approach for use with clinical populations, as well as understanding the relative contribution of perceptual processing and affective processing in facial expression recognition, is discussed.
Collapse
|
3
|
Nonlinear transduction of emotional facial expression. Vision Res 2020; 170:1-11. [PMID: 32217366 DOI: 10.1016/j.visres.2020.03.004] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 03/06/2020] [Accepted: 03/09/2020] [Indexed: 11/23/2022]
Abstract
To create neural representations of external stimuli, the brain performs a number of processing steps that transform its inputs. For fundamental attributes, such as stimulus contrast, this involves one or more nonlinearities that are believed to optimise the neural code to represent features of the natural environment. Here we ask if the same is also true of more complex stimulus dimensions, such as emotional facial expression. We report the results of three experiments combining morphed facial stimuli with electrophysiological and psychophysical methods to measure the function mapping emotional expression intensity to internal response. The results converge on a nonlinearity that accelerates over weak expressions, and then becomes shallower for stronger expressions, similar to the situation for lower level stimulus properties. We further demonstrate that the nonlinearity is not attributable to the morphing procedure used in stimulus generation.
Collapse
|
4
|
Trilla I, Weigand A, Dziobek I. Affective states influence emotion perception: evidence for emotional egocentricity. PSYCHOLOGICAL RESEARCH 2020; 85:1005-1015. [PMID: 32206856 PMCID: PMC8049894 DOI: 10.1007/s00426-020-01314-3] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2019] [Accepted: 03/06/2020] [Indexed: 11/24/2022]
Abstract
Research in social cognition has shown that our own emotional experiences are an important source of information to understand what other people are feeling. The current study investigated whether individuals project their own affective states when reading other's emotional expressions. We used brief autobiographical recall and audiovisual stimuli to induce happy, neutral and sad transient states. After each emotion induction, participants made emotion judgments about ambiguous faces displaying a mixture of happiness and sadness. Using an adaptive psychophysics procedure, we estimated the tendency to perceive the faces as happy under each of the induced affective states. Results demonstrate the occurrence of egocentric projections, such that faces were more likely judged as happy when participants reported being happy as compared to when they were sad. Moreover, the degree of emotional egocentricity was associated with individual differences in perspective-taking, with smaller biases being observed in individuals with higher disposition to take the perspective of others. Our findings extend previous literature on emotional egocentricity by showing that self-projection occurs when we make emotion attributions based on the other's emotional expressions, and supports the notion that perspective-taking tendencies play a role in the ability to understand the other's affective states.
Collapse
Affiliation(s)
- Irene Trilla
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany. .,Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany.
| | - Anne Weigand
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany.,Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany.,Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany
| |
Collapse
|
5
|
Leleu A, Dzhelyova M, Rossion B, Brochard R, Durand K, Schaal B, Baudouin JY. Tuning functions for automatic detection of brief changes of facial expression in the human brain. Neuroimage 2018; 179:235-251. [DOI: 10.1016/j.neuroimage.2018.06.048] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2017] [Revised: 05/03/2018] [Accepted: 06/15/2018] [Indexed: 12/27/2022] Open
|
6
|
Fantoni C, Rigutti S, Gerbino W. Bodily action penetrates affective perception. PeerJ 2016; 4:e1677. [PMID: 26893964 PMCID: PMC4756752 DOI: 10.7717/peerj.1677] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2015] [Accepted: 01/20/2016] [Indexed: 11/20/2022] Open
Abstract
Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer's internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer's internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on perception: specifically, facial expressions of emotion are penetrable by action-induced mood. Affective priming by action valence is a candidate mechanism for the influence of observer's internal states on properties experienced as phenomenally objective and yet loaded with meaning.
Collapse
Affiliation(s)
- Carlo Fantoni
- Department of Life Sciences, Psychology Unit "Gaetano Kanizsa," University of Trieste , Trieste , Italy
| | - Sara Rigutti
- Department of Life Sciences, Psychology Unit "Gaetano Kanizsa," University of Trieste , Trieste , Italy
| | - Walter Gerbino
- Department of Life Sciences, Psychology Unit "Gaetano Kanizsa," University of Trieste , Trieste , Italy
| |
Collapse
|
7
|
Hass NC, Schneider EJS, Lim SL. Emotional expressions of old faces are perceived as more positive and less negative than young faces in young adults. Front Psychol 2015; 6:1276. [PMID: 26379599 PMCID: PMC4549556 DOI: 10.3389/fpsyg.2015.01276] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2015] [Accepted: 08/10/2015] [Indexed: 12/02/2022] Open
Abstract
Interpreting the emotions of others through their facial expressions can provide important social information, yet the way in which we judge an emotion is subject to psychosocial factors. We hypothesized that the age of a face would bias how the emotional expressions are judged, with older faces generally more likely to be viewed as having more positive and less negative expressions than younger faces. Using two-alternative forced-choice perceptual decision tasks, participants sorted young and old faces of which emotional expressions were gradually morphed into one of two categories-"neutral vs. happy" and "neutral vs. angry." The results indicated that old faces were more frequently perceived as having a happy expression at the lower emotional intensity levels, and less frequently perceived as having an angry expression at the higher emotional intensity levels than younger faces in young adults. Critically, the perceptual decision threshold at which old faces were judged as happy was lower than for young faces, and higher for angry old faces compared to young faces. These findings suggest that the age of the face influences how its emotional expression is interpreted in social interactions.
Collapse
Affiliation(s)
| | | | - Seung-Lark Lim
- Department of Psychology, University of Missouri-Kansas City, Kansas City, MO, USA
| |
Collapse
|
8
|
Lee JG, Jung SJ, Lee HJ, Seo JH, Choi YJ, Bae HS, Park JT, Kim HJ. Quantitative anatomical analysis of facial expression using a 3D motion capture system: Application to cosmetic surgery and facial recognition technology. Clin Anat 2015; 28:735-44. [PMID: 25872024 DOI: 10.1002/ca.22542] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2014] [Revised: 02/27/2015] [Accepted: 03/02/2015] [Indexed: 11/07/2022]
Abstract
The topography of the facial muscles differs between males and females and among individuals of the same gender. To explain the unique expressions that people can make, it is important to define the shapes of the muscle, their associations with the skin, and their relative functions. Three-dimensional (3D) motion-capture analysis, often used to study facial expression, was used in this study to identify characteristic skin movements in males and females when they made six representative basic expressions. The movements of 44 reflective markers (RMs) positioned on anatomical landmarks were measured. Their mean displacement was large in males [ranging from 14.31 mm (fear) to 41.15 mm (anger)], and 3.35-4.76 mm smaller in females [ranging from 9.55 mm (fear) to 37.80 mm (anger)]. The percentages of RMs involved in the ten highest mean maximum displacement values in making at least one expression were 47.6% in males and 61.9% in females. The movements of the RMs were larger in males than females but were more limited. Expanding our understanding of facial expression requires morphological studies of facial muscles and studies of related complex functionality. Conducting these together with quantitative analyses, as in the present study, will yield data valuable for medicine, dentistry, and engineering, for example, for surgical operations on facial regions, software for predicting changes in facial features and expressions after corrective surgery, and the development of face-mimicking robots.
Collapse
Affiliation(s)
- Jae-Gi Lee
- Department of Dental Hygiene, School of Health and Medicine, Namseoul University, Cheonan, South Korea
| | - Su-Jin Jung
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Human Identification Research Center, Yonsei University College of Dentistry, Seoul, South Korea
| | - Hyung-Jin Lee
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Human Identification Research Center, Yonsei University College of Dentistry, Seoul, South Korea
| | - Jung-Hyuk Seo
- Department of Advanced General Dentistry, Yonsei University College of Dentistry, Seoul, South Korea
| | - You-Jin Choi
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Human Identification Research Center, Yonsei University College of Dentistry, Seoul, South Korea
| | - Hyun-Sook Bae
- Department of Dental Hygiene, School of Health and Medicine, Namseoul University, Cheonan, South Korea
| | - Jong-Tae Park
- Department of Oral Anatomy, Dankook University College of Dentistry, Cheonan, South Korea
| | - Hee-Jin Kim
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Human Identification Research Center, Yonsei University College of Dentistry, Seoul, South Korea
| |
Collapse
|
9
|
Abstract
Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.
Collapse
|
10
|
Marneweck M, Hammond G. Discriminating facial expressions of emotion and its link with perceiving visual form in Parkinson's disease. J Neurol Sci 2014; 346:149-55. [PMID: 25179875 DOI: 10.1016/j.jns.2014.08.014] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2014] [Revised: 07/30/2014] [Accepted: 08/11/2014] [Indexed: 11/26/2022]
Abstract
We investigated the link between the ability to perceive facial expressions of emotion and the ability to perceive visual form in Parkinson's disease (PD). We assessed in individuals with PD and healthy controls the ability to discriminate graded intensities of facial expressions of anger from neutral expressions and the ability to discriminate radial frequency (RF) patterns with modulations in amplitude from a perfect circle. Those with PD were, as a group, impaired relative to controls in discriminating graded intensities of angry from neutral expressions and discriminating modulated amplitudes of RF patterns from perfect circles; these two abilities correlated positively and moderately to highly, even after removing the variance that was shared with disease progression and general cognitive functioning. The results indicate that the impaired ability to perceive visual form is likely to contribute to the impaired ability to perceive facial expressions of emotion in PD, and that both are related to the progression of the disease.
Collapse
Affiliation(s)
| | - Geoff Hammond
- School of Psychology, University of Western Australia, Australia
| |
Collapse
|