1
|
Kim H, Küster D, Girard JM, Krumhuber EG. Human and machine recognition of dynamic and static facial expressions: prototypicality, ambiguity, and complexity. Front Psychol 2023; 14:1221081. [PMID: 37794914 PMCID: PMC10546417 DOI: 10.3389/fpsyg.2023.1221081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Accepted: 08/22/2023] [Indexed: 10/06/2023] Open
Abstract
A growing body of research suggests that movement aids facial expression recognition. However, less is known about the conditions under which the dynamic advantage occurs. The aim of this research was to test emotion recognition in static and dynamic facial expressions, thereby exploring the role of three featural parameters (prototypicality, ambiguity, and complexity) in human and machine analysis. In two studies, facial expression videos and corresponding images depicting the peak of the target and non-target emotion were presented to human observers and the machine classifier (FACET). Results revealed higher recognition rates for dynamic stimuli compared to non-target images. Such benefit disappeared in the context of target-emotion images which were similarly well (or even better) recognised than videos, and more prototypical, less ambiguous, and more complex in appearance than non-target images. While prototypicality and ambiguity exerted more predictive power in machine performance, complexity was more indicative of human emotion recognition. Interestingly, recognition performance by the machine was found to be superior to humans for both target and non-target images. Together, the findings point towards a compensatory role of dynamic information, particularly when static-based stimuli lack relevant features of the target emotion. Implications for research using automatic facial expression analysis (AFEA) are discussed.
Collapse
Affiliation(s)
- Hyunwoo Kim
- Departmet of Experimental Psychology, University College London, London, United Kingdom
| | - Dennis Küster
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | - Jeffrey M. Girard
- Department of Psychology, University of Kansas, Lawrence, KS, United States
| | - Eva G. Krumhuber
- Departmet of Experimental Psychology, University College London, London, United Kingdom
| |
Collapse
|
2
|
Chen Y, Xu Q, Fan C, Wang Y, Jiang Y. Eye gaze direction modulates nonconscious affective contextual effect. Conscious Cogn 2022; 102:103336. [DOI: 10.1016/j.concog.2022.103336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Revised: 04/06/2022] [Accepted: 04/23/2022] [Indexed: 11/03/2022]
|
3
|
Krumhuber EG, Hyniewska S, Orlowska A. Contextual effects on smile perception and recognition memory. CURRENT PSYCHOLOGY 2021. [DOI: 10.1007/s12144-021-01910-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
AbstractMost past research has focused on the role played by social context information in emotion classification, such as whether a display is perceived as belonging to one emotion category or another. The current study aims to investigate whether the effect of context extends to the interpretation of emotion displays, i.e. smiles that could be judged either as posed or spontaneous readouts of underlying positive emotion. A between-subjects design (N = 93) was used to investigate the perception and recall of posed smiles, presented together with a happy or polite social context scenario. Results showed that smiles seen in a happy context were judged as more spontaneous than the same smiles presented in a polite context. Also, smiles were misremembered as having more of the physical attributes (i.e., Duchenne marker) associated with spontaneous enjoyment when they appeared in the happy than polite context condition. Together, these findings indicate that social context information is routinely encoded during emotion perception, thereby shaping the interpretation and recognition memory of facial expressions.
Collapse
|
4
|
The inherently contextualized nature of facial emotion perception. Curr Opin Psychol 2017; 17:47-54. [DOI: 10.1016/j.copsyc.2017.06.006] [Citation(s) in RCA: 58] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2017] [Revised: 04/28/2017] [Accepted: 06/14/2017] [Indexed: 11/20/2022]
|
5
|
Nelson NL, Mondloch CJ. Adults’ and children’s perception of facial expressions is influenced by body postures even for dynamic stimuli. VISUAL COGNITION 2017. [DOI: 10.1080/13506285.2017.1301615] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Nicole L. Nelson
- School of Psychology, The University of Queensland, Brisbane, Australia
| | - Catherine J. Mondloch
- Department of Psychology, Brock University, St. Catharines, Canada
- ARC Centre of Excellence in Cognition and its Disorders, School of Psychology, University of Western Australia, Perth, Australia
| |
Collapse
|
6
|
Moors A. Integration of Two Skeptical Emotion Theories: Dimensional Appraisal Theory and Russell's Psychological Construction Theory. PSYCHOLOGICAL INQUIRY 2017. [DOI: 10.1080/1047840x.2017.1235900] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Agnes Moors
- Research Group of Quantitative Psychology and Individual Differences, KU Leuven, Leuven, Belgium
- Center for Social and Cultural Psychology, KU Leuven, Leuven, Belgium
- Experimental Clinical and Health Psychology, Department of Psychology, Ghent University, Ghent, Belgium
| |
Collapse
|
7
|
Abstract
As a highly social species, humans frequently exchange social information to support almost all facets of life. One of the richest and most powerful tools in social communication is the face, from which observers can quickly and easily make a number of inferences - about identity, gender, sex, age, race, ethnicity, sexual orientation, physical health, attractiveness, emotional state, personality traits, pain or physical pleasure, deception, and even social status. With the advent of the digital economy, increasing globalization and cultural integration, understanding precisely which face information supports social communication and which produces misunderstanding is central to the evolving needs of modern society (for example, in the design of socially interactive digital avatars and companion robots). Doing so is challenging, however, because the face can be thought of as comprising a high-dimensional, dynamic information space, and this impacts cognitive science and neuroimaging, and their broader applications in the digital economy. New opportunities to address this challenge are arising from the development of new methods and technologies, coupled with the emergence of a modern scientific culture that embraces cross-disciplinary approaches. Here, we briefly review one such approach that combines state-of-the-art computer graphics, psychophysics and vision science, cultural psychology and social cognition, and highlight the main knowledge advances it has generated. In the light of current developments, we provide a vision of the future directions in the field of human facial communication within and across cultures.
Collapse
Affiliation(s)
- Rachael E Jack
- School of Psychology, University of Glasgow, Scotland G12 8QB, UK; Institute of Neuroscience and Psychology, University of Glasgow, Scotland G12 8QB, UK.
| | - Philippe G Schyns
- School of Psychology, University of Glasgow, Scotland G12 8QB, UK; Institute of Neuroscience and Psychology, University of Glasgow, Scotland G12 8QB, UK.
| |
Collapse
|
8
|
Perceiving expressions of emotion: What evidence could bear on questions about perceptual experience of mental states? Conscious Cogn 2015; 36:438-51. [DOI: 10.1016/j.concog.2015.03.008] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2014] [Revised: 03/04/2015] [Accepted: 03/11/2015] [Indexed: 11/21/2022]
|
9
|
Calvo MG, Nummenmaa L. Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cogn Emot 2015. [PMID: 26212348 DOI: 10.1080/02699931.2015.1049124] [Citation(s) in RCA: 132] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.
Collapse
Affiliation(s)
- Manuel G Calvo
- a Department of Cognitive Psychology , University of La Laguna , Tenerife , Spain
| | - Lauri Nummenmaa
- b School of Science , Aalto University , Espoo , Finland.,c Department of Psychology and Turku PET Centre , University of Turku , Turku , Finland
| |
Collapse
|
10
|
Parkinson B, Manstead ASR. Current Emotion Research in Social Psychology: Thinking About Emotions and Other People. EMOTION REVIEW 2015. [DOI: 10.1177/1754073915590624] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
This article discusses contemporary social psychological approaches to (a) the social relations and appraisals associated with specific emotions; (b) other people’s impact on appraisal processes; (c) effects of emotion on other people; and (d) interpersonal emotion regulation. We argue that single-minded cognitive perspectives restrict our understanding of interpersonal and group-related emotional processes, and that new methodologies addressing real-time interpersonal and group processes present promising opportunities for future progress.
Collapse
Affiliation(s)
- Brian Parkinson
- Department of Experimental Psychology, University of Oxford, UK
| | | |
Collapse
|
11
|
Grossman RB. Judgments of social awkwardness from brief exposure to children with and without high-functioning autism. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2015; 19:580-7. [PMID: 24923894 PMCID: PMC4485991 DOI: 10.1177/1362361314536937] [Citation(s) in RCA: 53] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
We form first impressions of many traits based on very short interactions. This study examines whether typical adults judge children with high-functioning autism to be more socially awkward than their typically developing peers based on very brief exposure to still images, audio-visual, video-only, or audio-only information. We used video and audio recordings of children with and without high-functioning autism captured during a story-retelling task. Typically developing adults were presented with 1 s and 3 s clips of these children, as well as still images, and asked to judge whether the person in the clip was socially awkward. Our findings show that participants who are naïve to diagnostic differences between the children in the clips judged children with high-functioning autism to be socially awkward at a significantly higher rate than their typically developing peers. These results remain consistent for exposures as short as 1 s to visual and/or auditory information, as well as for still images. These data suggest that typical adults use subtle nonverbal and non-linguistic cues produced by children with high-functioning autism to form rapid judgments of social awkwardness with the potential for significant repercussions in social interactions.
Collapse
Affiliation(s)
- Ruth B Grossman
- Department of Communication Sciences and Disorders, Emerson College, USA Department of Psychiatry, University of Massachusetts Medical School, USA
| |
Collapse
|
12
|
Volkova E, de la Rosa S, Bülthoff HH, Mohler B. The MPI emotional body expressions database for narrative scenarios. PLoS One 2014; 9:e113647. [PMID: 25461382 PMCID: PMC4252031 DOI: 10.1371/journal.pone.0113647] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2014] [Accepted: 10/22/2014] [Indexed: 12/03/2022] Open
Abstract
Emotion expression in human-human interaction takes place via various types of information, including body motion. Research on the perceptual-cognitive mechanisms underlying the processing of natural emotional body language can benefit greatly from datasets of natural emotional body expressions that facilitate stimulus manipulation and analysis. The existing databases have so far focused on few emotion categories which display predominantly prototypical, exaggerated emotion expressions. Moreover, many of these databases consist of video recordings which limit the ability to manipulate and analyse the physical properties of these stimuli. We present a new database consisting of a large set (over 1400) of natural emotional body expressions typical of monologues. To achieve close-to-natural emotional body expressions, amateur actors were narrating coherent stories while their body movements were recorded with motion capture technology. The resulting 3-dimensional motion data recorded at a high frame rate (120 frames per second) provides fine-grained information about body movements and allows the manipulation of movement on a body joint basis. For each expression it gives the positions and orientations in space of 23 body joints for every frame. We report the results of physical motion properties analysis and of an emotion categorisation study. The reactions of observers from the emotion categorisation study are included in the database. Moreover, we recorded the intended emotion expression for each motion sequence from the actor to allow for investigations regarding the link between intended and perceived emotions. The motion sequences along with the accompanying information are made available in a searchable MPI Emotional Body Expression Database. We hope that this database will enable researchers to study expression and perception of naturally occurring emotional body expressions in greater depth.
Collapse
Affiliation(s)
- Ekaterina Volkova
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Graduate School of Neural & Behavioural Sciences, Tübingen, Germany
- * E-mail: (EV); (HHB)
| | - Stephan de la Rosa
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Heinrich H. Bülthoff
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Brain and Cognitive Engineering, Korea University, Seoul, South Korea
- * E-mail: (EV); (HHB)
| | - Betty Mohler
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| |
Collapse
|
13
|
Calvo MG, Gutiérrez-García A, Fernández-Martín A, Nummenmaa L. Recognition of Facial Expressions of Emotion is Related to their Frequency in Everyday Life. JOURNAL OF NONVERBAL BEHAVIOR 2014. [DOI: 10.1007/s10919-014-0191-3] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
14
|
Waller BM, Misch A, Whitehouse J, Herrmann E. Children, but not chimpanzees, have facial correlates of determination. Biol Lett 2014; 10:20130974. [PMID: 24598107 DOI: 10.1098/rsbl.2013.0974] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Facial expressions have long been proposed to be important agents in forming and maintaining cooperative interactions in social groups. Human beings are inordinately cooperative when compared with their closest-living relatives, the great apes, and hence one might expect species differences in facial expressivity in contexts in which cooperation could be advantageous. Here, human children and chimpanzees were given an identical task designed to induce an element of frustration (it was impossible to solve). In children, but not chimpanzees, facial expressions associated with effort and determination positively correlated with persistence at the task. By contrast, bodily indicators of stress (self-directed behaviour) negatively correlated with task persistence in chimpanzees. Thus, children exhibited more behaviour as they persisted, and chimpanzees exhibited less. The facial expressions produced by children, could, therefore, function to solicit prosocial assistance from others.
Collapse
Affiliation(s)
- B M Waller
- Centre for Comparative and Evolutionary Psychology, University of Portsmouth, , Portsmouth PO1 2DY, UK
| | | | | | | |
Collapse
|
15
|
Reisenzein R, Studtmann M, Horstmann G. Coherence between Emotion and Facial Expression: Evidence from Laboratory Experiments. EMOTION REVIEW 2013. [DOI: 10.1177/1754073912457228] [Citation(s) in RCA: 120] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Evidence on the coherence between emotion and facial expression in adults from laboratory experiments is reviewed. High coherence has been found in several studies between amusement and smiling; low to moderate coherence between other positive emotions and smiling. The available evidence for surprise and disgust suggests that these emotions are accompanied by their “traditional” facial expressions, and even components of these expressions, only in a minority of cases. Evidence concerning sadness, anger, and fear is very limited. For sadness, one study suggests that high emotion–expression coherence may exist in specific situations, whereas for anger and fear, the evidence points to low coherence. Insufficient emotion intensity and inhibition of facial expressions seem unable to account for the observed dissociations between emotion and facial expression.
Collapse
|