1
|
Zhang M, Zhou Y, Xu X, Ren Z, Zhang Y, Liu S, Luo W. Multi-view emotional expressions dataset using 2D pose estimation. Sci Data 2023; 10:649. [PMID: 37739952 PMCID: PMC10516935 DOI: 10.1038/s41597-023-02551-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Accepted: 09/07/2023] [Indexed: 09/24/2023] Open
Abstract
Human body expressions convey emotional shifts and intentions of action and, in some cases, are even more effective than other emotion models. Despite many datasets of body expressions incorporating motion capture available, there is a lack of more widely distributed datasets regarding naturalized body expressions based on the 2D video. In this paper, therefore, we report the multi-view emotional expressions dataset (MEED) using 2D pose estimation. Twenty-two actors presented six emotional (anger, disgust, fear, happiness, sadness, surprise) and neutral body movements from three viewpoints (left, front, right). A total of 4102 videos were captured. The MEED consists of the corresponding pose estimation results (i.e., 397,809 PNG files and 397,809 JSON files). The size of MEED exceeds 150 GB. We believe this dataset will benefit the research in various fields, including affective computing, human-computer interaction, social neuroscience, and psychiatry.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yanan Zhou
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Xinye Xu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Ziwei Ren
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yihan Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shenglan Liu
- School of Innovation and Entrepreneurship, Dalian University of Technology, Dalian, 116024, Liaoning, China.
- Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024, Liaoning, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China.
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China.
| |
Collapse
|
2
|
Zhang M, Yu L, Zhang K, Du B, Zhan B, Jia S, Chen S, Han F, Li Y, Liu S, Yi X, Liu S, Luo W. Construction and validation of the Dalian emotional movement open-source set (DEMOS). Behav Res Methods 2023; 55:2353-2366. [PMID: 35931937 DOI: 10.3758/s13428-022-01887-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/24/2022] [Indexed: 11/08/2022]
Abstract
Human body movements are important for emotion recognition and social communication and have received extensive attention from researchers. In this field, emotional biological motion stimuli, as depicted by point-light displays, are widely used. However, the number of stimuli in the existing material library is small, and there is a lack of standardized indicators, which subsequently limits experimental design and conduction. Therefore, based on our prior kinematic dataset, we constructed the Dalian Emotional Movement Open-source Set (DEMOS) using computational modeling. The DEMOS has three views (i.e., frontal 0°, left 45°, and left 90°) and in total comprises 2664 high-quality videos of emotional biological motion, each displaying happiness, sadness, anger, fear, disgust, and neutral. All stimuli were validated in terms of recognition accuracy, emotional intensity, and subjective movement. The objective movement for each expression was also calculated. The DEMOS can be downloaded for free from https://osf.io/83fst/ . To our knowledge, this is the largest multi-view emotional biological motion set based on the whole body. The DEMOS can be applied in many fields, including affective computing, social cognition, and psychiatry.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Lu Yu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Keye Zhang
- School of Social and Behavioral Sciences, Nanjing University, Nanjing, 210023, China
| | - Bixuan Du
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Bin Zhan
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Shuxin Jia
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shaohua Chen
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Fengxu Han
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yiwen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Xi Yi
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shenglan Liu
- School of Innovation and Entrepreneurship, Dalian University of Technology, Dalian, 116024, China.
- Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China.
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China.
| |
Collapse
|
3
|
Smith RA, Cross ES. The McNorm library: creating and validating a new library of emotionally expressive whole body dance movements. PSYCHOLOGICAL RESEARCH 2023; 87:484-508. [PMID: 35385989 PMCID: PMC8985749 DOI: 10.1007/s00426-022-01669-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 02/23/2022] [Indexed: 11/28/2022]
Abstract
The ability to exchange affective cues with others plays a key role in our ability to create and maintain meaningful social relationships. We express our emotions through a variety of socially salient cues, including facial expressions, the voice, and body movement. While significant advances have been made in our understanding of verbal and facial communication, to date, understanding of the role played by human body movement in our social interactions remains incomplete. To this end, here we describe the creation and validation of a new set of emotionally expressive whole-body dance movement stimuli, named the Motion Capture Norming (McNorm) Library, which was designed to reconcile a number of limitations associated with previous movement stimuli. This library comprises a series of point-light representations of a dancer's movements, which were performed to communicate to observers neutrality, happiness, sadness, anger, and fear. Based on results from two validation experiments, participants could reliably discriminate the intended emotion expressed in the clips in this stimulus set, with accuracy rates up to 60% (chance = 20%). We further explored the impact of dance experience and trait empathy on emotion recognition and found that neither significantly impacted emotion discrimination. As all materials for presenting and analysing this movement library are openly available, we hope this resource will aid other researchers in further exploration of affective communication expressed by human bodily movement.
Collapse
Affiliation(s)
- Rebecca A. Smith
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland
| | - Emily S. Cross
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland ,Department of Cognitive Science, Macquarie University, Sydney, Australia
| |
Collapse
|
4
|
Ke H, Vuong QC, Geangu E. Three- and six-year-old children are sensitive to natural body expressions of emotion: An event-related potential emotional priming study. J Exp Child Psychol 2022; 224:105497. [PMID: 35850023 DOI: 10.1016/j.jecp.2022.105497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Revised: 03/23/2022] [Accepted: 06/06/2022] [Indexed: 12/01/2022]
Abstract
Body movements provide a rich source of emotional information during social interactions. Although the ability to perceive biological motion cues related to those movements begins to develop during infancy, processing those cues to identify emotions likely continues to develop into childhood. Previous studies used posed or exaggerated body movements, which might not reflect the kind of body expressions children experience. The current study used an event-related potential (ERP) priming paradigm to investigate the development of emotion recognition from more naturalistic body movements. Point-light displays (PLDs) of male adult bodies expressing happy or angry emotional movements while narrating a story were used as prime stimuli, whereas audio recordings of the words "happy" and "angry" spoken with an emotionally neutral prosody were used as targets. We recorded the ERPs time-locked to the onset of the auditory target from 3- and 6-year-old children, and we compared amplitude and latency of the N300 and N400 responses between the two age groups in the different prime-target conditions. There was an overall effect of prime for the N300 amplitude, with more negative-going responses for happy PLDs compared with angry PLDs. There was also an interaction between prime and target for the N300 latency, suggesting that all children were sensitive to the emotional congruency between body movements and words. For the N400 component, there was only an interaction among age, prime, and target for latency, suggesting an age-dependent modulation of this component when prime and target did not match in emotional information. Overall, our results suggest that the emergence of more complex emotion processing of body expressions occurs around 6 years of age, but it is not fully developed at this point in ontogeny.
Collapse
Affiliation(s)
- Han Ke
- Department of Psychology, Lancaster University, Lancaster LA1 4YF, UK.
| | - Quoc C Vuong
- Biosciences Institute, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
| | - Elena Geangu
- Department of Psychology, University of York, York YO10 5DD, UK
| |
Collapse
|
5
|
de Gelder B, Poyo Solanas M. A computational neuroethology perspective on body and expression perception. Trends Cogn Sci 2021; 25:744-756. [PMID: 34147363 DOI: 10.1016/j.tics.2021.05.010] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 04/22/2021] [Accepted: 05/24/2021] [Indexed: 01/17/2023]
Abstract
Survival prompts organisms to prepare adaptive behavior in response to environmental and social threat. However, what are the specific features of the appearance of a conspecific that trigger such adaptive behaviors? For social species, the prime candidates for triggering defense systems are the visual features of the face and the body. We propose a novel approach for studying the ability of the brain to gather survival-relevant information from seeing conspecific body features. Specifically, we propose that behaviorally relevant information from bodies and body expressions is coded at the levels of midlevel features in the brain. These levels are relatively independent from higher-order cognitive and conscious perception of bodies and emotions. Instead, our approach is embedded in an ethological framework and mobilizes computational models for feature discovery.
Collapse
Affiliation(s)
- Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands; Department of Computer Science, University College London, London WC1E 6BT, UK.
| | - Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands
| |
Collapse
|
6
|
Bieńkiewicz MMN, Smykovskyi AP, Olugbade T, Janaqi S, Camurri A, Bianchi-Berthouze N, Björkman M, Bardy BG. Bridging the gap between emotion and joint action. Neurosci Biobehav Rev 2021; 131:806-833. [PMID: 34418437 DOI: 10.1016/j.neubiorev.2021.08.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Revised: 08/08/2021] [Accepted: 08/13/2021] [Indexed: 11/17/2022]
Abstract
Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies.
Collapse
Affiliation(s)
- Marta M N Bieńkiewicz
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Montpellier, France.
| | - Andrii P Smykovskyi
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Montpellier, France
| | | | - Stefan Janaqi
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Montpellier, France
| | | | | | | | - Benoît G Bardy
- EuroMov Digital Health in Motion, Univ. Montpellier IMT Mines Ales, Montpellier, France.
| |
Collapse
|
7
|
Scheer C, Kubowitsch S, Dendorfer S, Jansen P. Happy Enough to Relax? How Positive and Negative Emotions Activate Different Muscular Regions in the Back - an Explorative Study. Front Psychol 2021; 12:511746. [PMID: 34135791 PMCID: PMC8201496 DOI: 10.3389/fpsyg.2021.511746] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Accepted: 05/04/2021] [Indexed: 11/23/2022] Open
Abstract
Embodiment theories have proposed a reciprocal relationship between emotional state and bodily reactions. Besides large body postures, recent studies have found emotions to affect rather subtle bodily expressions, such as slumped or upright sitting posture. This study investigated back muscle activity as an indication of an effect of positive and negative emotions on the sitting position. The electromyography (EMG) activity of six back muscles was recorded in 31 healthy subjects during exposure to positive and negative affective pictures. A resting period was used as a control condition. Increased muscle activity patterns in the back were found during the exposure to negative emotional stimuli, which was mainly measured in the lumbar and thorax regions. The positive emotion condition caused no elevated activity. The findings show that negative emotions lead to increased differential muscle activity in the back and thus corroborate those of previous research that emotion affects subtle bodily expressions.
Collapse
Affiliation(s)
- Clara Scheer
- Institute of Sport Science, Faculty of Humanities, University of Regensburg, Regensburg, Germany
| | - Simone Kubowitsch
- Laboratory for Biomechanics, Ostbayrische Technische Hoschschule Regensburg, Regensburg, Germany
| | - Sebastian Dendorfer
- Laboratory for Biomechanics, Ostbayrische Technische Hoschschule Regensburg, Regensburg, Germany
| | - Petra Jansen
- Institute of Sport Science, Faculty of Humanities, University of Regensburg, Regensburg, Germany
| |
Collapse
|
8
|
Botta A, Lagravinese G, Bove M, Avenanti A, Avanzino L. Modulation of Response Times During Processing of Emotional Body Language. Front Psychol 2021; 12:616995. [PMID: 33716882 PMCID: PMC7947862 DOI: 10.3389/fpsyg.2021.616995] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 01/28/2021] [Indexed: 11/23/2022] Open
Abstract
The investigation of how humans perceive and respond to emotional signals conveyed by the human body has been for a long time secondary compared with the investigation of facial expressions and emotional scenes recognition. The aims of this behavioral study were to assess the ability to process emotional body postures and to test whether motor response is mainly driven by the emotional content of the picture or if it is influenced by motor resonance. Emotional body postures and scenes (IAPS) divided into three clusters (fear, happiness, and neutral) were shown to 25 healthy subjects (13 males, mean age ± SD: 22.3 ± 1.8 years) in a three-alternative forced choice task. Subjects were asked to recognize the emotional content of the pictures by pressing one of three keys as fast as possible in order to estimate response times (RTs). The rating of valence and arousal was also performed. We found shorter RTs for fearful body postures as compared with happy and neutral postures. In contrast, no differences across emotional categories were found for the IAPS stimuli. Analysis on valence and arousal and the subsequent item analysis showed an excellent reliability of the two sets of images used in the experiment. Our results show that fearful body postures are rapidly recognized and processed, probably thanks to the automatic activation of a series of central nervous system structures orchestrating the defensive threat reactions, strengthening and supporting previous neurophysiological and behavioral findings in body language processing.
Collapse
Affiliation(s)
- Alessandro Botta
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Giovanna Lagravinese
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| | - Marco Bove
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| | - Alessio Avenanti
- Centro di Neuroscienze Cognitive and Dipartimento di Psicologia, Campus Cesena, Alma Mater Studiorum – University of Bologna, Cesena, Italy
- Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile
| | - Laura Avanzino
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| |
Collapse
|
9
|
Poyo Solanas M, Vaessen M, de Gelder B. Computation-Based Feature Representation of Body Expressions in the Human Brain. Cereb Cortex 2020; 30:6376-6390. [DOI: 10.1093/cercor/bhaa196] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2020] [Revised: 06/04/2020] [Accepted: 06/26/2020] [Indexed: 01/31/2023] Open
Abstract
Abstract
Humans and other primate species are experts at recognizing body expressions. To understand the underlying perceptual mechanisms, we computed postural and kinematic features from affective whole-body movement videos and related them to brain processes. Using representational similarity and multivoxel pattern analyses, we showed systematic relations between computation-based body features and brain activity. Our results revealed that postural rather than kinematic features reflect the affective category of the body movements. The feature limb contraction showed a central contribution in fearful body expression perception, differentially represented in action observation, motor preparation, and affect coding regions, including the amygdala. The posterior superior temporal sulcus differentiated fearful from other affective categories using limb contraction rather than kinematics. The extrastriate body area and fusiform body area also showed greater tuning to postural features. The discovery of midlevel body feature encoding in the brain moves affective neuroscience beyond research on high-level emotion representations and provides insights in the perceptual features that possibly drive automatic emotion perception.
Collapse
Affiliation(s)
- Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Maarten Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
- Department of Computer Science, University College London, London WC1E 6BT, UK
| |
Collapse
|
10
|
Emotional expressions in human and non-human great apes. Neurosci Biobehav Rev 2020; 115:378-395. [DOI: 10.1016/j.neubiorev.2020.01.027] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 01/17/2020] [Accepted: 01/22/2020] [Indexed: 11/23/2022]
|
11
|
Melzer A, Shafir T, Tsachor RP. How Do We Recognize Emotion From Movement? Specific Motor Components Contribute to the Recognition of Each Emotion. Front Psychol 2019; 10:1389. [PMID: 31333524 PMCID: PMC6617736 DOI: 10.3389/fpsyg.2019.01389] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2018] [Accepted: 05/28/2019] [Indexed: 11/27/2022] Open
Abstract
Are there movement features that are recognized as expressing each basic emotion by most people, and what are they? In our previous study we identified sets of Laban movement components that, when moved, elicited the basic emotions of anger, sadness, fear, and happiness. Our current study aimed to investigate if movements composed from those sets would be recognized as expressing those emotions, regardless of any instruction to the mover to portray emotion. Our stimuli included 113 video-clips of five Certified Laban Movement Analysts (CMAs) moving combinations of two to four movement components from each set associated with only one emotion: happiness, sadness, fear, or anger. Each three second clip showed one CMA moving a single combination. The CMAs moved only the combination's required components. Sixty-two physically and mentally healthy men (n = 31) and women (n = 31), ages 19–48, watched the clips and rated the perceived emotion and its intensity. To confirm participants' ability to recognize emotions from movement and to compare our stimuli to existing validated emotional expression stimuli, participants rated 50 additional clips of bodily motor expressions of these same emotions validated by Atkinson et al. (2004). Results showed that for both stimuli types, all emotions were recognized far above chance level. Comparing recognition accuracy of the two clip types revealed better recognition of anger, fear, and neutral emotion from Atkinson's clips of actors expressing emotions, and similar levels of recognition accuracy for happiness and sadness. Further analysis was performed to determine the contribution of specific movement components to the recognition of the studied emotions. Our results indicated that these specific Laban motor components not only enhance feeling the associated emotions when moved, but also contribute to recognition of the associated emotions when being observed, even when the mover was not instructed to portray emotion, indicating that the presence of these movement components alone is sufficient for emotion recognition. This research-based knowledge regarding the relationship between Laban motor components and bodily emotional expressions can be used by dance-movement and drama therapists for better understanding of clients' emotional movements, for creating appropriate interventions, and for enhancing communication with other practitioners regarding bodily emotional expression.
Collapse
Affiliation(s)
- Ayelet Melzer
- Faculty of Social Welfare and Health Sciences, The Graduate School of Creative Arts Therapies, University of Haifa, Haifa, Israel
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
| | | |
Collapse
|
12
|
Pollux PM, Craddock M, Guo K. Gaze patterns in viewing static and dynamic body expressions. Acta Psychol (Amst) 2019; 198:102862. [PMID: 31226535 DOI: 10.1016/j.actpsy.2019.05.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2019] [Revised: 05/09/2019] [Accepted: 05/26/2019] [Indexed: 11/25/2022] Open
Abstract
Evidence for the importance of bodily cues for emotion recognition has grown over the last two decades. Despite this growing literature, it is underspecified how observers view whole bodies for body expression recognition. Here we investigate to which extent body-viewing is face- and context-specific when participants are categorizing whole body expressions in static (Experiment 1) and dynamic displays (Experiment 2). Eye-movement recordings showed that observers viewed the face exclusively when visible in dynamic displays, whereas viewing was distributed over head, torso and arms in static displays and in dynamic displays with faces not visible. The strong face bias in dynamic face-visible expressions suggests that viewing of the body responds flexibly to the informativeness of facial cues for emotion categorisation. However, when facial expressions are static or not visible, observers adopt a viewing strategy that includes all upper body regions. This viewing strategy is further influenced by subtle viewing biases directed towards emotion-specific body postures and movements to optimise recruitment of diagnostic information for emotion categorisation.
Collapse
|
13
|
Tsachor RP, Shafir T. How Shall I Count the Ways? A Method for Quantifying the Qualitative Aspects of Unscripted Movement With Laban Movement Analysis. Front Psychol 2019; 10:572. [PMID: 31001158 PMCID: PMC6455080 DOI: 10.3389/fpsyg.2019.00572] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2018] [Accepted: 02/28/2019] [Indexed: 12/30/2022] Open
Abstract
There is significant clinical evidence showing that creative and expressive movement processes involved in dance/movement therapy (DMT) enhance psycho-social well-being. Yet, because movement is a complex phenomenon, statistically validating which aspects of movement change during interventions or lead to significant positive therapeutic outcomes is challenging because movement has multiple, overlapping variables appearing in unique patterns in different individuals and situations. One factor contributing to the therapeutic effects of DMT is movement's effect on clients' emotional states. Our previous study identified sets of movement variables which, when executed, enhanced specific emotions. In this paper, we describe how we selected movement variables for statistical analysis in that study, using a multi-stage methodology to identify, reduce, code, and quantify the multitude of variables present in unscripted movement. We suggest a set of procedures for using Laban Movement Analysis (LMA)-described movement variables as research data. Our study used LMA, an internationally accepted comprehensive system for movement analysis, and a primary DMT clinical assessment tool for describing movement. We began with Davis's (1970) three-stepped protocol for analyzing movement patterns and identifying the most important variables: (1) We repeatedly observed video samples of validated (Atkinson et al., 2004) emotional expressions to identify prevalent movement variables, eliminating variables appearing minimally or absent. (2) We use the criteria repetition, frequency, duration and emphasis to eliminate additional variables. (3) For each emotion, we analyzed motor expression variations to discover how variables cluster: first, by observing ten movement samples of each emotion to identify variables common to all samples; second, by qualitative analysis of the two best-recognized samples to determine if phrasing, duration or relationship among variables was significant. We added three new steps to this protocol: (4) we created Motifs (LMA symbols) combining movement variables extracted in steps 1-3; (5) we asked participants in the pilot study to move these combinations and quantify their emotional experience. Based on the results of the pilot study, we eliminated more variables; (6) we quantified the remaining variables' prevalence in each Motif for statistical analysis that examined which variables enhanced each emotion. We posit that our method successfully quantified unscripted movement data for statistical analysis.
Collapse
Affiliation(s)
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
14
|
Wade MD, McDowell AR, Ziermann JM. Innervation of the Long Head of the Triceps Brachii in Humans-A Fresh Look. Anat Rec (Hoboken) 2018; 301:473-483. [PMID: 29418118 DOI: 10.1002/ar.23741] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2017] [Revised: 06/30/2017] [Accepted: 07/13/2017] [Indexed: 11/11/2022]
Abstract
The triceps brachii muscle occupies the posterior compartment of the arm in humans and has three heads. The lateral and medial heads originate from the humerus and the long head arises from the infraglenoid tubercle of the scapula. All heads form a common tendon that inserts onto the olecranon and the deep antebrachial fascia on each side of it. Each head receives its own motor branch, which all are thought to originate from the radial nerve. However, several studies reported that the motor branch of the long head of the triceps (LHT) arises from the axillary nerve or the posterior cord. Here, we dissected 27 triceps in 15 cadavers to analyze the innervation of the LHT and found only radial innervation, which contradicts those studies. We examined studies reporting that the motor branch to the LHT in humans does not arise from the radial nerve as well as studies of the triceps in primates. Occasional variations of the innervation of skeletal muscles are normal, but a change of principal motor innervation from radial to axillary nerve has important implications. This is because the axillary nerve is often involved during shoulder injuries. The precise identification of the prevalence of axillary versus radial innervation is therefore clinically relevant for surgery, nerve drafting, and occupational and physical therapy. We conclude that the primary motor branch to the LHT arises from the radial nerve but axillary/posterior cord innervations occur occasionally. We suggest the development of a standard methodology for further studies. Anat Rec, 301:473-483, 2018. © 2018 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Michael D Wade
- Department of Anatomy, Howard University College of Medicine, Washington, DC
| | - Arthur R McDowell
- Department of Anatomy, Howard University College of Medicine, Washington, DC
| | - Janine M Ziermann
- Department of Anatomy, Howard University College of Medicine, Washington, DC
| |
Collapse
|
15
|
Wilson AD, Kolesar TA, Kornelsen J, Smith SD. Neural Responses to Consciously and Unconsciously Perceived Emotional Faces: A Spinal fMRI Study. Brain Sci 2018; 8:brainsci8080156. [PMID: 30126119 PMCID: PMC6119943 DOI: 10.3390/brainsci8080156] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2018] [Revised: 08/08/2018] [Accepted: 08/14/2018] [Indexed: 11/17/2022] Open
Abstract
Emotional stimuli modulate activity in brain areas related to attention, perception, and movement. Similar increases in neural activity have been detected in the spinal cord, suggesting that this understudied component of the central nervous system is an important part of our emotional responses. To date, previous studies of emotion-dependent spinal cord activity have utilized long presentations of complex emotional scenes. The current study differs from this research by (1) examining whether emotional faces will lead to enhanced spinal cord activity and (2) testing whether these stimuli require conscious perception to influence neural responses. Fifteen healthy undergraduate participants completed six spinal functional magnetic resonance imaging (fMRI) runs in which three one-minute blocks of fearful, angry, or neutral faces were interleaved with 40-s rest periods. In half of the runs, the faces were clearly visible while in the other half, the faces were displayed for only 17 ms. Spinal fMRI consisted of half-Fourier acquisition single-shot turbo spin-echo (HASTE) sequences targeting the cervical spinal cord. The results indicated that consciously perceived faces expressing anger elicited significantly more activity than fearful or neutral faces in ventral (motoric) regions of the cervical spinal cord. When stimuli were presented below the threshold of conscious awareness, neutral faces elicited significantly more activity than angry or fearful faces. Together, these data suggest that the emotional modulation of spinal cord activity is most impactful when the stimuli are consciously perceived and imply a potential threat toward the observer.
Collapse
Affiliation(s)
- Alyssia D Wilson
- Department of Psychology, University of Winnipeg, Winnipeg, MB R3B 2E9, Canada.
| | - Tiffany A Kolesar
- Department of Physiology and Pathophysiology, University of Manitoba, Winnipeg, MB R3T 2N2, Canada.
| | - Jennifer Kornelsen
- Department of Psychology, University of Winnipeg, Winnipeg, MB R3B 2E9, Canada.
- Department of Physiology and Pathophysiology, University of Manitoba, Winnipeg, MB R3T 2N2, Canada.
- Department of Radiology, University of Manitoba, Winnipeg, MB R3T 2N2, Canada.
| | - Stephen D Smith
- Department of Psychology, University of Winnipeg, Winnipeg, MB R3B 2E9, Canada.
| |
Collapse
|
16
|
Engelen T, Zhan M, Sack AT, de Gelder B. The Influence of Conscious and Unconscious Body Threat Expressions on Motor Evoked Potentials Studied With Continuous Flash Suppression. Front Neurosci 2018; 12:480. [PMID: 30061812 PMCID: PMC6054979 DOI: 10.3389/fnins.2018.00480] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2018] [Accepted: 06/25/2018] [Indexed: 12/21/2022] Open
Abstract
The observation of threatening expression in others is a strong cue for triggering an action response. One method of capturing such action responses is by measuring the amplitude of motor evoked potentials (MEPs) elicited with single pulse TMS over the primary motor cortex. Indeed, it has been shown that viewing whole body expressions of threat modulate the size of MEP amplitude. Furthermore, emotional cues have been shown to act on certain brain areas even outside of conscious awareness. In the current study, we explored if the influence of viewing whole body expressions of threat extends to stimuli that are presented outside of conscious awareness in healthy participants. To accomplish this, we combined the measurement of MEPs with a continuous flash suppression task. In experiment 1, participants were presented with images of neutral bodies, fearful bodies, or objects that were either perceived consciously or unconsciously, while single pulses of TMS were applied at different times after stimulus onset (200, 500, or 700 ms). In experiment 2 stimuli consisted of neutral bodies, angry bodies or objects, and pulses were applied at either 200 or 400 ms post stimulus onset. In experiment 1, there was a general effect of the time of stimulation, but no condition specific effects were evident. In experiment 2 there were no significant main effects, nor any significant interactions. Future studies need to look into earlier effects of MEP modulation by emotion body stimuli, specifically when presented outside of conscious awareness, as well as an exploration of other outcome measures such as intracortical facilitation.
Collapse
Affiliation(s)
| | | | | | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
17
|
Witkower Z, Tracy JL. Bodily Communication of Emotion: Evidence for Extrafacial Behavioral Expressions and Available Coding Systems. EMOTION REVIEW 2018. [DOI: 10.1177/1754073917749880] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Although scientists dating back to Darwin have noted the importance of the body in communicating emotion, current research on emotion communication tends to emphasize the face. In this article we review the evidence for bodily expressions of emotions—that is, the handful of emotions that are displayed and recognized from certain bodily behaviors (i.e., pride, joy, sadness, shame, embarrassment, anger, fear, and disgust). We also review the previously developed coding systems available for identifying emotions from bodily behaviors. Although no extant coding system provides an exhaustive list of bodily behaviors known to communicate a panoply of emotions, our review provides the foundation for developing such a system.
Collapse
Affiliation(s)
- Zachary Witkower
- Department of Psychology, University of British Columbia, Canada
| | - Jessica L. Tracy
- Department of Psychology, University of British Columbia, Canada
| |
Collapse
|
18
|
Coverage of Emotion Recognition for Common Wearable Biosensors. BIOSENSORS-BASEL 2018; 8:bios8020030. [PMID: 29587375 PMCID: PMC6023004 DOI: 10.3390/bios8020030] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/20/2018] [Revised: 03/16/2018] [Accepted: 03/22/2018] [Indexed: 11/21/2022]
Abstract
The present research proposes a novel emotion recognition framework for the computer prediction of human emotions using common wearable biosensors. Emotional perception promotes specific patterns of biological responses in the human body, and this can be sensed and used to predict emotions using only biomedical measurements. Based on theoretical and empirical psychophysiological research, the foundation of autonomic specificity facilitates the establishment of a strong background for recognising human emotions using machine learning on physiological patterning. However, a systematic way of choosing the physiological data covering the elicited emotional responses for recognising the target emotions is not obvious. The current study demonstrates through experimental measurements the coverage of emotion recognition using common off-the-shelf wearable biosensors based on the synchronisation between audiovisual stimuli and the corresponding physiological responses. The work forms the basis of validating the hypothesis for emotional state recognition in the literature and presents coverage of the use of common wearable biosensors coupled with a novel preprocessing algorithm to demonstrate the practical prediction of the emotional states of wearers.
Collapse
|
19
|
Hoogerwerf MD, Veldhuizen IJT, Tarvainen MP, Merz EM, Huis In 't Veld EMJ, de Kort WLAM, Sluiter JK, Frings-Dresen MHW. Physiological stress response patterns during a blood donation. Vox Sang 2018; 113:357-367. [PMID: 29574883 DOI: 10.1111/vox.12646] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Revised: 01/31/2018] [Accepted: 02/03/2018] [Indexed: 11/26/2022]
Abstract
BACKGROUND Donating blood is associated with increased psychological stress. This study investigates whether a blood donation induces physiological stress and if response patterns differ by gender, donation experience and non-acute stress. STUDY DESIGN AND METHODS In 372 donors, physiological stress [blood pressure, pulse rate, pulse rate variability (PRV)] was measured at seven moments during routine donation. PRV was assessed using time domain [root mean square of successive differences (RMSSD)] and frequency domain [high frequency (HF) and low frequency (LF) power] parameters. Non-acute stress was assessed by questionnaire. Shape and significance of time course patterns were assessed by fitting multilevel models for each stress measure and comparing men and women, first-time and experienced donors, and donors with high and low levels of non-acute stress. RESULTS Significant response patterns were found for all stress measures, where levels of systolic blood pressure (F(1,1315) = 24·2, P < 0·001), RMSSD (F(1,1315) = 24·2, P < 0·001), LF (F(1,1627) = 14·1, P < 0·001) and HF (F(1,1624) = 34·0, P < 0·001) increased towards needle insertion and then decreased to values lower than when arriving at the donation centre. Diastolic blood pressure (F(1,1326) = 50·9, P < 0·001) increased and pulse rate (F(1,1393) = 507·4, P < 0·001) showed a U-shaped curve. Significant group effects were found, that is, higher systolic blood pressure/pulse rate in women; higher pulse rate in first-time donors; higher RMSSD at arrival and from screening until leaving in first-time donors; and higher LF and HF in first-time donors. CONCLUSION This study shows an increase in physiological stress related to needle insertion, followed by a decrease when leaving the donation centre. Some group effects were also found.
Collapse
Affiliation(s)
- M D Hoogerwerf
- Department Donor Studies, Sanquin Research, Amsterdam, The Netherlands.,Landsteiner Laboratory, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands.,Coronel Institute of Occupational Health, Academic Medical Center, Amsterdam Public Health research institute, University of Amsterdam, Amsterdam, The Netherlands
| | - I J T Veldhuizen
- Landsteiner Laboratory, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands
| | - M P Tarvainen
- Department of Applied Physics, University of Eastern Finland, Kuopio, Finland.,Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, Kuopio, Finland
| | - E-M Merz
- Department Donor Studies, Sanquin Research, Amsterdam, The Netherlands.,Department of Sociology, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | | | - W L A M de Kort
- Department Donor Studies, Sanquin Research, Amsterdam, The Netherlands.,Landsteiner Laboratory, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands.,Department of Social Medicine, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands
| | - J K Sluiter
- Coronel Institute of Occupational Health, Academic Medical Center, Amsterdam Public Health research institute, University of Amsterdam, Amsterdam, The Netherlands
| | - M H W Frings-Dresen
- Coronel Institute of Occupational Health, Academic Medical Center, Amsterdam Public Health research institute, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
20
|
Affective vocalizations influence body ownership as measured in the rubber hand illusion. PLoS One 2017; 12:e0186009. [PMID: 28982176 PMCID: PMC5628997 DOI: 10.1371/journal.pone.0186009] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2017] [Accepted: 09/22/2017] [Indexed: 11/20/2022] Open
Abstract
Emotional signals, like threatening sounds, automatically ready the perceiver to prepare an appropriate defense behavior. Conjecturing that this would manifest itself in extending the safety zone around the body we used the rubber hand illusion (RHI) to test this prediction. The RHI is a perceptual illusion in which body ownership is manipulated by synchronously stroking a rubber hand and real hand occluded from view. Many factors, both internal and external, have been shown to influence the strength of the illusion, yet the effect of emotion perception on body ownership remains unexplored. We predicted that listening to affective vocalizations would influence how strongly participants experience the RHI. In the first experiment four groups were tested that listened either to affective sounds (angry or happy vocalizations), non-vocal sounds or no sound while undergoing synchronous or asynchronous stroking of the real and rubber hand. In a second experiment three groups were tested comparing angry or neutral vocalizations and no sound condition. There was a significantly larger drift towards the rubber hand in the emotion versus the no emotion conditions. We interpret these results in the framework that the spatial increase in the RHI indicates that under threat the body has the capacity to extend its safety zone.
Collapse
|
21
|
Kordts-Freudinger R, Oergel K, Wuennemann M. Feel Bad and Keep Steady: Emotional Images and Words and Postural Control during Bipedal Stance. JOURNAL OF NONVERBAL BEHAVIOR 2017. [DOI: 10.1007/s10919-017-0260-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
22
|
Wang L, Xia L, Zhang D. Face-body integration of intense emotional expressions of victory and defeat. PLoS One 2017; 12:e0171656. [PMID: 28245245 PMCID: PMC5330456 DOI: 10.1371/journal.pone.0171656] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2016] [Accepted: 01/24/2017] [Indexed: 11/23/2022] Open
Abstract
Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs) to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.
Collapse
Affiliation(s)
- Lili Wang
- School of Educational Science, Huaiyin Normal University, Huaian, China
| | - Lisheng Xia
- Institute of Affective and Social Neuroscience, Shenzhen University, Shenzhen, China
| | - Dandan Zhang
- Institute of Affective and Social Neuroscience, Shenzhen University, Shenzhen, China
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| |
Collapse
|
23
|
Coutinho TV, Reis SPS, da Silva AG, Miranda DM, Malloy-Diniz LF. Deficits in Response Inhibition in Patients with Attention-Deficit/Hyperactivity Disorder: The Impaired Self-Protection System Hypothesis. Front Psychiatry 2017; 8:299. [PMID: 29403397 PMCID: PMC5786525 DOI: 10.3389/fpsyt.2017.00299] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2017] [Accepted: 12/14/2017] [Indexed: 12/24/2022] Open
Abstract
Problems in inhibitory control are regarded in Psychology as a key problem associated with attention-deficit/hyperactivity disorder (ADHD). They, however, might not be primary deficits, but instead a consequence of inattention. At least two components have been identified and dissociated in studies in regards to inhibitory control: interference suppression, responsible for controlling interference by resisting irrelevant or misleading information, and response inhibition, referring to withholding a response or overriding an ongoing behavior. Poor error awareness and self-monitoring undermine an individual's ability to inhibit inadequate responses and change course of action. In non-social contexts, an individual depends on his own cognition to regulate his mistakes. In social contexts, however, there are many social cues that should help that individual to perceive his mistakes and inhibit inadequate responses. The processes involved in perceiving and interpreting those social cues are arguably part of a self-protection system (SPS). Individuals with ADHD not only present impulsive behaviors in social contexts, but also have difficulty perceiving their inadequate responses and overriding ongoing actions toward more appropriate ones. In this paper, we discuss that those difficulties are arguably a consequence of an impaired SPS, due to visual attention deficits and subsequent failure in perceiving and recognizing accurately negative emotions in facial expressions, especially anger. We discuss evidence that children with ADHD exhibit problems in a series of components involved in the activation of that system and advocate that the inability to identify the anger expressed by others, and thus, not experiencing the fear response that should follow, is, ultimately, what prevents them from inhibiting the ongoing inappropriate behavior, since a potential threat is not registered. Getting involved in high-risk situations, such as reckless driving, could also be a consequence of not registering a threat and thus, not experiencing fear.
Collapse
Affiliation(s)
- Thales Vianna Coutinho
- Laboratório de Investigações em Neurociência CLínica, Department of Mental Health, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.,iLumina Neurociências, Belo Horizonte, Brazil
| | - Samara Passos Santos Reis
- Quantitative Methods and Predictive Psychometrics Laboratory, Department of Psychology, Universidade Federal da Bahia, Salvador, Brazil
| | | | | | - Leandro Fernandes Malloy-Diniz
- Laboratório de Investigações em Neurociência CLínica, Department of Mental Health, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.,iLumina Neurociências, Belo Horizonte, Brazil
| |
Collapse
|
24
|
Aung MSH, Kaltwang S, Romera-Paredes B, Martinez B, Singh A, Cella M, Valstar M, Meng H, Kemp A, Shafizadeh M, Elkins AC, Kanakam N, de Rothschild A, Tyler N, Watson PJ, de C Williams AC, Pantic M, Bianchi-Berthouze N. The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING 2016; 7:435-451. [PMID: 30906508 PMCID: PMC6430129 DOI: 10.1109/taffc.2015.2462830] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Pain-related emotions are a major barrier to effective self rehabilitation in chronic pain. Automated coaching systems capable of detecting these emotions are a potential solution. This paper lays the foundation for the development of such systems by making three contributions. First, through literature reviews, an overview of how pain is expressed in chronic pain and the motivation for detecting it in physical rehabilitation is provided. Second, a fully labelled multimodal dataset (named 'EmoPain') containing high resolution multiple-view face videos, head mounted and room audio signals, full body 3D motion capture and electromyographic signals from back muscles is supplied. Natural unconstrained pain related facial expressions and body movement behaviours were elicited from people with chronic pain carrying out physical exercises. Both instructed and non-instructed exercises were considered to reflect traditional scenarios of physiotherapist directed therapy and home-based self-directed therapy. Two sets of labels were assigned: level of pain from facial expressions annotated by eight raters and the occurrence of six pain-related body behaviours segmented by four experts. Third, through exploratory experiments grounded in the data, the factors and challenges in the automated recognition of such expressions and behaviour are described, the paper concludes by discussing potential avenues in the context of these findings also highlighting differences for the two exercise scenarios addressed.
Collapse
Affiliation(s)
- Min S H Aung
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Sebastian Kaltwang
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | | | - Brais Martinez
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Aneesha Singh
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Matteo Cella
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Michel Valstar
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Hongying Meng
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Andrew Kemp
- Physiotherapy Department, Maidstone & Tunbridge Wells NHS Trust, TN2 4QJ
| | - Moshen Shafizadeh
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Aaron C Elkins
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Natalie Kanakam
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Amschel de Rothschild
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Nick Tyler
- Department of Civil, Environmental & Geomatic Engineering, University College London, London WC1E 6BT, Unithed Kingdom
| | - Paul J Watson
- Department of Health Sciences, University of Leicester, Leicester LE5 7PW, Unithed Kingdom
| | - Amanda C de C Williams
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Maja Pantic
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | | |
Collapse
|
25
|
Koole SL, Tschacher W. Synchrony in Psychotherapy: A Review and an Integrative Framework for the Therapeutic Alliance. Front Psychol 2016; 7:862. [PMID: 27378968 PMCID: PMC4907088 DOI: 10.3389/fpsyg.2016.00862] [Citation(s) in RCA: 180] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2016] [Accepted: 05/24/2016] [Indexed: 12/30/2022] Open
Abstract
During psychotherapy, patient and therapist tend to spontaneously synchronize their vocal pitch, bodily movements, and even their physiological processes. In the present article, we consider how this pervasive phenomenon may shed new light on the therapeutic relationship- or alliance- and its role within psychotherapy. We first review clinical research on the alliance and the multidisciplinary area of interpersonal synchrony. We then integrate both literatures in the Interpersonal Synchrony (In-Sync) model of psychotherapy. According to the model, the alliance is grounded in the coupling of patient and therapist's brains. Because brains do not interact directly, movement synchrony may help to establish inter-brain coupling. Inter-brain coupling may provide patient and therapist with access to another's internal states, which facilitates common understanding and emotional sharing. Over time, these interpersonal exchanges may improve patients' emotion-regulatory capacities and related therapeutic outcomes. We discuss the empirical assessment of interpersonal synchrony and review preliminary research on synchrony in psychotherapy. Finally, we summarize our main conclusions and consider the broader implications of viewing psychotherapy as the product of two interacting brains.
Collapse
|
26
|
de Borst AW, de Gelder B. Clear signals or mixed messages: inter-individual emotion congruency modulates brain activity underlying affective body perception. Soc Cogn Affect Neurosci 2016; 11:1299-309. [PMID: 27025242 DOI: 10.1093/scan/nsw039] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2015] [Accepted: 03/17/2016] [Indexed: 11/12/2022] Open
Abstract
The neural basis of emotion perception has mostly been investigated with single face or body stimuli. However, in daily life one may also encounter affective expressions by groups, e.g. an angry mob or an exhilarated concert crowd. In what way is brain activity modulated when several individuals express similar rather than different emotions? We investigated this question using an experimental design in which we presented two stimuli simultaneously, with same or different emotional expressions. We hypothesized that, in the case of two same-emotion stimuli, brain activity would be enhanced, while in the case of two different emotions, one emotion would interfere with the effect of the other. The results showed that the simultaneous perception of different affective body expressions leads to a deactivation of the amygdala and a reduction of cortical activity. It was revealed that the processing of fearful bodies, compared with different-emotion bodies, relied more strongly on saliency and action triggering regions in inferior parietal lobe and insula, while happy bodies drove the occipito-temporal cortex more strongly. We showed that this design could be used to uncover important differences between brain networks underlying fearful and happy emotions. The enhancement of brain activity for unambiguous affective signals expressed by several people simultaneously supports adaptive behaviour in critical situations.
Collapse
Affiliation(s)
- A W de Borst
- Department of Cognitive Neuroscience, Brain and Emotion Laboratory, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands
| | - B de Gelder
- Department of Cognitive Neuroscience, Brain and Emotion Laboratory, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands Department of Psychiatry and Mental Health, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
27
|
Shafir T, Tsachor RP, Welch KB. Emotion Regulation through Movement: Unique Sets of Movement Characteristics are Associated with and Enhance Basic Emotions. Front Psychol 2016; 6:2030. [PMID: 26793147 PMCID: PMC4707271 DOI: 10.3389/fpsyg.2015.02030] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2015] [Accepted: 12/21/2015] [Indexed: 11/13/2022] Open
Abstract
We have recently demonstrated that motor execution, observation, and imagery of movements expressing certain emotions can enhance corresponding affective states and therefore could be used for emotion regulation. But which specific movement(s) should one use in order to enhance each emotion? This study aimed to identify, using Laban Movement Analysis (LMA), the Laban motor elements (motor characteristics) that characterize movements whose execution enhances each of the basic emotions: anger, fear, happiness, and sadness. LMA provides a system of symbols describing its motor elements, which gives a written instruction (motif) for the execution of a movement or movement-sequence over time. Six senior LMA experts analyzed a validated set of video clips showing whole body dynamic expressions of anger, fear, happiness and sadness, and identified the motor elements that were common to (appeared in) all clips expressing the same emotion. For each emotion, we created motifs of different combinations of the motor elements common to all clips of the same emotion. Eighty subjects from around the world read and moved those motifs, to identify the emotion evoked when moving each motif and to rate the intensity of the evoked emotion. All subjects together moved and rated 1241 motifs, which were produced from 29 different motor elements. Using logistic regression, we found a set of motor elements associated with each emotion which, when moved, predicted the feeling of that emotion. Each emotion was predicted by a unique set of motor elements and each motor element predicted only one emotion. Knowledge of which specific motor elements enhance specific emotions can enable emotional self-regulation through adding some desired motor qualities to one's personal everyday movements (rather than mimicking others' specific movements) and through decreasing motor behaviors which include elements that enhance negative emotions.
Collapse
Affiliation(s)
- Tal Shafir
- The Graduate School of Creative Arts Therapies, Faculty of Social Welfare and Health Sciences, University of HaifaHaifa, Israel; The Department of Psychiatry, University of MichiganAnn Arbor, MI, USA
| | - Rachelle P Tsachor
- Department of Theatre, School of Theatre and Music, University of Illinois at Chicago Chicago, IL, USA
| | - Kathleen B Welch
- Center for Statistical Consultation and Research, University of Michigan Ann Arbor, MI, USA
| |
Collapse
|
28
|
Caneiro JP, Labie C, Sulley E, Briggs AM, Straker LM, Burnett AF, O'Sullivan PB. An exploration of familial associations of two movement pattern-derived subgroups of chronic disabling low back pain; a cross-sectional cohort study. ACTA ACUST UNITED AC 2015; 22:202-10. [PMID: 26874816 DOI: 10.1016/j.math.2015.12.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2015] [Revised: 12/17/2015] [Accepted: 12/20/2015] [Indexed: 10/22/2022]
Abstract
BACKGROUND Altered movement patterns with pain have been demonstrated in children, adolescents and adults with chronic disabling low back pain (CDLBP). A previously developed classification system has identified different subgroups including active extension and multidirectional patterns in patients with CDLBP. While familial associations have been identified for certain spinal postures in standing, it is unknown whether a familial relationship might exist between movement pattern-derived subgroups in families with CDLBP. OBJECTIVES This study explored whether familial associations in movement pattern-derived subgroups within and between members of families with CDLBP existed. DESIGN Cross-sectional cohort study. METHOD 33 parents and 28 children with CDLBP were classified into two subgroups based on clinical analysis of video footage of postures and functional movements, combined with aggravating factors obtained from Oswestry Disability Questionnaire. Prevalence of subgroups within family members was determined, associations between parent and child's subgroup membership was evaluated using Fisher's exact test, and spearman's correlation coefficient was used to determine the strength of association between familial dyads. RESULTS The majority of parents were classified as active extenders, sons predominately multidirectional and daughters were evenly distributed between the two subgroups. No significant association was found when comparing subgroups in nine parent-child relationships. CONCLUSIONS The exploration of a small cohort of family dyads in this study demonstrated that children's movement pattern-derived subgroups could not be explained by their parents' subgroup membership. These results cannot be generalised to the CLBP population due to this study's small sample. Larger sample studies are needed to further elucidate this issue.
Collapse
Affiliation(s)
- Joao Paulo Caneiro
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia.
| | - Céline Labie
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia.
| | - Emma Sulley
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia.
| | - Andrew M Briggs
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia; Arthritis and Osteoporosis Victoria, Australia.
| | - Leon M Straker
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia.
| | - Angus F Burnett
- ASPETAR, Qatar Orthopaedic and Sports Medicine Hospital, PO Box 29222, Doha, Qatar; School of Exercise and Health Sciences, Edith Cowan University, Joondalup, Western Australia, Australia.
| | - Peter B O'Sullivan
- School of Physiotherapy and Exercise Science, Faculty of Health Science, Curtin University of Technology, GPO Box U1987, Perth, Western Australia, 6845, Australia.
| |
Collapse
|
29
|
de Borst AW, de Gelder B. Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective. Front Psychol 2015; 6:576. [PMID: 26029133 PMCID: PMC4428060 DOI: 10.3389/fpsyg.2015.00576] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2014] [Accepted: 04/20/2015] [Indexed: 01/30/2023] Open
Abstract
Recent developments in neuroimaging research support the increased use of naturalistic stimulus material such as film, avatars, or androids. These stimuli allow for a better understanding of how the brain processes information in complex situations while maintaining experimental control. While avatars and androids are well suited to study human cognition, they should not be equated to human stimuli. For example, the uncanny valley hypothesis theorizes that artificial agents with high human-likeness may evoke feelings of eeriness in the human observer. Here we review if, when, and how the perception of human-like avatars and androids differs from the perception of humans and consider how this influences their utilization as stimulus material in social and affective neuroimaging studies. First, we discuss how the appearance of virtual characters affects perception. When stimuli are morphed across categories from non-human to human, the most ambiguous stimuli, rather than the most human-like stimuli, show prolonged classification times and increased eeriness. Human-like to human stimuli show a positive linear relationship with familiarity. Secondly, we show that expressions of emotions in human-like avatars can be perceived similarly to human emotions, with corresponding behavioral, physiological and neuronal activations, with exception of physical dissimilarities. Subsequently, we consider if and when one perceives differences in action representation by artificial agents versus humans. Motor resonance and predictive coding models may account for empirical findings, such as an interference effect on action for observed human-like, natural moving characters. However, the expansion of these models to explain more complex behavior, such as empathy, still needs to be investigated in more detail. Finally, we broaden our outlook to social interaction, where virtual reality stimuli can be utilized to imitate complex social situations.
Collapse
Affiliation(s)
- Aline W de Borst
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| |
Collapse
|
30
|
Zacharatos H, Gatzoulis C, Chrysanthou YL. Automatic emotion recognition based on body movement analysis: a survey. IEEE COMPUTER GRAPHICS AND APPLICATIONS 2014; 34:35-45. [PMID: 25216477 DOI: 10.1109/mcg.2014.106] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Humans are emotional beings, and their feelings influence how they perform and interact with computers. One of the most expressive modalities for humans is body posture and movement, which researchers have recently started exploiting for emotion recognition. This survey describes emerging techniques and modalities related to emotion recognition based on body movement, as well as recent advances in automatic emotion recognition. It also describes application areas and notation systems and explains the importance of movement segmentation. It then discusses unsolved problems and provides promising directions for future research. The Web extra (a PDF file) contains tables with additional information related to the article.
Collapse
|
31
|
Huis In 't Veld EMJ, van Boxtel GJM, de Gelder B. The Body Action Coding System II: muscle activations during the perception and expression of emotion. Front Behav Neurosci 2014; 8:330. [PMID: 25294993 PMCID: PMC4172051 DOI: 10.3389/fnbeh.2014.00330] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2014] [Accepted: 09/03/2014] [Indexed: 11/13/2022] Open
Abstract
Research into the expression and perception of emotions has mostly focused on facial expressions. Recently, body postures have become increasingly important in research, but knowledge on muscle activity during the perception or expression of emotion is lacking. The current study continues the development of a Body Action Coding System (BACS), which was initiated in a previous study, and described the involvement of muscles in the neck, shoulders and arms during expression of fear and anger. The current study expands the BACS by assessing the activity patterns of three additional muscles. Surface electromyography of muscles in the neck (upper trapezius descendens), forearms (extensor carpi ulnaris), lower back (erector spinae longissimus) and calves (peroneus longus) were measured during active expression and passive viewing of fearful and angry body expressions. The muscles in the forearm were strongly active for anger expression and to a lesser extent for fear expression. In contrast, muscles in the calves were recruited slightly more for fearful expressions. It was also found that muscles automatically responded to the perception of emotion, without any overt movement. The observer's forearms responded to the perception of fear, while the muscles used for leaning backwards were activated when faced with an angry adversary. Lastly, the calf responded immediately when a fearful person was seen, but responded slower to anger. There is increasing interest in developing systems that are able to create or recognize emotional body language for the development of avatars, robots, and online environments. To that end, multiple coding systems have been developed that can either interpret or create bodily expressions based on static postures, motion capture data or videos. However, the BACS is the first coding system based on muscle activity.
Collapse
Affiliation(s)
- Elisabeth M J Huis In 't Veld
- Brain and Emotion Laboratory, Department of Medical and Clinical Psychology, Tilburg University Tilburg, Netherlands
| | - Geert J M van Boxtel
- Department of Cognitive Neuropsychology, Tilburg University Tilburg, Netherlands
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Medical and Clinical Psychology, Tilburg University Tilburg, Netherlands ; Brain and Emotion Laboratory, Faculty of Psychology and Neuroscience, Maastricht University Maastricht, Netherlands
| |
Collapse
|
32
|
Volkova EP, Mohler BJ, Dodds TJ, Tesch J, Bülthoff HH. Emotion categorization of body expressions in narrative scenarios. Front Psychol 2014; 5:623. [PMID: 25071623 PMCID: PMC4075474 DOI: 10.3389/fpsyg.2014.00623] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2014] [Accepted: 06/02/2014] [Indexed: 11/13/2022] Open
Abstract
Humans can recognize emotions expressed through body motion with high accuracy even when the stimuli are impoverished. However, most of the research on body motion has relied on exaggerated displays of emotions. In this paper we present two experiments where we investigated whether emotional body expressions could be recognized when they were recorded during natural narration. Our actors were free to use their entire body, face, and voice to express emotions, but our resulting visual stimuli used only the upper body motion trajectories in the form of animated stick figures. Observers were asked to perform an emotion recognition task on short motion sequences using a large and balanced set of emotions (amusement, joy, pride, relief, surprise, anger, disgust, fear, sadness, shame, and neutral). Even with only upper body motion available, our results show recognition accuracy significantly above chance level and high consistency rates among observers. In our first experiment, that used more classic emotion induction setup, all emotions were well recognized. In the second study that employed narrations, four basic emotion categories (joy, anger, fear, and sadness), three non-basic emotion categories (amusement, pride, and shame) and the "neutral" category were recognized above chance. Interestingly, especially in the second experiment, observers showed a bias toward anger when recognizing the motion sequences for emotions. We discovered that similarities between motion sequences across the emotions along such properties as mean motion speed, number of peaks in the motion trajectory and mean motion span can explain a large percent of the variation in observers' responses. Overall, our results show that upper body motion is informative for emotion recognition in narrative scenarios.
Collapse
Affiliation(s)
- Ekaterina P Volkova
- Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics Tübingen, Germany ; Graduate School of Neural and Behavioural Sciences Tübingen, Germany
| | - Betty J Mohler
- Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | - Trevor J Dodds
- Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | - Joachim Tesch
- Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | - Heinrich H Bülthoff
- Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics Tübingen, Germany ; Department of Brain and Cognitive Engineering College of Information and Communication, Korea University Seoul, Korea
| |
Collapse
|