1
|
Mestre-Sansó F, Canals V, Montoya P, Riquelme I. Combination of motor, sensory and affective tasks in an EEG paradigm for children with developmental disabilities. MethodsX 2024; 13:102997. [PMID: 39498122 PMCID: PMC11532964 DOI: 10.1016/j.mex.2024.102997] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2024] [Accepted: 10/04/2024] [Indexed: 11/07/2024] Open
Abstract
Individuals with neurodevelopmental disorders exhibit overlapping emotional, somatosensory and motor deficits. Although brain processes underlying these impairments have been extensively studied in a separate way, the brain interaction of these inputs is an innovative line of research. Here we present a new EEG methodology for exploring the interactive brain activity of sensorimotor and affective stimuli. The task consists in presenting affective stimuli of different modalities (e.g. affective pictures, affective touch) while simultaneously an arthromotor performs passive joint movements, unseen by the participant. Participants were then required to press one of two buttons to indicate if their joint position agreed with a picture shown in a screen. Pilot data of electroencephalography recordings revealed distinct somatosensory event-related potentials (SEP) when movement was subsequent to affective stimuli, compared to neutral stimuli, as well as a differentiation of SEPs for different neurodevelopmental conditions. Behavioral responses further showed that children with cerebral palsy had more errors to identify their hand position when they were exposed to affective stimuli. This paradigm is a valuable tool to explore the modulative influence of emotion in the sensorimotor brain processing of different populations with joint emotional and sensorimotor impairments, such as children with neurodevelopmental disorders or patients with stroke.•This method allows exploring the interaction between affective and sensoriomotor inputs in an EEG paradigm.
Collapse
Affiliation(s)
- Francesc Mestre-Sansó
- Industrial Engineering and Construction Department, University of the Balearic Islands, 07122 Palma, Spain
| | - Vicent Canals
- Industrial Engineering and Construction Department, University of the Balearic Islands, 07122 Palma, Spain
| | - Pedro Montoya
- Research Institute on Health Sciences (IUNICS-IdISBa), University of the Balearic Islands, 07122, Palma, Spain
- Health Research Institute of the Balearic Islands (IdISBa), 07010, Palma, Spain
| | - Inmaculada Riquelme
- Research Institute on Health Sciences (IUNICS-IdISBa), University of the Balearic Islands, 07122, Palma, Spain
- Health Research Institute of the Balearic Islands (IdISBa), 07010, Palma, Spain
- Department of Nursing and Physiotherapy, University of the Balearic Islands, 07122, Palma, Spain
| |
Collapse
|
2
|
Zhang M, Zhou Y, Xu X, Ren Z, Zhang Y, Liu S, Luo W. Multi-view emotional expressions dataset using 2D pose estimation. Sci Data 2023; 10:649. [PMID: 37739952 PMCID: PMC10516935 DOI: 10.1038/s41597-023-02551-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Accepted: 09/07/2023] [Indexed: 09/24/2023] Open
Abstract
Human body expressions convey emotional shifts and intentions of action and, in some cases, are even more effective than other emotion models. Despite many datasets of body expressions incorporating motion capture available, there is a lack of more widely distributed datasets regarding naturalized body expressions based on the 2D video. In this paper, therefore, we report the multi-view emotional expressions dataset (MEED) using 2D pose estimation. Twenty-two actors presented six emotional (anger, disgust, fear, happiness, sadness, surprise) and neutral body movements from three viewpoints (left, front, right). A total of 4102 videos were captured. The MEED consists of the corresponding pose estimation results (i.e., 397,809 PNG files and 397,809 JSON files). The size of MEED exceeds 150 GB. We believe this dataset will benefit the research in various fields, including affective computing, human-computer interaction, social neuroscience, and psychiatry.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yanan Zhou
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Xinye Xu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Ziwei Ren
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yihan Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shenglan Liu
- School of Innovation and Entrepreneurship, Dalian University of Technology, Dalian, 116024, Liaoning, China.
- Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024, Liaoning, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, Liaoning, China.
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China.
| |
Collapse
|
3
|
Zhang M, Yu L, Zhang K, Du B, Zhan B, Jia S, Chen S, Han F, Li Y, Liu S, Yi X, Liu S, Luo W. Construction and validation of the Dalian emotional movement open-source set (DEMOS). Behav Res Methods 2023; 55:2353-2366. [PMID: 35931937 DOI: 10.3758/s13428-022-01887-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/24/2022] [Indexed: 11/08/2022]
Abstract
Human body movements are important for emotion recognition and social communication and have received extensive attention from researchers. In this field, emotional biological motion stimuli, as depicted by point-light displays, are widely used. However, the number of stimuli in the existing material library is small, and there is a lack of standardized indicators, which subsequently limits experimental design and conduction. Therefore, based on our prior kinematic dataset, we constructed the Dalian Emotional Movement Open-source Set (DEMOS) using computational modeling. The DEMOS has three views (i.e., frontal 0°, left 45°, and left 90°) and in total comprises 2664 high-quality videos of emotional biological motion, each displaying happiness, sadness, anger, fear, disgust, and neutral. All stimuli were validated in terms of recognition accuracy, emotional intensity, and subjective movement. The objective movement for each expression was also calculated. The DEMOS can be downloaded for free from https://osf.io/83fst/ . To our knowledge, this is the largest multi-view emotional biological motion set based on the whole body. The DEMOS can be applied in many fields, including affective computing, social cognition, and psychiatry.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Lu Yu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Keye Zhang
- School of Social and Behavioral Sciences, Nanjing University, Nanjing, 210023, China
| | - Bixuan Du
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Bin Zhan
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Shuxin Jia
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shaohua Chen
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Fengxu Han
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yiwen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Xi Yi
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shenglan Liu
- School of Innovation and Entrepreneurship, Dalian University of Technology, Dalian, 116024, China.
- Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China.
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China.
| |
Collapse
|
4
|
Troncoso A, Soto V, Gomila A, Martínez-Pernía D. Moving beyond the lab: investigating empathy through the Empirical 5E approach. Front Psychol 2023; 14:1119469. [PMID: 37519389 PMCID: PMC10374225 DOI: 10.3389/fpsyg.2023.1119469] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 06/05/2023] [Indexed: 08/01/2023] Open
Abstract
Empathy is a complex and multifaceted phenomenon that plays a crucial role in human social interactions. Recent developments in social neuroscience have provided valuable insights into the neural underpinnings and bodily mechanisms underlying empathy. This methodology often prioritizes precision, replicability, internal validity, and confound control. However, fully understanding the complexity of empathy seems unattainable by solely relying on artificial and controlled laboratory settings, while overlooking a comprehensive view of empathy through an ecological experimental approach. In this article, we propose articulating an integrative theoretical and methodological framework based on the 5E approach (the "E"s stand for embodied, embedded, enacted, emotional, and extended perspectives of empathy), highlighting the relevance of studying empathy as an active interaction between embodied agents, embedded in a shared real-world environment. In addition, we illustrate how a novel multimodal approach including mobile brain and body imaging (MoBi) combined with phenomenological methods, and the implementation of interactive paradigms in a natural context, are adequate procedures to study empathy from the 5E approach. In doing so, we present the Empirical 5E approach (E5E) as an integrative scientific framework to bridge brain/body and phenomenological attributes in an interbody interactive setting. Progressing toward an E5E approach can be crucial to understanding empathy in accordance with the complexity of how it is experienced in the real world.
Collapse
Affiliation(s)
- Alejandro Troncoso
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| | - Vicente Soto
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| | - Antoni Gomila
- Department of Psychology, University of the Balearic Islands, Palma de Mallorca, Spain
| | - David Martínez-Pernía
- Center for Social and Cognitive Neuroscience, School of Psychology, Adolfo Ibáñez University, Santiago, Chile
| |
Collapse
|
5
|
Snoek L, Jack RE, Schyns PG, Garrod OG, Mittenbühler M, Chen C, Oosterwijk S, Scholte HS. Testing, explaining, and exploring models of facial expressions of emotions. SCIENCE ADVANCES 2023; 9:eabq8421. [PMID: 36763663 PMCID: PMC9916981 DOI: 10.1126/sciadv.abq8421] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/06/2022] [Accepted: 01/09/2023] [Indexed: 06/18/2023]
Abstract
Models are the hallmark of mature scientific inquiry. In psychology, this maturity has been reached in a pervasive question-what models best represent facial expressions of emotion? Several hypotheses propose different combinations of facial movements [action units (AUs)] as best representing the six basic emotions and four conversational signals across cultures. We developed a new framework to formalize such hypotheses as predictive models, compare their ability to predict human emotion categorizations in Western and East Asian cultures, explain the causal role of individual AUs, and explore updated, culture-accented models that improve performance by reducing a prevalent Western bias. Our predictive models also provide a noise ceiling to inform the explanatory power and limitations of different factors (e.g., AUs and individual differences). Thus, our framework provides a new approach to test models of social signals, explain their predictive power, and explore their optimization, with direct implications for theory development.
Collapse
Affiliation(s)
- Lukas Snoek
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
- School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK
| | - Rachael E. Jack
- School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK
| | - Philippe G. Schyns
- School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK
| | | | - Maximilian Mittenbühler
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
- Department of Computer Science, University of Tübingen, Tübingen, Germany
| | - Chaona Chen
- School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK
| | - Suzanne Oosterwijk
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
| | - H. Steven Scholte
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
6
|
Smith RA, Cross ES. The McNorm library: creating and validating a new library of emotionally expressive whole body dance movements. PSYCHOLOGICAL RESEARCH 2023; 87:484-508. [PMID: 35385989 PMCID: PMC8985749 DOI: 10.1007/s00426-022-01669-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 02/23/2022] [Indexed: 11/28/2022]
Abstract
The ability to exchange affective cues with others plays a key role in our ability to create and maintain meaningful social relationships. We express our emotions through a variety of socially salient cues, including facial expressions, the voice, and body movement. While significant advances have been made in our understanding of verbal and facial communication, to date, understanding of the role played by human body movement in our social interactions remains incomplete. To this end, here we describe the creation and validation of a new set of emotionally expressive whole-body dance movement stimuli, named the Motion Capture Norming (McNorm) Library, which was designed to reconcile a number of limitations associated with previous movement stimuli. This library comprises a series of point-light representations of a dancer's movements, which were performed to communicate to observers neutrality, happiness, sadness, anger, and fear. Based on results from two validation experiments, participants could reliably discriminate the intended emotion expressed in the clips in this stimulus set, with accuracy rates up to 60% (chance = 20%). We further explored the impact of dance experience and trait empathy on emotion recognition and found that neither significantly impacted emotion discrimination. As all materials for presenting and analysing this movement library are openly available, we hope this resource will aid other researchers in further exploration of affective communication expressed by human bodily movement.
Collapse
Affiliation(s)
- Rebecca A. Smith
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland
| | - Emily S. Cross
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland ,Department of Cognitive Science, Macquarie University, Sydney, Australia
| |
Collapse
|
7
|
Botta A, Lagravinese G, Bove M, Pelosin E, Bonassi G, Avenanti A, Avanzino L. Sensorimotor inhibition during emotional processing. Sci Rep 2022; 12:6998. [PMID: 35488018 PMCID: PMC9054825 DOI: 10.1038/s41598-022-10981-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 04/04/2022] [Indexed: 11/09/2022] Open
Abstract
Visual processing of emotional stimuli has been shown to engage complex cortical and subcortical networks, but it is still unclear how it affects sensorimotor integration processes. To fill this gap, here, we used a TMS protocol named short-latency afferent inhibition (SAI), capturing sensorimotor interactions, while healthy participants were observing emotional body language (EBL) and International Affective Picture System (IAPS) stimuli. Participants were presented with emotional (fear- and happiness-related) or non-emotional (neutral) EBL and IAPS stimuli while SAI was tested at 120 ms and 300 ms after pictures presentation. At the earlier time point (120 ms), we found that fear-related EBL and IAPS stimuli selectively enhanced SAI as indexed by the greater inhibitory effect of somatosensory afferents on motor excitability. Larger early SAI enhancement was associated with lower scores at the Behavioural Inhibition Scale (BIS). At the later time point (300 ms), we found a generalized SAI decrease for all kind of stimuli (fear, happiness or neutral). Because the SAI index reflects integrative activity of cholinergic sensorimotor circuits, our findings suggest greater sensitivity of such circuits during early (120 ms) processing of threat-related information. Moreover, the correlation with BIS score may suggest increased attention and sensory vigilance in participants with greater anxiety-related dispositions. In conclusion, the results of this study show that sensorimotor inhibition is rapidly enhanced while processing threatening stimuli and that SAI protocol might be a valuable option in evaluating emotional-motor interactions in physiological and pathological conditions.
Collapse
Affiliation(s)
- Alessandro Botta
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Viale Benedetto XV/3, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Giovanna Lagravinese
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Marco Bove
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Viale Benedetto XV/3, 16132, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Elisa Pelosin
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Gaia Bonassi
- S.C. Medicina Fisica e Riabilitazione Ospedaliera, ASL4, Azienda Sanitaria Locale Chiavarese, Chiavari, Italy
| | - Alessio Avenanti
- Centro di Neuroscienze Cognitive and Dipartimento di Psicologia, Campus Cesena, Alma Mater Studiorum-University of Bologna, Cesena, Italy.,Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile
| | - Laura Avanzino
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Viale Benedetto XV/3, 16132, Genoa, Italy. .,IRCCS Ospedale Policlinico San Martino, Genoa, Italy.
| |
Collapse
|
8
|
Scheer C, Kubowitsch S, Dendorfer S, Jansen P. Happy Enough to Relax? How Positive and Negative Emotions Activate Different Muscular Regions in the Back - an Explorative Study. Front Psychol 2021; 12:511746. [PMID: 34135791 PMCID: PMC8201496 DOI: 10.3389/fpsyg.2021.511746] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Accepted: 05/04/2021] [Indexed: 11/23/2022] Open
Abstract
Embodiment theories have proposed a reciprocal relationship between emotional state and bodily reactions. Besides large body postures, recent studies have found emotions to affect rather subtle bodily expressions, such as slumped or upright sitting posture. This study investigated back muscle activity as an indication of an effect of positive and negative emotions on the sitting position. The electromyography (EMG) activity of six back muscles was recorded in 31 healthy subjects during exposure to positive and negative affective pictures. A resting period was used as a control condition. Increased muscle activity patterns in the back were found during the exposure to negative emotional stimuli, which was mainly measured in the lumbar and thorax regions. The positive emotion condition caused no elevated activity. The findings show that negative emotions lead to increased differential muscle activity in the back and thus corroborate those of previous research that emotion affects subtle bodily expressions.
Collapse
Affiliation(s)
- Clara Scheer
- Institute of Sport Science, Faculty of Humanities, University of Regensburg, Regensburg, Germany
| | - Simone Kubowitsch
- Laboratory for Biomechanics, Ostbayrische Technische Hoschschule Regensburg, Regensburg, Germany
| | - Sebastian Dendorfer
- Laboratory for Biomechanics, Ostbayrische Technische Hoschschule Regensburg, Regensburg, Germany
| | - Petra Jansen
- Institute of Sport Science, Faculty of Humanities, University of Regensburg, Regensburg, Germany
| |
Collapse
|
9
|
Botta A, Lagravinese G, Bove M, Avenanti A, Avanzino L. Modulation of Response Times During Processing of Emotional Body Language. Front Psychol 2021; 12:616995. [PMID: 33716882 PMCID: PMC7947862 DOI: 10.3389/fpsyg.2021.616995] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 01/28/2021] [Indexed: 11/23/2022] Open
Abstract
The investigation of how humans perceive and respond to emotional signals conveyed by the human body has been for a long time secondary compared with the investigation of facial expressions and emotional scenes recognition. The aims of this behavioral study were to assess the ability to process emotional body postures and to test whether motor response is mainly driven by the emotional content of the picture or if it is influenced by motor resonance. Emotional body postures and scenes (IAPS) divided into three clusters (fear, happiness, and neutral) were shown to 25 healthy subjects (13 males, mean age ± SD: 22.3 ± 1.8 years) in a three-alternative forced choice task. Subjects were asked to recognize the emotional content of the pictures by pressing one of three keys as fast as possible in order to estimate response times (RTs). The rating of valence and arousal was also performed. We found shorter RTs for fearful body postures as compared with happy and neutral postures. In contrast, no differences across emotional categories were found for the IAPS stimuli. Analysis on valence and arousal and the subsequent item analysis showed an excellent reliability of the two sets of images used in the experiment. Our results show that fearful body postures are rapidly recognized and processed, probably thanks to the automatic activation of a series of central nervous system structures orchestrating the defensive threat reactions, strengthening and supporting previous neurophysiological and behavioral findings in body language processing.
Collapse
Affiliation(s)
- Alessandro Botta
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
| | - Giovanna Lagravinese
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal Child Health (DINOGMI), University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| | - Marco Bove
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| | - Alessio Avenanti
- Centro di Neuroscienze Cognitive and Dipartimento di Psicologia, Campus Cesena, Alma Mater Studiorum – University of Bologna, Cesena, Italy
- Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile
| | - Laura Avanzino
- Department of Experimental Medicine (DIMES), Section of Human Physiology, University of Genoa, Genoa, Italy
- IRCCS Policlinico San Martino, Genoa, Italy
| |
Collapse
|
10
|
Melzer A, Shafir T, Tsachor RP. How Do We Recognize Emotion From Movement? Specific Motor Components Contribute to the Recognition of Each Emotion. Front Psychol 2019; 10:1389. [PMID: 31333524 PMCID: PMC6617736 DOI: 10.3389/fpsyg.2019.01389] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2018] [Accepted: 05/28/2019] [Indexed: 11/27/2022] Open
Abstract
Are there movement features that are recognized as expressing each basic emotion by most people, and what are they? In our previous study we identified sets of Laban movement components that, when moved, elicited the basic emotions of anger, sadness, fear, and happiness. Our current study aimed to investigate if movements composed from those sets would be recognized as expressing those emotions, regardless of any instruction to the mover to portray emotion. Our stimuli included 113 video-clips of five Certified Laban Movement Analysts (CMAs) moving combinations of two to four movement components from each set associated with only one emotion: happiness, sadness, fear, or anger. Each three second clip showed one CMA moving a single combination. The CMAs moved only the combination's required components. Sixty-two physically and mentally healthy men (n = 31) and women (n = 31), ages 19–48, watched the clips and rated the perceived emotion and its intensity. To confirm participants' ability to recognize emotions from movement and to compare our stimuli to existing validated emotional expression stimuli, participants rated 50 additional clips of bodily motor expressions of these same emotions validated by Atkinson et al. (2004). Results showed that for both stimuli types, all emotions were recognized far above chance level. Comparing recognition accuracy of the two clip types revealed better recognition of anger, fear, and neutral emotion from Atkinson's clips of actors expressing emotions, and similar levels of recognition accuracy for happiness and sadness. Further analysis was performed to determine the contribution of specific movement components to the recognition of the studied emotions. Our results indicated that these specific Laban motor components not only enhance feeling the associated emotions when moved, but also contribute to recognition of the associated emotions when being observed, even when the mover was not instructed to portray emotion, indicating that the presence of these movement components alone is sufficient for emotion recognition. This research-based knowledge regarding the relationship between Laban motor components and bodily emotional expressions can be used by dance-movement and drama therapists for better understanding of clients' emotional movements, for creating appropriate interventions, and for enhancing communication with other practitioners regarding bodily emotional expression.
Collapse
Affiliation(s)
- Ayelet Melzer
- Faculty of Social Welfare and Health Sciences, The Graduate School of Creative Arts Therapies, University of Haifa, Haifa, Israel
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
| | | |
Collapse
|
11
|
Bernardet U, Fdili Alaoui S, Studd K, Bradley K, Pasquier P, Schiphorst T. Assessing the reliability of the Laban Movement Analysis system. PLoS One 2019; 14:e0218179. [PMID: 31194794 PMCID: PMC6564005 DOI: 10.1371/journal.pone.0218179] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2018] [Accepted: 05/28/2019] [Indexed: 11/18/2022] Open
Abstract
The Laban Movement Analysis system (LMA) is a widely used system for the description of human movement. Here we present results of an empirical analysis of the reliability of the LMA system. Firstly, we developed a directed graph-based representation for the formalization of LMA. Secondly, we implemented a custom video annotation tool for stimulus presentation and annotation of the formalized LMA. Using these two elements, we conducted an experimental assessment of LMA reliability. In the experimental assessment of the reliability, experts-Certified Movement Analysts (CMA)-were tasked with identifying the differences between a "neutral" movement and the same movement executed with a specific variation in one of the dimensions of the LMA parameter space. The videos represented variations on the pantomimed movement of knocking at a door or giving directions. To be as close as possible to the annotation practice of CMAs, participants were given full control over the number of times and order in which they viewed the videos. The LMA annotation was captured by means of the video annotation tool that guided the participants through the LMA graph by asking them multiple-choice questions at each node. Participants were asked to first annotate the most salient difference (round 1), and then the second most salient one (round 2) between a neutral and gesture and the variation. To quantify the overall reliability of LMA, we computed Krippendorff's α. The quantitative data shows that the reliability, depending on how the two rounds are integrated, ranges between a weak and an acceptable reliability of LMA. The analysis of viewing behavior showed that, despite relatively large differences at the inter-individual level, there is no simple relationship between viewing behavior and individual performance (quantified as the level of agreement of the individual with the dominant rating). This research advances the state of the art in formalizing and implementing a reliability measure for the Laban Movement Analysis system. The experimental study we conducted allows identifying some of the strengths and weaknesses of the widely used movement coding system. Additionally, we have gained useful insights into the assessment procedure itself.
Collapse
Affiliation(s)
- Ulysses Bernardet
- School of Engineering and Applied Science, Aston University, Birmingham, United Kingdom
- * E-mail:
| | - Sarah Fdili Alaoui
- Université Paris-Sud, CNRS, Inria, Université Paris-Saclay, Orsay, France
| | - Karen Studd
- Laban/Bartenieff Institute of Movement Studies, New York, New York, United States of America
| | - Karen Bradley
- University of Maryland, College Park, Maryland, United States of America
| | - Philippe Pasquier
- School of Interactive Arts and Technology, Simon Fraser University, Vancouver, Canada
| | - Thecla Schiphorst
- School of Interactive Arts and Technology, Simon Fraser University, Vancouver, Canada
| |
Collapse
|
12
|
Tsachor RP, Shafir T. How Shall I Count the Ways? A Method for Quantifying the Qualitative Aspects of Unscripted Movement With Laban Movement Analysis. Front Psychol 2019; 10:572. [PMID: 31001158 PMCID: PMC6455080 DOI: 10.3389/fpsyg.2019.00572] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2018] [Accepted: 02/28/2019] [Indexed: 12/30/2022] Open
Abstract
There is significant clinical evidence showing that creative and expressive movement processes involved in dance/movement therapy (DMT) enhance psycho-social well-being. Yet, because movement is a complex phenomenon, statistically validating which aspects of movement change during interventions or lead to significant positive therapeutic outcomes is challenging because movement has multiple, overlapping variables appearing in unique patterns in different individuals and situations. One factor contributing to the therapeutic effects of DMT is movement's effect on clients' emotional states. Our previous study identified sets of movement variables which, when executed, enhanced specific emotions. In this paper, we describe how we selected movement variables for statistical analysis in that study, using a multi-stage methodology to identify, reduce, code, and quantify the multitude of variables present in unscripted movement. We suggest a set of procedures for using Laban Movement Analysis (LMA)-described movement variables as research data. Our study used LMA, an internationally accepted comprehensive system for movement analysis, and a primary DMT clinical assessment tool for describing movement. We began with Davis's (1970) three-stepped protocol for analyzing movement patterns and identifying the most important variables: (1) We repeatedly observed video samples of validated (Atkinson et al., 2004) emotional expressions to identify prevalent movement variables, eliminating variables appearing minimally or absent. (2) We use the criteria repetition, frequency, duration and emphasis to eliminate additional variables. (3) For each emotion, we analyzed motor expression variations to discover how variables cluster: first, by observing ten movement samples of each emotion to identify variables common to all samples; second, by qualitative analysis of the two best-recognized samples to determine if phrasing, duration or relationship among variables was significant. We added three new steps to this protocol: (4) we created Motifs (LMA symbols) combining movement variables extracted in steps 1-3; (5) we asked participants in the pilot study to move these combinations and quantify their emotional experience. Based on the results of the pilot study, we eliminated more variables; (6) we quantified the remaining variables' prevalence in each Motif for statistical analysis that examined which variables enhanced each emotion. We posit that our method successfully quantified unscripted movement data for statistical analysis.
Collapse
Affiliation(s)
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
13
|
Vetter P, Badde S, Phelps EA, Carrasco M. Emotional faces guide the eyes in the absence of awareness. eLife 2019; 8:43467. [PMID: 30735123 PMCID: PMC6382349 DOI: 10.7554/elife.43467] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Accepted: 02/07/2019] [Indexed: 12/14/2022] Open
Abstract
The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.
Collapse
Affiliation(s)
- Petra Vetter
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Royal Holloway, University of London, Egham, United Kingdom
| | - Stephanie Badde
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| | - Elizabeth A Phelps
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Harvard University, Cambridge, United States
| | - Marisa Carrasco
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| |
Collapse
|
14
|
Plusquellec P, Denault V. The 1000 Most Cited Papers on Visible Nonverbal Behavior: A Bibliometric Analysis. JOURNAL OF NONVERBAL BEHAVIOR 2018. [DOI: 10.1007/s10919-018-0280-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
|
15
|
Coverage of Emotion Recognition for Common Wearable Biosensors. BIOSENSORS-BASEL 2018; 8:bios8020030. [PMID: 29587375 PMCID: PMC6023004 DOI: 10.3390/bios8020030] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/20/2018] [Revised: 03/16/2018] [Accepted: 03/22/2018] [Indexed: 11/21/2022]
Abstract
The present research proposes a novel emotion recognition framework for the computer prediction of human emotions using common wearable biosensors. Emotional perception promotes specific patterns of biological responses in the human body, and this can be sensed and used to predict emotions using only biomedical measurements. Based on theoretical and empirical psychophysiological research, the foundation of autonomic specificity facilitates the establishment of a strong background for recognising human emotions using machine learning on physiological patterning. However, a systematic way of choosing the physiological data covering the elicited emotional responses for recognising the target emotions is not obvious. The current study demonstrates through experimental measurements the coverage of emotion recognition using common off-the-shelf wearable biosensors based on the synchronisation between audiovisual stimuli and the corresponding physiological responses. The work forms the basis of validating the hypothesis for emotional state recognition in the literature and presents coverage of the use of common wearable biosensors coupled with a novel preprocessing algorithm to demonstrate the practical prediction of the emotional states of wearers.
Collapse
|
16
|
Aung MSH, Kaltwang S, Romera-Paredes B, Martinez B, Singh A, Cella M, Valstar M, Meng H, Kemp A, Shafizadeh M, Elkins AC, Kanakam N, de Rothschild A, Tyler N, Watson PJ, de C Williams AC, Pantic M, Bianchi-Berthouze N. The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING 2016; 7:435-451. [PMID: 30906508 PMCID: PMC6430129 DOI: 10.1109/taffc.2015.2462830] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Pain-related emotions are a major barrier to effective self rehabilitation in chronic pain. Automated coaching systems capable of detecting these emotions are a potential solution. This paper lays the foundation for the development of such systems by making three contributions. First, through literature reviews, an overview of how pain is expressed in chronic pain and the motivation for detecting it in physical rehabilitation is provided. Second, a fully labelled multimodal dataset (named 'EmoPain') containing high resolution multiple-view face videos, head mounted and room audio signals, full body 3D motion capture and electromyographic signals from back muscles is supplied. Natural unconstrained pain related facial expressions and body movement behaviours were elicited from people with chronic pain carrying out physical exercises. Both instructed and non-instructed exercises were considered to reflect traditional scenarios of physiotherapist directed therapy and home-based self-directed therapy. Two sets of labels were assigned: level of pain from facial expressions annotated by eight raters and the occurrence of six pain-related body behaviours segmented by four experts. Third, through exploratory experiments grounded in the data, the factors and challenges in the automated recognition of such expressions and behaviour are described, the paper concludes by discussing potential avenues in the context of these findings also highlighting differences for the two exercise scenarios addressed.
Collapse
Affiliation(s)
- Min S H Aung
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Sebastian Kaltwang
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | | | - Brais Martinez
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Aneesha Singh
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Matteo Cella
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Michel Valstar
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Hongying Meng
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Andrew Kemp
- Physiotherapy Department, Maidstone & Tunbridge Wells NHS Trust, TN2 4QJ
| | - Moshen Shafizadeh
- UCL Interaction Centre, University, College London, London WC1E 6BT, Unithed Kingdom
| | - Aaron C Elkins
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | - Natalie Kanakam
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Amschel de Rothschild
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Nick Tyler
- Department of Civil, Environmental & Geomatic Engineering, University College London, London WC1E 6BT, Unithed Kingdom
| | - Paul J Watson
- Department of Health Sciences, University of Leicester, Leicester LE5 7PW, Unithed Kingdom
| | - Amanda C de C Williams
- Department of Clinical, Educational & Health Psychology, University College London, London WC1E 6BT, Unithed Kingdom
| | - Maja Pantic
- Department of Computing, Imperial College London, London SW7 2AZ, Unithed Kingdom
| | | |
Collapse
|
17
|
de Borst AW, de Gelder B. Clear signals or mixed messages: inter-individual emotion congruency modulates brain activity underlying affective body perception. Soc Cogn Affect Neurosci 2016; 11:1299-309. [PMID: 27025242 DOI: 10.1093/scan/nsw039] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2015] [Accepted: 03/17/2016] [Indexed: 11/12/2022] Open
Abstract
The neural basis of emotion perception has mostly been investigated with single face or body stimuli. However, in daily life one may also encounter affective expressions by groups, e.g. an angry mob or an exhilarated concert crowd. In what way is brain activity modulated when several individuals express similar rather than different emotions? We investigated this question using an experimental design in which we presented two stimuli simultaneously, with same or different emotional expressions. We hypothesized that, in the case of two same-emotion stimuli, brain activity would be enhanced, while in the case of two different emotions, one emotion would interfere with the effect of the other. The results showed that the simultaneous perception of different affective body expressions leads to a deactivation of the amygdala and a reduction of cortical activity. It was revealed that the processing of fearful bodies, compared with different-emotion bodies, relied more strongly on saliency and action triggering regions in inferior parietal lobe and insula, while happy bodies drove the occipito-temporal cortex more strongly. We showed that this design could be used to uncover important differences between brain networks underlying fearful and happy emotions. The enhancement of brain activity for unambiguous affective signals expressed by several people simultaneously supports adaptive behaviour in critical situations.
Collapse
Affiliation(s)
- A W de Borst
- Department of Cognitive Neuroscience, Brain and Emotion Laboratory, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands
| | - B de Gelder
- Department of Cognitive Neuroscience, Brain and Emotion Laboratory, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands Department of Psychiatry and Mental Health, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
18
|
Shafir T, Tsachor RP, Welch KB. Emotion Regulation through Movement: Unique Sets of Movement Characteristics are Associated with and Enhance Basic Emotions. Front Psychol 2016; 6:2030. [PMID: 26793147 PMCID: PMC4707271 DOI: 10.3389/fpsyg.2015.02030] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2015] [Accepted: 12/21/2015] [Indexed: 11/13/2022] Open
Abstract
We have recently demonstrated that motor execution, observation, and imagery of movements expressing certain emotions can enhance corresponding affective states and therefore could be used for emotion regulation. But which specific movement(s) should one use in order to enhance each emotion? This study aimed to identify, using Laban Movement Analysis (LMA), the Laban motor elements (motor characteristics) that characterize movements whose execution enhances each of the basic emotions: anger, fear, happiness, and sadness. LMA provides a system of symbols describing its motor elements, which gives a written instruction (motif) for the execution of a movement or movement-sequence over time. Six senior LMA experts analyzed a validated set of video clips showing whole body dynamic expressions of anger, fear, happiness and sadness, and identified the motor elements that were common to (appeared in) all clips expressing the same emotion. For each emotion, we created motifs of different combinations of the motor elements common to all clips of the same emotion. Eighty subjects from around the world read and moved those motifs, to identify the emotion evoked when moving each motif and to rate the intensity of the evoked emotion. All subjects together moved and rated 1241 motifs, which were produced from 29 different motor elements. Using logistic regression, we found a set of motor elements associated with each emotion which, when moved, predicted the feeling of that emotion. Each emotion was predicted by a unique set of motor elements and each motor element predicted only one emotion. Knowledge of which specific motor elements enhance specific emotions can enable emotional self-regulation through adding some desired motor qualities to one's personal everyday movements (rather than mimicking others' specific movements) and through decreasing motor behaviors which include elements that enhance negative emotions.
Collapse
Affiliation(s)
- Tal Shafir
- The Graduate School of Creative Arts Therapies, Faculty of Social Welfare and Health Sciences, University of HaifaHaifa, Israel; The Department of Psychiatry, University of MichiganAnn Arbor, MI, USA
| | - Rachelle P Tsachor
- Department of Theatre, School of Theatre and Music, University of Illinois at Chicago Chicago, IL, USA
| | - Kathleen B Welch
- Center for Statistical Consultation and Research, University of Michigan Ann Arbor, MI, USA
| |
Collapse
|
19
|
de Borst AW, de Gelder B. Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective. Front Psychol 2015; 6:576. [PMID: 26029133 PMCID: PMC4428060 DOI: 10.3389/fpsyg.2015.00576] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2014] [Accepted: 04/20/2015] [Indexed: 01/30/2023] Open
Abstract
Recent developments in neuroimaging research support the increased use of naturalistic stimulus material such as film, avatars, or androids. These stimuli allow for a better understanding of how the brain processes information in complex situations while maintaining experimental control. While avatars and androids are well suited to study human cognition, they should not be equated to human stimuli. For example, the uncanny valley hypothesis theorizes that artificial agents with high human-likeness may evoke feelings of eeriness in the human observer. Here we review if, when, and how the perception of human-like avatars and androids differs from the perception of humans and consider how this influences their utilization as stimulus material in social and affective neuroimaging studies. First, we discuss how the appearance of virtual characters affects perception. When stimuli are morphed across categories from non-human to human, the most ambiguous stimuli, rather than the most human-like stimuli, show prolonged classification times and increased eeriness. Human-like to human stimuli show a positive linear relationship with familiarity. Secondly, we show that expressions of emotions in human-like avatars can be perceived similarly to human emotions, with corresponding behavioral, physiological and neuronal activations, with exception of physical dissimilarities. Subsequently, we consider if and when one perceives differences in action representation by artificial agents versus humans. Motor resonance and predictive coding models may account for empirical findings, such as an interference effect on action for observed human-like, natural moving characters. However, the expansion of these models to explain more complex behavior, such as empathy, still needs to be investigated in more detail. Finally, we broaden our outlook to social interaction, where virtual reality stimuli can be utilized to imitate complex social situations.
Collapse
Affiliation(s)
- Aline W de Borst
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| |
Collapse
|