1
|
Ngai HHT, Jin J. Emotion-Guided Attention Impacts Deliberate Multi-Evidence Emotion-Related Perceptual Decision-Making. Psychophysiology 2025; 62:e70059. [PMID: 40289354 PMCID: PMC12034915 DOI: 10.1111/psyp.70059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2024] [Revised: 04/04/2025] [Accepted: 04/04/2025] [Indexed: 04/30/2025]
Abstract
Emotion-guided endogenous attention (e.g., attending to fear) may play a crucial role in determining how humans integrate emotional evidence from various sources when assessing the general emotional tenor of the environment. For instance, what emotion a presenter focuses on can shape their perception of the overall emotion of the room. While there is an increasing interest in understanding how endogenous attention affects emotion perception, existing studies have largely focused on single-stimulus perception. There is limited understanding of how endogenous attention influences emotion evidence integration across multiple sources. To investigate this question, human participants (N = 40) were invited to judge the average emotion across an array of faces ranging from fearful to happy. Endogenous attention was manipulated by instructing participants to decide whether the face array was "fearful or not" (fear attention), "happy or not" (happy attention). Eye movement results revealed an endogenous attention-induced sampling bias such that participants paid more attention to extreme emotional evidence congruent with the target emotion. Computational modeling revealed that endogenous attention shifted the decision criterion to be more conservative, leading to reduced target-category decisions. These findings unraveled the cognitive and computational mechanisms of how endogenous attention impacts the way we gather emotional evidence and make integrative decisions, shedding light on emotion-related decision-making.
Collapse
Affiliation(s)
- Hilary H. T. Ngai
- Department of PsychologyThe University of Hong KongHong KongSAR China
| | - Jingwen Jin
- Department of PsychologyThe University of Hong KongHong KongSAR China
- State Key Laboratory of Brain and Cognitive SciencesThe University of Hong KongHong KongSAR China
| |
Collapse
|
2
|
Pelliet A, Nogueira M, Fagundes C, Capela S, Saraiva F, Pulcu E, Harmer CJ, Murphy SE, Capitão LP. "Invisible Dangers": Unconscious processing of angry vs fearful faces and its relationship to subjective anger. Conscious Cogn 2025; 130:103848. [PMID: 40138766 DOI: 10.1016/j.concog.2025.103848] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2024] [Revised: 03/07/2025] [Accepted: 03/09/2025] [Indexed: 03/29/2025]
Abstract
Traditional paradigms for studying the unconscious processing of threatening facial expressions face methodological limitations and have predominantly focused on fear, leaving gaps in our understanding of anger. Additionally, it is unclear how the unconscious perception of anger influences subjective anger experiences. To address this, the current study employed Continuous Flash Suppression (CFS), a robust method for studying unconscious processing, to assess suppression times for angry, fearful and happy facial expressions. Following the administration of CFS, participants underwent an anger induction paradigm, and state anger symptoms were assessed at multiple timepoints. Suppression times for angry faces were compared to those for happy and fearful faces, and their relationship with state anger symptoms post-induction was examined. Results revealed that fearful faces broke suppression significantly faster than happy faces. Anger was slower to break suppression compared to fear, but no significant differences emerged between anger and happiness. In addition, the faster emergence into awareness of fear compared to anger was linked to an increased state anger after the induction, indicating that differences in the unconscious processing of these two emotions can potentially influence symptoms of subjective anger. These findings provide new insights into how angry and fearful faces are processed unconsciously, with implications for understanding the cognitive mechanisms underlying subjective anger.
Collapse
Affiliation(s)
- Anna Pelliet
- Department of Psychiatry, University of Oxford, Oxford, UK
| | - Marlene Nogueira
- Psychological Neuroscience Laboratory, Psychology Research Centre (CIPsi), School of Psychology, University of Minho, Braga, Portugal
| | - Catarina Fagundes
- Psychological Neuroscience Laboratory, Psychology Research Centre (CIPsi), School of Psychology, University of Minho, Braga, Portugal
| | - Susana Capela
- Psychological Neuroscience Laboratory, Psychology Research Centre (CIPsi), School of Psychology, University of Minho, Braga, Portugal
| | - Fátima Saraiva
- Psychological Neuroscience Laboratory, Psychology Research Centre (CIPsi), School of Psychology, University of Minho, Braga, Portugal
| | - Erdem Pulcu
- Department of Psychiatry, University of Oxford, Oxford, UK; Oxford Health NHS Trust, Warneford Hospital, Oxford, UK
| | - Catherine J Harmer
- Department of Psychiatry, University of Oxford, Oxford, UK; Oxford Health NHS Trust, Warneford Hospital, Oxford, UK
| | - Susannah E Murphy
- Department of Psychiatry, University of Oxford, Oxford, UK; Oxford Health NHS Trust, Warneford Hospital, Oxford, UK
| | - Liliana P Capitão
- Psychological Neuroscience Laboratory, Psychology Research Centre (CIPsi), School of Psychology, University of Minho, Braga, Portugal.
| |
Collapse
|
3
|
Keck J, Bachmann J, Zabicki A, Munzert J, Krüger B. Decoding affect in emotional body language: valence representation in the action observation network. Soc Cogn Affect Neurosci 2025; 20:nsaf021. [PMID: 39953789 PMCID: PMC11879420 DOI: 10.1093/scan/nsaf021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2024] [Revised: 09/17/2024] [Accepted: 02/13/2025] [Indexed: 02/17/2025] Open
Abstract
Humans are highly adept at inferring emotional states from body movements in social interactions. Nonetheless, it is under debate how this process is facilitated by neural activations across multiple brain regions. The specific contributions of various brain areas to the perception of valence in biological motion remain poorly understood, particularly those within the action observation network (AON) and those involved in processing emotional valence. This study explores which cortical regions involved in processing emotional body language depicted by kinematic stimuli contain valence information, and whether this is reflected either in the magnitude of activation or in distinct activation patterns. Results showed that neural patterns within the AON, notably the inferior parietal lobule (IPL), exhibit a neural geometry that reflects the valence impressions of the observed stimuli. However, the representational geometry of valence-sensitive areas mirrors these impressions to a lesser degree. Our findings also reveal that the activation magnitude in both AON and valence-sensitive regions does not correlate with the perceived valence of emotional interactions. Results underscore the critical role of the AON, particularly the IPL, in interpreting the valence of emotional interactions, indicating its essential function in the perception of valence, especially when observing biological movements.
Collapse
Affiliation(s)
- Johannes Keck
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
- Center for Mind, Brain and Behavior (CMBB), Philipps University of Marburg and Justus-Liebig-University Giessen, Marburg 35032, Germany
| | - Julia Bachmann
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| | - Adam Zabicki
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| | - Jörn Munzert
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
- Center for Mind, Brain and Behavior (CMBB), Philipps University of Marburg and Justus-Liebig-University Giessen, Marburg 35032, Germany
| | - Britta Krüger
- Nemolab, Institute of Sports Science, Justus-Liebig-University Giessen, Giessen 35394, Germany
| |
Collapse
|
4
|
Ngai HHT, Hsiao JH, Luhmann CC, Mohanty A, Jin J. How is emotional evidence from multiple sources used in perceptual decision making? Psychophysiology 2025; 62:e14727. [PMID: 39614659 DOI: 10.1111/psyp.14727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Revised: 10/30/2024] [Accepted: 11/01/2024] [Indexed: 12/01/2024]
Abstract
Judging the emotional nature of a scene requires us to deliberately integrate pieces of evidence with varying intensities of emotion. Our existing knowledge about emotion-related perceptual decision-making is largely based on paradigms using single stimulus and, when involving multiple stimuli, rapid decisions. Consequently, it remains unclear how we sample and integrate multiple pieces of emotional evidence deliberately to form an overall judgment. Findings from non-emotion rapid decision-making studies show humans down-sample and downweight extreme evidence. However, deliberate decision-making may rely on a different attention mode than in rapid decision-making; and extreme emotional stimuli are inherently salient. Given these critical differences, it is imperative to directly examine the deliberate decision-making process about multiple emotional stimuli. In the current study, human participants (N = 33) viewed arrays of faces with expressions ranging from extremely fearful to extremely happy freely with their eye movement tracked. They then decided whether the faces were more fearful or happier on average. In contrast to conclusions drawn from non-emotion and rapid decision-making studies, eye movement measures revealed that participants attentionally sampled extreme emotional evidence more than less extreme evidence. Computational modeling results indicated that even though participants exhibited biased attention distribution, they weighted various emotional evidence equally. These findings provide novel insights into how people sample and integrate multiple pieces of emotional evidence, contribute to a more comprehensive understanding of emotion-related decision-making, and shed light on the mechanisms of pathological affective decisions.
Collapse
Affiliation(s)
- Hilary H T Ngai
- Department of Psychology, The University of Hong Kong, The University of Hong Kong, Hong Kong SAR, China
| | - Janet H Hsiao
- Division of Social Science, Hong Kong University of Science and Technology, Hong Kong SAR, China
| | - Christian C Luhmann
- Department of Psychology, Stony Brook University, Stony Brook, New York, USA
| | - Aprajita Mohanty
- Department of Psychology, Stony Brook University, Stony Brook, New York, USA
| | - Jingwen Jin
- Department of Psychology, The University of Hong Kong, The University of Hong Kong, Hong Kong SAR, China
- State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong SAR, China
| |
Collapse
|
5
|
Ren J, Zhang M, Liu S, He W, Luo W. Maintenance of Bodily Expressions Modulates Functional Connectivity Between Prefrontal Cortex and Extrastriate Body Area During Working Memory Processing. Brain Sci 2024; 14:1172. [PMID: 39766371 PMCID: PMC11674776 DOI: 10.3390/brainsci14121172] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2024] [Revised: 11/13/2024] [Accepted: 11/21/2024] [Indexed: 01/11/2025] Open
Abstract
Background/Objectives: As a form of visual input, bodily expressions can be maintained and manipulated in visual working memory (VWM) over a short period of time. While the prefrontal cortex (PFC) plays an indispensable role in top-down control, it remains largely unclear whether this region also modulates the VWM storage of bodily expressions during a delay period. Therefore, the two primary goals of this study were to examine whether the emotional bodies would elicit heightened brain activity among areas such as the PFC and extrastriate body area (EBA) and whether the emotional effects subsequently modulate the functional connectivity patterns for active maintenance during delay periods. Methods: During functional magnetic resonance imaging (fMRI) scanning, participants performed a delayed-response task in which they were instructed to view and maintain a body stimulus in working memory before emotion categorization (happiness, anger, and neutral). If processing happy and angry bodies consume increased cognitive demands, stronger PFC activation and its functional connectivity with perceptual areas would be observed. Results: Results based on univariate and multivariate analyses conducted on the data collected during stimulus presentation revealed an enhanced processing of the left PFC and left EBA. Importantly, subsequent functional connectivity analyses performed on delayed-period data using a psychophysiological interaction model indicated that functional connectivity between the PFC and EBA increases for happy and angry bodies compared to neutral bodies. Conclusions: The emotion-modulated coupling between the PFC and EBA during maintenance deepens our understanding of the functional organization underlying the VWM processing of bodily information.
Collapse
Affiliation(s)
- Jie Ren
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China; (J.R.); (M.Z.); (S.L.); (W.H.)
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
| | - Mingming Zhang
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China; (J.R.); (M.Z.); (S.L.); (W.H.)
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
| | - Shuaicheng Liu
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China; (J.R.); (M.Z.); (S.L.); (W.H.)
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
| | - Weiqi He
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China; (J.R.); (M.Z.); (S.L.); (W.H.)
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
| | - Wenbo Luo
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China; (J.R.); (M.Z.); (S.L.); (W.H.)
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
| |
Collapse
|
6
|
Chen P, Zhang C, Li B, Tong L, Wang L, Ma S, Cao L, Yu Z, Yan B. An fMRI dataset in response to large-scale short natural dynamic facial expression videos. Sci Data 2024; 11:1247. [PMID: 39562568 PMCID: PMC11576863 DOI: 10.1038/s41597-024-04088-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Accepted: 11/06/2024] [Indexed: 11/21/2024] Open
Abstract
Facial expression is among the most natural methods for human beings to convey their emotional information in daily life. Although the neural mechanisms of facial expression have been extensively studied employing lab-controlled images and a small number of lab-controlled video stimuli, how the human brain processes natural dynamic facial expression videos still needs to be investigated. To our knowledge, this type of data specifically on large-scale natural facial expression videos is currently missing. We describe here the natural Facial Expressions Dataset (NFED), an fMRI dataset including responses to 1,320 short (3-second) natural facial expression video clips. These video clips are annotated with three types of labels: emotion, gender, and ethnicity, along with accompanying metadata. We validate that the dataset has good quality within and across participants and, notably, can capture temporal and spatial stimuli features. NFED provides researchers with fMRI data for understanding of the visual processing of large number of natural facial expression videos.
Collapse
Affiliation(s)
- Panpan Chen
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Chi Zhang
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Bao Li
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Li Tong
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - LinYuan Wang
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - ShuXiao Ma
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Long Cao
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - ZiYa Yu
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China
| | - Bin Yan
- Henan Key Laboratory of Imaging and Intelligent Processing, PLA Strategic Support Force Information Engineering University, Zhengzhou, 450000, China.
| |
Collapse
|
7
|
Becker C, Conduit R, Chouinard PA, Laycock R. Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuli. Behav Res Methods 2024; 56:7674-7690. [PMID: 38834812 PMCID: PMC11362322 DOI: 10.3758/s13428-024-02443-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/11/2024] [Indexed: 06/06/2024]
Abstract
Video recordings accurately capture facial expression movements; however, they are difficult for face perception researchers to standardise and manipulate. For this reason, dynamic morphs of photographs are often used, despite their lack of naturalistic facial motion. This study aimed to investigate how humans perceive emotions from faces using real videos and two different approaches to artificially generating dynamic expressions - dynamic morphs, and AI-synthesised deepfakes. Our participants perceived dynamic morphed expressions as less intense when compared with videos (all emotions) and deepfakes (fearful, happy, sad). Videos and deepfakes were perceived similarly. Additionally, they perceived morphed happiness and sadness, but not morphed anger or fear, as less genuine than other formats. Our findings support previous research indicating that social responses to morphed emotions are not representative of those to video recordings. The findings also suggest that deepfakes may offer a more suitable standardized stimulus type compared to morphs. Additionally, qualitative data were collected from participants and analysed using ChatGPT, a large language model. ChatGPT successfully identified themes in the data consistent with those identified by an independent human researcher. According to this analysis, our participants perceived dynamic morphs as less natural compared with videos and deepfakes. That participants perceived deepfakes and videos similarly suggests that deepfakes effectively replicate natural facial movements, making them a promising alternative for face perception research. The study contributes to the growing body of research exploring the usefulness of generative artificial intelligence for advancing the study of human perception.
Collapse
|
8
|
Nan Y, Mehta P, Liao J, Zheng Y, Han C, Wu Y. Testosterone administration decreases sensitivity to angry facial expressions in healthy males: A computational modeling approach. Psychoneuroendocrinology 2024; 161:106948. [PMID: 38211451 DOI: 10.1016/j.psyneuen.2023.106948] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/09/2023] [Revised: 11/10/2023] [Accepted: 12/23/2023] [Indexed: 01/13/2024]
Abstract
Previous research indicates that higher testosterone levels are related to increased aggressive and dominant behaviors, particularly in males. One possible mechanism for these hormone-behavior associations could involve threat perception. However, the causal influence of testosterone on men's recognition of threatening facial expressions remains unknown. Here, we tested the causal effect of exogenous testosterone on men's sensitivity to facial threat by combining a psychophysical task with computational modeling. We administered a single dose (150 mg) of testosterone or placebo gel to healthy young men (n = 120) in a double-blind, placebo-controlled, between-participant design. Participants were presented with morphed emotional faces mixing anger/fear and neutral expressions and made judgments about the emotional expression. Across typical regression analysis, signal detection analysis, and drift diffusion modeling, our results consistently showed that individuals who received testosterone (versus placebo) exhibited a lower perceived sensitivity to angry facial expressions. But we observed no significant effects of testosterone administration on fearful facial expressions. The findings indicate that testosterone attenuates sensitivity to facial threat, especially angry facial expressions, which could lead to a misestimation of others' dominance and an increase in one's own aggressive and dominant behaviors.
Collapse
Affiliation(s)
- Yu Nan
- Department of Applied Social Sciences, Hong Kong Polytechnic University, Hong Kong Special Administrative Region of China; School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Pranjal Mehta
- Department of Experimental Psychology, University College London, London, United Kingdom
| | - Jiajun Liao
- School of Psychology, South China Normal University, Guangzhou, China
| | - Yueyuan Zheng
- Department of Psychology, University of Hong Kong, Hong Kong Special Administrative Region of China
| | - Chengyang Han
- Department of Psychology, Hangzhou Normal University, Hangzhou, China
| | - Yin Wu
- Department of Applied Social Sciences, Hong Kong Polytechnic University, Hong Kong Special Administrative Region of China; Research Institute for Sports Science and Technology, Hong Kong Polytechnic University, Hong Kong Special Administrative Region of China.
| |
Collapse
|
9
|
Whitehead JC, Spiousas I, Armony JL. Individual differences in the evaluation of ambiguous visual and auditory threat-related expressions. Eur J Neurosci 2024; 59:370-393. [PMID: 38185821 DOI: 10.1111/ejn.16220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Revised: 10/29/2023] [Accepted: 11/22/2023] [Indexed: 01/09/2024]
Abstract
This study investigated the neural correlates of the judgement of auditory and visual ambiguous threat-related information, and the influence of state anxiety on this process. Healthy subjects were scanned using a fast, high-resolution functional magnetic resonance imaging (fMRI) multiband sequence while they performed a two-alternative forced-choice emotion judgement task on faces and vocal utterances conveying explicit anger or fear, as well as ambiguous ones. Critically, the latter was specific to each subject, obtained through a morphing procedure and selected prior to scanning following a perceptual decision-making task. Behavioural results confirmed a greater task-difficulty for subject-specific ambiguous stimuli and also revealed a judgement bias for visual fear, and, to a lesser extent, for auditory anger. Imaging results showed increased activity in regions of the salience and frontoparietal control networks (FPCNs) and deactivation in areas of the default mode network for ambiguous, relative to explicit, expressions. In contrast, the right amygdala (AMG) responded more strongly to explicit stimuli. Interestingly, its response to the same ambiguous stimulus depended on the subjective judgement of the expression. Finally, we found that behavioural and neural differences between ambiguous and explicit expressions decreased as a function of state anxiety scores. Taken together, our results show that behavioural and brain responses to emotional expressions are determined not only by emotional clarity but also modality and the subjects' subjective perception of the emotion expressed, and that some of these responses are modulated by state anxiety levels.
Collapse
Affiliation(s)
- Jocelyne C Whitehead
- Human Neuroscience, Douglas Mental Health University Institute, Verdun, Quebec, Canada
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Integrated Program in Neuroscience, McGill University, Montreal, Quebec, Canada
| | - Ignacio Spiousas
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Laboratorio Interdisciplinario del Tiempo y la Experiencia (LITERA), CONICET, Universidad de San Andrés, Victoria, Argentina
| | - Jorge L Armony
- Human Neuroscience, Douglas Mental Health University Institute, Verdun, Quebec, Canada
- BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Quebec, Canada
- Laboratorio Interdisciplinario del Tiempo y la Experiencia (LITERA), CONICET, Universidad de San Andrés, Victoria, Argentina
- Department of Psychiatry, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
10
|
Borgomaneri S, Vitale F, Battaglia S, de Vega M, Avenanti A. Task-related modulation of motor response to emotional bodies: A TMS motor-evoked potential study. Cortex 2024; 171:235-246. [PMID: 38096756 DOI: 10.1016/j.cortex.2023.10.013] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 09/19/2023] [Accepted: 10/06/2023] [Indexed: 02/12/2024]
Abstract
Exposure to emotional body postures during perceptual decision-making tasks has been linked to transient suppression of motor reactivity, supporting the monitoring of emotionally relevant information. However, it remains unclear whether this effect occurs implicitly, i.e., when emotional information is irrelevant to the task. To investigate this issue, we used single-pulse transcranial magnetic stimulation (TMS) to assess motor excitability while healthy participants were asked to categorize pictures of body expressions as emotional or neutral (emotion recognition task) or as belonging to a male or a female actor (gender recognition task) while receiving TMS over the motor cortex at 100 and 125 ms after picture onset. Results demonstrated that motor-evoked potentials (MEPs) were reduced for emotional body postures relative to neutral postures during the emotion recognition task. Conversely, MEPs increased for emotional body postures relative to neutral postures during the gender recognition task. These findings indicate that motor inhibition, contingent upon observing emotional body postures, is selectively associated with actively monitoring emotional features. In contrast, observing emotional body postures prompts motor facilitation when task-relevant features are non-emotional. These findings contribute to embodied cognition models that link emotion perception and action tendencies.
Collapse
Affiliation(s)
- Sara Borgomaneri
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy.
| | - Francesca Vitale
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Simone Battaglia
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy
| | - Manuel de Vega
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Santa Cruz de Tenerife, Spain
| | - Alessio Avenanti
- Centro studi e ricerche in Neuroscienze Cognitive, Dipartimento di Psicologia "Renzo Canestrari", Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy; Centro de Investigación en Neuropsicología y Neurosciencias Cognitivas, Universidad Católica Del Maule, Talca, Chile.
| |
Collapse
|
11
|
Garrido MV, Godinho S. The influence of consonant wanderings and facial expressions in warmth and competence judgments. Cogn Emot 2023; 37:1272-1280. [PMID: 37675963 DOI: 10.1080/02699931.2023.2253423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 08/18/2023] [Accepted: 08/22/2023] [Indexed: 09/08/2023]
Abstract
ABSTRACTThe preference for usernames whose oral pronunciation implies inward wandering articulatory movements over those involving outward movements - the in-out effect - has been shown to shape person perception judgments. Across three studies, we further tested the boundary conditions to this effect by combining the manipulation of the articulation direction of mock online usernames with one of the most critical cues for interpersonal judgments - facial expressions. As expected, users displaying smiling faces were rated as warmer and more competent than those displaying angry expressions. Notably, even in the presence of such diagnostic cues for social judgment, the articulatory activity involved in pronouncing a person's name still affected the impressions formed, particularly in the warmth dimension. These results show that the in-out effect did not vanish even when highly diagnostic visual information was available. Overall, the current work further emphasises the role of sensorimotor experience in person perception while providing additional evidence for the in-out effect, its boundary conditions, and potential mechanisms.
Collapse
Affiliation(s)
- Margarida V Garrido
- Iscte - Instituto Universitário de Lisboa, Centro de Investigação e Intervenção Social, Lisboa, Portugal
| | - Sandra Godinho
- Iscte - Instituto Universitário de Lisboa, Centro de Investigação e Intervenção Social, Lisboa, Portugal
| |
Collapse
|
12
|
Tsuchiyagaito A, Sánchez SM, Misaki M, Kuplicki R, Park H, Paulus MP, Guinjoan SM. Intensity of repetitive negative thinking in depression is associated with greater functional connectivity between semantic processing and emotion regulation areas. Psychol Med 2023; 53:5488-5499. [PMID: 36043367 PMCID: PMC9973538 DOI: 10.1017/s0033291722002677] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
BACKGROUND Repetitive negative thinking (RNT), a cognitive process that encompasses past (rumination) and future (worry) directed thoughts focusing on negative experiences and the self, is a transdiagnostic construct that is especially relevant for major depressive disorder (MDD). Severe RNT often occurs in individuals with severe levels of MDD, which makes it challenging to disambiguate the neural circuitry underlying RNT from depression severity. METHODS We used a propensity score, i.e., a conditional probability of having high RNT given observed covariates to match high and low RNT individuals who are similar in the severity of depression, anxiety, and demographic characteristics. Of 148 MDD individuals, we matched high and low RNT groups (n = 50/group) and used a data-driven whole-brain voxel-to-voxel connectivity pattern analysis to investigate the resting-state functional connectivity differences between the groups. RESULTS There was an association between RNT and connectivity in the bilateral superior temporal sulcus (STS), an important region for speech processing including inner speech. High relative to low RNT individuals showed greater connectivity between right STS and bilateral anterior insular cortex (AI), and between bilateral STS and left dorsolateral prefrontal cortex (DLPFC). Greater connectivity in those regions was specifically related to RNT but not to depression severity. CONCLUSIONS RNT intensity is directly related to connectivity between STS and AI/DLPFC. This might be a mechanism underlying the role of RNT in perceptive, cognitive, speech, and emotional processing. Future investigations will need to determine whether modifying these connectivities could be a treatment target to reduce RNT.
Collapse
Affiliation(s)
- Aki Tsuchiyagaito
- Laureate Institute for Brain Research, Tulsa, OK, USA
- The University of Tulsa, Tulsa, OK, USA
- Chiba University, Chiba, Japan
| | | | - Masaya Misaki
- Laureate Institute for Brain Research, Tulsa, OK, USA
| | | | - Heekyong Park
- Laureate Institute for Brain Research, Tulsa, OK, USA
- University of North Texas at Dallas, Dallas, TX, USA
| | | | | |
Collapse
|
13
|
Zhang M, Yu L, Zhang K, Du B, Zhan B, Jia S, Chen S, Han F, Li Y, Liu S, Yi X, Liu S, Luo W. Construction and validation of the Dalian emotional movement open-source set (DEMOS). Behav Res Methods 2023; 55:2353-2366. [PMID: 35931937 DOI: 10.3758/s13428-022-01887-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/24/2022] [Indexed: 11/08/2022]
Abstract
Human body movements are important for emotion recognition and social communication and have received extensive attention from researchers. In this field, emotional biological motion stimuli, as depicted by point-light displays, are widely used. However, the number of stimuli in the existing material library is small, and there is a lack of standardized indicators, which subsequently limits experimental design and conduction. Therefore, based on our prior kinematic dataset, we constructed the Dalian Emotional Movement Open-source Set (DEMOS) using computational modeling. The DEMOS has three views (i.e., frontal 0°, left 45°, and left 90°) and in total comprises 2664 high-quality videos of emotional biological motion, each displaying happiness, sadness, anger, fear, disgust, and neutral. All stimuli were validated in terms of recognition accuracy, emotional intensity, and subjective movement. The objective movement for each expression was also calculated. The DEMOS can be downloaded for free from https://osf.io/83fst/ . To our knowledge, this is the largest multi-view emotional biological motion set based on the whole body. The DEMOS can be applied in many fields, including affective computing, social cognition, and psychiatry.
Collapse
Affiliation(s)
- Mingming Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Lu Yu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Keye Zhang
- School of Social and Behavioral Sciences, Nanjing University, Nanjing, 210023, China
| | - Bixuan Du
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Bin Zhan
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Shuxin Jia
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shaohua Chen
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Fengxu Han
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Yiwen Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shuaicheng Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Xi Yi
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China
| | - Shenglan Liu
- School of Innovation and Entrepreneurship, Dalian University of Technology, Dalian, 116024, China.
- Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024, China.
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, 116029, China.
- Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian, 116029, China.
| |
Collapse
|
14
|
Yeung MK. The prefrontal cortex is differentially involved in implicit and explicit facial emotion processing: An fNIRS study. Biol Psychol 2023; 181:108619. [PMID: 37336356 DOI: 10.1016/j.biopsycho.2023.108619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2023] [Revised: 06/14/2023] [Accepted: 06/15/2023] [Indexed: 06/21/2023]
Abstract
Despite extensive research, the differential roles of the prefrontal cortex (PFC) in implicit and explicit facial emotion processing remain elusive. Functional near-infrared spectroscopy (fNIRS) is a neuroimaging technique that can measure changes in both oxyhemoglobin (HbO) and deoxyhemoglobin (HbR) concentrations. Currently, how HbO and HbR change during facial emotion processing remains unclear. Here, fNIRS was used to examine and compare PFC activation during implicit and explicit facial emotion processing. Forty young adults performed a facial-matching task that required either emotion discrimination (explicit task) or age discrimination (implicit task), and the activation of their PFCs was measured by fNIRS. Participants attempted the task on two occasions to determine whether their activation patterns were maintained over time. The PFC displayed increases in HbO and/or decreases in HbR during the implicit and explicit facial emotion tasks. Importantly, there were significantly greater changes in PFC HbO during the explicit task, whereas no significant difference in HbR changes between conditions was found. Between sessions, HbO changes were reduced across tasks, but the difference in HbO changes between the implicit and explicit tasks remained unchanged. The test-retest reliability of the behavioral measures was excellent, whereas that of fNIRS measures was mostly poor to fair. Thus, the PFC plays a specific role in recognizing facial expressions, and its differential involvement in implicit and explicit facial emotion processing can be consistently captured at the group level by changes in HbO. This study demonstrates the potential of fNIRS for elucidating the neural mechanisms underlying facial emotion recognition.
Collapse
Affiliation(s)
- Michael K Yeung
- Department of Psychology, The Education University of Hong Kong, Hong Kong, China; University Research Facility in Behavioral and Systems Neuroscience, The Hong Kong Polytechnic University, Hong Kong, China.
| |
Collapse
|
15
|
Sanders AFP, Harms MP, Kandala S, Marek S, Somerville LH, Bookheimer SY, Dapretto M, Thomas KM, Van Essen DC, Yacoub E, Barch DM. Age-related differences in resting-state functional connectivity from childhood to adolescence. Cereb Cortex 2023; 33:6928-6942. [PMID: 36724055 PMCID: PMC10233258 DOI: 10.1093/cercor/bhad011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Revised: 01/06/2023] [Accepted: 01/07/2023] [Indexed: 02/02/2023] Open
Abstract
The human brain is active at rest, and spontaneous fluctuations in functional MRI BOLD signals reveal an intrinsic functional architecture. During childhood and adolescence, functional networks undergo varying patterns of maturation, and measures of functional connectivity within and between networks differ as a function of age. However, many aspects of these developmental patterns (e.g. trajectory shape and directionality) remain unresolved. In the present study, we characterised age-related differences in within- and between-network resting-state functional connectivity (rsFC) and integration (i.e. participation coefficient, PC) in a large cross-sectional sample of children and adolescents (n = 628) aged 8-21 years from the Lifespan Human Connectome Project in Development. We found evidence for both linear and non-linear differences in cortical, subcortical, and cerebellar rsFC, as well as integration, that varied by age. Additionally, we found that sex moderated the relationship between age and putamen integration where males displayed significant age-related increases in putamen PC compared with females. Taken together, these results provide evidence for complex, non-linear differences in some brain systems during development.
Collapse
Affiliation(s)
- Ashley F P Sanders
- Department of Psychiatry, Washington University School of Medicine, St Louis, MO 63110, USA
| | - Michael P Harms
- Department of Psychiatry, Washington University School of Medicine, St Louis, MO 63110, USA
| | - Sridhar Kandala
- Department of Psychiatry, Washington University School of Medicine, St Louis, MO 63110, USA
| | - Scott Marek
- Department of Radiology, Washington University School of Medicine, St Louis, MO 63119, USA
| | - Leah H Somerville
- Department of Psychology and Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Susan Y Bookheimer
- Department of Psychiatry and Biobehavioral Sciences, University of California Los Angeles School of Medicine, Los Angeles, CA 90095, USA
| | - Mirella Dapretto
- Department of Psychiatry and Biobehavioral Sciences, University of California Los Angeles School of Medicine, Los Angeles, CA 90095, USA
| | - Kathleen M Thomas
- Institute of Child Development, University of Minnesota, Minneapolis, MN 55455, USA
| | - David C Van Essen
- Department of Neuroscience, Washington University School of Medicine, St Louis, MO 63110, USA
| | - Essa Yacoub
- Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, MN 55455, USA
| | - Deanna M Barch
- Department of Psychiatry, Washington University School of Medicine, St Louis, MO 63110, USA
- Department of Psychological and Brain Sciences, Washington University, St Louis, MO 63130, USA
| |
Collapse
|
16
|
Karimi-Boroujeni M, Dajani HR, Giguère C. Perception of Prosody in Hearing-Impaired Individuals and Users of Hearing Assistive Devices: An Overview of Recent Advances. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:775-789. [PMID: 36652704 DOI: 10.1044/2022_jslhr-22-00125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE Prosody perception is an essential component of speech communication and social interaction through which both linguistic and emotional information are conveyed. Considering the importance of the auditory system in processing prosody-related acoustic features, the aim of this review article is to review the effects of hearing impairment on prosody perception in children and adults. It also assesses the performance of hearing assistive devices in restoring prosodic perception. METHOD Following a comprehensive online database search, two lines of inquiry were targeted. The first summarizes recent attempts toward determining the effects of hearing loss and interacting factors such as age and cognitive resources on prosody perception. The second analyzes studies reporting beneficial or detrimental impacts of hearing aids, cochlear implants, and bimodal stimulation on prosodic abilities in people with hearing loss. RESULTS The reviewed studies indicate that hearing-impaired individuals vary widely in perceiving affective and linguistic prosody, depending on factors such as hearing loss severity, chronological age, and cognitive status. In addition, most of the emerging information points to limitations of hearing assistive devices in processing and transmitting the acoustic features of prosody. CONCLUSIONS The existing literature is incomplete in several respects, including the lack of a consensus on how and to what extent hearing prostheses affect prosody perception, especially the linguistic function of prosody, and a gap in assessing prosody under challenging listening situations such as noise. This review article proposes directions that future research could follow to provide a better understanding of prosody processing in those with hearing impairment, which may help health care professionals and designers of assistive technology to develop innovative diagnostic and rehabilitation tools. SUPPLEMENTAL MATERIAL https://doi.org/10.23641/asha.21809772.
Collapse
Affiliation(s)
| | - Hilmi R Dajani
- School of Electrical Engineering and Computer Science, University of Ottawa, Ontario, Canada
| | - Christian Giguère
- School of Rehabilitation Sciences, University of Ottawa, Ontario, Canada
| |
Collapse
|
17
|
Li B, Solanas MP, Marrazzo G, Raman R, Taubert N, Giese M, Vogels R, de Gelder B. A large-scale brain network of species-specific dynamic human body perception. Prog Neurobiol 2023; 221:102398. [PMID: 36565985 DOI: 10.1016/j.pneurobio.2022.102398] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Revised: 11/25/2022] [Accepted: 12/19/2022] [Indexed: 12/24/2022]
Abstract
This ultrahigh field 7 T fMRI study addressed the question of whether there exists a core network of brain areas at the service of different aspects of body perception. Participants viewed naturalistic videos of monkey and human faces, bodies, and objects along with mosaic-scrambled videos for control of low-level features. Independent component analysis (ICA) based network analysis was conducted to find body and species modulations at both the voxel and the network levels. Among the body areas, the highest species selectivity was found in the middle frontal gyrus and amygdala. Two large-scale networks were highly selective to bodies, dominated by the lateral occipital cortex and right superior temporal sulcus (STS) respectively. The right STS network showed high species selectivity, and its significant human body-induced node connectivity was focused around the extrastriate body area (EBA), STS, temporoparietal junction (TPJ), premotor cortex, and inferior frontal gyrus (IFG). The human body-specific network discovered here may serve as a brain-wide internal model of the human body serving as an entry point for a variety of processes relying on body descriptions as part of their more specific categorization, action, or expression recognition functions.
Collapse
Affiliation(s)
- Baichen Li
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Giuseppe Marrazzo
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands
| | - Rajani Raman
- Laboratory for Neuro, and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven 3000, Belgium; Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| | - Nick Taubert
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen 72076, Germany
| | - Martin Giese
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen 72076, Germany
| | - Rufin Vogels
- Laboratory for Neuro, and Psychophysiology, Department of Neurosciences, KU Leuven Medical School, Leuven 3000, Belgium; Leuven Brain Institute, KU Leuven, Leuven 3000, Belgium
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht 6200 MD, the Netherlands; Department of Computer Science, University College London, London WC1E 6BT, UK.
| |
Collapse
|
18
|
Seinfeld S, Hortensius R, Arroyo-Palacios J, Iruretagoyena G, Zapata LE, de Gelder B, Slater M, Sanchez-Vives MV. Domestic Violence From a Child Perspective: Impact of an Immersive Virtual Reality Experience on Men With a History of Intimate Partner Violent Behavior. JOURNAL OF INTERPERSONAL VIOLENCE 2023; 38:2654-2682. [PMID: 35727942 DOI: 10.1177/08862605221106130] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Domestic violence has long-term negative consequences on children. In this study, men with a history of partner aggression and a control group of non-offenders were embodied in a child's body from a first-person perspective in virtual reality (VR). From this perspective, participants witnessed a scene of domestic violence where a male avatar assaulted a female avatar. We evaluated the impact of the experience on emotion recognition skills and heart rate deceleration responses. We found that the experience mainly impacted the recognition of angry facial expressions. The results also indicate that males with a history of partner aggression had larger physiological responses during an explicit violent event (when the virtual abuser threw a telephone) compared with controls, while their physiological reactions were less pronounced when the virtual abuser invaded the victim's personal space. We show that embodiment from a child's perspective during a conflict situation in VR impacts emotion recognition, physiological reactions, and attitudes towards violence. We provide initial evidence of the potential of VR in the rehabilitation and neuropsychological assessment of males with a history of domestic violence, especially in relation to children.
Collapse
Affiliation(s)
- Sofia Seinfeld
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Ruud Hortensius
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
- Department of Psychology, Utrecht University, Utrecht, Netherlands
| | - Jorge Arroyo-Palacios
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
| | - Guillermo Iruretagoyena
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
| | - Luis E Zapata
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| | - Mel Slater
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- 207203Institute of Neurosciences of the University of Barcelona, Barcelona, Spain
| | - Maria V Sanchez-Vives
- 146245Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain
- EVENT Lab, Department of Clinical Psychology and Psychobiology, 207203University of Barcelona, Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| |
Collapse
|
19
|
The bodily fundament of empathy: The role of action, nonaction-oriented, and interoceptive body representations. Psychon Bull Rev 2022:10.3758/s13423-022-02231-9. [PMID: 36510091 DOI: 10.3758/s13423-022-02231-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/27/2022] [Indexed: 12/14/2022]
Abstract
Mental representations with bodily contents or in various bodily formats have been suggested to play a pivotal role in social cognition, including empathy. However, there is a lack of systematic studies investigating, in the same sample of participants and using an individual differences approach, whether and to what extent the sensorimotor, perceptual, and interoceptive representations of the body could fulfill an explanatory role in the empathic abilities.To address this goal, we carried out two studies in which healthy adults were given measures of interoceptive sensibility (IS), action (aBR), and nonaction-oriented body representations (NaBR), and affective, cognitive, and motor empathy. A higher tendency to be self-focused on interoceptive signals predicted higher affective, cognitive, and motor empathy levels. A better performance in tasks probing aBR and NaBR predicted, respectively, higher motor and cognitive empathy levels.These findings support the view that the various facets of the empathic response are differently grounded in the body since they diversely involve representations with a different bodily format.Individual differences in the focus on one's internal body state representation can directly modulate all the components of the empathic experience. Instead, a body representation used interpersonally to represent both one's own body and others' bodies, in particular in its spatial specificity, could be necessary to accurately understand other people's minds (cognitive empathy), while a sensorimotor body representation used to represent both one's own body and others' bodies actions, could be fundamental for the self-awareness of feelings expressed in actions (motor empathy).
Collapse
|
20
|
Dirzyte A, Antanaitis F, Patapas A. Law Enforcement Officers’ Ability to Recognize Emotions: The Role of Personality Traits and Basic Needs’ Satisfaction. Behav Sci (Basel) 2022; 12:bs12100351. [PMID: 36285920 PMCID: PMC9598174 DOI: 10.3390/bs12100351] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Revised: 09/10/2022] [Accepted: 09/19/2022] [Indexed: 12/01/2022] Open
Abstract
Background: This study intended to explore the role of personality traits and basic psychological needs in law enforcement officers’ ability to recognize emotions: anger, joy, sadness, fear, surprise, disgust, and neutral. It was significant to analyze law enforcement officers’ emotion recognition and the contributing factors, as this field has been under-researched despite increased excessive force use by officers in many countries. Methods: This study applied the Big Five–2 (BFI-2), the Basic Psychological Needs Satisfaction and Frustration Scale (BPNSFS), and the Karolinska Directed Emotional Faces set of stimuli (KDEF). The data was gathered using an online questionnaire provided directly to law enforcement agencies. A total of 154 law enforcement officers participated in the study, 50.65% were females, and 49.35% were males. The mean age was 41.2 (age range = 22–61). In order to analyze the data, SEM and multiple linear regression methods were used. Results: This study analyzed variables of motion recognition, personality traits, and needs satisfaction and confirmed that law enforcement officers’ personality traits play a significant role in emotion recognition. Respondents’ agreeableness significantly predicted increased overall emotion recognition; conscientiousness predicted increased anger recognition; joy recognition was significantly predicted by extraversion, neuroticism, and agreeableness. This study also confirmed that law enforcement officers’ basic psychological needs satisfaction/frustration play a significant role in emotion recognition. Respondents’ relatedness satisfaction significantly predicted increased overall emotion recognition, fear recognition, joy recognition, and sadness recognition. Relatedness frustration significantly predicted decreased anger recognition, surprise recognition, and neutral face recognition. Furthermore, this study confirmed links between law enforcement officers’ personality traits, satisfaction/frustration of basic psychological needs, and emotion recognition, χ2 = 57.924; df = 41; p = 0.042; TLI = 0.929; CFI = 0.956; RMSEA = 0.042 [0.009–0.065]. Discussion: The findings suggested that agreeableness, conscientiousness, extraversion, and neuroticism play an essential role in satisfaction and frustration of relatedness needs, which, subsequently, link to emotion recognition. Due to the relatively small sample size, the issues of validity/reliability of some instruments, and other limitations, the results of this study should preferably be regarded with concern.
Collapse
Affiliation(s)
- Aiste Dirzyte
- Institute of Psychology, Mykolas Romeris University, Ateities 20, LT-08303 Vilnius, Lithuania
- Correspondence:
| | - Faustas Antanaitis
- Institute of Psychology, Mykolas Romeris University, Ateities 20, LT-08303 Vilnius, Lithuania
| | - Aleksandras Patapas
- Institute of Public Administration, Mykolas Romeris University, Ateities 20, LT-08303 Vilnius, Lithuania
| |
Collapse
|
21
|
Suci A, Wang HC, Doong HS. Parent‐like spokesperson for campaigning an
anti‐plastic
straw movement to young adults: Is it effective? ASIAN JOURNAL OF SOCIAL PSYCHOLOGY 2022. [DOI: 10.1111/ajsp.12551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Afred Suci
- Department of Business Administration National Taiwan University of Science and Technology Taipei Taiwan
- Department of Management Universitas Lancang Kuning Pekanbaru Indonesia
| | - Hui Chih Wang
- Department of Business Administration National Taiwan University of Science and Technology Taipei Taiwan
| | - Her Sen Doong
- Department of Management Information System National Chiayi University Chiayi Taiwan
| |
Collapse
|
22
|
Neuromodulation of facial emotion recognition in health and disease: A systematic review. Neurophysiol Clin 2022; 52:183-201. [DOI: 10.1016/j.neucli.2022.03.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2021] [Revised: 03/20/2022] [Accepted: 03/21/2022] [Indexed: 11/20/2022] Open
|
23
|
Whitehead JC, Armony JL. Intra-individual Reliability of Voice- and Music-elicited Responses and their Modulation by Expertise. Neuroscience 2022; 487:184-197. [PMID: 35182696 DOI: 10.1016/j.neuroscience.2022.02.011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Revised: 01/19/2022] [Accepted: 02/10/2022] [Indexed: 10/19/2022]
Abstract
A growing number of functional neuroimaging studies have identified regions within the temporal lobe, particularly along the planum polare and planum temporale, that respond more strongly to music than other types of acoustic stimuli, including voice. This "music preferred" regions have been reported using a variety of stimulus sets, paradigms and analysis approaches and their consistency across studies confirmed through meta-analyses. However, the critical question of intra-subject reliability of these responses has received less attention. Here, we directly assessed this important issue by contrasting brain responses to musical vs. vocal stimuli in the same subjects across three consecutive fMRI runs, using different types of stimuli. Moreover, we investigated whether these music- and voice-preferred responses were reliably modulated by expertise. Results demonstrated that music-preferred activity previously reported in temporal regions, and its modulation by expertise, exhibits a high intra-subject reliability. However, we also found that activity in some extra-temporal regions, such as the precentral and middle frontal gyri, did depend on the particular stimuli employed, which may explain why these are less consistently reported in the literature. Taken together, our findings confirm and extend the notion that specific regions in the brain consistently respond more strongly to certain socially-relevant stimulus categories, such as faces, voices and music, but that some of these responses appear to depend, at least to some extent, on the specific features of the paradigm employed.
Collapse
Affiliation(s)
- Jocelyne C Whitehead
- Douglas Mental Health University Institute, Verdun, Canada; BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Canada; Integrated Program in Neuroscience, McGill University, Montreal, Canada.
| | - Jorge L Armony
- Douglas Mental Health University Institute, Verdun, Canada; BRAMS Laboratory, Centre for Research on Brain, Language and Music, Montreal, Canada; Department of Psychiatry, McGill University, Montreal, Canada
| |
Collapse
|
24
|
Ding X, Chen Y, Liu Y, Zhao J, Liu J. The automatic detection of unexpected emotion and neutral body postures: A visual mismatch negativity study. Neuropsychologia 2022; 164:108108. [PMID: 34863799 DOI: 10.1016/j.neuropsychologia.2021.108108] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2021] [Revised: 11/20/2021] [Accepted: 11/30/2021] [Indexed: 11/17/2022]
Abstract
The ability to automatically detect emotional changes in the environment is crucial for social interaction. In the visual system, expression-related mismatch negativity (EMMN) reflects the automatic processing of emotional changes in facial expression. However, body postures also carry visual emotional information that can be recognized effectively and processed automatically, although their processing mechanism remains unknown. In this study, the reverse oddball paradigm was used to investigate the mismatch responses of unexpected fear and neutral body postures. The nonparametric cluster permutation test revealed significant fear and neutral visual mismatch negativity (vMMN) activities, and the fear-related vMMN was enhanced prior (130-230 ms) to the neutral vMMN (180-230 ms). The body-sensitive N190 component may partially account for the vMMN obtained in this study. The fearful body posture evoked a greater N190 response over the neutral body, and amplitudes of N190 were more negative in the deviant condition than the standard condition. Additionally, the body-related visual mismatch oscillatory responses were associated with enhancement of the alpha band oscillation, especially for the fearful body posture. These results expanded the applicable scope of body posture cues corresponding to mismatch signals, objectively defined the electrophysiological activities evoked, and revealed the processing bias toward negative emotion.
Collapse
Affiliation(s)
- Xiaobin Ding
- School of Psychology, Northwest Normal University, Lanzhou, China
| | - Yan Chen
- School of Psychology, Northwest Normal University, Lanzhou, China
| | - Yang Liu
- School of Psychology, Northwest Normal University, Lanzhou, China
| | - Jingjing Zhao
- School of Psychology, Shaanxi Normal University, And Key Laboratory for Behavior and Cognitive Neuroscience of Shaanxi Province, Xi'an, China.
| | - Jianyi Liu
- School of Psychology, Shaanxi Normal University, And Key Laboratory for Behavior and Cognitive Neuroscience of Shaanxi Province, Xi'an, China.
| |
Collapse
|
25
|
Differential beta desynchronisation responses to dynamic emotional facial expressions are attenuated in higher trait anxiety and autism. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2022; 22:1404-1420. [PMID: 35761029 PMCID: PMC9622532 DOI: 10.3758/s13415-022-01015-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 05/18/2022] [Indexed: 01/27/2023]
Abstract
Daily life demands that we differentiate between a multitude of emotional facial expressions (EFEs). The mirror neuron system (MNS) is becoming increasingly implicated as a neural network involved with understanding emotional body expressions. However, the specificity of the MNS's involvement in emotion recognition has remained largely unexplored. This study investigated whether six basic dynamic EFEs (anger, disgust, fear, happiness, sadness, and surprise) would be differentiated through event-related desynchronisation (ERD) of sensorimotor alpha and beta oscillatory activity, which indexes sensorimotor MNS activity. We found that beta ERD differentiated happy, fearful, and sad dynamic EFEs at the central region of interest, but not at occipital regions. Happy EFEs elicited significantly greater central beta ERD relative to fearful and sad EFEs within 800 - 2,000 ms after EFE onset. These differences were source-localised to the primary somatosensory cortex, which suggests they are likely to reflect differential sensorimotor simulation rather than differential attentional engagement. Furthermore, individuals with higher trait anxiety showed less beta ERD differentiation between happy and sad faces. Similarly, individuals with higher trait autism showed less beta ERD differentiation between happy and fearful faces. These findings suggest that the differential simulation of specific affective states is attenuated in individuals with higher trait anxiety and autism. In summary, the MNS appears to support the skills needed for emotion processing in daily life, which may be influenced by certain individual differences. This provides novel evidence for the notion that simulation-based emotional skills may underlie the emotional difficulties that accompany affective disorders, such as anxiety.
Collapse
|
26
|
Mello M, Dupont L, Engelen T, Acciarino A, de Borst AW, de Gelder B. The influence of body expression, group affiliation and threat proximity on interactions in virtual reality. CURRENT RESEARCH IN BEHAVIORAL SCIENCES 2022. [DOI: 10.1016/j.crbeha.2022.100075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022] Open
|
27
|
Quadrelli E, Roberti E, Polver S, Bulf H, Turati C. Sensorimotor Activity and Network Connectivity to Dynamic and Static Emotional Faces in 7-Month-Old Infants. Brain Sci 2021; 11:brainsci11111396. [PMID: 34827394 PMCID: PMC8615901 DOI: 10.3390/brainsci11111396] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 10/20/2021] [Accepted: 10/22/2021] [Indexed: 11/16/2022] Open
Abstract
The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.
Collapse
Affiliation(s)
- Ermanno Quadrelli
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
- Correspondence: ; Tel.: +39-026-448-3775
| | - Elisa Roberti
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| | - Silvia Polver
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
| | - Hermann Bulf
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| | - Chiara Turati
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| |
Collapse
|
28
|
Vandewouw MM, Safar K, Mossad SI, Lu J, Lerch JP, Anagnostou E, Taylor MJ. Do shapes have feelings? Social attribution in children with autism spectrum disorder and attention-deficit/hyperactivity disorder. Transl Psychiatry 2021; 11:493. [PMID: 34564704 PMCID: PMC8464598 DOI: 10.1038/s41398-021-01625-y] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Revised: 08/16/2021] [Accepted: 08/26/2021] [Indexed: 12/14/2022] Open
Abstract
Theory of mind (ToM) deficits are common in children with neurodevelopmental disorders (NDDs), such as autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD), which contribute to their social and cognitive difficulties. The social attribution task (SAT) involves geometrical shapes moving in patterns that depict social interactions and is known to recruit brain regions from the classic ToM network. To better understand ToM in ASD and ADHD children, we examined the neural correlates using the SAT and functional magnetic resonance imaging (fMRI) in a cohort of 200 children: ASD (N = 76), ADHD (N = 74) and typically developing (TD; N = 50) (4-19 years). In the scanner, participants were presented with SAT videos corresponding to social help, social threat, and random conditions. Contrasting social vs. random, the ASD compared with TD children showed atypical activation in ToM brain areas-the middle temporal and anterior cingulate gyri. In the social help vs. social threat condition, atypical activation of the bilateral middle cingulate and right supramarginal and superior temporal gyri was shared across the NDD children, with between-diagnosis differences only being observed in the right fusiform. Data-driven subgrouping identified two distinct subgroups spanning all groups that differed in both their clinical characteristics and brain-behaviour relations with ToM ability.
Collapse
Affiliation(s)
- Marlee M Vandewouw
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, ON, Canada.
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada.
- Autism Research Center, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada.
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada.
| | - Kristina Safar
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, ON, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada
| | - Sarah I Mossad
- Department of Psychology, Hospital for Sick Children, Toronto, ON, Canada
| | - Julie Lu
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, ON, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada
| | - Jason P Lerch
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada
- Wellcome Centre for Integrative Neuroimaging, FMRIB, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
- Department of Medical Biophysics, University of Toronto, Toronto, ON, Canada
| | - Evdokia Anagnostou
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada
- Autism Research Center, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
- Institute of Medical Science, University of Toronto, Toronto, ON, Canada
| | - Margot J Taylor
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, ON, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, ON, Canada
- Department of Medical Imaging, University of Toronto, Toronto, ON, Canada
- Department of Psychology, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
29
|
Marrazzo G, Vaessen MJ, de Gelder B. Decoding the difference between explicit and implicit body expression representation in high level visual, prefrontal and inferior parietal cortex. Neuroimage 2021; 243:118545. [PMID: 34478822 DOI: 10.1016/j.neuroimage.2021.118545] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 08/30/2021] [Accepted: 08/31/2021] [Indexed: 11/28/2022] Open
Abstract
Recent studies provide an increasing understanding of how visual objects categories like faces or bodies are represented in the brain and also raised the question whether a category based or more dynamic network inspired models are more powerful. Two important and so far sidestepped issues in this debate are, first, how major category attributes like the emotional expression directly influence category representation and second, whether category and attribute representation are sensitive to task demands. This study investigated the impact of a crucial category attribute like emotional expression on category area activity and whether this varies with the participants' task. Using (fMRI) we measured BOLD responses while participants viewed whole body expressions and performed either an explicit (emotion) or an implicit (shape) recognition task. Our results based on multivariate methods show that the type of task is the strongest determinant of brain activity and can be decoded in EBA, VLPFC and IPL. Brain activity was higher for the explicit task condition in VLPFC and was not emotion specific. This pattern suggests that during explicit recognition of the body expression, body category representation may be strengthened, and emotion and action related activity suppressed. Taken together these results stress the importance of the task and of the role of category attributes for understanding the functional organization of high level visual cortex.
Collapse
Affiliation(s)
- Giuseppe Marrazzo
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands
| | - Maarten J Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Limburg 6200 MD, Maastricht, the Netherlands; Department of Computer Science, University College London, London WC1E 6BT, United Kingdom.
| |
Collapse
|
30
|
de Gelder B, Poyo Solanas M. A computational neuroethology perspective on body and expression perception. Trends Cogn Sci 2021; 25:744-756. [PMID: 34147363 DOI: 10.1016/j.tics.2021.05.010] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 04/22/2021] [Accepted: 05/24/2021] [Indexed: 01/17/2023]
Abstract
Survival prompts organisms to prepare adaptive behavior in response to environmental and social threat. However, what are the specific features of the appearance of a conspecific that trigger such adaptive behaviors? For social species, the prime candidates for triggering defense systems are the visual features of the face and the body. We propose a novel approach for studying the ability of the brain to gather survival-relevant information from seeing conspecific body features. Specifically, we propose that behaviorally relevant information from bodies and body expressions is coded at the levels of midlevel features in the brain. These levels are relatively independent from higher-order cognitive and conscious perception of bodies and emotions. Instead, our approach is embedded in an ethological framework and mobilizes computational models for feature discovery.
Collapse
Affiliation(s)
- Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands; Department of Computer Science, University College London, London WC1E 6BT, UK.
| | - Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200, MD, The Netherlands
| |
Collapse
|
31
|
Does gaze direction of fearful faces facilitate the processing of threat? An ERP study of spatial precuing effects. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2021; 21:837-851. [PMID: 33846951 DOI: 10.3758/s13415-021-00890-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/03/2021] [Indexed: 01/22/2023]
Abstract
Eye gaze is very important for attentional orienting in social life. By adopting the event-related potential (ERP) technique, we explored whether attentional orienting of eye gaze is modulated by emotional congruency between facial expressions and the targets in a spatial cuing task. Faces with different emotional expressions (fearful/angry/happy/neutral) directing their eye gaze to the left or right were used as cues, indicating the possible location of subsequent targets. Targets were line drawings of animals, which could be either threatening or neutral. Participants indicated by choice responses whether the animal would fit inside a shoebox in real life or not. Reaction times to targets were faster after valid compared with invalid cues, showing the typical eye gaze cuing effect. Analyses of the late positive potential (LPP) elicited by targets revealed a significant modulation of the gaze cuing effect by emotional congruency. Threatening targets elicited larger LPPs when validly cued by gaze in faces with negative (fearful and angry) expressions. Similarly, neutral targets showed larger LPPs when validly cued by faces with neutral expressions. Such effects were not present after happy face cues. Source localization in the LPP time window revealed that for threatening targets, the activity of right medial frontal gyrus could be related to a larger gaze-orienting effect for the fearful than the angry condition. Our findings provide electrophysiological evidence for the modulation of gaze cuing effects by emotional congruency.
Collapse
|
32
|
Styliadis C, Leung R, Özcan S, Moulton EA, Pang E, Taylor MJ, Papadelis C. Atypical spatiotemporal activation of cerebellar lobules during emotional face processing in adolescents with autism. Hum Brain Mapp 2021; 42:2099-2114. [PMID: 33528852 PMCID: PMC8046060 DOI: 10.1002/hbm.25349] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 12/07/2020] [Accepted: 01/09/2021] [Indexed: 01/17/2023] Open
Abstract
Autism spectrum disorder (ASD) is characterized by social deficits and atypical facial processing of emotional expressions. The underlying neuropathology of these abnormalities is still unclear. Recent studies implicate cerebellum in emotional processing; other studies show cerebellar abnormalities in ASD. Here, we elucidate the spatiotemporal activation of cerebellar lobules in ASD during emotional processing of happy and angry faces in adolescents with ASD and typically developing (TD) controls. Using magnetoencephalography, we calculated dynamic statistical parametric maps across a period of 500 ms after emotional stimuli onset and determined differences between group activity to happy and angry emotions. Following happy face presentation, adolescents with ASD exhibited only left‐hemispheric cerebellar activation in a cluster extending from lobule VI to lobule V (compared to TD controls). Following angry face presentation, adolescents with ASD exhibited only midline cerebellar activation (posterior IX vermis). Our findings indicate an early (125–175 ms) overactivation in cerebellar activity only for happy faces and a later overactivation for both happy (250–450 ms) and angry (250–350 ms) faces in adolescents with ASD. The prioritized hemispheric activity (happy faces) could reflect the promotion of a more flexible and adaptive social behavior, while the latter midline activity (angry faces) may guide conforming behavior.
Collapse
Affiliation(s)
- Charis Styliadis
- Laboratory of Medical Physics, School of Medicine, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | | | - Selin Özcan
- Laboratory of Children's Brain Dynamics, Division of Newborn Medicine, Boston Children's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Eric A Moulton
- Center for Pain and the Brain, Department of Anesthesiology, Critical Care and Pain Medicine, Boston Children's Hospital, Harvard Medical School, Boston, Massachusetts, USA.,Department of Ophthalmology, Boston Children's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Elizabeth Pang
- University of Toronto, Toronto, Canada.,Division of Neurology, Hospital for Sick Children, Toronto, Ontario, Canada.,Neurosciences and Mental Health Program, Research Institute, Hospital for Sick Children, Toronto, Canada
| | - Margot J Taylor
- University of Toronto, Toronto, Canada.,Neurosciences and Mental Health Program, Research Institute, Hospital for Sick Children, Toronto, Canada.,Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada.,Autism Research Unit, Hospital for Sick Children, Toronto, Canada
| | - Christos Papadelis
- Jane and John Justin Neurosciences Center, Cook Children's Health Care System, Fort Worth, Texas, USA.,Department of Bioengineering, University of Texas at Arlington, Arlington, Texas, USA.,Department of Pediatrics, TCU and UNTHSC School of Medicine, Fort Worth, Texas, USA
| |
Collapse
|
33
|
Rubin M, Telch MJ. Pupillary Response to Affective Voices: Physiological Responsivity and Posttraumatic Stress Disorder. J Trauma Stress 2021; 34:182-189. [PMID: 32969073 DOI: 10.1002/jts.22574] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/02/2019] [Revised: 04/30/2020] [Accepted: 06/02/2020] [Indexed: 01/10/2023]
Abstract
Posttraumatic stress disorder (PTSD) is related to dysfunctional emotional processing, thus motivating the search for physiological indices that can elucidate this process. Toward this aim, we compared pupillary response patterns in response to angry and fearful auditory stimuli among 99 adults, some with PTSD (n = 14), some trauma-exposed without PTSD (TE; n = 53), and some with no history of trauma exposure (CON; n = 32). We hypothesized that individuals with PTSD would show more pupillary response to angry and fearful auditory stimuli compared to those in the TE and CON groups. Among participants who had experienced a traumatic event, we explored the association between PTSD symptoms and pupillary response; contrary to our prediction, individuals with PTSD displayed the least pupillary response to fearful auditory stimuli compared those in the TE, B = -0.022, p = .077, and CON, B = -0.042, p = .002, groups, but they did not differ on angry auditory stimuli, B = 0.019, p = .118 and B = 0.006, p = .634, respectively. It is important to note that within-group analyses revealed that participants with PTSD differed significantly in their response to angry versus fearful stimuli, B = -0.032, p = .015. We also found a positive association between PTSD symptoms and pupillary response to angry stimuli. Our findings suggest that differential pupil response to anger and fear stimuli may be a promising way to understand emotional processing in PTSD.
Collapse
Affiliation(s)
- Mikael Rubin
- Department of Psychology, University of Texas at Austin, Austin, Texas, USA
| | - Michael J Telch
- Department of Psychology, University of Texas at Austin, Austin, Texas, USA
| |
Collapse
|
34
|
Is the Putative Mirror Neuron System Associated with Empathy? A Systematic Review and Meta-Analysis. Neuropsychol Rev 2020; 31:14-57. [PMID: 32876854 DOI: 10.1007/s11065-020-09452-6] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Accepted: 08/09/2020] [Indexed: 12/18/2022]
Abstract
Theoretical perspectives suggest that the mirror neuron system (MNS) is an important neurobiological contributor to empathy, yet empirical support is mixed. Here, we adopt a summary model for empathy, consisting of motor, emotional, and cognitive components of empathy. This review provides an overview of existing empirical studies investigating the relationship between putative MNS activity and empathy in healthy populations. 52 studies were identified that investigated the association between the MNS and at least one domain of empathy, representing data from 1044 participants. Our results suggest that emotional and cognitive empathy are moderately correlated with MNS activity, however, these domains were mixed and varied across techniques used to acquire MNS activity (TMS, EEG, and fMRI). Few studies investigated motor empathy, and of those, no significant relationships were revealed. Overall, results provide preliminary evidence for a relationship between MNS activity and empathy. However, our findings highlight methodological variability in study design as an important factor in understanding this relationship. We discuss limitations regarding these methodological variations and important implications for clinical and community translations, as well as suggestions for future research.
Collapse
|
35
|
Poyo Solanas M, Vaessen M, de Gelder B. Computation-Based Feature Representation of Body Expressions in the Human Brain. Cereb Cortex 2020; 30:6376-6390. [DOI: 10.1093/cercor/bhaa196] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2020] [Revised: 06/04/2020] [Accepted: 06/26/2020] [Indexed: 01/31/2023] Open
Abstract
Abstract
Humans and other primate species are experts at recognizing body expressions. To understand the underlying perceptual mechanisms, we computed postural and kinematic features from affective whole-body movement videos and related them to brain processes. Using representational similarity and multivoxel pattern analyses, we showed systematic relations between computation-based body features and brain activity. Our results revealed that postural rather than kinematic features reflect the affective category of the body movements. The feature limb contraction showed a central contribution in fearful body expression perception, differentially represented in action observation, motor preparation, and affect coding regions, including the amygdala. The posterior superior temporal sulcus differentiated fearful from other affective categories using limb contraction rather than kinematics. The extrastriate body area and fusiform body area also showed greater tuning to postural features. The discovery of midlevel body feature encoding in the brain moves affective neuroscience beyond research on high-level emotion representations and provides insights in the perceptual features that possibly drive automatic emotion perception.
Collapse
Affiliation(s)
- Marta Poyo Solanas
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Maarten Vaessen
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
| | - Beatrice de Gelder
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Limburg 6200 MD, The Netherlands
- Department of Computer Science, University College London, London WC1E 6BT, UK
| |
Collapse
|
36
|
Huang YA, Dupont P, Van de Vliet L, Jastorff J, Peeters R, Theys T, van Loon J, Van Paesschen W, Van den Stock J, Vandenbulcke M. Network level characteristics in the emotion recognition network after unilateral temporal lobe surgery. Eur J Neurosci 2020; 52:3470-3484. [PMID: 32618060 DOI: 10.1111/ejn.14849] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2019] [Revised: 05/12/2020] [Accepted: 05/27/2020] [Indexed: 02/06/2023]
Abstract
The human amygdala is considered a key region for successful emotion recognition. We recently reported that temporal lobe surgery (TLS), including resection of the amygdala, does not affect emotion recognition performance (Journal of Neuroscience, 2018, 38, 9263). In the present study, we investigate the neural basis of this preserved function at the network level. We use generalized psychophysiological interaction and graph theory indices to investigate network level characteristics of the emotion recognition network in TLS patients and healthy controls. Based on conflicting emotion processing theories, we anticipated two possible outcomes: a substantial increase of the non-amygdalar connections of the emotion recognition network to compensate functionally for the loss of the amygdala, in line with basic emotion theory versus only minor changes in network level properties as predicted by psychological construction theory. We defined the emotion recognition network in the total sample and investigated group differences on five network level indices (i.e. characteristic path length, global efficiency, clustering coefficient, local efficiency and small-worldness). The results did not reveal a significant increase in the left or right temporal lobectomy group (compared to the control group) in any of the graph measures, indicating that preserved behavioural emotion recognition in TLS is not associated with a massive connectivity increase between non-amygdalar nodes at network level. We conclude that the emotion recognition network is robust and functionally able to compensate for structural damage without substantial global reorganization, in line with a psychological construction theory.
Collapse
Affiliation(s)
- Yun-An Huang
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Patrick Dupont
- Department of Neurosciences, Laboratory for Cognitive Neurology, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Laura Van de Vliet
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Jan Jastorff
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Ron Peeters
- Department of Imaging & Pathology, Radiology, KU Leuven, Leuven, Belgium
| | - Tom Theys
- Department of Neurosciences, Research Group Experimental Neurosurgery and Neuroanatomy, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Johannes van Loon
- Department of Neurosciences, Research Group Experimental Neurosurgery and Neuroanatomy, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Wim Van Paesschen
- Department of Neurosciences, Research Group Experimental Neurology, Laboratory for Epilepsy Research, Leuven Brain Institute, KU Leuven, Leuven, Belgium
| | - Jan Van den Stock
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, Leuven, Belgium.,Geriatric Psychiatry, University Psychiatric Center KU Leuven, Leuven, Belgium
| | - Mathieu Vandenbulcke
- Department of Neurosciences, Neuropsychiatry, Leuven Brain Institute, KU Leuven, Leuven, Belgium.,Geriatric Psychiatry, University Psychiatric Center KU Leuven, Leuven, Belgium
| |
Collapse
|
37
|
Steines M, Krautheim JT, Neziroğlu G, Kircher T, Straube B. Conflicting group memberships modulate neural activation in an emotional production-perception network. Cortex 2020; 126:153-172. [DOI: 10.1016/j.cortex.2019.12.020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2019] [Revised: 10/25/2019] [Accepted: 12/18/2019] [Indexed: 11/16/2022]
|
38
|
Dricu M, Frühholz S. A neurocognitive model of perceptual decision-making on emotional signals. Hum Brain Mapp 2020; 41:1532-1556. [PMID: 31868310 PMCID: PMC7267943 DOI: 10.1002/hbm.24893] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2019] [Revised: 11/18/2019] [Accepted: 11/29/2019] [Indexed: 01/09/2023] Open
Abstract
Humans make various kinds of decisions about which emotions they perceive from others. Although it might seem like a split-second phenomenon, deliberating over which emotions we perceive unfolds across several stages of decisional processing. Neurocognitive models of general perception postulate that our brain first extracts sensory information about the world then integrates these data into a percept and lastly interprets it. The aim of the present study was to build an evidence-based neurocognitive model of perceptual decision-making on others' emotions. We conducted a series of meta-analyses of neuroimaging data spanning 30 years on the explicit evaluations of others' emotional expressions. We find that emotion perception is rather an umbrella term for various perception paradigms, each with distinct neural structures that underline task-related cognitive demands. Furthermore, the left amygdala was responsive across all classes of decisional paradigms, regardless of task-related demands. Based on these observations, we propose a neurocognitive model that outlines the information flow in the brain needed for a successful evaluation of and decisions on other individuals' emotions. HIGHLIGHTS: Emotion classification involves heterogeneous perception and decision-making tasks Decision-making processes on emotions rarely covered by existing emotions theories We propose an evidence-based neuro-cognitive model of decision-making on emotions Bilateral brain processes for nonverbal decisions, left brain processes for verbal decisions Left amygdala involved in any kind of decision on emotions.
Collapse
Affiliation(s)
- Mihai Dricu
- Department of PsychologyUniversity of BernBernSwitzerland
| | - Sascha Frühholz
- Cognitive and Affective Neuroscience Unit, Department of PsychologyUniversity of ZurichZurichSwitzerland
- Neuroscience Center Zurich (ZNZ)University of Zurich and ETH ZurichZurichSwitzerland
- Center for Integrative Human Physiology (ZIHP)University of ZurichZurichSwitzerland
| |
Collapse
|
39
|
Amygdala functional connectivity in the acute aftermath of trauma prospectively predicts severity of posttraumatic stress symptoms. Neurobiol Stress 2020; 12:100217. [PMID: 32435666 PMCID: PMC7231977 DOI: 10.1016/j.ynstr.2020.100217] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2019] [Revised: 02/20/2020] [Accepted: 03/27/2020] [Indexed: 12/20/2022] Open
Abstract
Understanding neural mechanisms that confer risk for posttraumatic stress disorder (PTSD) is critical for earlier intervention, yet longitudinal work has been sparse. The amygdala is part of a core network consistently implicated in PTSD symptomology. Most neural models of PTSD have focused on the amygdala's interactions with the dorsal anterior cingulate cortex, ventromedial prefrontal cortex, and hippocampus. However, an increasing number of studies have linked PTSD symptoms to aberrations in amygdala functional connections with other brain regions involved in emotional information processing, self-referential processing, somatosensory processing, visual processing, and motor control. In the current study, trauma-exposed individuals (N = 54) recruited from the emergency department completed a resting state fMRI scan as well as a script-driven trauma recall fMRI task scan two-weeks post-trauma along with demographic, PTSD, and other clinical symptom questionnaires two-weeks and six-months post-trauma. We examined whether amygdala-whole brain functional connectivity (FC) during rest and task could predict six-month post-trauma PTSD symptoms. More negative amygdala-cerebellum and amygdala-postcentral gyrus FC during rest as well as more negative amygdala-postcentral gyrus and amygdala-midcingulate cortex during recall of the trauma memory predicted six-month post-trauma PTSD after controlling for scanner type. Follow-up multiple regression sensitivity analyses controlling for several other relevant predictors of PTSD symptoms, revealed that amygdala-cerebellum FC during rest and amygdala-postcentral gyrus FC during trauma recall were particularly robust predictors of six-month PTSD symptoms. The results extend cross-sectional studies implicating abnormal FC of the amygdala with other brain regions involved in somatosensory processing, motor control, and emotional information processing in PTSD, to the prospective prediction of risk for chronic PTSD. This work may contribute to earlier identification of at-risk individuals and elucidate potential intervention targets.
Collapse
|
40
|
Lima Portugal LC, Alves RDCS, Junior OF, Sanchez TA, Mocaiber I, Volchan E, Smith Erthal F, David IA, Kim J, Oliveira L, Padmala S, Chen G, Pessoa L, Pereira MG. Interactions between emotion and action in the brain. Neuroimage 2020; 214:116728. [PMID: 32199954 PMCID: PMC7485650 DOI: 10.1016/j.neuroimage.2020.116728] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Revised: 03/04/2020] [Accepted: 03/07/2020] [Indexed: 11/16/2022] Open
Abstract
A growing literature supports the existence of interactions between emotion and action in the brain, and the central participation of the anterior midcingulate cortex (aMCC) in this regard. In the present functional magnetic resonance imaging study, we sought to investigate the role of self-relevance during such interactions by varying the context in which threating pictures were presented (with guns pointed towards or away from the observer). Participants performed a simple visual detection task following exposure to such stimuli. Except for voxelwise tests, we adopted a Bayesian analysis framework which evaluated evidence for the hypotheses of interest, given the data, in a continuous fashion. Behaviorally, our results demonstrated a valence by context interaction such that there was a tendency of speeding up responses to targets after viewing threat pictures directed towards the participant. In the brain, interaction patterns that paralleled those observed behaviorally were observed most notably in the middle temporal gyrus, supplementary motor area, precentral gyrus, and anterior insula. In these regions, activity was overall greater during threat conditions relative to neutral ones, and this effect was enhanced in the directed towards context. A valence by context interaction was observed in the aMCC too, where we also observed a correlation (across participants) of evoked responses and reaction time data. Taken together, our study revealed the context-sensitive engagement of motor-related areas during emotional perception, thus supporting the idea that emotion and action interact in important ways in the brain.
Collapse
Affiliation(s)
- Liana Catarina Lima Portugal
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Rita de Cássia Soares Alves
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Orlando Fernandes Junior
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Tiago Arruda Sanchez
- Laboratory of Neuroimaging and Psychophysiology, Department of Radiology, Faculty of Medicine, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Izabela Mocaiber
- Laboratory of Cognitive Psychophysiology, Department of Natural Sciences, Institute of Humanities and Health, Federal Fluminense University, Rio das Ostras, RJ, Brazil
| | - Eliane Volchan
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Fátima Smith Erthal
- Laboratory of Neurobiology II, Institute of Biophysics Carlos Chagas Filho, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, Brazil
| | - Isabel Antunes David
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | - Jongwan Kim
- Department of Psychology, University of Maryland, College Park, MD, USA
| | - Leticia Oliveira
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil
| | | | - Gang Chen
- Scientific and Statistical Computing Core, National Institute of Mental Health, USA
| | - Luiz Pessoa
- Department of Psychology, University of Maryland, College Park, MD, USA; Maryland Neuroimaging Center, University of Maryland, College Park, MD, USA
| | - Mirtes Garcia Pereira
- Department of Physiology and Pharmacology, Laboratory of Neurophysiology of Behavior, Biomedical Institute, Federal Fluminense University, Niterói, RJ, Brazil.
| |
Collapse
|
41
|
Ross P, Atkinson AP. Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories. Front Psychol 2020; 11:309. [PMID: 32194476 PMCID: PMC7063097 DOI: 10.3389/fpsyg.2020.00309] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 02/10/2020] [Indexed: 12/14/2022] Open
Abstract
Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
42
|
Jessen S, Fiedler L, Münte TF, Obleser J. Quantifying the individual auditory and visual brain response in 7-month-old infants watching a brief cartoon movie. Neuroimage 2019; 202:116060. [PMID: 31362048 DOI: 10.1016/j.neuroimage.2019.116060] [Citation(s) in RCA: 40] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Revised: 07/05/2019] [Accepted: 07/26/2019] [Indexed: 11/16/2022] Open
Abstract
Electroencephalography (EEG) continues to be the most popular method to investigate cognitive brain mechanisms in young children and infants. Most infant studies rely on the well-established and easy-to-use event-related brain potential (ERP). As a severe disadvantage, ERP computation requires a large number of repetitions of items from the same stimulus-category, compromising both ERPs' reliability and their ecological validity in infant research. We here explore a way to investigate infant continuous EEG responses to an ongoing, engaging signal (i.e., "neural tracking") by using multivariate temporal response functions (mTRFs), an approach increasingly popular in adult EEG research. N = 52 infants watched a 5-min episode of an age-appropriate cartoon while the EEG signal was recorded. We estimated and validated forward encoding models of auditory-envelope and visual-motion features. We compared individual and group-based ('generic') models of the infant brain response to comparison data from N = 28 adults. The generic model yielded clearly defined response functions for both, the auditory and the motion regressor. Importantly, this response profile was present also on an individual level, albeit with lower precision of the estimate but above-chance predictive accuracy for the modelled individual brain responses. In sum, we demonstrate that mTRFs are a feasible way of analyzing continuous EEG responses in infants. We observe robust response estimates both across and within participants from only 5 min of recorded EEG signal. Our results open ways for incorporating more engaging and more ecologically valid stimulus materials when probing cognitive, perceptual, and affective processes in infants and young children.
Collapse
Affiliation(s)
- Sarah Jessen
- Department of Neurology, University of Lübeck, Lübeck, Germany.
| | - Lorenz Fiedler
- Department of Psychology, University of Lübeck, Lübeck, Germany
| | - Thomas F Münte
- Department of Neurology, University of Lübeck, Lübeck, Germany
| | - Jonas Obleser
- Department of Psychology, University of Lübeck, Lübeck, Germany
| |
Collapse
|
43
|
Addabbo M, Vacaru SV, Meyer M, Hunnius S. 'Something in the way you move': Infants are sensitive to emotions conveyed in action kinematics. Dev Sci 2019; 23:e12873. [PMID: 31144771 DOI: 10.1111/desc.12873] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2018] [Revised: 05/20/2019] [Accepted: 05/27/2019] [Indexed: 11/28/2022]
Abstract
Body movements, as well as faces, communicate emotions. Research in adults has shown that the perception of action kinematics has a crucial role in understanding others' emotional experiences. Still, little is known about infants' sensitivity to body emotional expressions, since most of the research in infancy focused on faces. While there is some first evidence that infants can recognize emotions conveyed in whole-body postures, it is still an open question whether they can extract emotional information from action kinematics. We measured electromyographic (EMG) activity over the muscles involved in happy (zygomaticus major, ZM), angry (corrugator supercilii, CS) and fearful (frontalis, F) facial expressions, while 11-month-old infants observed the same action performed with either happy or angry kinematics. Results demonstrate that infants responded to angry and happy kinematics with matching facial reactions. In particular, ZM activity increased while CS activity decreased in response to happy kinematics and vice versa for angry kinematics. Our results show for the first time that infants can rely on kinematic information to pick up on the emotional content of an action. Thus, from very early in life, action kinematics represent a fundamental and powerful source of information in revealing others' emotional state.
Collapse
Affiliation(s)
- Margaret Addabbo
- Department of Psychology, University of Milano-Bicocca, Milano, Italy
| | - Stefania V Vacaru
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| | - Marlene Meyer
- Department of Psychology, University of Chicago, Chicago, Illinois
| | - Sabine Hunnius
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| |
Collapse
|
44
|
Quadrelli E, Conte S, Macchi Cassia V, Turati C. Emotion in motion: Facial dynamics affect infants' neural processing of emotions. Dev Psychobiol 2019; 61:843-858. [DOI: 10.1002/dev.21860] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2018] [Revised: 02/27/2019] [Accepted: 03/17/2019] [Indexed: 01/14/2023]
Affiliation(s)
- Ermanno Quadrelli
- Department of Psychology University of Milano‐Bicocca Milano Italy
- NeuroMI, Milan Center for Neuroscience Milan Italy
| | - Stefania Conte
- Department of Psychology University of Milano‐Bicocca Milano Italy
- NeuroMI, Milan Center for Neuroscience Milan Italy
| | - Viola Macchi Cassia
- Department of Psychology University of Milano‐Bicocca Milano Italy
- NeuroMI, Milan Center for Neuroscience Milan Italy
| | - Chiara Turati
- Department of Psychology University of Milano‐Bicocca Milano Italy
- NeuroMI, Milan Center for Neuroscience Milan Italy
| |
Collapse
|
45
|
Vetter P, Badde S, Phelps EA, Carrasco M. Emotional faces guide the eyes in the absence of awareness. eLife 2019; 8:43467. [PMID: 30735123 PMCID: PMC6382349 DOI: 10.7554/elife.43467] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Accepted: 02/07/2019] [Indexed: 12/14/2022] Open
Abstract
The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.
Collapse
Affiliation(s)
- Petra Vetter
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Royal Holloway, University of London, Egham, United Kingdom
| | - Stephanie Badde
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| | - Elizabeth A Phelps
- Department of Psychology, Center for Neural Science, New York University, New York, United States.,Department of Psychology, Harvard University, Cambridge, United States
| | - Marisa Carrasco
- Department of Psychology, Center for Neural Science, New York University, New York, United States
| |
Collapse
|
46
|
Whitehead JC, Armony JL. Multivariate fMRI pattern analysis of fear perception across modalities. Eur J Neurosci 2019; 49:1552-1563. [DOI: 10.1111/ejn.14322] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2018] [Revised: 11/23/2018] [Accepted: 12/17/2018] [Indexed: 01/04/2023]
Affiliation(s)
- Jocelyne C. Whitehead
- Douglas Mental Health University Institute Verdun Quebec Canada
- BRAMS LaboratoryCentre for Research on Brain, Language and Music Montreal Quebec Canada
- Integrated Program in NeuroscienceMcGill University Montreal Quebec Canada
| | - Jorge L. Armony
- Douglas Mental Health University Institute Verdun Quebec Canada
- BRAMS LaboratoryCentre for Research on Brain, Language and Music Montreal Quebec Canada
- Department of PsychiatryMcGill University Montreal Quebec Canada
| |
Collapse
|
47
|
Emotional Contrast and Psychological Function Impact Response Inhibition to Threatening Faces. MOTIVATION AND EMOTION 2018; 42:920-930. [PMID: 30581242 DOI: 10.1007/s11031-018-9709-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
Poor inhibitory control over negative emotional information has been identified as a possible contributor to affective disorders, but the distinct effects of emotional contrast and fearful versus angry faces on response inhibition remain unknown. In the present study, young adults completed an emotional go/no-go task involving happy, neutral, and either fearful or angry faces. Results did not reveal differences in accuracy or speed between angry and fearful face conditions. However, responses were slower and indicated poorer inhibition in blocks where threatening faces were paired with happy, versus neutral, faces. Results may reflect cognitive load of emotional valence contrast, such that higher contrast blocks (containing threatening with happy faces) produced more conflict and required more processing than lower contrast blocks (threatening with neutral faces). Preliminary findings also revealed higher anxiety and depression symptoms corresponded with slower responses and worse accuracy, consistent with patterns of adverse impacts of anxiety and depression on response inhibition to threatening faces, even at subclinical levels of symptomatology.
Collapse
|
48
|
Chen C, Martínez RM, Cheng Y. The Developmental Origins of the Social Brain: Empathy, Morality, and Justice. Front Psychol 2018; 9:2584. [PMID: 30618998 PMCID: PMC6302010 DOI: 10.3389/fpsyg.2018.02584] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2018] [Accepted: 12/03/2018] [Indexed: 12/12/2022] Open
Abstract
The social brain is the cornerstone that effectively negotiates and navigates complex social environments and relationships. When mature, these social abilities facilitate the interaction and cooperation with others. Empathy, morality, and justice, among others, are all closely intertwined, yet the relationships between them are quite complex. They are fundamental components of our human nature, and shape the landscape of our social lives. The various facets of empathy, including affective arousal/emotional sharing, empathic concern, and perspective taking, have unique contributions as subcomponents of morality. This review helps understand how basic forms of empathy, morality, and justice are substantialized in early ontogeny. It provides valuable information as to gain new insights into the underlying neurobiological precursors of the social brain, enabling future translation toward therapeutic and medical interventions.
Collapse
Affiliation(s)
- Chenyi Chen
- Department of Physical Medicine and Rehabilitation, National Yang-Ming University Hospital, Yilan, Taiwan.,Graduate Institute of Injury Prevention and Control, College of Public Health, Taipei Medical University, Taipei, Taiwan.,Research Center of Brain and Consciousness, Shuang Ho Hospital, Taipei Medical University, New Taipei City, Taiwan.,Institute of Humanities in Medicine, Taipei Medical University, Taipei, Taiwan
| | - Róger Marcelo Martínez
- Department of Physical Medicine and Rehabilitation, National Yang-Ming University Hospital, Yilan, Taiwan.,Institute of Neuroscience and Brain Research Center, National Yang-Ming University, Taipei, Taiwan
| | - Yawei Cheng
- Department of Physical Medicine and Rehabilitation, National Yang-Ming University Hospital, Yilan, Taiwan.,Institute of Neuroscience and Brain Research Center, National Yang-Ming University, Taipei, Taiwan.,Department of Education and Research, Taipei City Hospital, Taipei, Taiwan
| |
Collapse
|
49
|
Reply to Crivelli et al.: The different faces of fear and threat. Evolutionary and cultural insights. J Hum Evol 2018; 125:193-197. [DOI: 10.1016/j.jhevol.2017.11.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2017] [Revised: 11/22/2017] [Accepted: 11/24/2017] [Indexed: 11/17/2022]
|
50
|
Bachmann J, Munzert J, Krüger B. Neural Underpinnings of the Perception of Emotional States Derived From Biological Human Motion: A Review of Neuroimaging Research. Front Psychol 2018; 9:1763. [PMID: 30298036 PMCID: PMC6160569 DOI: 10.3389/fpsyg.2018.01763] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2018] [Accepted: 08/31/2018] [Indexed: 12/20/2022] Open
Abstract
Research on the perception of biological human motion shows that people are able to infer emotional states by observing body movements. This article reviews the methodology applied in fMRI research on the neural representation of such emotion perception. Specifically, we ask how different stimulus qualities of bodily expressions, individual emotional valence, and task instructions may affect the neural representation of an emotional scene. The review demonstrates the involvement of a variety of brain areas, thereby indicating how well the human brain is adjusted to navigate in multiple social situations. All stimulus categories (i.e., full-light body displays, point-light displays, and avatars) can induce an emotional percept and are associated with increased activation in an extensive neural network. This network seems to be organized around areas belonging to the so-called action observation network (PMC, IFG, and IPL) and the mentalizing network (TPJ, TP, dmPFC, and lOFC) as well as areas processing body form and motion (e.g., EBA, FBA, and pSTS). Furthermore, emotion-processing brain sites such as the amygdala and the hypothalamus seem to play an important role during the observation of emotional body expressions. Whereas most brain regions clearly display an increased response to emotional body movements in general, some structures respond selectively to negative valence. Moreover, neural activation seems to depend on task characteristics, indicating that certain structures are activated even when attention is shifted away from emotional body movements.
Collapse
Affiliation(s)
- Julia Bachmann
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| | - Jörn Munzert
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| | - Britta Krüger
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| |
Collapse
|