1
|
Xing Z, Guo T, Ren L, Schwieter JW, Liu H. Spatiotemporal evidence uncovers differential neural activity patterns in cognitive and affective conflict control. Behav Brain Res 2023; 451:114522. [PMID: 37268253 DOI: 10.1016/j.bbr.2023.114522] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Revised: 05/25/2023] [Accepted: 05/30/2023] [Indexed: 06/04/2023]
Abstract
Studies have shown that there are overlapping neural bases for cognitive and affective conflict control, but whether the neural activity patterns caused by the two types of conflict are similar remains to be explored. The present study utilizes electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) to temporally and spatially analyze the differences between cognitive and affective conflict control. We employ a semantic conflict task which includes blocks of cognitive and affective judgements primed by conflicting and non-conflicting contexts. The results showed a typical neural conflict effect in the cognitive judgment blocks as reflected by greater amplitudes of P2, N400, and the late positive potential (LPP), as well as greater activation of the left pre-supplementary motor area (pre-SMA) and the right inferior frontal gyrus (IFG) in the conflict condition relative to the non-conflict condition. These patterns did not emerge in the affective judgments, but instead, showed reversed effects of the LPP and in the left SMA. Taken together, these findings suggest that cognitive and affective conflict control result in different neural activity patterns.
Collapse
Affiliation(s)
- Zehui Xing
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, 116029 Dalian, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, Liaoning Province 116029, China
| | - Tingting Guo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, 116029 Dalian, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, Liaoning Province 116029, China
| | - Lanlan Ren
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, 116029 Dalian, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, Liaoning Province 116029, China
| | - John W Schwieter
- Language Acquisition, Multilingualism, and Cognition Laboratory / Bilingualism Matters @ Wilfrid Laurier University, Canada; Department of Linguistics and Languages, McMaster University, Canada
| | - Huanhuan Liu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, 116029 Dalian, China; Key Laboratory of Brain and Cognitive Neuroscience, Dalian, Liaoning Province 116029, China.
| |
Collapse
|
2
|
Sun S, Yu H, Wang S, Yu R. Cognitive and neural bases of visual-context-guided decision-making. Neuroimage 2023; 275:120170. [PMID: 37192677 PMCID: PMC10868706 DOI: 10.1016/j.neuroimage.2023.120170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Revised: 05/12/2023] [Accepted: 05/13/2023] [Indexed: 05/18/2023] Open
Abstract
Humans adjust their behavioral strategies based on feedback, a process that may depend on intrinsic preferences and contextual factors such as visual salience. In this study, we hypothesized that decision-making based on visual salience is influenced by habitual and goal-directed processes, which can be evidenced by changes in attention and subjective valuation systems. To test this hypothesis, we conducted a series of studies to investigate the behavioral and neural mechanisms underlying visual salience-driven decision-making. We first established the baseline behavioral strategy without salience in Experiment 1 (n = 21). We then highlighted the utility or performance dimension of the chosen outcome using colors in Experiment 2 (n = 30). We demonstrated that the difference in staying frequency increased along the salient dimension, confirming a salience effect. Furthermore, the salience effect was abolished when directional information was removed in Experiment 3 (n = 28), suggesting that the salience effect is feedback-specific. To generalize our findings, we replicated the feedback-specific salience effects using eye-tracking and text emphasis. The fixation differences between the chosen and unchosen values were enhanced along the feedback-specific salient dimension in Experiment 4 (n = 48) but unchanged after removing feedback-specific information in Experiment 5 (n = 32). Moreover, the staying frequency was correlated with fixation properties, confirming that salience guides attention deployment. Lastly, our neuroimaging study (Experiment 6, n = 25) showed that the striatum subregions encoded salience-based outcome evaluation, while the vmPFC encoded salience-based behavioral adjustments. The connectivity of the vmPFC-ventral striatum accounted for individual differences in utility-driven, whereas the vmPFC-dmPFC for performance-driven behavioral adjustments. Together, our results provide a neurocognitive account of how task-irrelevant visual salience drives decision-making by involving attention and the frontal-striatal valuation systems. PUBLIC SIGNIFICANCE STATEMENT: Humans may use the current outcome to make behavior adjustments. How this occurs may depend on stable individual preferences and contextual factors, such as visual salience. Under the hypothesis that visual salience determines attention and subsequently modulates subjective valuation, we investigated the underlying behavioral and neural bases of visual-context-guided outcome evaluation and behavioral adjustments. Our findings suggest that the reward system is orchestrated by visual context and highlight the critical role of attention and the frontal-striatal neural circuit in visual-context-guided decision-making that may involve habitual and goal-directed processes.
Collapse
Affiliation(s)
- Sai Sun
- Frontier Research Institute for Interdisciplinary Sciences, Tohoku University, 6-3 Aramaki Aoba, Aoba-ku, Sendai, 980-8578, Japan; Research Institute of Electrical Communication, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai, 980-8577, Japan.
| | - Hongbo Yu
- Department of Psychological & Brain Sciences, University of California Santa Barbara, Santa Barbara, CA 93106, USA.
| | - Shuo Wang
- Department of Radiology, Washington University in St. Louis, MO 63110, USA.
| | - Rongjun Yu
- Department of Management, Marketing, and Information Systems, Hong Kong Baptist University, Kowloon Tong, HKSAR, Hong Kong
| |
Collapse
|
3
|
Pinheiro AP, Sarzedas J, Roberto MS, Kotz SA. Attention and emotion shape self-voice prioritization in speech processing. Cortex 2023; 158:83-95. [PMID: 36473276 DOI: 10.1016/j.cortex.2022.10.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2022] [Revised: 09/27/2022] [Accepted: 10/06/2022] [Indexed: 01/18/2023]
Abstract
Both self-voice and emotional speech are salient signals that are prioritized in perception. Surprisingly, self-voice perception has been investigated to a lesser extent than the self-face. Therefore, it remains to be clarified whether self-voice prioritization is boosted by emotion, and whether self-relevance and emotion interact differently when attention is focused on who is speaking vs. what is being said. Thirty participants listened to 210 prerecorded words spoken in one's own or an unfamiliar voice and differing in emotional valence in two tasks, manipulating the attention focus on either speaker identity or speech emotion. Event-related potentials (ERP) of the electroencephalogram (EEG) informed on the temporal dynamics of self-relevance, emotion, and attention effects. Words spoken in one's own voice elicited a larger N1 and Late Positive Potential (LPP), but smaller N400. Identity and emotion interactively modulated the P2 (self-positivity bias) and LPP (self-negativity bias). Attention to speaker identity modulated more strongly ERP responses within 600 ms post-word onset (N1, P2, N400), whereas attention to speech emotion altered the late component (LPP). However, attention did not modulate the interaction of self-relevance and emotion. These findings suggest that the self-voice is prioritized for neural processing at early sensory stages, and that both emotion and attention shape self-voice prioritization in speech processing. They also confirm involuntary processing of salient signals (self-relevance and emotion) even in situations in which attention is deliberately directed away from those cues. These findings have important implications for a better understanding of symptoms thought to arise from aberrant self-voice monitoring such as auditory verbal hallucinations.
Collapse
Affiliation(s)
- Ana P Pinheiro
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal; Basic and Applied NeuroDynamics Lab, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| | - João Sarzedas
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal
| | - Magda S Roberto
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, Lisboa, Portugal
| | - Sonja A Kotz
- Basic and Applied NeuroDynamics Lab, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
4
|
Martins I, Lima CF, Pinheiro AP. Enhanced salience of musical sounds in singers and instrumentalists. COGNITIVE, AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2022; 22:1044-1062. [PMID: 35501427 DOI: 10.3758/s13415-022-01007-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 04/10/2022] [Indexed: 06/14/2023]
Abstract
Music training has been linked to facilitated processing of emotional sounds. However, most studies have focused on speech, and less is known about musicians' brain responses to other emotional sounds and in relation to instrument-specific experience. The current study combined behavioral and EEG methods to address two novel questions related to the perception of auditory emotional cues: whether and how long-term music training relates to a distinct emotional processing of nonverbal vocalizations and music; and whether distinct training profiles (vocal vs. instrumental) modulate brain responses to emotional sounds from early to late processing stages. Fifty-eight participants completed an EEG implicit emotional processing task, in which musical and vocal sounds differing in valence were presented as nontarget stimuli. After this task, participants explicitly evaluated the same sounds regarding the emotion being expressed, their valence, and arousal. Compared with nonmusicians, musicians displayed enhanced salience detection (P2), attention orienting (P3), and elaborative processing (Late Positive Potential) of musical (vs. vocal) sounds in event-related potential (ERP) data. The explicit evaluation of musical sounds also was distinct in musicians: accuracy in the emotional recognition of musical sounds was similar across valence types in musicians, who also judged musical sounds to be more pleasant and more arousing than nonmusicians. Specific profiles of music training (singers vs. instrumentalists) did not relate to differences in the processing of vocal vs. musical sounds. Together, these findings reveal that music has a privileged status in the auditory system of long-term musically trained listeners, irrespective of their instrument-specific experience.
Collapse
Affiliation(s)
- Inês Martins
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, 1649-013, Lisbon, Portugal
| | - César F Lima
- Instituto Universitário de Lisboa (ISCTE-IUL), Lisbon, Portugal
| | - Ana P Pinheiro
- CICPSI, Faculdade de Psicologia, Universidade de Lisboa, 1649-013, Lisbon, Portugal.
| |
Collapse
|
5
|
The Time Course of Emotional Authenticity Detection in Nonverbal Vocalizations. Cortex 2022; 151:116-132. [DOI: 10.1016/j.cortex.2022.02.016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Revised: 12/23/2021] [Accepted: 02/16/2022] [Indexed: 11/24/2022]
|
6
|
Haigh SM, Brosseau P, Eack SM, Leitman DI, Salisbury DF, Behrmann M. Hyper-Sensitivity to Pitch and Poorer Prosody Processing in Adults With Autism: An ERP Study. Front Psychiatry 2022; 13:844830. [PMID: 35693971 PMCID: PMC9174755 DOI: 10.3389/fpsyt.2022.844830] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 04/20/2022] [Indexed: 01/30/2023] Open
Abstract
Individuals with autism typically experience a range of symptoms, including abnormal sensory sensitivities. However, there are conflicting reports on the sensory profiles that characterize the sensory experience in autism that often depend on the type of stimulus. Here, we examine early auditory processing to simple changes in pitch and later auditory processing of more complex emotional utterances. We measured electroencephalography in 24 adults with autism and 28 controls. First, tones (1046.5Hz/C6, 1108.7Hz/C#6, or 1244.5Hz/D#6) were repeated three times or nine times before the pitch changed. Second, utterances of delight or frustration were repeated three or six times before the emotion changed. In response to the simple pitched tones, the autism group exhibited larger mismatch negativity (MMN) after nine standards compared to controls and produced greater trial-to-trial variability (TTV). In response to the prosodic utterances, the autism group showed smaller P3 responses when delight changed to frustration compared to controls. There was no significant correlation between ERPs to pitch and ERPs to prosody. Together, this suggests that early auditory processing is hyper-sensitive in autism whereas later processing of prosodic information is hypo-sensitive. The impact the different sensory profiles have on perceptual experience in autism may be key to identifying behavioral treatments to reduce symptoms.
Collapse
Affiliation(s)
- Sarah M Haigh
- Department of Psychology and Institute for Neuroscience, University of Nevada, Reno, NV, United States.,Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, United States
| | - Pat Brosseau
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, United States
| | - Shaun M Eack
- School of Social Work, University of Pittsburgh, Pittsburgh, PA, United States
| | - David I Leitman
- Division of Translational Research, National Institute of Mental Health, Bethesda, MD, United States
| | - Dean F Salisbury
- Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, PA, United States
| | - Marlene Behrmann
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, United States.,Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA, United States
| |
Collapse
|
7
|
Castiajo P, Pinheiro AP. Attention to voices is increased in non-clinical auditory verbal hallucinations irrespective of salience. Neuropsychologia 2021; 162:108030. [PMID: 34563552 DOI: 10.1016/j.neuropsychologia.2021.108030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 09/17/2021] [Accepted: 09/20/2021] [Indexed: 11/24/2022]
Abstract
Alterations in the processing of vocal emotions have been associated with both clinical and non-clinical auditory verbal hallucinations (AVH), suggesting that changes in the mechanisms underpinning voice perception contribute to AVH. These alterations seem to be more pronounced in psychotic patients with AVH when attention demands increase. However, it remains to be clarified how attention modulates the processing of vocal emotions in individuals without clinical diagnoses who report hearing voices but no related distress. Using an active auditory oddball task, the current study clarified how emotion and attention interact during voice processing as a function of AVH proneness, and examined the contributions of stimulus valence and intensity. Participants with vs. without non-clinical AVH were presented with target vocalizations differing in valence (neutral; positive; negative) and intensity (55 decibels (dB); 75 dB). The P3b amplitude was larger in response to louder (vs. softer) vocal targets irrespective of valence, and in response to negative (vs. neutral) vocal targets irrespective of intensity. Of note, the P3b amplitude was globally increased in response to vocal targets in participants reporting AVH, and failed to be modulated by valence and intensity in these participants. These findings suggest enhanced voluntary attention to changes in vocal expressions but reduced discrimination of salient and non-salient cues. A decreased sensitivity to salience cues of vocalizations could contribute to increased cognitive control demands, setting the stage for an AVH.
Collapse
Affiliation(s)
- Paula Castiajo
- Psychological Neuroscience Laboratory, CIPsi, School of Psychology, University of Minho, Braga, Portugal
| | - Ana P Pinheiro
- Faculdade de Psicologia, CICPSI, Universidade de Lisboa, Lisboa, Portugal; Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| |
Collapse
|
8
|
Pinheiro AP, Schwartze M, Kotz SA. Cerebellar circuitry and auditory verbal hallucinations: An integrative synthesis and perspective. Neurosci Biobehav Rev 2020; 118:485-503. [DOI: 10.1016/j.neubiorev.2020.08.004] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2020] [Revised: 06/30/2020] [Accepted: 08/07/2020] [Indexed: 02/06/2023]
|
9
|
Where Sounds Occur Matters: Context Effects Influence Processing of Salient Vocalisations. Brain Sci 2020; 10:brainsci10070429. [PMID: 32640750 PMCID: PMC7407900 DOI: 10.3390/brainsci10070429] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2020] [Revised: 06/26/2020] [Accepted: 07/02/2020] [Indexed: 11/23/2022] Open
Abstract
The social context in which a salient human vocalisation is heard shapes the affective information it conveys. However, few studies have investigated how visual contextual cues lead to differential processing of such vocalisations. The prefrontal cortex (PFC) is implicated in processing of contextual information and evaluation of saliency of vocalisations. Using functional Near-Infrared Spectroscopy (fNIRS), we investigated PFC responses of young adults (N = 18) to emotive infant and adult vocalisations while they passively viewed the scenes of two categories of environmental contexts: a domestic environment (DE) and an outdoors environment (OE). Compared to a home setting (DE) which is associated with a fixed mental representation (e.g., expect seeing a living room in a typical house), the outdoor setting (OE) is more variable and less predictable, thus might demand greater processing effort. From our previous study in Azhari et al. (2018) that employed the same experimental paradigm, the OE context was found to elicit greater physiological arousal compared to the DE context. Similarly, we hypothesised that greater PFC activation will be observed when salient vocalisations are paired with the OE compared to the DE condition. Our finding supported this hypothesis: the left rostrolateral PFC, an area of the brain that facilitates relational integration, exhibited greater activation in the OE than DE condition which suggests that greater cognitive resources are required to process outdoor situational information together with salient vocalisations. The result from this study bears relevance in deepening our understanding of how contextual information differentially modulates the processing of salient vocalisations.
Collapse
|
10
|
Top-down Effects on Empathy for Pain in Adults with Autistic Traits. Sci Rep 2019; 9:8022. [PMID: 31142776 PMCID: PMC6541648 DOI: 10.1038/s41598-019-44400-2] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 05/16/2019] [Indexed: 01/30/2023] Open
Abstract
While empathic responses of individuals with autism-spectrum disorder have been reported to be modulated by top-down attention, it remains unclear whether empathy for pain in typically developing individuals with autistic traits also involves such top-down modulation mechanisms. This study employed the autism-spectrum quotient (AQ) to quantify autistic traits in a group of 1,231 healthy adults. Two subset groups (High-AQ and Low-AQ groups) were randomly selected from the highest and lowest 10% AQ scores respectively. We explored whether participants in both groups would differ in their response to others’ pain when their attention was directed toward (A-P tasks) or away (A-N tasks) from pain cues in auditory and visual experimental modalities. Compared to Low-AQ individuals, High-AQ individuals exhibited more suppressed N1 and P2 amplitudes in response to painful vocal cues in auditory A-N tasks. This suggests suppressed attentional and emotional processes of empathy for pain when High-AQ individuals have their attention directed away from others’ pain cues. No significant difference was found between both groups in the auditory A-P task, nor in the visual A-P and A-N tasks. These results suggest that top-down attention modulation of cortical empathic responses to others’ vocal pain is influenced by autistic traits.
Collapse
|
11
|
Altered attentional processing of happy prosody in schizophrenia. Schizophr Res 2019; 206:217-224. [PMID: 30554811 DOI: 10.1016/j.schres.2018.11.024] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Revised: 11/17/2018] [Accepted: 11/19/2018] [Indexed: 11/21/2022]
Abstract
BACKGROUND Abnormalities in emotional prosody processing have been consistently reported in schizophrenia. Emotionally salient changes in vocal expressions attract attention in social interactions. However, it remains to be clarified how attention and emotion interact during voice processing in schizophrenia. The current study addressed this question by examining the P3b event-related potential (ERP) component. METHOD The P3b was elicited with a modified oddball task, in which frequent (p = .84) neutral stimuli were intermixed with infrequent (p = .16) task-relevant emotional (happy or angry) targets. Prosodic speech was presented in two conditions - with intelligible (semantic content condition - SCC) or unintelligible semantic content (prosody-only condition - POC). Fifteen chronic schizophrenia patients and 15 healthy controls were instructed to silently count the target vocal sounds. RESULTS Compared to controls, P3b amplitude was specifically reduced for happy prosodic stimuli in schizophrenia, irrespective of semantic status. Groups did not differ in the processing of neutral standards or angry targets. DISCUSSION The selectively reduced P3b for happy prosody in schizophrenia suggests top-down attentional resources were less strongly engaged by positive relative to negative prosody, reflecting alterations in the evaluation of the emotional salience of the voice. These results highlight the role played by higher-order processes in emotional prosody dysfunction in schizophrenia.
Collapse
|
12
|
Liu Y, Meng J, Yao M, Ye Q, Fan B, Peng W. Hearing other's pain is associated with sensitivity to physical pain: An ERP study. Biol Psychol 2019; 145:150-158. [PMID: 30914209 DOI: 10.1016/j.biopsycho.2019.03.011] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Revised: 03/17/2019] [Accepted: 03/21/2019] [Indexed: 12/30/2022]
Abstract
Numerous studies have demonstrated an overlap between the processing of self-pain and others' pain, which suggests that psychological and neural representations are shared between the perception of physical pain and empathy for pain. As hearing emotional exclamations is a common way in which we regularly perceive and empathize with others' pain, the present study aimed to investigate the link between sensitivity to physical pain and the sounds made by others in pain. We recorded event-related potential (ERP) responses to another person's vocalizations (neutral or painful intonation) and identified electrophysiological responses associated with the processing of painful sounds. Additionally, individual pain sensitivity was characterized by a stimulus-response function that described the relationship between objective stimulus intensity and subjective pain intensity. Results showed that compared with hearing others' neutral sounds, hearing others' sounds of pain elicited more positive frontal-central N1 and N2 responses as well as more positive central-parietal P3 and late positive potential responses. These electrophysiological responses to hearing others' pain replicated electrophysiological responses to observing pictures and video clips of people in pain. Importantly, the neural responses to hearing others in pain were associated with physical pain sensitivity that was indexed by stimulus-response characteristics. The identified link between perception of one's own physical pain and the sounds of others in pain further supports the shared common psychological computations between processing one's own pain and empathizing with others' pain.
Collapse
Affiliation(s)
- Yang Liu
- College of Psychology and Sociology, Shenzhen Key Laboratory of Affective and Social Cognitive Science, Shenzhen University, Shenzhen, China
| | - Jing Meng
- Key Laboratory of Applied Psychology, Chongqing Normal University, Chongqing, China
| | - Manlin Yao
- College of Psychology and Sociology, Shenzhen Key Laboratory of Affective and Social Cognitive Science, Shenzhen University, Shenzhen, China
| | - Qian Ye
- College of Psychology and Sociology, Shenzhen Key Laboratory of Affective and Social Cognitive Science, Shenzhen University, Shenzhen, China
| | - Bi Fan
- College of Management, Shenzhen University, Shenzhen, China
| | - Weiwei Peng
- College of Psychology and Sociology, Shenzhen Key Laboratory of Affective and Social Cognitive Science, Shenzhen University, Shenzhen, China.
| |
Collapse
|
13
|
Burra N, Kerzel D, Munoz Tord D, Grandjean D, Ceravolo L. Early spatial attention deployment toward and away from aggressive voices. Soc Cogn Affect Neurosci 2019; 14:73-80. [PMID: 30418635 PMCID: PMC6318470 DOI: 10.1093/scan/nsy100] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Accepted: 11/07/2018] [Indexed: 01/29/2023] Open
Abstract
Salient vocalizations, especially aggressive voices, are believed to attract attention due to an automatic threat detection system. However, studies assessing the temporal dynamics of auditory spatial attention to aggressive voices are missing. Using event-related potential markers of auditory spatial attention (N2ac and LPCpc), we show that attentional processing of threatening vocal signals is enhanced at two different stages of auditory processing. As early as 200 ms post-stimulus onset, attentional orienting/engagement is enhanced for threatening as compared to happy vocal signals. Subsequently, as early as 400 ms post-stimulus onset, the reorienting of auditory attention to the center of the screen (or disengagement from the target) is enhanced. This latter effect is consistent with the need to optimize perception by balancing the intake of stimulation from left and right auditory space. Our results extend the scope of theories from the visual to the auditory modality by showing that threatening stimuli also bias early spatial attention in the auditory modality. Attentional enhancement was only present in female and not in male participants.
Collapse
Affiliation(s)
- Nicolas Burra
- Faculté de Psychologie et des Sciences de l'Education, University of Geneva, Geneva, Switzerland
| | - Dirk Kerzel
- Faculté de Psychologie et des Sciences de l'Education, University of Geneva, Geneva, Switzerland
| | - David Munoz Tord
- Faculté de Psychologie et des Sciences de l'Education, University of Geneva, Geneva, Switzerland
| | - Didier Grandjean
- Faculté de Psychologie et des Sciences de l'Education, University of Geneva, Geneva, Switzerland.,Neuroscience of Emotion and Affective Dynamics Lab, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Swizerland
| | - Leonardo Ceravolo
- Faculté de Psychologie et des Sciences de l'Education, University of Geneva, Geneva, Switzerland.,Neuroscience of Emotion and Affective Dynamics Lab, University of Geneva, Geneva, Switzerland.,Swiss Center for Affective Sciences, University of Geneva, Geneva, Swizerland
| |
Collapse
|
14
|
Lima CF, Anikin A, Monteiro AC, Scott SK, Castro SL. Automaticity in the recognition of nonverbal emotional vocalizations. ACTA ACUST UNITED AC 2018; 19:219-233. [PMID: 29792444 DOI: 10.1037/emo0000429] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The ability to perceive the emotions of others is crucial for everyday social interactions. Important aspects of visual socioemotional processing, such as the recognition of facial expressions, are known to depend on largely automatic mechanisms. However, whether and how properties of automaticity extend to the auditory domain remains poorly understood. Here we ask if nonverbal auditory emotion recognition is a controlled deliberate or an automatic efficient process, using vocalizations such as laughter, crying, and screams. In a between-subjects design (N = 112), and covering eight emotions (four positive), we determined whether emotion recognition accuracy (a) is improved when participants actively deliberate about their responses (compared with when they respond as fast as possible) and (b) is impaired when they respond under low and high levels of cognitive load (concurrent task involving memorizing sequences of six or eight digits, respectively). Response latencies were also measured. Mixed-effects models revealed that recognition accuracy was high across emotions, and only minimally affected by deliberation and cognitive load; the benefits of deliberation and costs of cognitive load were significant mostly for positive emotions, notably amusement/laughter, and smaller or absent for negative ones; response latencies did not suffer under low or high cognitive load; and high recognition accuracy (approximately 90%) could be reached within 500 ms after the stimulus onset, with performance exceeding chance-level already between 300 and 360 ms. These findings indicate that key features of automaticity, namely fast and efficient/effortless processing, might be a modality-independent component of emotion recognition. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Collapse
Affiliation(s)
- César F Lima
- Faculty of Psychology and Education Sciences, University of Porto
| | - Andrey Anikin
- Division of Cognitive Science, Department of Philosophy, Lund University
| | | | - Sophie K Scott
- Institute of Cognitive Neuroscience, University College London
| | - São Luís Castro
- Faculty of Psychology and Education Sciences, University of Porto
| |
Collapse
|
15
|
Schirmer A, Gunter TC. Temporal signatures of processing voiceness and emotion in sound. Soc Cogn Affect Neurosci 2018; 12:902-909. [PMID: 28338796 PMCID: PMC5472162 DOI: 10.1093/scan/nsx020] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 02/07/2017] [Indexed: 12/22/2022] Open
Abstract
This study explored the temporal course of vocal and emotional sound processing. Participants detected rare repetitions in a stimulus stream comprising neutral and surprised non-verbal exclamations and spectrally rotated control sounds. Spectral rotation preserved some acoustic and emotional properties of the vocal originals. Event-related potentials elicited to unrepeated sounds revealed effects of voiceness and emotion. Relative to non-vocal sounds, vocal sounds elicited a larger centro-parietally distributed N1. This effect was followed by greater positivity to vocal relative to non-vocal sounds beginning with the P2 and extending throughout the recording epoch (N4, late positive potential) with larger amplitudes in female than in male listeners. Emotion effects overlapped with the voiceness effects but were smaller and differed topographically. Voiceness and emotion interacted only for the late positive potential, which was greater for vocal-emotional as compared with all other sounds. Taken together, these results point to a multi-stage process in which voiceness and emotionality are represented independently before being integrated in a manner that biases responses to stimuli with socio-emotional relevance.
Collapse
Affiliation(s)
- Annett Schirmer
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Department of Psychology, Chinese University of Hong Kong, Hong Kong
| | - Thomas C Gunter
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
16
|
Amorim M, Pinheiro AP. Is the sunny side up and the dark side down? Effects of stimulus type and valence on a spatial detection task. Cogn Emot 2018; 33:346-360. [PMID: 29564964 DOI: 10.1080/02699931.2018.1452718] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
In verbal communication, affective information is commonly conveyed to others through spatial terms (e.g. in "I am feeling down", negative affect is associated with a lower spatial location). This study used a target location discrimination task with neutral, positive and negative stimuli (words, facial expressions, and vocalizations) to test the automaticity of the emotion-space association, both in the vertical and horizontal spatial axes. The effects of stimulus type on emotion-space representations were also probed. A congruency effect (reflected in reaction times) was observed in the vertical axis: detection of upper targets preceded by positive stimuli was faster. This effect occurred for all stimulus types, indicating that the emotion-space association is not dependent on sensory modality and on the verbal content of affective stimuli.
Collapse
Affiliation(s)
- Maria Amorim
- a Faculdade de Psicologia, Universidade de Lisboa Lisboa , Portugal.,b School of Psychology, University of Minho , Braga , Portugal
| | - Ana P Pinheiro
- a Faculdade de Psicologia, Universidade de Lisboa Lisboa , Portugal.,b School of Psychology, University of Minho , Braga , Portugal
| |
Collapse
|
17
|
Pinheiro AP, Barros C, Dias M, Kotz SA. Laughter catches attention! Biol Psychol 2017; 130:11-21. [PMID: 28942367 DOI: 10.1016/j.biopsycho.2017.09.012] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2016] [Revised: 09/19/2017] [Accepted: 09/19/2017] [Indexed: 01/02/2023]
Abstract
In social interactions, emotionally salient and sudden changes in vocal expressions attract attention. However, only a few studies examined how emotion and attention interact in voice processing. We investigated neutral, happy (laughs) and angry (growls) vocalizations in a modified oddball task. Participants silently counted the targets in each block and rated the valence and arousal of the vocalizations. A combined event-related potential and time-frequency analysis focused on the P3 and pre-stimulus alpha power to capture attention effects in response to unexpected events. Whereas an early differentiation between emotionally salient and neutral vocalizations was reflected in the P3a response, the P3b was selectively enhanced for happy voices. The P3b modulation was predicted by pre-stimulus frontal alpha desynchronization, and by the perceived pleasantness of the targets. These findings indicate that vocal emotions may be differently processed based on task relevance and valence. Increased anticipation and attention to positive vocal cues (laughter) may reflect their high social relevance.
Collapse
Affiliation(s)
- Ana P Pinheiro
- Universidade de Lisboa, Faculdade de Psicologia, CICPSI, Lisboa, Portugal; Neuropsychophysiology Laboratory, School of Psychology, University of Minho, Braga, Portugal.
| | - Carla Barros
- Neuropsychophysiology Laboratory, School of Psychology, University of Minho, Braga, Portugal
| | - Marcelo Dias
- Neuropsychophysiology Laboratory, School of Psychology, University of Minho, Braga, Portugal
| | - Sonja A Kotz
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany; Faculty of Psychology and Neuroscience, Department of Neuropsychology & Psychopharmacology, Maastricht University, The Netherlands
| |
Collapse
|
18
|
Pinheiro AP, Barros C, Dias M, Niznikiewicz M. Does emotion change auditory prediction and deviance detection? Biol Psychol 2017; 127:123-133. [PMID: 28499839 DOI: 10.1016/j.biopsycho.2017.05.007] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2016] [Revised: 03/15/2017] [Accepted: 05/06/2017] [Indexed: 01/23/2023]
Abstract
In the last decades, a growing number of studies provided compelling evidence supporting the interplay of cognitive and affective processes. However, it remains to be clarified whether and how an emotional context affects the prediction and detection of change in unattended sensory events. In an event-related potential (ERP) study, we probed the modulatory role of pleasant, unpleasant and neutral visual contexts on the brain response to automatic detection of change in spectral (intensity) vs. temporal (duration) sound features. Twenty participants performed a passive auditory oddball task. Additionally, we tested the relationship between ERPs and self-reported mood. Participants reported more negative mood after the negative block. The P2 amplitude elicited by standards was increased in a positive context. Mismatch Negativity (MMN) amplitude was decreased in the negative relative to the neutral and positive contexts, and was associated with self-reported mood. These findings suggest that the detection of regularities in the auditory stream was facilitated in a positive context, whereas a negative visual context interfered with prediction error elicitation, through associated mood changes. Both ERP and behavioral effects highlight the intricate links between emotion, perception and cognitive processes.
Collapse
Affiliation(s)
- Ana P Pinheiro
- Neuropsychophysiology Lab, School of Psychology, University of Minho, Braga, Portugal; Faculty of Psychology, University of Lisbon, Lisbon, Portugal.
| | - Carla Barros
- Neuropsychophysiology Lab, School of Psychology, University of Minho, Braga, Portugal
| | - Marcelo Dias
- Neuropsychophysiology Lab, School of Psychology, University of Minho, Braga, Portugal
| | - Margaret Niznikiewicz
- VA Boston Healthcare System, Department of Psychiatry, Harvard Medical School, Boston MA, USA
| |
Collapse
|
19
|
What is the Melody of That Voice? Probing Unbiased Recognition Accuracy with the Montreal Affective Voices. JOURNAL OF NONVERBAL BEHAVIOR 2017. [DOI: 10.1007/s10919-017-0253-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
|