1
|
Chen C, Chan CW, Cheng Y. Test-Retest Reliability of Mismatch Negativity (MMN) to Emotional Voices. Front Hum Neurosci 2018; 12:453. [PMID: 30498437 PMCID: PMC6249375 DOI: 10.3389/fnhum.2018.00453] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2018] [Accepted: 10/24/2018] [Indexed: 12/20/2022] Open
Abstract
A voice from kin species conveys indispensable social and affective signals with uniquely phylogenetic and ontogenetic standpoints. However, the neural underpinning of emotional voices, beyond low-level acoustic features, activates a processing chain that proceeds from the auditory pathway to the brain structures implicated in cognition and emotion. By using a passive auditory oddball paradigm, which employs emotional voices, this study investigates the test–retest reliability of emotional mismatch negativity (MMN), indicating that the deviants of positively (happily)- and negatively (angrily)-spoken syllables, as compared to neutral standards, can trigger MMN as a response to an automatic discrimination of emotional salience. The neurophysiological estimates of MMN to positive and negative deviants appear to be highly reproducible, irrespective of the subject’s attentional disposition: whether the subjects are set to a condition that involves watching a silent movie or do a working memory task. Specifically, negativity bias is evinced as threatening, relative to positive vocalizations, consistently inducing larger MMN amplitudes, regardless of the day and the time of a day. The present findings provide evidence to support the fact that emotional MMN offers a stable platform to detect subtle changes in current emotional shifts.
Collapse
Affiliation(s)
- Chenyi Chen
- Department of Physical Medicine and Rehabilitation, National Yang-Ming University Hospital, Yilan, Taiwan.,Graduate Institute of Injury Prevention and Control, Taipei Medical University, Taipei, Taiwan.,Institute of Humanities in Medicine, Taipei Medical University, Taipei, Taiwan.,Research Center of Brain and Consciousness, Shuang Ho Hospital, Taipei Medical University, Taipei, Taiwan
| | - Chia-Wen Chan
- Graduate Institute of Injury Prevention and Control, Taipei Medical University, Taipei, Taiwan
| | - Yawei Cheng
- Department of Physical Medicine and Rehabilitation, National Yang-Ming University Hospital, Yilan, Taiwan.,Institute of Neuroscience and Brain Research Center, National Yang-Ming University, Taipei, Taiwan.,Department of Research and Education, Taipei City Hospital, Taipei, Taiwan
| |
Collapse
|
2
|
Zhang X, Xu C, Xue W, Hu J, He Y, Gao M. Emotion Recognition Based on Multichannel Physiological Signals with Comprehensive Nonlinear Processing. SENSORS 2018; 18:s18113886. [PMID: 30423894 PMCID: PMC6263611 DOI: 10.3390/s18113886] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/13/2018] [Revised: 11/07/2018] [Accepted: 11/07/2018] [Indexed: 11/16/2022]
Abstract
Multichannel physiological datasets are usually nonlinear and separable in the field of emotion recognition. Many researchers have applied linear or partial nonlinear processing in feature reduction and classification, but these applications did not work well. Therefore, this paper proposed a comprehensive nonlinear method to solve this problem. On the one hand, as traditional feature reduction may cause the loss of significant amounts of feature information, Kernel Principal Component Analysis (KPCA) based on radial basis function (RBF) was introduced to map the data into a high-dimensional space, extract the nonlinear information of the features, and then reduce the dimension. This method can provide many features carrying information about the structure in the physiological dataset. On the other hand, considering its advantages of predictive power and feature selection from a large number of features, Gradient Boosting Decision Tree (GBDT) was used as a nonlinear ensemble classifier to improve the recognition accuracy. The comprehensive nonlinear processing method had a great performance on our physiological dataset. Classification accuracy of four emotions in 29 participants achieved 93.42%.
Collapse
Affiliation(s)
- Xingxing Zhang
- College of Intelligence and Computing, Tianjin University, Tianjin 300350, China.
| | - Chao Xu
- College of Intelligence and Computing, Tianjin University, Tianjin 300350, China.
| | - Wanli Xue
- College of Intelligence and Computing, Tianjin University, Tianjin 300350, China.
- School of Computer Science and Engineering, Tianjin University of Technology, Tianjin 300384, China.
| | - Jing Hu
- College of Intelligence and Computing, Tianjin University, Tianjin 300350, China.
| | - Yongchuan He
- Shenzhen Graduate School, Peking University, Shenzhen 518055, China.
| | - Mengxin Gao
- Department of Economics, Pennsylvania State University, State College, PA 16803, USA.
| |
Collapse
|
3
|
Li J, Hu H, Chen N, Jones JA, Wu D, Liu P, Liu H. Aging and Sex Influence Cortical Auditory-Motor Integration for Speech Control. Front Neurosci 2018; 12:749. [PMID: 30386204 PMCID: PMC6199396 DOI: 10.3389/fnins.2018.00749] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 09/28/2018] [Indexed: 11/13/2022] Open
Abstract
It is well known that acoustic change in speech production is subject to age-related declines. How aging alters cortical sensorimotor integration in speech control, however, remains poorly understood. The present event-related potential study examined the behavioral and neural effects of aging and sex on the auditory-motor processing of voice pitch errors. Behaviorally, older adults produced significantly larger vocal compensations for pitch perturbations than young adults across the sexes, while the effects of sex on vocal compensation did not exist for both young and older adults. At the cortical level, there was a significant interaction between aging and sex on the N1-P2 complex. Older males produced significantly smaller P2 amplitudes than young males, while young males produced significantly larger N1 and P2 amplitudes than young females. In addition, females produced faster N1 responses than males regardless of age, while young adults produced faster P2 responses than older adults across the sexes. These findings provide the first neurobehavioral evidence that demonstrates the aging influence on auditory feedback control of speech production, and highlight the importance of sex in understanding the aging of the neuromotor control of speech production.
Collapse
Affiliation(s)
- Jingting Li
- Department of Rehabilitation Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China
| | - Huijing Hu
- Guangdong Work Injury Rehabilitation Center, Guangzhou, China
| | - Na Chen
- Department of Rehabilitation Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China
| | - Jeffery A Jones
- Department of Psychology and Laurier Centre for Cognitive Neuroscience, Wilfrid Laurier University, Waterloo, ON, Canada
| | - Dan Wu
- Department of Rehabilitation Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China
| | - Peng Liu
- Department of Rehabilitation Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China
| | - Hanjun Liu
- Department of Rehabilitation Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.,Guangdong Province Key Laboratory of Brain Function and Disease, Zhongshan School of Medicine, Sun Yat-sen University, Guangzhou, China
| |
Collapse
|
4
|
Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. APPLIED SCIENCES-BASEL 2017. [DOI: 10.3390/app7121239] [Citation(s) in RCA: 70] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
5
|
Sen A, Isaacowitz D, Schirmer A. Age differences in vocal emotion perception: on the role of speaker age and listener sex. Cogn Emot 2017; 32:1189-1204. [DOI: 10.1080/02699931.2017.1393399] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Antarika Sen
- Neurobiology and Aging Programme, National University of Singapore, Singapore, Singapore
| | | | - Annett Schirmer
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- The Mind and Brain Institute, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
6
|
Zhu X, Niu Y, Li W, Zhang Z, Liu P, Chen X, Liu H. Menstrual Cycle Phase Modulates Auditory-Motor Integration for Vocal Pitch Regulation. Front Neurosci 2016; 10:600. [PMID: 28082863 PMCID: PMC5187373 DOI: 10.3389/fnins.2016.00600] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2016] [Accepted: 12/15/2016] [Indexed: 01/19/2023] Open
Abstract
In adult females, previous work has demonstrated that changes in auditory function and vocal motor behaviors may accompany changes in gonadal steroids. Less is known, however, about the influence of gonadal steroids on auditory-motor integration for voice control in humans. The present event-related potential (ERP) study sought to examine the interaction between gonadal steroids and auditory feedback-based vocal pitch regulation across the menstrual cycle. Participants produced sustained vowels while hearing their voice unexpectedly pitch-shifted during the menstrual, follicular, and luteal phases of the menstrual cycle. Measurement of vocal and cortical responses to pitch feedback perturbations and assessment of estradiol and progesterone levels were performed in all three phases. The behavioral results showed that the menstrual phase (when estradiol levels are low) as associated with larger magnitudes of vocal responses than the follicular and luteal phases (when estradiol levels are high). Furthermore, there was a significant negative correlation between the magnitudes of vocal responses and estradiol levels. At the cortical level, ERP P2 responses were smaller during the luteal phase (when progesterone levels were high) than the menstrual and follicular phases (when progesterone levels were low). These findings show neurobehavioral evidence for the modulation of auditory-motor integration for vocal pitch regulation across the menstrual cycle, and provide important insights into the neural mechanisms and functional outcomes of gonadal steroids' influence on speech motor control in adult women.
Collapse
Affiliation(s)
- Xiaoxia Zhu
- Department of Rehabilitation Medicine, The Sixth Affiliated Hospital of Sun Yat-sen University Guangzhou, China
| | - Yang Niu
- Department of Rehabilitation Medicine, Anhui No. 2 Province People's Hospital Hefei, China
| | - Weifeng Li
- Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen University Guangzhou, China
| | - Zhou Zhang
- Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen University Guangzhou, China
| | - Peng Liu
- Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen University Guangzhou, China
| | - Xi Chen
- Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen University Guangzhou, China
| | - Hanjun Liu
- Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen UniversityGuangzhou, China; Guangdong Provincial Key Laboratory of Brain Function and Disease, Zhongshan School of Medicine, Sun Yat-sen UniversityGuangzhou, China
| |
Collapse
|
7
|
Chen X, Pan Z, Wang P, Zhang L, Yuan J. EEG oscillations reflect task effects for the change detection in vocal emotion. Cogn Neurodyn 2014; 9:351-8. [PMID: 25972983 DOI: 10.1007/s11571-014-9326-9] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2014] [Revised: 11/27/2014] [Accepted: 12/14/2014] [Indexed: 12/01/2022] Open
Abstract
How task focus affects recognition of change in vocal emotion remains in debate. In this study, we investigated the role of task focus for change detection in emotional prosody by measuring changes in event-related electroencephalogram (EEG) power. EEG was recorded for prosodies with and without emotion change while subjects performed emotion change detection task (explicit) and visual probe detection task (implicit). We found that vocal emotion change induced theta event-related synchronization during 100-600 ms regardless of task focus. More importantly, vocal emotion change induced significant beta event-related desynchronization during 400-750 ms under explicit instead of implicit task condition. These findings suggest that the detection of emotional changes is independent of task focus, while the task focus effect in neural processing of vocal emotion change is specific to the integration of emotional deviations.
Collapse
Affiliation(s)
- Xuhai Chen
- Key Laboratory of Behavior and Cognitive Psychology in Shaanxi Province, School of Psychology, Shaanxi Normal University, 199# South Chang'an Road, Xi'an, 710062 China ; Key Laboratory of Modern Teaching Technology, Ministry of Education, Shaanxi Normal University, Xi'an, 710062 China
| | - Zhihui Pan
- Key Laboratory of Behavior and Cognitive Psychology in Shaanxi Province, School of Psychology, Shaanxi Normal University, 199# South Chang'an Road, Xi'an, 710062 China
| | - Ping Wang
- Key Laboratory of Behavior and Cognitive Psychology in Shaanxi Province, School of Psychology, Shaanxi Normal University, 199# South Chang'an Road, Xi'an, 710062 China
| | - Lijie Zhang
- Key Laboratory of Behavior and Cognitive Psychology in Shaanxi Province, School of Psychology, Shaanxi Normal University, 199# South Chang'an Road, Xi'an, 710062 China
| | - Jiajin Yuan
- Key Laboratory of Cognition and Personality of Ministry of Education, School of Psychology, Southwest University, Chongqing, 400715 China
| |
Collapse
|
8
|
Loui P, Bachorik JP, Li HC, Schlaug G. Effects of voice on emotional arousal. Front Psychol 2013; 4:675. [PMID: 24101908 PMCID: PMC3787249 DOI: 10.3389/fpsyg.2013.00675] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2013] [Accepted: 09/07/2013] [Indexed: 11/27/2022] Open
Abstract
Music is a powerful medium capable of eliciting a broad range of emotions. Although the relationship between language and music is well documented, relatively little is known about the effects of lyrics and the voice on the emotional processing of music and on listeners' preferences. In the present study, we investigated the effects of vocals in music on participants' perceived valence and arousal in songs. Participants (N = 50) made valence and arousal ratings for familiar songs that were presented with and without the voice. We observed robust effects of vocal content on perceived arousal. Furthermore, we found that the effect of the voice on enhancing arousal ratings is independent of familiarity of the song and differs across genders and age: females were more influenced by vocals than males; furthermore these gender effects were enhanced among older adults. Results highlight the effects of gender and aging in emotion perception and are discussed in terms of the social roles of music.
Collapse
Affiliation(s)
- Psyche Loui
- Department of Neurology, Beth Israel Deaconess Medical Center and Harvard Medical School, BostonMA, USA
- Department of Psychology, Wesleyan University, MiddletownCT, USA
| | - Justin P. Bachorik
- Department of Neurology, Beth Israel Deaconess Medical Center and Harvard Medical School, BostonMA, USA
| | - H. Charles Li
- Department of Neurology, Beth Israel Deaconess Medical Center and Harvard Medical School, BostonMA, USA
| | - Gottfried Schlaug
- Department of Neurology, Beth Israel Deaconess Medical Center and Harvard Medical School, BostonMA, USA
| |
Collapse
|
9
|
Vocal emotions influence verbal memory: Neural correlates and interindividual differences. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2012; 13:80-93. [DOI: 10.3758/s13415-012-0132-8] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
10
|
Stevens JS, Hamann S. Sex differences in brain activation to emotional stimuli: a meta-analysis of neuroimaging studies. Neuropsychologia 2012; 50:1578-93. [PMID: 22450197 DOI: 10.1016/j.neuropsychologia.2012.03.011] [Citation(s) in RCA: 374] [Impact Index Per Article: 28.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2011] [Revised: 03/05/2012] [Accepted: 03/09/2012] [Indexed: 01/23/2023]
Abstract
Substantial sex differences in emotional responses and perception have been reported in previous psychological and psychophysiological studies. For example, women have been found to respond more strongly to negative emotional stimuli, a sex difference that has been linked to an increased risk of depression and anxiety disorders. The extent to which such sex differences are reflected in corresponding differences in regional brain activation remains a largely unresolved issue, however, in part because relatively few neuroimaging studies have addressed this issue. Here, by conducting a quantitative meta-analysis of neuroimaging studies, we were able to substantially increase statistical power to detect sex differences relative to prior studies, by combining emotion studies which explicitly examined sex differences with the much larger number of studies that examined only women or men. We used an activation likelihood estimation approach to characterize sex differences in the likelihood of regional brain activation elicited by emotional stimuli relative to non-emotional stimuli. We examined sex differences separately for negative and positive emotions, in addition to examining all emotions combined. Sex differences varied markedly between negative and positive emotion studies. The majority of sex differences favoring women were observed for negative emotion, whereas the majority of the sex differences favoring men were observed for positive emotion. This valence-specificity was particularly evident for the amygdala. For negative emotion, women exhibited greater activation than men in the left amygdala, as well as in other regions including the left thalamus, hypothalamus, mammillary bodies, left caudate, and medial prefrontal cortex. In contrast, for positive emotion, men exhibited greater activation than women in the left amygdala, as well as greater activation in other regions including the bilateral inferior frontal gyrus and right fusiform gyrus. These meta-analysis findings indicate that the amygdala, a key region for emotion processing, exhibits valence-dependent sex differences in activation to emotional stimuli. The greater left amygdala response to negative emotion for women accords with previous reports that women respond more strongly to negative emotional stimuli, as well as with hypothesized links between increased neurobiological reactivity to negative emotion and increased prevalence of depression and anxiety disorders in women. The finding of greater left amygdala activation for positive emotional stimuli in men suggests that greater amygdala responses reported previously for men for specific types of positive stimuli may also extend to positive stimuli more generally. In summary, this study extends efforts to characterize sex differences in brain activation during emotion processing by providing the largest and most comprehensive quantitative meta-analysis to date, and for the first time examining sex differences as a function of positive vs. negative emotional valence. The current findings highlight the importance of considering sex as a potential factor modulating emotional processing and its underlying neural mechanisms, and more broadly, the need to consider individual differences in understanding the neurobiology of emotion.
Collapse
Affiliation(s)
- Jennifer S Stevens
- Department of Psychology, Emory University, 36 Eagle Row, Atlanta, GA 30322, USA
| | | |
Collapse
|
11
|
Leitman DI, Sehatpour P, Garidis C, Gomez-Ramirez M, Javitt DC. Preliminary Evidence of Pre-Attentive Distinctions of Frequency-Modulated Tones that Convey Affect. Front Hum Neurosci 2011; 5:96. [PMID: 22053152 PMCID: PMC3205480 DOI: 10.3389/fnhum.2011.00096] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2011] [Accepted: 08/19/2011] [Indexed: 11/13/2022] Open
Abstract
Recognizing emotion is an evolutionary imperative. An early stage of auditory scene analysis involves the perceptual grouping of acoustic features, which can be based on both temporal coincidence and spectral features such as perceived pitch. Perceived pitch, or fundamental frequency (F0), is an especially salient cue for differentiating affective intent through speech intonation (prosody). We hypothesized that: (1) simple frequency-modulated tone abstractions, based on the parameters of actual prosodic stimuli, would be reliably classified as representing differing emotional categories; and (2) that such differences would yield significant mismatch negativities (MMNs) – an index of pre-attentive deviance detection within the auditory environment. We constructed a set of FM tones that approximated the F0 mean and variation of reliably recognized happy and neutral prosodic stimuli. These stimuli were presented to 13 subjects using a passive listening oddball paradigm. We additionally included stimuli with no frequency modulation (FM) and FM tones with identical carrier frequencies but differing modulation depths as control conditions. Following electrophysiological recording, subjects were asked to identify the sounds they heard as happy, sad, angry, or neutral. We observed that FM tones abstracted from happy and no-expression speech stimuli elicited MMNs. Post hoc behavioral testing revealed that subjects reliably identified the FM tones in a consistent manner. Finally, we also observed that FM tones and no-FM tones elicited equivalent MMNs. MMNs to FM tones that differentiate affect suggests that these abstractions may be sufficient to characterize prosodic distinctions, and that these distinctions can be represented in pre-attentive auditory sensory memory.
Collapse
Affiliation(s)
- David I Leitman
- Neuropsychiatry Section, Department of Psychiatry, University of Pennsylvania School of Medicine Philadelphia, PA, USA
| | | | | | | | | |
Collapse
|
12
|
Sex-dependent modulation of activity in the neural networks engaged during emotional speech comprehension. Brain Res 2011; 1390:108-17. [DOI: 10.1016/j.brainres.2011.03.043] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2010] [Revised: 03/15/2011] [Accepted: 03/16/2011] [Indexed: 11/20/2022]
|
13
|
Knott V, Heenan A, Shah D, Bolton K, Fisher D, Villeneuve C. Electrophysiological evidence of nicotine's distracter-filtering properties in non-smokers. J Psychopharmacol 2011; 25:239-48. [PMID: 19939874 DOI: 10.1177/0269881109348158] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Nicotine-enhanced attentional functions are purported to underlie improvements in behavioral performance in cognitive tasks but it is unclear as to whether these effects involve selective attention or attentional control under conditions of distraction. Behavioral and event-related potential measures were used to examine the effects of nicotine on distractibility in 21 non-smokers who were instructed to ignore task-irrelevant auditory stimuli while they performed a visual discrimination task. In a randomized, double-blind, placebo-controlled cross-over design, nicotine gum (6 mg) shortened overall reaction times but failed to prevent increased response slowing and errors caused by deviant sounds. Relative to placebo, nicotine did not modulate the early pre-attentive detection of deviants as reflected in the mismatch negativity but it attenuated the amplitude of the deviant-elicited P3a, an event-related potential component indexing the involuntary shifting of attention. Nicotine also enhanced attentional re-focusing back on to task-relevant stimuli following distraction as evidenced by an increased amplitude of the re-orienting negativity. These findings and the behavioral-electrophysiological dissociation seen with nicotine are discussed in relation to theories of attention and smoking motivation.
Collapse
Affiliation(s)
- Verner Knott
- Clinical Neuroelectrophysiological and Cognitive Research Laboratory, University of Ottawa Institute of Mental Health Research, Royal Ottawa Mental Health Centre, Ottawa, ON, Canada.
| | | | | | | | | | | |
Collapse
|
14
|
Schirmer A, Soh YH, Penney TB, Wyse L. Perceptual and conceptual priming of environmental sounds. J Cogn Neurosci 2011; 23:3241-53. [PMID: 21281092 DOI: 10.1162/jocn.2011.21623] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
It is still unknown whether sonic environments influence the processing of individual sounds in a similar way as discourse or sentence context influences the processing of individual words. One obstacle to answering this question has been the failure to dissociate perceptual (i.e., how similar are sonic environment and target sound?) and conceptual (i.e., how related are sonic environment and target?) priming effects. In this study, we dissociate these effects by creating prime-target pairs with a purely perceptual or both a perceptual and conceptual relationship. Perceptual prime-target pairs were derived from perceptual-conceptual pairs (i.e., meaningful environmental sounds) by shuffling the spectral composition of primes and targets so as to preserve their perceptual relationship while making them unrecognizable. Hearing both original and shuffled targets elicited a more positive N1/P2 complex in the ERP when targets were related to a preceding prime as compared with unrelated. Only related original targets reduced the N400 amplitude. Related shuffled targets tended to decrease the amplitude of a late temporo-parietal positivity. Taken together, these effects indicate that sonic environments influence first the perceptual and then the conceptual processing of individual sounds. Moreover, the influence on conceptual processing is comparable to the influence linguistic context has on the processing of individual words.
Collapse
|
15
|
When neurons do not mirror the agent's intentions: Sex differences in neural coding of goal-directed actions. Neuropsychologia 2010; 48:1454-63. [DOI: 10.1016/j.neuropsychologia.2010.01.015] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2009] [Revised: 01/08/2010] [Accepted: 01/21/2010] [Indexed: 11/20/2022]
|
16
|
Leung S, Croft RJ, Guille V, Scholes K, O'Neill BV, Phan KL, Nathan PJ. Acute dopamine and/or serotonin depletion does not modulate mismatch negativity (MMN) in healthy human participants. Psychopharmacology (Berl) 2010; 208:233-44. [PMID: 20012022 DOI: 10.1007/s00213-009-1723-0] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/02/2009] [Accepted: 11/02/2009] [Indexed: 01/08/2023]
Abstract
RATIONALE Schizophrenia is commonly associated with impairments in pre-attentive change detection, as represented by reduced mismatch negativity (MMN). While the neurochemical basis of MMN has been linked to N-methyl-D: -aspartic acid (NMDA) receptor function, the roles of the dopaminergic and/or the serotonergic systems are not fully explored in humans. OBJECTIVES The aim of the present study was to investigate the effects of acutely depleting dopamine (DA) and serotonin (5-hydroxytryptamine, 5-HT) alone or simultaneously by depleting their amino acid precursors on MMN in healthy participants. METHODS Sixteen healthy male subjects participated in a double-blind, placebo-controlled, cross-over design in which each subject's duration MMN was assessed under four acute treatment conditions separated by a 5-day washout period: balanced amino acid control (no depletion), tyrosine/phenylalanine depletion (to reduce DA neurotransmission), tryptophan depletion (to reduce 5-HT neurotransmission) and tryptophan/tyrosine/phenylalanine depletion (to reduce DA and 5-HT neurotransmission simultaneously). RESULTS Acute depletion of either DA and 5-HT alone or simultaneously had no effect on MMN. CONCLUSIONS These findings suggest that modulation of the dopaminergic and serotonergic systems acutely does not lead to changes in MMN.
Collapse
Affiliation(s)
- Sumie Leung
- Brain Sciences Institute, Faculty of Life and Social Sciences, Swinburne University of Technology, P.O. Box 218, John Street Hawthorn, 3122, Melbourne, VIC, Australia.
| | | | | | | | | | | | | |
Collapse
|
17
|
Shangguan F, Shi J. Puberty timing and fluid intelligence: a study of correlations between testosterone and intelligence in 8- to 12-year-old Chinese boys. Psychoneuroendocrinology 2009; 34:983-8. [PMID: 19249158 DOI: 10.1016/j.psyneuen.2009.01.012] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/22/2008] [Revised: 01/21/2009] [Accepted: 01/26/2009] [Indexed: 12/12/2022]
Abstract
Sex hormone such as testosterone was recently recognized as an important contributor of spatial cognition and intelligence during development, but the relationship between puberty timing and intelligence especially in children is largely unknown. Here in this study, we investigated the potential relationship between the level of sex hormones in saliva and fluid intelligence in 8- to 12-year-old Chinese boys. Fluid intelligence was measured by the Cattell's Culture Fair Intelligence Test. 1600 children aged 8-12 years were included in the Cattell's Culture Fair Intelligence Test and saliva samples were collected thereafter from 166 boys with normal intelligence distribution, composed of 49, 54 and 63 boys in 8-, 10- and 12-year-old group respectively. The level of salivary testosterone and estradiol was measured with enzyme-immunoassay technique. Data of BMI and age were collected. The relationship between the level of salivary sex hormones and fluid intelligence was analysed by correlation test. There was no significant correlation between salivary testosterone level and fluid intelligence in 8-year-old boys, whereas there was a significant positive correlation in 10-year-old boys and a significant negative correlation in 12-year-old boys between those two variable. To verify the correlation, we performed stepwise multivariate linear regression and discriminant analysis, with both the age and BMI of the boys and their parents, and salivary estradiol level considered. The results showed that the level of testosterone and intelligence was correlated, and the correlation was much stronger when the level of salivary testosterone was higher than 14 pg/ml. In summary, the study suggests that the relationship of testosterone and intelligence varies from late childhood to early adolescence, and the puberty timing is closely related with fluid intelligence.
Collapse
|