1
|
Kelemen DE, Burnsworth C, Chubb C, Centanni TM. Complex Pitch Perception Deficits in Dyslexia Persist Regardless of Previous Musical Experiences. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2025:1-14. [PMID: 40366888 DOI: 10.1044/2025_jslhr-24-00883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2025]
Abstract
PURPOSE Pitch perception is important for speech sound learning, and reading acquisition requires integration of speech sounds and written letters. Many individuals with dyslexia exhibit auditory perception deficits that may therefore contribute to their reading impairment given that complex pitch perception is crucial for categorizing speech sounds. Given rising interest in music training as a reading intervention, understanding associations between prior music experiences and pitch perception is important. This study explored the relationship between pitch perception skills and reading ability in young adults with and without dyslexia with various levels of musical experience. METHOD Young adults (18-35 years old) with (N = 43) and without (N = 105) dyslexia completed two pitch perception tasks, reading assessments, and a survey reporting formal music training and childhood home music environment (HME). RESULTS Participants with dyslexia performed worse than typically developing peers on both pitch perception tasks. Single-word reading was related to pitch perception in the typically developing group only. Childhood HME positively correlated with mode categorization and simple pitch discrimination in both groups. Formal music training was associated with performance on both pitch perception tasks in the typically developing group, and simple pitch discrimination in the dyslexia group. CONCLUSIONS Pitch perception deficits may interfere with complex acoustic categorization and persist in some individuals with dyslexia despite prior music experiences. Future research should investigate the link between pitch perception and phonological awareness in dyslexia and assess whether music interventions targeting these skills improve reading.
Collapse
Affiliation(s)
- Delaney E Kelemen
- Department of Psychology, Texas Christian University, Fort Worth
- Department of Speech, Language and Hearing Sciences, University of Florida, Gainesville
| | | | - Charles Chubb
- Department of Cognitive Sciences, University of California, Irvine
| | - Tracy M Centanni
- Department of Psychology, Texas Christian University, Fort Worth
- Department of Speech, Language and Hearing Sciences, University of Florida, Gainesville
| |
Collapse
|
2
|
Yang Z, Su Q, Xie J, Su H, Huang T, Han C, Zhang S, Zhang K, Xu G. Music tempo modulates emotional states as revealed through EEG insights. Sci Rep 2025; 15:8276. [PMID: 40065030 PMCID: PMC11893886 DOI: 10.1038/s41598-025-92679-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2024] [Accepted: 03/03/2025] [Indexed: 03/14/2025] Open
Abstract
Music can effectively influence human emotions, with different melodies and rhythms eliciting varying emotional responses. Among these, tempo is one of the most important parameters affecting emotions. This study explores the impact of music tempo on emotional states and the associated brain functional networks. A total of 26 participants without any history of neurological or psychiatric disorders and music training took part in the experiment, using classical piano music clips at different tempi (56, 106, 156 bpm) as stimuli. The study was conducted using emotional scales and electroencephalogram (EEG) analysis. The results showed that the valence level of emotions significantly increased with music tempo, while the arousal level exhibited a "V" shape relationship. EEG analysis revealed significant changes in brainwave signals across different frequency bands under different tempi. For instance, slow tempo induced higher Theta and Alpha power in the frontal region, while fast tempo increased Beta and Gamma band power. Moreover, fast tempo enhanced the average connectivity strength in the frontal, temporal, and occipital regions, and increased phase synchrony value (PLV) between the frontal and parietal regions. However, slow tempo improves PLV between the occipital and parietal regions. The findings of this study elucidate the effects of music tempo on the brain functional networks related to emotion regulation, providing a theoretical basis for music-assisted diagnosis and treatment of mood disorders. Furthermore, these results suggest potential applications in emotion robotics, emotion-based human-computer interaction, and emotion-based intelligent control.
Collapse
Affiliation(s)
- Zengyao Yang
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Qiruo Su
- Joint School of Design and Innovation, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Jieren Xie
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Hechong Su
- Joint School of Design and Innovation, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Tianrun Huang
- Joint School of Design and Innovation, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Chengcheng Han
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Sicong Zhang
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Kai Zhang
- Faillace Department of Psychiatry and Behavioral Sciences, McGovern Medical School, University of Texas Health Science Center at Houston, Houston, TX, USA
| | - Guanghua Xu
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, 710049, China.
- State Key Laboratory for Manufacturing Systems Engineering, Xi'an Jiaotong University, Xi'an, 710049, China.
- The First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, China.
| |
Collapse
|
3
|
Carraturo G, Pando-Naude V, Costa M, Vuust P, Bonetti L, Brattico E. The major-minor mode dichotomy in music perception. Phys Life Rev 2025; 52:80-106. [PMID: 39721138 DOI: 10.1016/j.plrev.2024.11.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2024] [Accepted: 11/29/2024] [Indexed: 12/28/2024]
Abstract
In Western tonal music, major and minor modes are recognized as the primary musical features in eliciting emotional responses. The underlying correlates of this dichotomy in music perception have been extensively investigated through decades of psychological and neuroscientific research, yielding plentiful yet often discordant results that highlight the complexity and individual differences in how these modes are perceived. This variability suggests that a deeper understanding of major-minor mode perception in music is still needed. We present the first comprehensive systematic review and meta-analysis, providing both qualitative and quantitative syntheses of major-minor mode perception and its behavioural and neural correlates. The qualitative synthesis includes 70 studies, revealing significant diversity in how the major-minor dichotomy has been empirically investigated. Most studies focused on adults, considered participants' expertise, used real-life musical stimuli, conducted behavioural evaluations, and were predominantly performed with Western listeners. Meta-analyses of behavioural, electroencephalography, and neuroimaging data (37 studies) consistently show that major and minor modes elicit distinct neural and emotional responses, though these differences are heavily influenced by subjective perception. Based on our findings, we propose a framework to describe a Major-Minor Mode(l) of music perception and its correlates, incorporating individual factors such as age, expertise, cultural background, and emotional disorders. Moreover, this work explores the cultural and historical implications of the major-minor dichotomy in music, examining its origins, universality, and emotional associations across both Western and non-Western contexts. By considering individual differences and acoustic characteristics, we contribute to a broader understanding of how musical frameworks develop across cultures. Limitations, implications, and suggestions for future research are discussed, including potential clinical applications for mood regulation and emotional disorders, alongside recommendations for experimental paradigms in investigating major-minor modes.
Collapse
Affiliation(s)
- Giulio Carraturo
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music, Aarhus/Aalborg, Aarhus, Denmark; Department of Education, Psychology, Communication, University of Bari Aldo Moro, Italy; Department of Psychology, University of Bologna, Italy
| | - Victor Pando-Naude
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music, Aarhus/Aalborg, Aarhus, Denmark
| | - Marco Costa
- Department of Psychology, University of Bologna, Italy
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music, Aarhus/Aalborg, Aarhus, Denmark
| | - Leonardo Bonetti
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music, Aarhus/Aalborg, Aarhus, Denmark; Department of Psychology, University of Bologna, Italy; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, United Kingdom; Department of Psychiatry, University of Oxford, United Kingdom
| | - Elvira Brattico
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music, Aarhus/Aalborg, Aarhus, Denmark; Department of Education, Psychology, Communication, University of Bari Aldo Moro, Italy.
| |
Collapse
|
4
|
Kim SG. On the encoding of natural music in computational models and human brains. Front Neurosci 2022; 16:928841. [PMID: 36203808 PMCID: PMC9531138 DOI: 10.3389/fnins.2022.928841] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 08/15/2022] [Indexed: 11/13/2022] Open
Abstract
This article discusses recent developments and advances in the neuroscience of music to understand the nature of musical emotion. In particular, it highlights how system identification techniques and computational models of music have advanced our understanding of how the human brain processes the textures and structures of music and how the processed information evokes emotions. Musical models relate physical properties of stimuli to internal representations called features, and predictive models relate features to neural or behavioral responses and test their predictions against independent unseen data. The new frameworks do not require orthogonalized stimuli in controlled experiments to establish reproducible knowledge, which has opened up a new wave of naturalistic neuroscience. The current review focuses on how this trend has transformed the domain of the neuroscience of music.
Collapse
|
5
|
Liu Y, Zhao X, Tang Q, Li W, Liu G. Dynamic functional network connectivity associated with musical emotions evoked by different tempi. Brain Connect 2021; 12:584-597. [PMID: 34309409 DOI: 10.1089/brain.2021.0069] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Background:Music tempo has strong clinical maneuverability and positive emotional effect in music therapy, which can directly evoke multiple emotions and dynamic neural changes in the whole-brain. However, the precise relationship between music tempo and its emotional effects remains unclear. The present study aimed to investigate the dynamic functional network connectivity (dFNC) associated with emotions elicited by music at different tempi. METHODS We obtained emotion ratings of fast- (155-170 bpm), middle- (90 bpm), and slow-tempo (50-60 bpm) piano music from 40 participants both during and after functional magnetic resonance imaging (fMRI). Group independent component analysis (ICA), sliding time window correlations, and k-means clustering were used to assess dFNC of fMRI data. Paired t-tests were conducted to compare the difference of neural networks. RESULTS (1) Fast music was associated with higher ratings of emotional valence and arousal, which were accompanied with increasing dFNC between somatomotor (SM) and cingulo-opercular (CO) networks and decreasing dFNC between fronto-parietal and SM networks. (2) Even with stronger activation in auditory (AUD) networks, slow music was associated with weaker emotion than fast music, with decreasing FNC across the brain and the participation of default mode (DM). (3) Middle-tempo music elicited moderate emotional activation with the most stable dFNC in the whole brain. CONCLUSION Faster music increases neural activity in the SM and CO regions, increasing the intensity of the emotional experience. In contrast, slower music was associated with decreasing engagement of AUD and stable engagement of DM, resulting in a weak emotional experience. These findings suggested that the time-varying aspects of functional connectivity can help to uncover the dynamic neural substrates of tempo-evoked emotion while listening to music.
Collapse
Affiliation(s)
- Ying Liu
- Southwest University, 26463, School of Mathematics and Statistics , Chongqing, China.,Southwest University, 26463, School of Music, Chongqing, Sichuan, China;
| | - Xingcong Zhao
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Qingting Tang
- Southwest University, 26463, Faculty of Psychology, Chongqing, Chongqing, China;
| | - Wenhui Li
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Guangyuan Liu
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| |
Collapse
|
6
|
Meng Q, Jiang J, Liu F, Xu X. Effects of the Musical Sound Environment on Communicating Emotion. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2020; 17:E2499. [PMID: 32268523 PMCID: PMC7177471 DOI: 10.3390/ijerph17072499] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Revised: 03/30/2020] [Accepted: 04/03/2020] [Indexed: 11/18/2022]
Abstract
The acoustic environment is one of the factors influencing emotion, however, existing research has mainly focused on the effects of noise on emotion, and on music therapy, while the acoustic and psychological effects of music on interactive behaviour have been neglected. Therefore, this study aimed to investigate the effects of music on communicating emotion including evaluation of music, and d-values of pleasure, arousal, and dominance (PAD), in terms of sound pressure level (SPL), musical emotion, and tempo. Based on acoustic environment measurement and a questionnaire survey with 52 participants in a normal classroom in Harbin city, China, the following results were found. First, SPL was significantly correlated with musical evaluation of communication: average scores of musical evaluation decreased sharply from 1.31 to -2.13 when SPL rose from 50 dBA to 60 dBA, while they floated from 0.88 to 1.31 between 40 dBA and 50 dBA. Arousal increased with increases in musical SPL in the negative evaluation group. Second, musical emotions had significant effects on musical evaluation of communication, among which the effect of joyful-sounding music was the highest; and in general, joyful- and stirring-sounding music could enhance pleasure and arousal efficiently. Third, musical tempo had significant effect on musical evaluation and communicating emotion, faster music could enhance arousal and pleasure efficiently. Finally, in terms of social characteristics, familiarity, gender combination, and number of participants affected communicating emotion. For instance, in the positive evaluation group, dominance was much higher in the single-gender groups. This study shows that some music factors, such as SPL, musical emotion, and tempo, can be used to enhance communicating emotion.
Collapse
Affiliation(s)
- Qi Meng
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Jiani Jiang
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Fangfang Liu
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Xiaoduo Xu
- UCL The Bartlett School of Architecture, University College London (UCL), London WC1H 0QB, UK
| |
Collapse
|
7
|
Liu Y, Liu G, Wei D, Li Q, Yuan G, Wu S, Wang G, Zhao X. Effects of Musical Tempo on Musicians' and Non-musicians' Emotional Experience When Listening to Music. Front Psychol 2018; 9:2118. [PMID: 30483173 PMCID: PMC6243583 DOI: 10.3389/fpsyg.2018.02118] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2018] [Accepted: 10/15/2018] [Indexed: 11/29/2022] Open
Abstract
Tempo is an important musical element that affects human's emotional processes when listening to music. However, it remains unclear how tempo and training affect individuals' emotional experience of music. To explore the neural underpinnings of the effects of tempo on music-evoked emotion, music with fast, medium, and slow tempi were collected to compare differences in emotional responses using functional magnetic resonance imaging (fMRI) of neural activity between musicians and non-musicians. Behaviorally, musicians perceived higher valence in fast music than did non-musicians. The main effects of musicians and non-musicians and tempo were significant, and a near significant interaction between group and tempo was found. In the arousal dimension, the mean score of medium-tempo music was the highest among the three kinds; in the valence dimension, the mean scores decreased in order from fast music, medium music, to slow music. Functional analyses revealed that the neural activation of musicians was stronger than those of non-musicians in the left inferior parietal lobe (IPL). A comparison of tempi showed a stronger activation from fast music than slow music in the bilateral superior temporal gyrus (STG), which provided corresponding neural evidence for the highest valence reported by participants for fast music. Medium music showed stronger activation than slow music in the right Heschl's gyrus (HG), right middle temporal gyrus (MTG), right posterior cingulate cortex (PCC), right precuneus, right IPL, and left STG. Importantly, this study confirmed and explained the connection between music tempo and emotional experiences, and their interaction with individuals' musical training.
Collapse
Affiliation(s)
- Ying Liu
- Faculty of Psychology, Southwest University, Chongqing, China
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
| | - Guangyuan Liu
- Faculty of Psychology, Southwest University, Chongqing, China
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
- School of Electronic and Information Engineering of Southwest University, Chongqing, China
- Chongqing Key Laboratory of Non-linear Circuit and Intelligent Information Processing, Southwest University, Chongqing, China
| | - Dongtao Wei
- Faculty of Psychology, Southwest University, Chongqing, China
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
| | - Qiang Li
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
- School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Guangjie Yuan
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
- School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Shifu Wu
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
- School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Gaoyuan Wang
- School of Music, Southwest University, Chongqing, China
| | - Xingcong Zhao
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China
- Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
- School of Electronic and Information Engineering of Southwest University, Chongqing, China
| |
Collapse
|
8
|
Smart environment architecture for emotion detection and regulation. J Biomed Inform 2016; 64:55-73. [PMID: 27678301 DOI: 10.1016/j.jbi.2016.09.015] [Citation(s) in RCA: 86] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2016] [Revised: 09/09/2016] [Accepted: 09/23/2016] [Indexed: 11/24/2022]
Abstract
This paper introduces an architecture as a proof-of-concept for emotion detection and regulation in smart health environments. The aim of the proposal is to detect the patient's emotional state by analysing his/her physiological signals, facial expression and behaviour. Then, the system provides the best-tailored actions in the environment to regulate these emotions towards a positive mood when possible. The current state-of-the-art in emotion regulation through music and colour/light is implemented with the final goal of enhancing the quality of life and care of the subject. The paper describes the three main parts of the architecture, namely "Emotion Detection", "Emotion Regulation" and "Emotion Feedback Control". "Emotion Detection" works with the data captured from the patient, whereas "Emotion Regulation" offers him/her different musical pieces and colour/light settings. "Emotion Feedback Control" performs as a feedback control loop to assess the effect of emotion regulation over emotion detection. We are currently testing the overall architecture and the intervention in real environments to achieve our final goal.
Collapse
|
9
|
Fernández-Sotos A, Fernández-Caballero A, Latorre JM. Influence of Tempo and Rhythmic Unit in Musical Emotion Regulation. Front Comput Neurosci 2016; 10:80. [PMID: 27536232 PMCID: PMC4971092 DOI: 10.3389/fncom.2016.00080] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2016] [Accepted: 07/19/2016] [Indexed: 11/17/2022] Open
Abstract
This article is based on the assumption of musical power to change the listener's mood. The paper studies the outcome of two experiments on the regulation of emotional states in a series of participants who listen to different auditions. The present research focuses on note value, an important musical cue related to rhythm. The influence of two concepts linked to note value is analyzed separately and discussed together. The two musical cues under investigation are tempo and rhythmic unit. The participants are asked to label music fragments by using opposite meaningful words belonging to four semantic scales, namely "Tension" (ranging from Relaxing to Stressing), "Expressiveness" (Expressionless to Expressive), "Amusement" (Boring to Amusing) and "Attractiveness" (Pleasant to Unpleasant). The participants also have to indicate how much they feel certain basic emotions while listening to each music excerpt. The rated emotions are "Happiness," "Surprise," and "Sadness." This study makes it possible to draw some interesting conclusions about the associations between note value and emotions.
Collapse
Affiliation(s)
| | - Antonio Fernández-Caballero
- Departamento de Sistemas Informáticos, Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La ManchaAlbacete, Spain
| | - José M. Latorre
- Facultad de Medicina de Albacete, Universidad de Castilla-La ManchaAlbacete, Spain
| |
Collapse
|
10
|
Discrimination of tonal and atonal music in congenital amusia: The advantage of implicit tasks. Neuropsychologia 2016; 85:10-8. [DOI: 10.1016/j.neuropsychologia.2016.02.027] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2015] [Revised: 02/06/2016] [Accepted: 02/28/2016] [Indexed: 11/20/2022]
|
11
|
Chang YH, Lee YY, Liang KC, Chen IP, Tsai CG, Hsieh S. Experiencing affective music in eyes-closed and eyes-open states: an electroencephalography study. Front Psychol 2015; 6:1160. [PMID: 26300835 PMCID: PMC4528089 DOI: 10.3389/fpsyg.2015.01160] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2015] [Accepted: 07/24/2015] [Indexed: 11/13/2022] Open
Abstract
In real life, listening to music may be associated with an eyes-closed or eyes-open state. The effect of eye state on listeners’ reaction to music has attracted some attention, but its influence on brain activity has not been fully investigated. The present study aimed to evaluate the electroencephalographic (EEG) markers for the emotional valence of music in different eye states. Thirty participants listened to musical excerpts with different emotional content in the eyes-closed and eyes-open states. The results showed that participants rated the music as more pleasant or with more positive valence under an eyes-open state. In addition, we found that the alpha asymmetry indices calculated on the parietal and temporal sites reflected emotion valence in the eyes-closed and eyes-open states, respectively. The theta power in the frontal area significantly increased while listening to emotional-positive music compared to emotional-negative music under the eyes-closed condition. These effects of eye states on EEG markers are discussed in terms of brain mechanisms underlying attention and emotion.
Collapse
Affiliation(s)
- Yun-Hsuan Chang
- Department of Psychology, College of Medical and Health Science, Asia University , Taichung, Taiwan ; Department of Psychiatry, College of Medicine, National Cheng Kung University , Tainan, Taiwan ; Institute of Allied Health Sciences, College of Medicine, National Cheng Kung University , Tainan, Taiwan
| | - You-Yun Lee
- Cognitive Electrophysiology Laboratory, Department of Psychology, National Cheng Kung University , Tainan, Taiwan
| | - Keng-Chen Liang
- Department of Psychology, National Taiwan University , Taipei, Taiwan
| | - I-Ping Chen
- Institute of Applied Arts, National Chiao Tung University , Hsinchu, Taiwan
| | - Chen-Gia Tsai
- Graduate Institute of Musicology, National Taiwan University , Taipei, Taiwan ; Neurobiology and Cognitive Science Center, National Taiwan University , Taipei, Taiwan
| | - Shulan Hsieh
- Institute of Allied Health Sciences, College of Medicine, National Cheng Kung University , Tainan, Taiwan ; Cognitive Electrophysiology Laboratory, Department of Psychology, National Cheng Kung University , Tainan, Taiwan
| |
Collapse
|
12
|
Liégeois-Chauvel C, Bénar C, Krieg J, Delbé C, Chauvel P, Giusiano B, Bigand E. How functional coupling between the auditory cortex and the amygdala induces musical emotion: a single case study. Cortex 2014; 60:82-93. [PMID: 25023618 DOI: 10.1016/j.cortex.2014.06.002] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2013] [Revised: 06/03/2014] [Accepted: 06/04/2014] [Indexed: 10/25/2022]
Abstract
Music is a sound structure of remarkable acoustical and temporal complexity. Although it cannot denote specific meaning, it is one of the most potent and universal stimuli for inducing mood. How the auditory and limbic systems interact, and whether this interaction is lateralized when feeling emotions related to music, remains unclear. We studied the functional correlation between the auditory cortex (AC) and amygdala (AMY) through intracerebral recordings from both hemispheres in a single patient while she listened attentively to musical excerpts, which we compared to passive listening of a sequence of pure tones. While the left primary and secondary auditory cortices (PAC and SAC) showed larger increases in gamma-band responses than the right side, only the right side showed emotion-modulated gamma oscillatory activity. An intra- and inter-hemisphere correlation was observed between the auditory areas and AMY during the delivery of a sequence of pure tones. In contrast, a strikingly right-lateralized functional network between the AC and the AMY was observed to be related to the musical excerpts the patient experienced as happy, sad and peaceful. Interestingly, excerpts experienced as angry, which the patient disliked, were associated with widespread de-correlation between all the structures. These results suggest that the right auditory-limbic interactions result from the formation of oscillatory networks that bind the activities of the network nodes into coherence patterns, resulting in the emergence of a feeling.
Collapse
Affiliation(s)
| | - Christian Bénar
- INS INSERM, UMR U, 1106 Marseilles, France; Aix-Marseille Université, 13005 Marseilles, France
| | - Julien Krieg
- INS INSERM, UMR U, 1106 Marseilles, France; Aix-Marseille Université, 13005 Marseilles, France
| | - Charles Delbé
- LEAD UMR 5022 CNRS, Université de Bourgogne, 21065 Dijon, France
| | - Patrick Chauvel
- INS INSERM, UMR U, 1106 Marseilles, France; Aix-Marseille Université, 13005 Marseilles, France; Hôpitaux de Marseille, Hôpital de la Timone, 13005 Marseille, France
| | - Bernard Giusiano
- INS INSERM, UMR U, 1106 Marseilles, France; Aix-Marseille Université, 13005 Marseilles, France; Hôpitaux de Marseille, Hôpital de la Timone, 13005 Marseille, France
| | - Emmanuel Bigand
- LEAD UMR 5022 CNRS, Université de Bourgogne, 21065 Dijon, France
| |
Collapse
|