1
|
Neural decoding of music from the EEG. Sci Rep 2023; 13:624. [PMID: 36635340 PMCID: PMC9837107 DOI: 10.1038/s41598-022-27361-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Accepted: 12/30/2022] [Indexed: 01/13/2023] Open
Abstract
Neural decoding models can be used to decode neural representations of visual, acoustic, or semantic information. Recent studies have demonstrated neural decoders that are able to decode accoustic information from a variety of neural signal types including electrocortiography (ECoG) and the electroencephalogram (EEG). In this study we explore how functional magnetic resonance imaging (fMRI) can be combined with EEG to develop an accoustic decoder. Specifically, we first used a joint EEG-fMRI paradigm to record brain activity while participants listened to music. We then used fMRI-informed EEG source localisation and a bi-directional long-term short term deep learning network to first extract neural information from the EEG related to music listening and then to decode and reconstruct the individual pieces of music an individual was listening to. We further validated our decoding model by evaluating its performance on a separate dataset of EEG-only recordings. We were able to reconstruct music, via our fMRI-informed EEG source analysis approach, with a mean rank accuracy of 71.8% ([Formula: see text], [Formula: see text]). Using only EEG data, without participant specific fMRI-informed source analysis, we were able to identify the music a participant was listening to with a mean rank accuracy of 59.2% ([Formula: see text], [Formula: see text]). This demonstrates that our decoding model may use fMRI-informed source analysis to aid EEG based decoding and reconstruction of acoustic information from brain activity and makes a step towards building EEG-based neural decoders for other complex information domains such as other acoustic, visual, or semantic information.
Collapse
|
2
|
Weineck K, Wen OX, Henry MJ. Neural synchronization is strongest to the spectral flux of slow music and depends on familiarity and beat salience. eLife 2022; 11:75515. [PMID: 36094165 PMCID: PMC9467512 DOI: 10.7554/elife.75515] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2021] [Accepted: 07/25/2022] [Indexed: 11/29/2022] Open
Abstract
Neural activity in the auditory system synchronizes to sound rhythms, and brain–environment synchronization is thought to be fundamental to successful auditory perception. Sound rhythms are often operationalized in terms of the sound’s amplitude envelope. We hypothesized that – especially for music – the envelope might not best capture the complex spectro-temporal fluctuations that give rise to beat perception and synchronized neural activity. This study investigated (1) neural synchronization to different musical features, (2) tempo-dependence of neural synchronization, and (3) dependence of synchronization on familiarity, enjoyment, and ease of beat perception. In this electroencephalography study, 37 human participants listened to tempo-modulated music (1–4 Hz). Independent of whether the analysis approach was based on temporal response functions (TRFs) or reliable components analysis (RCA), the spectral flux of music – as opposed to the amplitude envelope – evoked strongest neural synchronization. Moreover, music with slower beat rates, high familiarity, and easy-to-perceive beats elicited the strongest neural response. Our results demonstrate the importance of spectro-temporal fluctuations in music for driving neural synchronization, and highlight its sensitivity to musical tempo, familiarity, and beat salience. When we listen to a melody, the activity of our neurons synchronizes to the music: in fact, it is likely that the closer the match, the better we can perceive the piece. However, it remains unclear exactly which musical features our brain cells synchronize to. Previous studies, which have often used ‘simplified’ music, have highlighted that the amplitude envelope (how the intensity of the sounds changes over time) could be involved in this phenomenon, alongside factors such as musical training, attention, familiarity with the piece or even enjoyment. Whether differences in neural synchronization could explain why musical tastes vary between people is also still a matter of debate. In their study, Weineck et al. aim to better understand what drives neuronal synchronization to music. A technique known as electroencephalography was used to record brain activity in 37 volunteers listening to instrumental music whose tempo ranged from 60 to 240 beats per minute. The tunes varied across an array of features such as familiarity, enjoyment and how easy the beat was to perceive. Two different approaches were then used to calculate neural synchronization, which yielded converging results. The analyses revealed that three types of factors were associated with a strong neural synchronization. First, amongst the various cadences, a tempo of 60-120 beats per minute elicited the strongest match with neuronal activity. Interestingly, this beat is commonly found in Western pop music, is usually preferred by listeners, and often matches spontaneous body rhythms such as walking pace. Second, synchronization was linked to variations in pitch and sound quality (known as ‘spectral flux’) rather than in the amplitude envelope. And finally, familiarity and perceived beat saliency – but not enjoyment or musical expertise – were connected to stronger synchronization. These findings help to better understand how our brains allow us to perceive and connect with music. The work conducted by Weineck et al. should help other researchers to investigate this field; in particular, it shows how important it is to consider spectral flux rather than amplitude envelope in experiments that use actual music.
Collapse
Affiliation(s)
- Kristin Weineck
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Goethe University Frankfurt, Institute for Cell Biology and Neuroscience, Frankfurt am Main, Germany
| | - Olivia Xin Wen
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Molly J Henry
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Department of Psychology, Toronto Metropolitan University, Toronto, Canada
| |
Collapse
|
3
|
Jiang J, Meng Q, Ji J. Combining Music and Indoor Spatial Factors Helps to Improve College Students' Emotion During Communication. Front Psychol 2021; 12:703908. [PMID: 34594267 PMCID: PMC8476911 DOI: 10.3389/fpsyg.2021.703908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 08/10/2021] [Indexed: 11/13/2022] Open
Abstract
Against the background of weakening face-to-face social interaction, the mental health of college students deserves attention. There are few existing studies on the impact of audiovisual interaction on interactive behavior, especially emotional perception in specific spaces. This study aims to indicate whether the perception of one's music environment has influence on college students' emotion during communication in different indoor conditions including spatial function, visual and sound atmospheres, and interior furnishings. The three-dimensional pleasure-arousal-dominance (PAD) emotional model was used to evaluate the changes of emotions before and after communication. An acoustic environmental measurement was performed and the evaluations of emotion during communication was investigated by a questionnaire survey with 331 participants at six experimental sites [including a classroom (CR), a learning corridor (LC), a coffee shop (CS), a fast food restaurant (FFR), a dormitory (DT), and a living room(LR)], the following results were found: Firstly, the results in different functional spaces showed no significant effect of music on communication or emotional states during communication. Secondly, the average score of the musical evaluation was 1.09 higher in the warm-toned space compared to the cold-toned space. Thirdly, the differences in the effects of music on emotion during communication in different sound environments were significant and pleasure, arousal, and dominance could be efficiently enhanced by music in the quiet space. Fourthly, dominance was 0.63 higher in the minimally furnished space. Finally, we also investigated influence of social characteristics on the effect of music on communication in different indoor spaces, in terms of the intimacy level, the gender combination, and the group size. For instance, when there are more than two communicators in the dining space, pleasure and arousal can be efficiently enhanced by music. This study shows that combining the sound environment with spatial factors (for example, the visual and sound atmosphere) and the interior furnishings can be an effective design strategy for promoting social interaction in indoor spaces.
Collapse
Affiliation(s)
- Jiani Jiang
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, Harbin, China
| | - Qi Meng
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, Harbin, China
| | - Jingtao Ji
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, Harbin, China
| |
Collapse
|
4
|
Liu Y, Lian W, Zhao X, Tang Q, Liu G. Spatial Connectivity and Temporal Dynamic Functional Network Connectivity of Musical Emotions Evoked by Dynamically Changing Tempo. Front Neurosci 2021; 15:700154. [PMID: 34421523 PMCID: PMC8375772 DOI: 10.3389/fnins.2021.700154] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2021] [Accepted: 07/07/2021] [Indexed: 11/13/2022] Open
Abstract
Music tempo is closely connected to listeners' musical emotion and multifunctional neural activities. Music with increasing tempo evokes higher emotional responses and music with decreasing tempo enhances relaxation. However, the neural substrate of emotion evoked by dynamically changing tempo is still unclear. To investigate the spatial connectivity and temporal dynamic functional network connectivity (dFNC) of musical emotion evoked by dynamically changing tempo, we collected dynamic emotional ratings and conducted group independent component analysis (ICA), sliding time window correlations, and k-means clustering to assess the FNC of emotion evoked by music with decreasing tempo (180-65 bpm) and increasing tempo (60-180 bpm). Music with decreasing tempo (with more stable dynamic valences) evoked higher valence than increasing tempo both with stronger independent components (ICs) in the default mode network (DMN) and sensorimotor network (SMN). The dFNC analysis showed that with time-decreasing FNC across the whole brain, emotion evoked by decreasing music was associated with strong spatial connectivity within the DMN and SMN. Meanwhile, it was associated with strong FNC between the DMN-frontoparietal network (FPN) and DMN-cingulate-opercular network (CON). The paired t-test showed that music with a decreasing tempo evokes stronger activation of ICs within DMN and SMN than that with an increasing tempo, which indicated that faster music is more likely to enhance listeners' emotions with multifunctional brain activities even when the tempo is slowing down. With increasing FNC across the whole brain, music with an increasing tempo was associated with strong connectivity within FPN; time-decreasing connectivity was found within CON, SMN, VIS, and between CON and SMN, which explained its unstable valence during the dynamic valence rating. Overall, the FNC can help uncover the spatial and temporal neural substrates of musical emotions evoked by dynamically changing tempi.
Collapse
Affiliation(s)
- Ying Liu
- School of Mathematics and Statistics, Southwest University, Chongqing, China
- School of Music, Southwest University, Chongqing, China
| | - Weili Lian
- College of Preschool Education, Chongqing Youth Vocational and Technical College, Chongqing, China
| | - Xingcong Zhao
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
| | - Qingting Tang
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Guangyuan Liu
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
| |
Collapse
|
5
|
Liu Y, Zhao X, Tang Q, Li W, Liu G. Dynamic functional network connectivity associated with musical emotions evoked by different tempi. Brain Connect 2021; 12:584-597. [PMID: 34309409 DOI: 10.1089/brain.2021.0069] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Background:Music tempo has strong clinical maneuverability and positive emotional effect in music therapy, which can directly evoke multiple emotions and dynamic neural changes in the whole-brain. However, the precise relationship between music tempo and its emotional effects remains unclear. The present study aimed to investigate the dynamic functional network connectivity (dFNC) associated with emotions elicited by music at different tempi. METHODS We obtained emotion ratings of fast- (155-170 bpm), middle- (90 bpm), and slow-tempo (50-60 bpm) piano music from 40 participants both during and after functional magnetic resonance imaging (fMRI). Group independent component analysis (ICA), sliding time window correlations, and k-means clustering were used to assess dFNC of fMRI data. Paired t-tests were conducted to compare the difference of neural networks. RESULTS (1) Fast music was associated with higher ratings of emotional valence and arousal, which were accompanied with increasing dFNC between somatomotor (SM) and cingulo-opercular (CO) networks and decreasing dFNC between fronto-parietal and SM networks. (2) Even with stronger activation in auditory (AUD) networks, slow music was associated with weaker emotion than fast music, with decreasing FNC across the brain and the participation of default mode (DM). (3) Middle-tempo music elicited moderate emotional activation with the most stable dFNC in the whole brain. CONCLUSION Faster music increases neural activity in the SM and CO regions, increasing the intensity of the emotional experience. In contrast, slower music was associated with decreasing engagement of AUD and stable engagement of DM, resulting in a weak emotional experience. These findings suggested that the time-varying aspects of functional connectivity can help to uncover the dynamic neural substrates of tempo-evoked emotion while listening to music.
Collapse
Affiliation(s)
- Ying Liu
- Southwest University, 26463, School of Mathematics and Statistics , Chongqing, China.,Southwest University, 26463, School of Music, Chongqing, Sichuan, China;
| | - Xingcong Zhao
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Qingting Tang
- Southwest University, 26463, Faculty of Psychology, Chongqing, Chongqing, China;
| | - Wenhui Li
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Guangyuan Liu
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| |
Collapse
|
6
|
Neural and physiological data from participants listening to affective music. Sci Data 2020; 7:177. [PMID: 32541806 PMCID: PMC7295758 DOI: 10.1038/s41597-020-0507-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2019] [Accepted: 05/07/2020] [Indexed: 11/09/2022] Open
Abstract
Music provides a means of communicating affective meaning. However, the neurological mechanisms by which music induces affect are not fully understood. Our project sought to investigate this through a series of experiments into how humans react to affective musical stimuli and how physiological and neurological signals recorded from those participants change in accordance with self-reported changes in affect. In this paper, the datasets recorded over the course of this project are presented, including details of the musical stimuli, participant reports of their felt changes in affective states as they listened to the music, and concomitant recordings of physiological and neurological activity. We also include non-identifying meta data on our participant populations for purposes of further exploratory analysis. This data provides a large and valuable novel resource for researchers investigating emotion, music, and how they affect our neural and physiological activity.
Collapse
|
7
|
Meng Q, Jiang J, Liu F, Xu X. Effects of the Musical Sound Environment on Communicating Emotion. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2020; 17:E2499. [PMID: 32268523 PMCID: PMC7177471 DOI: 10.3390/ijerph17072499] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Revised: 03/30/2020] [Accepted: 04/03/2020] [Indexed: 11/18/2022]
Abstract
The acoustic environment is one of the factors influencing emotion, however, existing research has mainly focused on the effects of noise on emotion, and on music therapy, while the acoustic and psychological effects of music on interactive behaviour have been neglected. Therefore, this study aimed to investigate the effects of music on communicating emotion including evaluation of music, and d-values of pleasure, arousal, and dominance (PAD), in terms of sound pressure level (SPL), musical emotion, and tempo. Based on acoustic environment measurement and a questionnaire survey with 52 participants in a normal classroom in Harbin city, China, the following results were found. First, SPL was significantly correlated with musical evaluation of communication: average scores of musical evaluation decreased sharply from 1.31 to -2.13 when SPL rose from 50 dBA to 60 dBA, while they floated from 0.88 to 1.31 between 40 dBA and 50 dBA. Arousal increased with increases in musical SPL in the negative evaluation group. Second, musical emotions had significant effects on musical evaluation of communication, among which the effect of joyful-sounding music was the highest; and in general, joyful- and stirring-sounding music could enhance pleasure and arousal efficiently. Third, musical tempo had significant effect on musical evaluation and communicating emotion, faster music could enhance arousal and pleasure efficiently. Finally, in terms of social characteristics, familiarity, gender combination, and number of participants affected communicating emotion. For instance, in the positive evaluation group, dominance was much higher in the single-gender groups. This study shows that some music factors, such as SPL, musical emotion, and tempo, can be used to enhance communicating emotion.
Collapse
Affiliation(s)
- Qi Meng
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Jiani Jiang
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Fangfang Liu
- Key Laboratory of Cold Region Urban and Rural Human Settlement Environment Science and Technology, Ministry of Industry and Information Technology, School of Architecture, Harbin Institute of Technology, 66 West Dazhi Street, Nan Gang District, Harbin 150001, China; (Q.M.); (J.J.)
| | - Xiaoduo Xu
- UCL The Bartlett School of Architecture, University College London (UCL), London WC1H 0QB, UK
| |
Collapse
|
8
|
Daly I, Williams D, Hwang F, Kirke A, Miranda ER, Nasuto SJ. Electroencephalography reflects the activity of sub-cortical brain regions during approach-withdrawal behaviour while listening to music. Sci Rep 2019; 9:9415. [PMID: 31263113 PMCID: PMC6603018 DOI: 10.1038/s41598-019-45105-2] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2018] [Accepted: 06/03/2019] [Indexed: 11/09/2022] Open
Abstract
The ability of music to evoke activity changes in the core brain structures that underlie the experience of emotion suggests that it has the potential to be used in therapies for emotion disorders. A large volume of research has identified a network of sub-cortical brain regions underlying music-induced emotions. Additionally, separate evidence from electroencephalography (EEG) studies suggests that prefrontal asymmetry in the EEG reflects the approach-withdrawal response to music-induced emotion. However, fMRI and EEG measure quite different brain processes and we do not have a detailed understanding of the functional relationships between them in relation to music-induced emotion. We employ a joint EEG – fMRI paradigm to explore how EEG-based neural correlates of the approach-withdrawal response to music reflect activity changes in the sub-cortical emotional response network. The neural correlates examined are asymmetry in the prefrontal EEG, and the degree of disorder in that asymmetry over time, as measured by entropy. Participants’ EEG and fMRI were recorded simultaneously while the participants listened to music that had been specifically generated to target the elicitation of a wide range of affective states. While listening to this music, participants also continuously reported their felt affective states. Here we report on co-variations in the dynamics of these self-reports, the EEG, and the sub-cortical brain activity. We find that a set of sub-cortical brain regions in the emotional response network exhibits activity that significantly relates to prefrontal EEG asymmetry. Specifically, EEG in the pre-frontal cortex reflects not only cortical activity, but also changes in activity in the amygdala, posterior temporal cortex, and cerebellum. We also find that, while the magnitude of the asymmetry reflects activity in parts of the limbic and paralimbic systems, the entropy of that asymmetry reflects activity in parts of the autonomic response network such as the auditory cortex. This suggests that asymmetry magnitude reflects affective responses to music, while asymmetry entropy reflects autonomic responses to music. Thus, we demonstrate that it is possible to infer activity in the limbic and paralimbic systems from pre-frontal EEG asymmetry. These results show how EEG can be used to measure and monitor changes in the limbic and paralimbic systems. Specifically, they suggest that EEG asymmetry acts as an indicator of sub-cortical changes in activity induced by music. This shows that EEG may be used as a measure of the effectiveness of music therapy to evoke changes in activity in the sub-cortical emotion response network. This is also the first time that the activity of sub-cortical regions, normally considered “invisible” to EEG, has been shown to be characterisable directly from EEG dynamics measured during music listening.
Collapse
Affiliation(s)
- Ian Daly
- Brain-Computer Interfacing and Neural Engineering Laboratory, School of Computer Science and Electronic Engineering, University of Essex, Colchester, CO4 3SQ, UK.
| | - Duncan Williams
- Digital Creativity Labs, Department of Computer Science, University of York, Heslington, YO10 5RG, UK
| | - Faustina Hwang
- Brain Embodiment Laboratory, Biomedical Sciences and Biomedical Engineering Division, School of Biological Sciences, University of Reading, Reading, RG6 6AY, UK
| | - Alexis Kirke
- Interdisciplinary Centre for Computer Music Research, University of Plymouth, Plymouth, PL4 8AA, UK
| | - Eduardo R Miranda
- Interdisciplinary Centre for Computer Music Research, University of Plymouth, Plymouth, PL4 8AA, UK
| | - Slawomir J Nasuto
- Brain Embodiment Laboratory, Biomedical Sciences and Biomedical Engineering Division, School of Biological Sciences, University of Reading, Reading, RG6 6AY, UK
| |
Collapse
|
9
|
Liu Y, Liu G, Wei D, Li Q, Yuan G, Wu S, Wang G, Zhao X. Effects of Musical Tempo on Musicians' and Non-musicians' Emotional Experience When Listening to Music. Front Psychol 2018; 9:2118. [PMID: 30483173 PMCID: PMC6243583 DOI: 10.3389/fpsyg.2018.02118] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2018] [Accepted: 10/15/2018] [Indexed: 11/29/2022] Open
Abstract
Tempo is an important musical element that affects human’s emotional processes when listening to music. However, it remains unclear how tempo and training affect individuals’ emotional experience of music. To explore the neural underpinnings of the effects of tempo on music-evoked emotion, music with fast, medium, and slow tempi were collected to compare differences in emotional responses using functional magnetic resonance imaging (fMRI) of neural activity between musicians and non-musicians. Behaviorally, musicians perceived higher valence in fast music than did non-musicians. The main effects of musicians and non-musicians and tempo were significant, and a near significant interaction between group and tempo was found. In the arousal dimension, the mean score of medium-tempo music was the highest among the three kinds; in the valence dimension, the mean scores decreased in order from fast music, medium music, to slow music. Functional analyses revealed that the neural activation of musicians was stronger than those of non-musicians in the left inferior parietal lobe (IPL). A comparison of tempi showed a stronger activation from fast music than slow music in the bilateral superior temporal gyrus (STG), which provided corresponding neural evidence for the highest valence reported by participants for fast music. Medium music showed stronger activation than slow music in the right Heschl’s gyrus (HG), right middle temporal gyrus (MTG), right posterior cingulate cortex (PCC), right precuneus, right IPL, and left STG. Importantly, this study confirmed and explained the connection between music tempo and emotional experiences, and their interaction with individuals’ musical training.
Collapse
Affiliation(s)
- Ying Liu
- Faculty of Psychology, Southwest University, Chongqing, China.,Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
| | - Guangyuan Liu
- Faculty of Psychology, Southwest University, Chongqing, China.,Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China.,School of Electronic and Information Engineering of Southwest University, Chongqing, China.,Chongqing Key Laboratory of Non-linear Circuit and Intelligent Information Processing, Southwest University, Chongqing, China
| | - Dongtao Wei
- Faculty of Psychology, Southwest University, Chongqing, China.,Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China
| | - Qiang Li
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China.,School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Guangjie Yuan
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China.,School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Shifu Wu
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China.,School of Electronic and Information Engineering of Southwest University, Chongqing, China
| | - Gaoyuan Wang
- School of Music, Southwest University, Chongqing, China
| | - Xingcong Zhao
- Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.,Chongqing Brain Science Collaborative Innovation Center, Southwest University, Chongqing, China.,School of Electronic and Information Engineering of Southwest University, Chongqing, China
| |
Collapse
|
10
|
Adamos DA, Laskaris NA, Micheloyannis S. Harnessing functional segregation across brain rhythms as a means to detect EEG oscillatory multiplexing during music listening. J Neural Eng 2018; 15:036012. [DOI: 10.1088/1741-2552/aaac36] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|