1
|
Li P, Lin R, Huang W, Tang H, Liu K, Qiu N, Xu P, Tian Y, Li C. Crucial rhythms and subnetworks for emotion processing extracted by an interpretable deep learning framework from EEG networks. Cereb Cortex 2024; 34:bhae477. [PMID: 39707986 DOI: 10.1093/cercor/bhae477] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2024] [Revised: 11/13/2024] [Accepted: 11/28/2024] [Indexed: 12/23/2024] Open
Abstract
Electroencephalogram (EEG) brain networks describe the driving and synchronous relationships among multiple brain regions and can be used to identify different emotional states. However, methods for extracting interpretable structural features from brain networks are still lacking. In the current study, a novel deep learning structure comprising both an attention mechanism and a domain adversarial strategy is proposed to extract discriminant and interpretable features from brain networks. Specifically, the attention mechanism enhances the contribution of crucial rhythms and subnetworks for emotion recognition, whereas the domain-adversarial module improves the generalization performance of our proposed model for cross-subject tasks. We validated the effectiveness of the proposed method for subject-independent emotion recognition tasks with the SJTU Emotion EEG Dataset (SEED) and the EEGs recorded in our laboratory. The experimental results showed that the proposed method can effectively improve the classification accuracy of different emotions compared with commonly used methods such as domain adversarial neural networks. On the basis of the extracted network features, we also revealed crucial rhythms and subnetwork structures for emotion processing, which are consistent with those found in previous studies. Our proposed method not only improves the classification performance of brain networks but also provides a novel tool for revealing emotion processing mechanisms.
Collapse
Affiliation(s)
- Peiyang Li
- School of Life Health Information Science and Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
- Chongqing Institute for Brain and Intelligence Guangyang Bay Laboratory, Chongqing 400074, China
- Institute for Advanced Sciences, Chongqing University of Posts and Communications, Chongqing 400065, China
| | - Ruiting Lin
- School of Life Health Information Science and Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
- Chongqing Institute for Brain and Intelligence Guangyang Bay Laboratory, Chongqing 400074, China
- Institute for Advanced Sciences, Chongqing University of Posts and Communications, Chongqing 400065, China
| | - Weijie Huang
- School of Life Health Information Science and Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
- Chongqing Institute for Brain and Intelligence Guangyang Bay Laboratory, Chongqing 400074, China
- Institute for Advanced Sciences, Chongqing University of Posts and Communications, Chongqing 400065, China
| | - Hao Tang
- School of Life Health Information Science and Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
- Chongqing Institute for Brain and Intelligence Guangyang Bay Laboratory, Chongqing 400074, China
- Institute for Advanced Sciences, Chongqing University of Posts and Communications, Chongqing 400065, China
| | - Ke Liu
- Chongqing Key Laboratory of Computational Intelligence, The Chongqing University of Posts and Telecommunications, Chongqing 400065, China
| | - Nan Qiu
- The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for Neuroinformation and School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 611731, China
- The Fourth People's Hospital of Chengdu, Chengdu 610031, China
| | - Peng Xu
- The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for Neuroinformation and School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 611731, China
| | - Yin Tian
- School of Life Health Information Science and Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
- Chongqing Institute for Brain and Intelligence Guangyang Bay Laboratory, Chongqing 400074, China
- Institute for Advanced Sciences, Chongqing University of Posts and Communications, Chongqing 400065, China
| | - Cunbo Li
- The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for Neuroinformation and School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 611731, China
| |
Collapse
|
2
|
Xie S, Lei L, Sun J, Xu J. [Research on emotion recognition method based on IWOA-ELM algorithm for electroencephalogram]. SHENG WU YI XUE GONG CHENG XUE ZA ZHI = JOURNAL OF BIOMEDICAL ENGINEERING = SHENGWU YIXUE GONGCHENGXUE ZAZHI 2024; 41:1-8. [PMID: 38403598 PMCID: PMC10894732 DOI: 10.7507/1001-5515.202303010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/27/2024]
Abstract
Emotion is a crucial physiological attribute in humans, and emotion recognition technology can significantly assist individuals in self-awareness. Addressing the challenge of significant differences in electroencephalogram (EEG) signals among different subjects, we introduce a novel mechanism in the traditional whale optimization algorithm (WOA) to expedite the optimization and convergence of the algorithm. Furthermore, the improved whale optimization algorithm (IWOA) was applied to search for the optimal training solution in the extreme learning machine (ELM) model, encompassing the best feature set, training parameters, and EEG channels. By testing 24 common EEG emotion features, we concluded that optimal EEG emotion features exhibited a certain level of specificity while also demonstrating some commonality among subjects. The proposed method achieved an average recognition accuracy of 92.19% in EEG emotion recognition, significantly reducing the manual tuning workload and offering higher accuracy with shorter training times compared to the control method. It outperformed existing methods, providing a superior performance and introducing a novel perspective for decoding EEG signals, thereby contributing to the field of emotion research from EEG signal.
Collapse
Affiliation(s)
- Songyun Xie
- School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710129, P. R. China
| | - Lingjun Lei
- Medical Research Institute, Northwestern Polytechnical University, Xi'an 710129, P. R. China
| | - Jiang Sun
- School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710129, P. R. China
| | - Jian Xu
- School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710129, P. R. China
| |
Collapse
|
3
|
Gao X, Huang W, Liu Y, Zhang Y, Zhang J, Li C, Chelangat Bore J, Wang Z, Si Y, Tian Y, Li P. A novel robust Student’s t-based Granger causality for EEG based brain network analysis. Biomed Signal Process Control 2023. [DOI: 10.1016/j.bspc.2022.104321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
4
|
Zhang J, Zhang X, Chen G, Huang L, Sun Y. EEG emotion recognition based on cross-frequency granger causality feature extraction and fusion in the left and right hemispheres. Front Neurosci 2022; 16:974673. [PMID: 36161187 PMCID: PMC9491730 DOI: 10.3389/fnins.2022.974673] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Accepted: 08/17/2022] [Indexed: 11/13/2022] Open
Abstract
EEG emotion recognition based on Granger causality (GC) brain networks mainly focus on the EEG signal from the same-frequency bands, however, there are still some causality relationships between EEG signals in the cross-frequency bands. Considering the functional asymmetric of the left and right hemispheres to emotional response, this paper proposes an EEG emotion recognition scheme based on cross-frequency GC feature extraction and fusion in the left and right hemispheres. Firstly, we calculate the GC relationship of EEG signals according to the frequencies and hemispheres, and mainly focus on the causality of the cross-frequency EEG signals in left and right hemispheres. Then, to remove the redundant connections of the GC brain network, an adaptive two-stage decorrelation feature extraction scheme is proposed under the condition of maintaining the best emotion recognition performance. Finally, a multi-GC feature fusion scheme is designed to balance the recognition accuracy and feature number of each GC feature, which comprehensively considers the influence of the recognition accuracy and computational complexity. Experimental results on the DEAP emotion dataset show that the proposed scheme can achieve an average accuracy of 84.91% for four classifications, which improved the classification accuracy by up to 8.43% compared with that of the traditional same-frequency band GC features.
Collapse
|