1
|
Anytime collaborative brain-computer interfaces for enhancing perceptual group decision-making. Sci Rep 2021; 11:17008. [PMID: 34417494 PMCID: PMC8379268 DOI: 10.1038/s41598-021-96434-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2020] [Accepted: 07/20/2021] [Indexed: 11/15/2022] Open
Abstract
In this paper we present, and test in two realistic environments, collaborative Brain-Computer Interfaces (cBCIs) that can significantly increase both the speed and the accuracy of perceptual group decision-making. The key distinguishing features of this work are: (1) our cBCIs combine behavioural, physiological and neural data in such a way as to be able to provide a group decision at any time after the quickest team member casts their vote, but the quality of a cBCI-assisted decision improves monotonically the longer the group decision can wait; (2) we apply our cBCIs to two realistic scenarios of military relevance (patrolling a dark corridor and manning an outpost at night where users need to identify any unidentified characters that appear) in which decisions are based on information conveyed through video feeds; and (3) our cBCIs exploit Event-Related Potentials (ERPs) elicited in brain activity by the appearance of potential threats but, uniquely, the appearance time is estimated automatically by the system (rather than being unrealistically provided to it). As a result of these elements, in the two test environments, groups assisted by our cBCIs make both more accurate and faster decisions than when individual decisions are integrated in more traditional manners.
Collapse
|
2
|
Fernandez-Vargas J, Tremmel C, Valeriani D, Bhattacharyya S, Cinel C, Citi L, Poli R. Subject- and task-independent neural correlates and prediction of decision confidence in perceptual decision making. J Neural Eng 2021; 18. [PMID: 33780913 DOI: 10.1088/1741-2552/abf2e4] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2020] [Accepted: 03/29/2021] [Indexed: 11/12/2022]
Abstract
Objective.In many real-world decision tasks, the information available to the decision maker is incomplete. To account for this uncertainty, we associate a degree of confidence to every decision, representing the likelihood of that decision being correct. In this study, we analyse electroencephalography (EEG) data from 68 participants undertaking eight different perceptual decision-making experiments. Our goals are to investigate (1) whether subject- and task-independent neural correlates of decision confidence exist, and (2) to what degree it is possible to build brain computer interfaces that can estimate confidence on a trial-by-trial basis. The experiments cover a wide range of perceptual tasks, which allowed to separate the task-related, decision-making features from the task-independent ones.Approach.Our systems train artificial neural networks to predict the confidence in each decision from EEG data and response times. We compare the decoding performance with three training approaches: (1) single subject, where both training and testing data were acquired from the same person; (2) multi-subject, where all the data pertained to the same task, but the training and testing data came from different users; and (3) multi-task, where the training and testing data came from different tasks and subjects. Finally, we validated our multi-task approach using data from two additional experiments, in which confidence was not reported.Main results.We found significant differences in the EEG data for different confidence levels in both stimulus-locked and response-locked epochs. All our approaches were able to predict the confidence between 15% and 35% better than the corresponding reference baselines.Significance.Our results suggest that confidence in perceptual decision making tasks could be reconstructed from neural signals even when using transfer learning approaches. These confidence estimates are based on the decision-making process rather than just the confidence-reporting process.
Collapse
Affiliation(s)
- Jacobo Fernandez-Vargas
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom
| | - Christoph Tremmel
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom
| | - Davide Valeriani
- Department of Otolaryngology
- Head and Neck Surgery, Massachusetts Eye and Ear, Boston, MA, United States of America.,Department of Otolaryngology
- Head and Neck Surgery, Harvard Medical School, Boston, MA, United States of America
| | - Saugat Bhattacharyya
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom.,School of Computing, Engineering & Intelligent Systems, Ulster University, Londonderry, United Kingdom
| | - Caterina Cinel
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom
| | - Luca Citi
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom
| | - Riccardo Poli
- Brain-Computer Interfaces and Neural Engineering laboratory, School of Computer Science and Electronic Engineering, University of Essex, Essex, United Kingdom
| |
Collapse
|
3
|
Zheng L, Sun S, Zhao H, Pei W, Chen H, Gao X, Zhang L, Wang Y. A Cross-Session Dataset for Collaborative Brain-Computer Interfaces Based on Rapid Serial Visual Presentation. Front Neurosci 2020; 14:579469. [PMID: 33192265 PMCID: PMC7642747 DOI: 10.3389/fnins.2020.579469] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 09/22/2020] [Indexed: 11/20/2022] Open
Abstract
Brain-computer interfaces (BCIs) based on rapid serial visual presentation (RSVP) have been widely used to categorize target and non-target images. However, it is still a challenge to detect single-trial event related potentials (ERPs) from electroencephalography (EEG) signals. Besides, the variability of EEG signal over time may cause difficulties of calibration in long-term system use. Recently, collaborative BCIs have been proposed to improve the overall BCI performance by fusing brain activities acquired from multiple subjects. For both individual and collaborative BCIs, feature extraction and classification algorithms that can be transferred across sessions can significantly facilitate system calibration. Although open datasets are highly efficient for developing algorithms, currently there is still a lack of datasets for a collaborative RSVP-based BCI. This paper presents a cross-session EEG dataset of a collaborative RSVP-based BCI system from 14 subjects, who were divided into seven groups. In collaborative BCI experiments, two subjects did the same target image detection tasks synchronously. All subjects participated in the same experiment twice with an average interval of ∼23 days. The results in data evaluation indicate that adequate signal processing algorithms can greatly enhance the cross-session BCI performance in both individual and collaborative conditions. Besides, compared with individual BCIs, the collaborative methods that fuse information from multiple subjects obtain significantly improved BCI performance. This dataset can be used for developing more efficient algorithms to enhance performance and practicality of a collaborative RSVP-based BCI system.
Collapse
Affiliation(s)
- Li Zheng
- State Key Laboratory on Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China.,School of Future Technology, University of Chinese Academy of Sciences, Beijing, China
| | - Sen Sun
- Department of Control Engineering, School of Information Science and Engineering, East China University of Science and Technology, Shanghai, China
| | - Hongze Zhao
- State Key Laboratory on Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China
| | - Weihua Pei
- State Key Laboratory on Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China.,School of Future Technology, University of Chinese Academy of Sciences, Beijing, China
| | - Hongda Chen
- State Key Laboratory on Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China
| | - Xiaorong Gao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Lijian Zhang
- Beijing Machine and Equipment Institute, Beijing, China
| | - Yijun Wang
- State Key Laboratory on Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China.,School of Future Technology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|