1
|
Hoogerbrugge AJ, Strauch C, Oláh ZA, Dalmaijer ES, Nijboer TCW, Van der Stigchel S. Seeing the Forrest through the trees: Oculomotor metrics are linked to heart rate. PLoS One 2022; 17:e0272349. [PMID: 35917377 PMCID: PMC9345484 DOI: 10.1371/journal.pone.0272349] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Accepted: 07/19/2022] [Indexed: 11/18/2022] Open
Abstract
Fluctuations in a person’s arousal accompany mental states such as drowsiness, mental effort, or motivation, and have a profound effect on task performance. Here, we investigated the link between two central instances affected by arousal levels, heart rate and eye movements. In contrast to heart rate, eye movements can be inferred remotely and unobtrusively, and there is evidence that oculomotor metrics (i.e., fixations and saccades) are indicators for aspects of arousal going hand in hand with changes in mental effort, motivation, or task type. Gaze data and heart rate of 14 participants during film viewing were used in Random Forest models, the results of which show that blink rate and duration, and the movement aspect of oculomotor metrics (i.e., velocities and amplitudes) link to heart rate–more so than the amount or duration of fixations and saccades. We discuss that eye movements are not only linked to heart rate, but they may both be similarly influenced by the common underlying arousal system. These findings provide new pathways for the remote measurement of arousal, and its link to psychophysiological features.
Collapse
Affiliation(s)
- Alex J. Hoogerbrugge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
- * E-mail:
| | - Christoph Strauch
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
| | - Zoril A. Oláh
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
| | - Edwin S. Dalmaijer
- School of Psychological Science, University of Bristol, Bristol, United Kingdom
| | - Tanja C. W. Nijboer
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
- Center of Excellence for Rehabilitation Medicine, UMC Utrecht Brain Center, University Medical Center Utrecht, De Hoogstraat Rehabilitation, Utrecht, Netherlands
- Department of Rehabilitation, Physical Therapy Science & Sports, UMC Utrecht Brain Center, University Medical Center Utrecht, Utrecht, Netherlands
| | | |
Collapse
|
2
|
Brouwer VHEW, Stuit S, Hoogerbrugge A, Ten Brink AF, Gosselt IK, Van der Stigchel S, Nijboer TCW. Applying machine learning to dissociate between stroke patients and healthy controls using eye movement features obtained from a virtual reality task. Heliyon 2022; 8:e09207. [PMID: 35399377 PMCID: PMC8991384 DOI: 10.1016/j.heliyon.2022.e09207] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Revised: 05/27/2021] [Accepted: 03/24/2022] [Indexed: 12/03/2022] Open
Abstract
Conventional neuropsychological tests do not represent the complex and dynamic situations encountered in daily life. Immersive virtual reality simulations can be used to simulate dynamic and interactive situations in a controlled setting. Adding eye tracking to such simulations may provide highly detailed outcome measures, and has great potential for neuropsychological assessment. Here, participants (83 stroke patients and 103 healthy controls) we instructed to find either 3 or 7 items from a shopping list in a virtual super market environment while eye movements were being recorded. Using Logistic Regression and Support Vector Machine models, we aimed to predict the task of the participant and whether they belonged to the stroke or the control group. With a limited number of eye movement features, our models achieved an average Area Under the Curve (AUC) of .76 in predicting whether each participant was assigned a short or long shopping list (3 or 7 items). Identifying participant as either stroke patients and controls led to an AUC of .64. In both classification tasks, the frequency with which aisles were revisited was the most dissociating feature. As such, eye movement data obtained from a virtual reality simulation contain a rich set of signatures for detecting cognitive deficits, opening the door to potential clinical applications.
Collapse
Affiliation(s)
- Veerle H E W Brouwer
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands
| | - Sjoerd Stuit
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands
| | - Alex Hoogerbrugge
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands
| | - Antonia F Ten Brink
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands
| | - Isabel K Gosselt
- Center of Excellence for Rehabilitation Medicine, UMC Utrecht Brain Center, University Medical Center Utrecht, De Hoogstraat Rehabilitation, Heidelberglaan 100, 3584 CX, Utrecht, Netherlands
| | - Stefan Van der Stigchel
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands
| | - Tanja C W Nijboer
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, Netherlands.,Center of Excellence for Rehabilitation Medicine, UMC Utrecht Brain Center, University Medical Center Utrecht, De Hoogstraat Rehabilitation, Heidelberglaan 100, 3584 CX, Utrecht, Netherlands.,Department of Rehabilitation, Physical Therapy Science & Sports, UMC Utrecht Brain Center, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX, Utrecht, Netherlands
| |
Collapse
|
3
|
Lim JZ, Mountstephens J, Teo J. Eye-Tracking Feature Extraction for Biometric Machine Learning. Front Neurorobot 2022; 15:796895. [PMID: 35177973 PMCID: PMC8843826 DOI: 10.3389/fnbot.2021.796895] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 12/23/2021] [Indexed: 11/13/2022] Open
Abstract
CONTEXT Eye tracking is a technology to measure and determine the eye movements and eye positions of an individual. The eye data can be collected and recorded using an eye tracker. Eye-tracking data offer unprecedented insights into human actions and environments, digitizing how people communicate with computers, and providing novel opportunities to conduct passive biometric-based classification such as emotion prediction. The objective of this article is to review what specific machine learning features can be obtained from eye-tracking data for the classification task. METHODS We performed a systematic literature review (SLR) covering the eye-tracking studies in classification published from 2016 to the present. In the search process, we used four independent electronic databases which were the IEEE Xplore, the ACM Digital Library, and the ScienceDirect repositories as well as the Google Scholar. The selection process was performed by using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) search strategy. We followed the processes indicated in the PRISMA to choose the appropriate relevant articles. RESULTS Out of the initial 420 articles that were returned from our initial search query, 37 articles were finally identified and used in the qualitative synthesis, which were deemed to be directly relevant to our research question based on our methodology. CONCLUSION The features that could be extracted from eye-tracking data included pupil size, saccade, fixations, velocity, blink, pupil position, electrooculogram (EOG), and gaze point. Fixation was the most commonly used feature among the studies found.
Collapse
Affiliation(s)
- Jia Zheng Lim
- Evolutionary Computing Laboratory, Faculty of Computing and Informatics, Universiti Malaysia Sabah, Kota Kinabalu, Malaysia
| | - James Mountstephens
- Faculty of Computing and Informatics, Universiti Malaysia Sabah, Kota Kinabalu, Malaysia
| | - Jason Teo
- Faculty of Computing and Informatics, Universiti Malaysia Sabah, Kota Kinabalu, Malaysia
| |
Collapse
|
4
|
Strauch C, Hirzle T, Van der Stigchel S, Bulling A. Decoding binary decisions under differential target probabilities from pupil dilation: A random forest approach. J Vis 2021; 21:6. [PMID: 34259827 PMCID: PMC8288052 DOI: 10.1167/jov.21.7.6] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Accepted: 06/11/2021] [Indexed: 11/24/2022] Open
Abstract
Although our pupils slightly dilate when we look at an intended target, they do not when we look at irrelevant distractors. This finding suggests that it may be possible to decode the intention of an observer, understood as the outcome of implicit covert binary decisions, from the pupillary dynamics over time. However, few previous works have investigated the feasibility of this approach and the few that did, did not control for possible confounds such as motor-execution, changes in brightness, or target and distractor probability. We report on our efforts to decode intentions from pupil dilation obtained under strict experimental control on a single trial basis using a machine learning approach. The basis for our analyses are data of 69 participants who looked at letters that needed to be selected with stimulus probabilities that varied systematically in a blockwise manner (n = 19,417 trials). We confirm earlier findings that pupil dilation is indicative of intentions and show that these can be decoded with a classification performance of up to 76% area under the curve for receiver operating characteristic curves if targets are rarer than distractors. To better understand which characteristics of the pupillary signal are most informative, we finally compare relative feature importances. The first derivative of pupil size changes was found to be most relevant, allowing us to decode intention within only about 800 ms of trial onset. Taken together, our results provide credible insights into the potential of decoding intentions from pupil dilation and may soon form the basis for new applications in visual search, gaze-based interaction, or human-robot interaction.
Collapse
Affiliation(s)
- Christoph Strauch
- Experimental Psychology, Helmholtz Institute, Utrecht University, the Netherlands
| | - Teresa Hirzle
- Institute of Media Informatics, Ulm University, Germany
| | | | - Andreas Bulling
- Institute for Visualisation and Interactive Systems, University of Stuttgart, Germany
| |
Collapse
|