1
|
Wilkinson KM, Brittlebank S, Barwise A, Zimmerman TO, Light J. Visual fixation patterns to AAC displays are significantly correlated with motor selection for individuals with Down syndrome or individuals on the autism spectrum. Augment Altern Commun 2024:1-13. [PMID: 38786201 DOI: 10.1080/07434618.2024.2325065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2023] [Accepted: 02/20/2024] [Indexed: 05/25/2024] Open
Abstract
Eye tracking research technologies are often used to study how individuals attend visually to different types of AAC displays (e.g. visual scene displays, grid displays). The assumption is that efficiency of visual search may relate to efficiency of motor selection necessary for communication via aided AAC; however, this assumption has not received direct empirical study. We examined the relation between speed of visual search and speed of motor selection of symbols. Ten individuals on the autism spectrum (AS; Study 1) and nine with Down syndrome (DS; Study 2) participated in a search task using simulated AAC displays with a main visual scene display (VSD) and a navigation bar of thumbnail VSDs. Participants were given an auditory prompt to find one of four thumbnail VSDs in the navigation bar. Eye tracking technologies measured how long it took participants to fixate visually on the thumbnail VSD, and recorded how long it took participants to select the thumbnail VSD with a finger. A statistically significant relationship emerged between visual fixation and selection latencies, confirming the positive relationship between visual processing and motor selection for both groups of participants. Eye tracking data may serve as a useful proxy measure for evaluating how display design influences selection of AAC symbols, especially when individuals are unwilling or unable to comply with traditional behaviorally-based assessment tasks.
Collapse
Affiliation(s)
| | | | - Allison Barwise
- The Pennsylvania State University, University Park, Pennsylvania, USA
| | | | - Janice Light
- The Pennsylvania State University, University Park, Pennsylvania, USA
| |
Collapse
|
2
|
Soberanis-Mukul RD, Puentes PR, Acar A, Gupta I, Bhowmick J, Li Y, Ghazi A, Wu JY, Unberath M. Cognitive load in tele-robotic surgery: a comparison of eye tracker designs. Int J Comput Assist Radiol Surg 2024:10.1007/s11548-024-03150-x. [PMID: 38704792 DOI: 10.1007/s11548-024-03150-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2024] [Accepted: 04/10/2024] [Indexed: 05/07/2024]
Abstract
PURPOSE Eye gaze tracking and pupillometry are evolving areas within the field of tele-robotic surgery, particularly in the context of estimating cognitive load (CL). However, this is a recent field, and current solutions for gaze and pupil tracking in robotic surgery require assessment. Considering the necessity of stable pupillometry signals for reliable cognitive load estimation, we compare the accuracy of three eye trackers, including head and console-mounted designs. METHODS We conducted a user study with the da Vinci Research Kit (dVRK), to compare the three designs. We collected eye tracking and dVRK video data while participants observed nine markers distributed over the dVRK screen. We compute and analyze pupil detection stability and gaze prediction accuracy for the three designs. RESULTS Head-worn devices present better stability and accuracy of gaze prediction and pupil detection compared to console-mounted systems. Tracking stability along the field of view varies between trackers, with gaze predictions detected at invalid zones of the image with high confidence. CONCLUSION While head-worn solutions show benefits in confidence and stability, our results demonstrate the need to improve eye tacker performance regarding pupil detection, stability, and gaze accuracy in tele-robotic scenarios.
Collapse
Affiliation(s)
| | | | | | - Iris Gupta
- Johns Hopkins University, Baltimore, MD, USA
| | | | - Yizhou Li
- Vanderbilt University, Nashville, TN, USA
| | - Ahmed Ghazi
- Department of Urology, Johns Hopkins Medical Institute, Baltimore, MD, USA
| | | | | |
Collapse
|
3
|
Gundler C, Temmen M, Gulberti A, Pötter-Nerger M, Ückert F. Improving Eye-Tracking Data Quality: A Framework for Reproducible Evaluation of Detection Algorithms. SENSORS (BASEL, SWITZERLAND) 2024; 24:2688. [PMID: 38732794 PMCID: PMC11085612 DOI: 10.3390/s24092688] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Revised: 04/18/2024] [Accepted: 04/20/2024] [Indexed: 05/13/2024]
Abstract
High-quality eye-tracking data are crucial in behavioral sciences and medicine. Even with a solid understanding of the literature, selecting the most suitable algorithm for a specific research project poses a challenge. Empowering applied researchers to choose the best-fitting detector for their research needs is the primary contribution of this paper. We developed a framework to systematically assess and compare the effectiveness of 13 state-of-the-art algorithms through a unified application interface. Hence, we more than double the number of algorithms that are currently usable within a single software package and allow researchers to identify the best-suited algorithm for a given scientific setup. Our framework validation on retrospective data underscores its suitability for algorithm selection. Through a detailed and reproducible step-by-step workflow, we hope to contribute towards significantly improved data quality in scientific experiments.
Collapse
Affiliation(s)
- Christopher Gundler
- Institute for Applied Medical Informatics, University Medical Center Hamburg-Eppendorf, 20246 Hamburg, Germany;
| | | | - Alessandro Gulberti
- Department of Neurology, University Medical Center Hamburg-Eppendorf, 20246 Hamburg, Germany; (A.G.); (M.P.-N.)
| | - Monika Pötter-Nerger
- Department of Neurology, University Medical Center Hamburg-Eppendorf, 20246 Hamburg, Germany; (A.G.); (M.P.-N.)
| | - Frank Ückert
- Institute for Applied Medical Informatics, University Medical Center Hamburg-Eppendorf, 20246 Hamburg, Germany;
| |
Collapse
|
4
|
Thorsson M, Galazka MA, Åsberg Johnels J, Hadjikhani N. Influence of autistic traits and communication role on eye contact behavior during face-to-face interaction. Sci Rep 2024; 14:8162. [PMID: 38589489 PMCID: PMC11001951 DOI: 10.1038/s41598-024-58701-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Accepted: 04/02/2024] [Indexed: 04/10/2024] Open
Abstract
Eye contact is a central component in face-to-face interactions. It is important in structuring communicative exchanges and offers critical insights into others' interests and intentions. To better understand eye contact in face-to-face interactions, we applied a novel, non-intrusive deep-learning-based dual-camera system and investigated associations between eye contact and autistic traits as well as self-reported eye contact discomfort during a referential communication task, where participants and the experimenter had to guess, in turn, a word known by the other individual. Corroborating previous research, we found that participants' eye gaze and mutual eye contact were inversely related to autistic traits. In addition, our findings revealed different behaviors depending on the role in the dyad: listening and guessing were associated with increased eye contact compared with describing words. In the listening and guessing condition, only a subgroup who reported eye contact discomfort had a lower amount of eye gaze and eye contact. When describing words, higher autistic traits were associated with reduced eye gaze and eye contact. Our data indicate that eye contact is inversely associated with autistic traits when describing words, and that eye gaze is modulated by the communicative role in a conversation.
Collapse
Affiliation(s)
- Max Thorsson
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden.
| | - Martyna A Galazka
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden
- Division of Cognition and Communication, Department of Applied Information Technology, University of Gothenburg, Gothenburg, Sweden
| | - Jakob Åsberg Johnels
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden
- Section of Speech and Language Pathology, Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden
| | - Nouchine Hadjikhani
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, University of Gothenburg, Gothenburg, Sweden
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
5
|
Byrne SA, Nyström M, Maquiling V, Kasneci E, Niehorster DC. Precise localization of corneal reflections in eye images using deep learning trained on synthetic data. Behav Res Methods 2024; 56:3226-3241. [PMID: 38114880 PMCID: PMC11133043 DOI: 10.3758/s13428-023-02297-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/15/2023] [Indexed: 12/21/2023]
Abstract
We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with synthetic CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on two datasets consisting of high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 3-41.5% reduction in terms of spatial precision across data sets, and performed on par with state-of-the-art on synthetic images in terms of spatial accuracy. We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem, which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers.
Collapse
Affiliation(s)
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Virmarie Maquiling
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Enkelejda Kasneci
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Diederick C Niehorster
- MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy.
- Department of Psychology, Lund University, Lund, Sweden.
| |
Collapse
|
6
|
Niehorster DC, Hessels RS, Benjamins JS, Nyström M, Hooge ITC. GlassesValidator: A data quality tool for eye tracking glasses. Behav Res Methods 2024; 56:1476-1484. [PMID: 37326770 PMCID: PMC10991001 DOI: 10.3758/s13428-023-02105-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/03/2023] [Indexed: 06/17/2023]
Abstract
According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings. To enable determining the accuracy quickly and easily, we have produced a simple validation procedure using a printable poster and accompanying Python software. We tested the poster and procedure with 61 participants using one wearable eye tracker. In addition, the software was tested with six different wearable eye trackers. We found that the validation procedure can be administered within a minute per participant and provides measures of accuracy and precision. Calculating the eye-tracking data quality measures can be done offline on a simple computer and requires no advanced computer skills.
Collapse
Affiliation(s)
- Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden.
| | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
| | - Jeroen S Benjamins
- Experimental Psychology, Helmholtz Institute & Social, Health and Organisational Psychology, Utrecht University, Utrecht, Netherlands
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, Netherlands
| |
Collapse
|
7
|
Spaeth AM, Koenig S, Everaert J, Glombiewski JA, Kube T. Are depressive symptoms linked to a reduced pupillary response to novel positive information?-An eye tracking proof-of-concept study. Front Psychol 2024; 15:1253045. [PMID: 38464618 PMCID: PMC10920252 DOI: 10.3389/fpsyg.2024.1253045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2023] [Accepted: 01/31/2024] [Indexed: 03/12/2024] Open
Abstract
Introduction Depressive symptoms have been linked to difficulties in revising established negative beliefs in response to novel positive information. Recent predictive processing accounts have suggested that this bias in belief updating may be related to a blunted processing of positive prediction errors at the neural level. In this proof-of-concept study, pupil dilation in response to unexpected positive emotional information was examined as a psychophysiological marker of an attenuated processing of positive prediction errors associated with depressive symptoms. Methods Participants (N = 34) completed a modified version of the emotional Bias Against Disconfirmatory Evidence (BADE) task in which scenarios initially suggest negative interpretations that are later either confirmed or disconfirmed by additional information. Pupil dilation in response to the confirmatory and disconfirmatory information was recorded. Results Behavioral results showed that depressive symptoms were related to difficulties in revising negative interpretations despite disconfirmatory positive information. The eye tracking results pointed to a reduced pupil response to unexpected positive information among people with elevated depressive symptoms. Discussion Altogether, the present study demonstrates that the adapted emotional BADE task can be appropriate for examining psychophysiological aspects such as changes in pupil size along with behavioral responses. Furthermore, the results suggest that depression may be characterized by deviations in both behavioral (i.e., reduced updating of negative beliefs) and psychophysiological (i.e., decreased pupil dilation) responses to unexpected positive information. Future work should focus on a larger sample including clinically depressed patients to further explore these findings.
Collapse
Affiliation(s)
- Alexandra M. Spaeth
- Department of Psychology, University of Kaiserslautern-Landau, Landau, Germany
| | - Stephan Koenig
- Department of Psychology, University of Kaiserslautern-Landau, Landau, Germany
| | - Jonas Everaert
- Department of Medical and Clinical Psychology, Tilburg University, Tilburg, Netherlands
- Research Group of Quantitative Psychology and Individual Differences, KU Leuven, Leuven, Belgium
| | | | - Tobias Kube
- Department of Psychology, University of Kaiserslautern-Landau, Landau, Germany
| |
Collapse
|
8
|
Lidle LR, Schmitz J. Assessing Visual Avoidance of Faces During Real-Life Social Stress in Children with Social Anxiety Disorder: A Mobile Eye-Tracking Study. Child Psychiatry Hum Dev 2024; 55:24-35. [PMID: 35708796 PMCID: PMC10796484 DOI: 10.1007/s10578-022-01383-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 05/23/2022] [Indexed: 12/01/2022]
Abstract
This study measured visual attention (fixation count, dwell time) during two real-life social stress tasks using mobile eye-tracking glasses in children (9-13 years) diagnosed with social anxiety disorder (SAD; n = 25) and a healthy control group (HC; n = 30). The influence of state anxiety on attention allocation and negative self-evaluation biases regarding gaze behavior were also examined. Compared to the HC group, children with SAD showed visual avoidance (i.e., fewer fixations) of the faces of interaction partners during the second social stress task. While visual avoidance in HC children decreased with declining state anxiety from the first to the second social stress task, no such effect was found in children with SAD. A negative self-evaluation bias regarding gaze behavior in children with SAD was not found. In sum, measuring visual attention during real-life social situations may help enhance our understanding of social attention in childhood SAD.
Collapse
Affiliation(s)
- Leonie Rabea Lidle
- Department for Clinical Child and Adolescent Psychology, Institute of Psychology, Leipzig University, Neumarkt 9-19, 04109, Leipzig, Germany.
- Leipzig Research Centre for Early Child Development, Leipzig University, Leipzig, Germany.
| | - Julian Schmitz
- Department for Clinical Child and Adolescent Psychology, Institute of Psychology, Leipzig University, Neumarkt 9-19, 04109, Leipzig, Germany
- Leipzig Research Centre for Early Child Development, Leipzig University, Leipzig, Germany
| |
Collapse
|
9
|
Huang Z, Duan X, Zhu G, Zhang S, Wang R, Wang Z. Assessing the data quality of AdHawk MindLink eye-tracking glasses. Behav Res Methods 2024:10.3758/s13428-023-02310-2. [PMID: 38168041 DOI: 10.3758/s13428-023-02310-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/30/2023] [Indexed: 01/05/2024]
Abstract
Most commercially available eye-tracking devices rely on video cameras and image processing algorithms to track gaze. Despite this, emerging technologies are entering the field, making high-speed, cameraless eye-tracking more accessible. In this study, a series of tests were conducted to compare the data quality of MEMS-based eye-tracking glasses (AdHawk MindLink) with three widely used camera-based eye-tracking devices (EyeLink Portable Duo, Tobii Pro Glasses 2, and SMI Eye Tracking Glasses 2). The data quality measures assessed in these tests included accuracy, precision, data loss, and system latency. The results suggest that, overall, the data quality of the eye-tracking glasses was lower compared to that of a desktop EyeLink Portable Duo eye-tracker. Among the eye-tracking glasses, the accuracy and precision of the MindLink eye-tracking glasses were either higher or on par with those of Tobii Pro Glasses 2 and SMI Eye Tracking Glasses 2. The system latency of MindLink was approximately 9 ms, significantly lower than that of camera-based eye-tracking devices found in VR goggles. These results suggest that the MindLink eye-tracking glasses show promise for research applications where high sampling rates and low latency are preferred.
Collapse
Affiliation(s)
- Zehao Huang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Xiaoting Duan
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Gancheng Zhu
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Shuai Zhang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Rong Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Zhiguo Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China.
| |
Collapse
|
10
|
Velisar A, Shanidze NM. Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles. Behav Res Methods 2024; 56:53-79. [PMID: 37369939 PMCID: PMC11062346 DOI: 10.3758/s13428-023-02150-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/23/2023] [Indexed: 06/29/2023]
Abstract
Head-mounted, video-based eye tracking is becoming increasingly common and has promise in a range of applications. Here, we provide a practical and systematic assessment of the sources of measurement uncertainty for one such device - the Pupil Core - in three eye-tracking domains: (1) the 2D scene camera image; (2) the physical rotation of the eye relative to the scene camera 3D space; and (3) the external projection of the estimated gaze point location onto the target plane or in relation to world coordinates. We also assess eye camera motion during active tasks relative to the eye and the scene camera, an important consideration as the rigid arrangement of eye and scene camera is essential for proper alignment of the detected gaze. We find that eye camera motion, improper gaze point depth estimation, and erroneous eye models can all lead to added noise that must be considered in the experimental design. Further, while calibration accuracy and precision estimates can help assess data quality in the scene camera image, they may not be reflective of errors and variability in gaze point estimation. These findings support the importance of eye model constancy for comparisons across experimental conditions and suggest additional assessments of data reliability may be warranted for experiments that require the gaze point or measure eye movements relative to the external world.
Collapse
Affiliation(s)
- Anca Velisar
- The Smith-Kettlewell Eye Research Institute, 2318 Fillmore Street, San Francisco, CA, 94115, USA.
| | - Natela M Shanidze
- The Smith-Kettlewell Eye Research Institute, 2318 Fillmore Street, San Francisco, CA, 94115, USA
| |
Collapse
|
11
|
Hooge ITC, Niehorster DC, Hessels RS, Benjamins JS, Nyström M. How robust are wearable eye trackers to slow and fast head and body movements? Behav Res Methods 2023; 55:4128-4142. [PMID: 36326998 PMCID: PMC10700439 DOI: 10.3758/s13428-022-02010-3] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8∘. However, most errors were smaller than 3∘. We discuss the implications of decreased accuracy in the context of different research scenarios.
Collapse
Affiliation(s)
- Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands.
| | - Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden
| | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Jeroen S Benjamins
- Experimental Psychology, Helmholtz Institute, and Social, Health and Organisational Psychology, Utrecht University, Utrecht, The Netherlands
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| |
Collapse
|
12
|
Bradshaw J, Fu X, Yurkovic-Harding J, Abney D. Infant embodied attention in context: Feasibility of home-based head-mounted eye tracking in early infancy. Dev Cogn Neurosci 2023; 64:101299. [PMID: 37748360 PMCID: PMC10522938 DOI: 10.1016/j.dcn.2023.101299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 08/24/2023] [Accepted: 09/08/2023] [Indexed: 09/27/2023] Open
Abstract
Social communication emerges from dynamic, embodied social interactions during which infants coordinate attention to caregivers and objects. Yet many studies of infant attention are constrained to a laboratory setting, neglecting how attention is nested within social contexts where caregivers dynamically scaffold infant behavior in real time. This study evaluates the feasibility and acceptability of the novel use of head-mounted eye tracking (HMET) in the home with N = 40 infants aged 4 and 8 months who are typically developing and at an elevated genetic liability for autism spectrum disorder (ASD). Results suggest that HMET with young infants with limited independent motor abilities and at an elevated likelihood for atypical development is highly feasible and deemed acceptable by caregivers. Feasibility and acceptability did not differ by age or ASD likelihood. Data quality was also acceptable, albeit with younger infants showing slightly lower accuracy, allowing for preliminary analysis of developmental trends in infant gaze behavior. This study provides new evidence for the feasibility of using in-home HMET with young infants during a critical developmental period when more complex interactions with the environment and social partners are emerging. Future research can apply this technology to illuminate atypical developmental trajectories of embodied social attention in infancy.
Collapse
Affiliation(s)
- Jessica Bradshaw
- University of South Carolina, 1800 Gervais St., Columbia, SC 29201, USA; Carolina Autism and Neurodevelopment Research Center, University of South Carolina, USA.
| | - Xiaoxue Fu
- University of South Carolina, 1800 Gervais St., Columbia, SC 29201, USA; Carolina Autism and Neurodevelopment Research Center, University of South Carolina, USA
| | - Julia Yurkovic-Harding
- University of South Carolina, 1800 Gervais St., Columbia, SC 29201, USA; Carolina Autism and Neurodevelopment Research Center, University of South Carolina, USA
| | - Drew Abney
- University of Georgia, 125 Baldwin St., Athens, GA 30602, USA
| |
Collapse
|
13
|
Takahashi M, Veale R. Pathways for Naturalistic Looking Behavior in Primate I: Behavioral Characteristics and Brainstem Circuits. Neuroscience 2023; 532:133-163. [PMID: 37776945 DOI: 10.1016/j.neuroscience.2023.09.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2023] [Revised: 09/09/2023] [Accepted: 09/18/2023] [Indexed: 10/02/2023]
Abstract
Organisms control their visual worlds by moving their eyes, heads, and bodies. This control of "gaze" or "looking" is key to survival and intelligence, but our investigation of the underlying neural mechanisms in natural conditions is hindered by technical limitations. Recent advances have enabled measurement of both brain and behavior in freely moving animals in complex environments, expanding on historical head-fixed laboratory investigations. We juxtapose looking behavior as traditionally measured in the laboratory against looking behavior in naturalistic conditions, finding that behavior changes when animals are free to move or when stimuli have depth or sound. We specifically focus on the brainstem circuits driving gaze shifts and gaze stabilization. The overarching goal of this review is to reconcile historical understanding of the differential neural circuits for different "classes" of gaze shift with two inconvenient truths. (1) "classes" of gaze behavior are artificial. (2) The neural circuits historically identified to control each "class" of behavior do not operate in isolation during natural behavior. Instead, multiple pathways combine adaptively and non-linearly depending on individual experience. While the neural circuits for reflexive and voluntary gaze behaviors traverse somewhat independent brainstem and spinal cord circuits, both can be modulated by feedback, meaning that most gaze behaviors are learned rather than hardcoded. Despite this flexibility, there are broadly enumerable neural pathways commonly adopted among primate gaze systems. Parallel pathways which carry simultaneous evolutionary and homeostatic drives converge in superior colliculus, a layered midbrain structure which integrates and relays these volitional signals to brainstem gaze-control circuits.
Collapse
Affiliation(s)
- Mayu Takahashi
- Department of Systems Neurophysiology, Graduate School of Medical and Dental, Sciences, Tokyo Medical and Dental University, Japan.
| | - Richard Veale
- Department of Neurobiology, Graduate School of Medicine, Kyoto University, Japan
| |
Collapse
|
14
|
Kaduk T, Goeke C, Finger H, König P. Webcam eye tracking close to laboratory standards: Comparing a new webcam-based system and the EyeLink 1000. Behav Res Methods 2023:10.3758/s13428-023-02237-8. [PMID: 37821751 DOI: 10.3758/s13428-023-02237-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/04/2023] [Indexed: 10/13/2023]
Abstract
This paper aims to compare a new webcam-based eye-tracking system, integrated into the Labvanced platform for online experiments, to a "gold standard" lab-based eye tracker (EyeLink 1000 - SR Research). Specifically, we simultaneously recorded data with both eye trackers in five different tasks, analyzing their real-time performance. These tasks were a subset of a standardized test battery for eye trackers, including a Large Grid task, Smooth Pursuit eye movements, viewing natural images, and two Head Movements tasks (roll, yaw). The results show that the webcam-based system achieved an overall accuracy of 1.4°, and a precision of 1.1° (standard deviation (SD) across subjects), an error of about 0.5° larger than the EyeLink system. Interestingly, both accuracy (1.3°) and precision (0.9°) were slightly better for centrally presented targets, the region of interest in many psychophysical experiments. Remarkably, the correlation of raw gaze samples between the EyeLink and webcam-based was at about 90% for the Large Grid task and about 80% for Free View and Smooth Pursuit. Overall, these results put the performance of the webcam-based system roughly on par with mobile eye-tracking devices (Ehinger et al. PeerJ, 7, e7086, 2019; Tonsen et al., 2020) and demonstrate substantial improvement compared to existing webcam eye-tracking solutions (Papoutsaki et al., 2017).
Collapse
Affiliation(s)
- Tobiasz Kaduk
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany.
- Research and Development Division, Scicovery GmbH, Paderborn, Germany.
| | - Caspar Goeke
- Research and Development Division, Scicovery GmbH, Paderborn, Germany
| | - Holger Finger
- Research and Development Division, Scicovery GmbH, Paderborn, Germany
| | - Peter König
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Germany
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
15
|
Faraji Y, van Rijn JW, van Nispen RMA, van Rens GHMB, Melis-Dankers BJM, Koopman J, van Rijn LJ. A toolkit for wide-screen dynamic area of interest measurements using the Pupil Labs Core Eye Tracker. Behav Res Methods 2023; 55:3820-3830. [PMID: 36253600 PMCID: PMC10616213 DOI: 10.3758/s13428-022-01991-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/26/2022] [Indexed: 11/08/2022]
Abstract
Eye tracking measurements taken while watching a wide field screen are challenging to perform. Commercially available remote eye trackers typically do not measure more than 35 degrees in eccentricity. Analysis software was developed using the Pupil Core Eye Tracking data to analyze viewing behavior under circumstances as natural as possible, on a 1.55-m-wide screen allowing free head movements. Additionally, dynamic area of interest (AOI) analyses were performed on data of participants viewing traffic scenes. A toolkit was created including software for simple allocation of dynamic AOIs (semi-automatically and manually), measurement of parameters such as dwell times and time to first entry, and overlaying gaze and AOIs on video. Participants (n =11) were asked to look at 13 dynamic AOIs in traffic scenes from appearance to disappearance in order to validate the setup and software. Different AOI margins were explored for the included objects. The median ratio between total appearance time and dwell time was about 90% for most objects when appropriate margins were chosen. This validated open-source toolkit is readily available for researchers who want to perform dynamic AOI analyses with the Pupil Core eye tracker, especially when measurements are desired on a wide screen, in various fields such as psychology, transportation, and low vision research.
Collapse
Affiliation(s)
- Yasmin Faraji
- Amsterdam UMC location Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands
- Amsterdam Public Health, Quality of Care, Societal Participation & Health, Mental Health, Aging and Later Life, Amsterdam, The Netherlands
| | - Joris W van Rijn
- Amsterdam UMC location Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands
| | - Ruth M A van Nispen
- Amsterdam UMC location Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands
- Amsterdam Public Health, Quality of Care, Societal Participation & Health, Mental Health, Aging and Later Life, Amsterdam, The Netherlands
| | - Ger H M B van Rens
- Amsterdam UMC location Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands
- Amsterdam Public Health, Quality of Care, Societal Participation & Health, Mental Health, Aging and Later Life, Amsterdam, The Netherlands
| | - Bart J M Melis-Dankers
- Royal Dutch Visio, Centre of Expertise for Blind and Partially Sighted People, Huizen, The Netherlands
| | - Jan Koopman
- Royal Dutch Visio, Centre of Expertise for Blind and Partially Sighted People, Huizen, The Netherlands
| | - Laurentius J van Rijn
- Amsterdam UMC location Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands.
- Department of Ophthalmology, Onze Lieve Vrouwe Gasthuis, Amsterdam, The Netherlands.
- Amsterdam Neuroscience, Systems & Network Neurosciences, Amsterdam, The Netherlands.
| |
Collapse
|
16
|
Nebe S, Reutter M, Baker DH, Bölte J, Domes G, Gamer M, Gärtner A, Gießing C, Gurr C, Hilger K, Jawinski P, Kulke L, Lischke A, Markett S, Meier M, Merz CJ, Popov T, Puhlmann LMC, Quintana DS, Schäfer T, Schubert AL, Sperl MFJ, Vehlen A, Lonsdorf TB, Feld GB. Enhancing precision in human neuroscience. eLife 2023; 12:e85980. [PMID: 37555830 PMCID: PMC10411974 DOI: 10.7554/elife.85980] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 07/23/2023] [Indexed: 08/10/2023] Open
Abstract
Human neuroscience has always been pushing the boundary of what is measurable. During the last decade, concerns about statistical power and replicability - in science in general, but also specifically in human neuroscience - have fueled an extensive debate. One important insight from this discourse is the need for larger samples, which naturally increases statistical power. An alternative is to increase the precision of measurements, which is the focus of this review. This option is often overlooked, even though statistical power benefits from increasing precision as much as from increasing sample size. Nonetheless, precision has always been at the heart of good scientific practice in human neuroscience, with researchers relying on lab traditions or rules of thumb to ensure sufficient precision for their studies. In this review, we encourage a more systematic approach to precision. We start by introducing measurement precision and its importance for well-powered studies in human neuroscience. Then, determinants for precision in a range of neuroscientific methods (MRI, M/EEG, EDA, Eye-Tracking, and Endocrinology) are elaborated. We end by discussing how a more systematic evaluation of precision and the application of respective insights can lead to an increase in reproducibility in human neuroscience.
Collapse
Affiliation(s)
- Stephan Nebe
- Zurich Center for Neuroeconomics, Department of Economics, University of ZurichZurichSwitzerland
| | - Mario Reutter
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
| | - Daniel H Baker
- Department of Psychology and York Biomedical Research Institute, University of YorkYorkUnited Kingdom
| | - Jens Bölte
- Institute for Psychology, University of Münster, Otto-Creuzfeldt Center for Cognitive and Behavioral NeuroscienceMünsterGermany
| | - Gregor Domes
- Department of Biological and Clinical Psychology, University of TrierTrierGermany
- Institute for Cognitive and Affective NeuroscienceTrierGermany
| | - Matthias Gamer
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
| | - Anne Gärtner
- Faculty of Psychology, Technische Universität DresdenDresdenGermany
| | - Carsten Gießing
- Biological Psychology, Department of Psychology, School of Medicine and Health Sciences, Carl von Ossietzky University of OldenburgOldenburgGermany
| | - Caroline Gurr
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital, Goethe UniversityFrankfurtGermany
- Brain Imaging Center, Goethe UniversityFrankfurtGermany
| | - Kirsten Hilger
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
- Department of Psychology, Psychological Diagnostics and Intervention, Catholic University of Eichstätt-IngolstadtEichstättGermany
| | - Philippe Jawinski
- Department of Psychology, Humboldt-Universität zu BerlinBerlinGermany
| | - Louisa Kulke
- Department of Developmental with Educational Psychology, University of BremenBremenGermany
| | - Alexander Lischke
- Department of Psychology, Medical School HamburgHamburgGermany
- Institute of Clinical Psychology and Psychotherapy, Medical School HamburgHamburgGermany
| | - Sebastian Markett
- Department of Psychology, Humboldt-Universität zu BerlinBerlinGermany
| | - Maria Meier
- Department of Psychology, University of KonstanzKonstanzGermany
- University Psychiatric Hospitals, Child and Adolescent Psychiatric Research Department (UPKKJ), University of BaselBaselSwitzerland
| | - Christian J Merz
- Department of Cognitive Psychology, Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr University BochumBochumGermany
| | - Tzvetan Popov
- Department of Psychology, Methods of Plasticity Research, University of ZurichZurichSwitzerland
| | - Lara MC Puhlmann
- Leibniz Institute for Resilience ResearchMainzGermany
- Max Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
| | - Daniel S Quintana
- Max Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- NevSom, Department of Rare Disorders & Disabilities, Oslo University HospitalOsloNorway
- KG Jebsen Centre for Neurodevelopmental Disorders, University of OsloOsloNorway
- Norwegian Centre for Mental Disorders Research (NORMENT), University of OsloOsloNorway
| | - Tim Schäfer
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital, Goethe UniversityFrankfurtGermany
- Brain Imaging Center, Goethe UniversityFrankfurtGermany
| | | | - Matthias FJ Sperl
- Department of Clinical Psychology and Psychotherapy, University of GiessenGiessenGermany
- Center for Mind, Brain and Behavior, Universities of Marburg and GiessenGiessenGermany
| | - Antonia Vehlen
- Department of Biological and Clinical Psychology, University of TrierTrierGermany
| | - Tina B Lonsdorf
- Department of Systems Neuroscience, University Medical Center Hamburg-EppendorfHamburgGermany
- Department of Psychology, Biological Psychology and Cognitive Neuroscience, University of BielefeldBielefeldGermany
| | - Gordon B Feld
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
- Department of Psychology, Heidelberg UniversityHeidelbergGermany
- Department of Addiction Behavior and Addiction Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
| |
Collapse
|
17
|
Onkhar V, Dodou D, de Winter JCF. Evaluating the Tobii Pro Glasses 2 and 3 in static and dynamic conditions. Behav Res Methods 2023:10.3758/s13428-023-02173-7. [PMID: 37550466 DOI: 10.3758/s13428-023-02173-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/15/2023] [Indexed: 08/09/2023]
Abstract
Over the past few decades, there have been significant developments in eye-tracking technology, particularly in the domain of mobile, head-mounted devices. Nevertheless, questions remain regarding the accuracy of these eye-trackers during static and dynamic tasks. In light of this, we evaluated the performance of two widely used devices: Tobii Pro Glasses 2 and Tobii Pro Glasses 3. A total of 36 participants engaged in tasks under three dynamicity conditions. In the "seated with a chinrest" trial, only the eyes could be moved; in the "seated without a chinrest" trial, both the head and the eyes were free to move; and during the walking trial, participants walked along a straight path. During the seated trials, participants' gaze was directed towards dots on a wall by means of audio instructions, whereas in the walking trial, participants maintained their gaze on a bullseye while walking towards it. Eye-tracker accuracy was determined using computer vision techniques to identify the target within the scene camera image. The findings showed that Tobii 3 outperformed Tobii 2 in terms of accuracy during the walking trials. Moreover, the results suggest that employing a chinrest in the case of head-mounted eye-trackers is counterproductive, as it necessitates larger eye eccentricities for target fixation, thereby compromising accuracy compared to not using a chinrest, which allows for head movement. Lastly, it was found that participants who reported higher workload demonstrated poorer eye-tracking accuracy. The current findings may be useful in the design of experiments that involve head-mounted eye-trackers.
Collapse
Affiliation(s)
- V Onkhar
- Department of Cognitive Robotics, Delft University of Technology, Delft, The Netherlands
| | - D Dodou
- Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - J C F de Winter
- Department of Cognitive Robotics, Delft University of Technology, Delft, The Netherlands.
| |
Collapse
|
18
|
Dunn MJ, Alexander RG, Amiebenomo OM, Arblaster G, Atan D, Erichsen JT, Ettinger U, Giardini ME, Gilchrist ID, Hamilton R, Hessels RS, Hodgins S, Hooge ITC, Jackson BS, Lee H, Macknik SL, Martinez-Conde S, Mcilreavy L, Muratori LM, Niehorster DC, Nyström M, Otero-Millan J, Schlüssel MM, Self JE, Singh T, Smyrnis N, Sprenger A. Minimal reporting guideline for research involving eye tracking (2023 edition). Behav Res Methods 2023:10.3758/s13428-023-02187-1. [PMID: 37507649 DOI: 10.3758/s13428-023-02187-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/28/2023] [Indexed: 07/30/2023]
Abstract
A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years.
Collapse
Affiliation(s)
- Matt J Dunn
- School of Optometry and Vision Sciences, Cardiff University, Cardiff, UK.
| | - Robert G Alexander
- Departments of Ophthalmology, Neurology, and Physiology/Pharmacology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | | | - Gemma Arblaster
- Health Sciences School, University of Sheffield, Sheffield, UK
- Orthoptic Department, Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, UK
| | - Denize Atan
- Bristol Medical School, University of Bristol, Bristol, UK
| | | | | | - Mario E Giardini
- Department of Biomedical Engineering, University of Strathclyde, Glasgow, UK
| | - Iain D Gilchrist
- School of Psychological Science, University of Bristol, Bristol, UK
| | - Ruth Hamilton
- Department of Clinical Physics & Bioengineering, Royal Hospital for Children, NHS Greater Glasgow & Clyde, Glasgow, UK
- College of Medical, Veterinary & Life Sciences, University of Glasgow, Glasgow, UK
| | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | | | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Brooke S Jackson
- Department of Psychology, University of Georgia, Athens, GA, USA
| | - Helena Lee
- Clinical and Experimental Sciences, University of Southampton, Southampton, UK
| | - Stephen L Macknik
- Departments of Ophthalmology, Neurology, and Physiology/Pharmacology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Susana Martinez-Conde
- Departments of Ophthalmology, Neurology, and Physiology/Pharmacology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Lee Mcilreavy
- School of Optometry and Vision Sciences, Cardiff University, Cardiff, UK
| | - Lisa M Muratori
- Department of Physical Therapy, School of Health Professions, Stony Brook University, Stony Brook, NY, USA
| | - Diederick C Niehorster
- Lund University Humanities Lab, Lund University, Lund, Sweden
- Department of Psychology, Lund University, Lund, Sweden
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Jorge Otero-Millan
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
- Department of Neurology, Johns Hopkins University, Baltimore, MD, USA
| | - Michael M Schlüssel
- UK EQUATOR Centre, Centre for Statistics in Medicine (CSM), Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), University of Oxford, Oxford, UK
| | - Jay E Self
- Clinical and Experimental Sciences, University of Southampton, Southampton, UK
| | - Tarkeshwar Singh
- Department of Kinesiology, Pennsylvania State University, University Park, PA, USA
| | - Nikolaos Smyrnis
- 2nd Department of Psychiatry, National and Kapodistrian University of Athens, Medical School, General University Hospital Attikon, Athens, Greece
| | - Andreas Sprenger
- Department of Neurology and Institute of Psychology II, Center of Brain, Behavior and Metabolism (CBBM), University of Luebeck, Luebeck, Germany
| |
Collapse
|
19
|
Bagot P, Fournier JF, Kerivel T, Bossard C, Kermarrec G, Martinent G, Bernier M. Visual Search Strategies of Elite Fencers: An Exploratory Study in Ecological Competitive Situation. J Funct Morphol Kinesiol 2023; 8:106. [PMID: 37606401 PMCID: PMC10443368 DOI: 10.3390/jfmk8030106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/09/2023] [Revised: 07/24/2023] [Accepted: 07/25/2023] [Indexed: 08/23/2023] Open
Abstract
This study investigates the visual activity of fencers in conditions resembling official competitions. Previous research in experimental conditions has shown that experts focus on specific areas of the torso and the armed arm to control movement initiation. Eight right-handed fencers (epee: two males, one female; foil: one male; sabre: two males, two females) participated in a simulated competition, wearing an eye tracker during one bout. The findings showed that the main fixation in foil and sabre is the upper torso, while in epee, it is the lower torso. In epee and sabre, the upper torso is viewed about 50% of the time, with three other areas also observed, while in foil, the fixation is totally directed to the upper torso. Additionally, two new areas of interest were identified: the score machine and an area involving fixations other than the opponent. The study found no direct link between visual activity and performance. The visual search strategy varies among weapons, with foil using a gaze anchor or foveal spot and epee and sabre utilizing a visual pivot due to the discipline's inherent rules. The study also emphasizes that competition-like conditions can disrupt visual activity with external stimuli, possibly affecting performance.
Collapse
Affiliation(s)
- Pierre Bagot
- Centre de Recherche sur l’Éducation, l’Apprentissage et la Didactique, University Brest, F-29200 Brest, France; (P.B.)
| | - Jean F. Fournier
- Laboratoire Interdisciplinaire en Neurosciences, Physiologie et Psychologie: Activité Physique, Santé et Apprentissages, University Paris Nanterre, F-92001 Nanterre, France
| | - Thibault Kerivel
- Centre de Recherche sur l’Éducation, l’Apprentissage et la Didactique, University Brest, F-29200 Brest, France; (P.B.)
| | - Cyril Bossard
- Centre de Recherche sur l’Éducation, l’Apprentissage et la Didactique, University Brest, F-29200 Brest, France; (P.B.)
| | - Gilles Kermarrec
- Centre de Recherche sur l’Éducation, l’Apprentissage et la Didactique, University Brest, F-29200 Brest, France; (P.B.)
| | - Guillaume Martinent
- Laboratoire sur les Vulnérabilités et l’Innovation dans le Sport, University Lyon 1, F-69622 Lyon, France
| | - Marjorie Bernier
- Centre de Recherche sur l’Éducation, l’Apprentissage et la Didactique, University Brest, F-29200 Brest, France; (P.B.)
| |
Collapse
|
20
|
Miljković N, Sodnik J. Effectiveness of a time to fixate for fitness to drive evaluation in neurological patients. Behav Res Methods 2023:10.3758/s13428-023-02177-3. [PMID: 37488465 DOI: 10.3758/s13428-023-02177-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/16/2023] [Indexed: 07/26/2023]
Abstract
We present a method to automatically calculate time to fixate (TTF) from the eye-tracker data in subjects with neurological impairment using a driving simulator. TTF presents the time interval for a person to notice the stimulus from its first occurrence. Precisely, we measured the time since the children started to cross the street until the drivers directed their look to the children. From 108 neurological patients recruited for the study, the analysis of TTF was performed in 56 patients to assess fit-, unfit-, and conditionally-fit-to-drive patients. The results showed that the proposed method based on the YOLO (you only look once) object detector is efficient for computing TTFs from the eye-tracker data. We obtained discriminative results for fit-to-drive patients by application of Tukey's honest significant difference post hoc test (p < 0.01), while no difference was observed between conditionally-fit and unfit-to-drive groups (p = 0.542). Moreover, we show that time-to-collision (TTC), initial gaze distance (IGD) from pedestrians, and speed at the hazard onset did not influence the result, while the only significant interaction is among fitness, IGD, and TTC on TTF. Obtained TTFs are also compared with the perception response times (PRT) calculated independently from eye-tracker data and YOLO. Although we reached statistically significant results that speak in favor of possible method application for assessment of fitness to drive, we provide detailed directions for future driving simulation-based evaluation and propose processing workflow to secure reliable TTF calculation and its possible application in for example psychology and neuroscience.
Collapse
Affiliation(s)
- Nadica Miljković
- University of Belgrade - School of Electrical Engineering, Bulevar kralja Aleksandra 73, 11000, Belgrade, Serbia.
- Faculty of Electrical Engineering, University of Ljubljana, Tržaška cesta 25, 1000, Ljubljana, Slovenia.
| | - Jaka Sodnik
- Faculty of Electrical Engineering, University of Ljubljana, Tržaška cesta 25, 1000, Ljubljana, Slovenia
| |
Collapse
|
21
|
Nishizono R, Saijo N, Kashino M. Highly reproducible eyeblink timing during formula car driving. iScience 2023; 26:106803. [PMID: 37378324 PMCID: PMC10291330 DOI: 10.1016/j.isci.2023.106803] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 02/01/2023] [Accepted: 04/28/2023] [Indexed: 06/29/2023] Open
Abstract
How do humans blink while driving a vehicle? Although gaze control patterns have been previously reported in relation to successful steering, eyeblinks that disrupt vision are believed to be randomly distributed during driving or are ignored. Herein, we demonstrate that eyeblink timing shows reproducible patterns during real formula car racing driving and is related to car control. We studied three top-level racing drivers. Their eyeblinks and driving behavior were acquired during practice sessions. The results revealed that the drivers blinked at surprisingly similar positions on the courses. We identified three factors underlying the eyeblink patterns: the driver's individual blink count, lap pace associated with how strictly they followed their pattern on each lap, and car acceleration associated with when/where to blink at a moment. These findings suggest that the eyeblink pattern reflected cognitive states during in-the-wild driving and experts appear to change such cognitive states continuously and dynamically.
Collapse
Affiliation(s)
- Ryota Nishizono
- NTT Communication Science Laboratories, Morinosato Wakamiya 3-1, Atsugi, Kanagawa 243-0198, Japan
| | - Naoki Saijo
- NTT Communication Science Laboratories, Morinosato Wakamiya 3-1, Atsugi, Kanagawa 243-0198, Japan
| | - Makio Kashino
- NTT Communication Science Laboratories, Morinosato Wakamiya 3-1, Atsugi, Kanagawa 243-0198, Japan
| |
Collapse
|
22
|
Kredel R, Hernandez J, Hossner EJ, Zahno S. Eye-tracking technology and the dynamics of natural gaze behavior in sports: an update 2016-2022. Front Psychol 2023; 14:1130051. [PMID: 37359890 PMCID: PMC10286576 DOI: 10.3389/fpsyg.2023.1130051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Accepted: 05/16/2023] [Indexed: 06/28/2023] Open
Abstract
Updating and complementing a previous review on eye-tracking technology and the dynamics of natural gaze behavior in sports, this short review focuses on the progress concerning researched sports tasks, applied methods of gaze data collection and analysis as well as derived gaze measures for the time interval of 2016-2022. To that end, a systematic review according to the PRISMA guidelines was conducted, searching Web of Science, PubMed Central, SPORTDiscus, and ScienceDirect for the keywords: eye tracking, gaze behavio*r, eye movement, and visual search. Thirty-one studies were identified for the review. On the one hand, a generally increased research interest and a wider area of researched sports with a particular increase in official's gaze behavior were diagnosed. On the other hand, a general lack of progress concerning sample sizes, amounts of trials, employed eye-tracking technology and gaze analysis procedures must be acknowledged. Nevertheless, first attempts to automated gaze-cue-allocations (GCA) in mobile eye-tracking studies were seen, potentially enhancing objectivity, and alleviating the burden of manual workload inherently associated with conventional gaze analyses. Reinforcing the claims of the previous review, this review concludes by describing four distinct technological approaches to automating GCA, some of which are specifically suited to tackle the validity and generalizability issues associated with the current limitations of mobile eye-tracking studies on natural gaze behavior in sports.
Collapse
|
23
|
Park SY, Holmqvist K, Niehorster DC, Huber L, Virányi Z. How to improve data quality in dog eye tracking. Behav Res Methods 2023; 55:1513-1536. [PMID: 35680764 PMCID: PMC10250523 DOI: 10.3758/s13428-022-01788-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/02/2022] [Indexed: 11/08/2022]
Abstract
Pupil-corneal reflection (P-CR) eye tracking has gained a prominent role in studying dog visual cognition, despite methodological challenges that often lead to lower-quality data than when recording from humans. In the current study, we investigated if and how the morphology of dogs might interfere with tracking of P-CR systems, and to what extent such interference, possibly in combination with dog-unique eye-movement characteristics, may undermine data quality and affect eye-movement classification when processed through algorithms. For this aim, we have conducted an eye-tracking experiment with dogs and humans, and investigated incidences of tracking interference, compared how they blinked, and examined how differential quality of dog and human data affected the detection and classification of eye-movement events. Our results show that the morphology of dogs' face and eye can interfere with tracking methods of the systems, and dogs blink less often but their blinks are longer. Importantly, the lower quality of dog data lead to larger differences in how two different event detection algorithms classified fixations, indicating that the results of key dependent variables are more susceptible to choice of algorithm in dog than human data. Further, two measures of the Nyström & Holmqvist (Behavior Research Methods, 42(4), 188-204, 2010) algorithm showed that dog fixations are less stable and dog data have more trials with extreme levels of noise. Our findings call for analyses better adjusted to the characteristics of dog eye-tracking data, and our recommendations help future dog eye-tracking studies acquire quality data to enable robust comparisons of visual cognition between dogs and humans.
Collapse
Affiliation(s)
- Soon Young Park
- Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna, Vienna, Austria.
- Medical University Vienna, Vienna, Austria.
- University of Vienna, Vienna, Austria.
| | - Kenneth Holmqvist
- Institute of Psychology, Nicolaus Copernicus University in Torun, Torun, Poland
- Department of Psychology, Regensburg University, Regensburg, Germany
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | - Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden
| | - Ludwig Huber
- Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna, Vienna, Austria
- Medical University Vienna, Vienna, Austria
- University of Vienna, Vienna, Austria
| | - Zsófia Virányi
- Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna, Vienna, Austria
- Medical University Vienna, Vienna, Austria
- University of Vienna, Vienna, Austria
| |
Collapse
|
24
|
Thorsson M, Galazka MA, Åsberg Johnels J, Hadjikhani N. A novel end-to-end dual-camera system for eye gaze synchrony assessment in face-to-face interaction. Atten Percept Psychophys 2023:10.3758/s13414-023-02679-4. [PMID: 37099200 DOI: 10.3758/s13414-023-02679-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/11/2023] [Indexed: 04/27/2023]
Abstract
Quantification of face-to-face interaction can provide highly relevant information in cognitive and psychological science research. Current commercial glint-dependent solutions suffer from several disadvantages and limitations when applied in face-to-face interaction, including data loss, parallax errors, the inconvenience and distracting effect of wearables, and/or the need for several cameras to capture each person. Here we present a novel eye-tracking solution, consisting of a dual-camera system used in conjunction with an individually optimized deep learning approach that aims to overcome some of these limitations. Our data show that this system can accurately classify gaze location within different areas of the face of two interlocutors, and capture subtle differences in interpersonal gaze synchrony between two individuals during a (semi-)naturalistic face-to-face interaction.
Collapse
Affiliation(s)
- Max Thorsson
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden.
| | - Martyna A Galazka
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Jakob Åsberg Johnels
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Section of Speech and Language Pathology, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Nouchine Hadjikhani
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
25
|
Holmqvist K, Örbom SL, Hooge ITC, Niehorster DC, Alexander RG, Andersson R, Benjamins JS, Blignaut P, Brouwer AM, Chuang LL, Dalrymple KA, Drieghe D, Dunn MJ, Ettinger U, Fiedler S, Foulsham T, van der Geest JN, Hansen DW, Hutton SB, Kasneci E, Kingstone A, Knox PC, Kok EM, Lee H, Lee JY, Leppänen JM, Macknik S, Majaranta P, Martinez-Conde S, Nuthmann A, Nyström M, Orquin JL, Otero-Millan J, Park SY, Popelka S, Proudlock F, Renkewitz F, Roorda A, Schulte-Mecklenbeck M, Sharif B, Shic F, Shovman M, Thomas MG, Venrooij W, Zemblys R, Hessels RS. Eye tracking: empirical foundations for a minimal reporting guideline. Behav Res Methods 2023; 55:364-416. [PMID: 35384605 PMCID: PMC9535040 DOI: 10.3758/s13428-021-01762-8] [Citation(s) in RCA: 45] [Impact Index Per Article: 45.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/29/2021] [Indexed: 11/08/2022]
Abstract
In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").
Collapse
Affiliation(s)
- Kenneth Holmqvist
- Department of Psychology, Nicolaus Copernicus University, Torun, Poland.
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa.
- Department of Psychology, Regensburg University, Regensburg, Germany.
| | - Saga Lee Örbom
- Department of Psychology, Regensburg University, Regensburg, Germany
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden
| | - Robert G Alexander
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | | | - Jeroen S Benjamins
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
- Social, Health and Organizational Psychology, Utrecht University, Utrecht, The Netherlands
| | - Pieter Blignaut
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | | | - Lewis L Chuang
- Department of Ergonomics, Leibniz Institute for Working Environments and Human Factors, Dortmund, Germany
- Institute of Informatics, LMU Munich, Munich, Germany
| | | | - Denis Drieghe
- School of Psychology, University of Southampton, Southampton, UK
| | - Matt J Dunn
- School of Optometry and Vision Sciences, Cardiff University, Cardiff, UK
| | | | - Susann Fiedler
- Vienna University of Economics and Business, Vienna, Austria
| | - Tom Foulsham
- Department of Psychology, University of Essex, Essex, UK
| | | | - Dan Witzner Hansen
- Machine Learning Group, Department of Computer Science, IT University of Copenhagen, Copenhagen, Denmark
| | | | - Enkelejda Kasneci
- Human-Computer Interaction, University of Tübingen, Tübingen, Germany
| | | | - Paul C Knox
- Department of Eye and Vision Science, Institute of Life Course and Medical Sciences, University of Liverpool, Liverpool, UK
| | - Ellen M Kok
- Department of Education and Pedagogy, Division Education, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands
- Department of Online Learning and Instruction, Faculty of Educational Sciences, Open University of the Netherlands, Heerlen, The Netherlands
| | - Helena Lee
- University of Southampton, Southampton, UK
| | - Joy Yeonjoo Lee
- School of Health Professions Education, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Jukka M Leppänen
- Department of Psychology and Speed-Language Pathology, University of Turku, Turku, Finland
| | - Stephen Macknik
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Päivi Majaranta
- TAUCHI Research Center, Computing Sciences, Faculty of Information Technology and Communication Sciences, Tampere University, Tampere, Finland
| | - Susana Martinez-Conde
- Department of Ophthalmology, SUNY Downstate Health Sciences University, Brooklyn, NY, USA
| | - Antje Nuthmann
- Institute of Psychology, University of Kiel, Kiel, Germany
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Jacob L Orquin
- Department of Management, Aarhus University, Aarhus, Denmark
- Center for Research in Marketing and Consumer Psychology, Reykjavik University, Reykjavik, Iceland
| | - Jorge Otero-Millan
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | - Soon Young Park
- Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna, Medical University of Vienna, Vienna, Austria
| | - Stanislav Popelka
- Department of Geoinformatics, Palacký University Olomouc, Olomouc, Czech Republic
| | - Frank Proudlock
- The University of Leicester Ulverscroft Eye Unit, Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
| | - Frank Renkewitz
- Department of Psychology, University of Erfurt, Erfurt, Germany
| | - Austin Roorda
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA
| | | | - Bonita Sharif
- School of Computing, University of Nebraska-Lincoln, Lincoln, Nebraska, USA
| | - Frederick Shic
- Center for Child Health, Behavior and Development, Seattle Children's Research Institute, Seattle, WA, USA
- Department of General Pediatrics, University of Washington School of Medicine, Seattle, WA, USA
| | - Mark Shovman
- Eyeviation Systems, Herzliya, Israel
- Department of Industrial Design, Bezalel Academy of Arts and Design, Jerusalem, Israel
| | - Mervyn G Thomas
- The University of Leicester Ulverscroft Eye Unit, Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
| | - Ward Venrooij
- Electrical Engineering, Mathematics and Computer Science (EEMCS), University of Twente, Enschede, The Netherlands
| | | | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
26
|
Ban S, Lee YJ, Kim KR, Kim JH, Yeo WH. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. BIOSENSORS 2022; 12:1039. [PMID: 36421157 PMCID: PMC9688058 DOI: 10.3390/bios12111039] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 11/11/2022] [Accepted: 11/13/2022] [Indexed: 06/16/2023]
Abstract
Eye movements show primary responses that reflect humans' voluntary intention and conscious selection. Because visual perception is one of the fundamental sensory interactions in the brain, eye movements contain critical information regarding physical/psychological health, perception, intention, and preference. With the advancement of wearable device technologies, the performance of monitoring eye tracking has been significantly improved. It also has led to myriad applications for assisting and augmenting human activities. Among them, electrooculograms, measured by skin-mounted electrodes, have been widely used to track eye motions accurately. In addition, eye trackers that detect reflected optical signals offer alternative ways without using wearable sensors. This paper outlines a systematic summary of the latest research on various materials, sensors, and integrated systems for monitoring eye movements and enabling human-machine interfaces. Specifically, we summarize recent developments in soft materials, biocompatible materials, manufacturing methods, sensor functions, systems' performances, and their applications in eye tracking. Finally, we discuss the remaining challenges and suggest research directions for future studies.
Collapse
Affiliation(s)
- Seunghyeb Ban
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Yoon Jae Lee
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Ka Ram Kim
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Jong-Hoon Kim
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Woon-Hong Yeo
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA 30332, USA
- Neural Engineering Center, Institute for Materials, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|
27
|
Sidenmark L, Parent M, Wu CH, Chan J, Glueck M, Wigdor D, Grossman T, Giordano M. Weighted Pointer: Error-aware Gaze-based Interaction through Fallback Modalities. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3585-3595. [PMID: 36048981 DOI: 10.1109/tvcg.2022.3203096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Gaze-based interaction is a fast and ergonomic type of hands-free interaction that is often used with augmented and virtual reality when pointing at targets. Such interaction, however, can be cumbersome whenever user, tracking, or environmental factors cause eye tracking errors. Recent research has suggested that fallback modalities could be leveraged to ensure stable interaction irrespective of the current level of eye tracking error. This work thus presents Weighted Pointer interaction, a collection of error-aware pointing techniques that determine whether pointing should be performed by gaze, a fallback modality, or a combination of the two, depending on the level of eye tracking error that is present. These techniques enable users to accurately point at targets when eye tracking is accurate and inaccurate. A virtual reality target selection study demonstrated that Weighted Pointer techniques were more performant and preferred over techniques that required the use of manual modality switching.
Collapse
|
28
|
Eye gaze and visual attention as a window into leadership and followership: A review of empirical insights and future directions. THE LEADERSHIP QUARTERLY 2022. [DOI: 10.1016/j.leaqua.2022.101654] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
29
|
Hunt R, Blackmore T, Mills C, Dicks M. Evaluating the integration of eye-tracking and motion capture technologies: Quantifying the accuracy and precision of gaze measures. Iperception 2022; 13:20416695221116652. [PMID: 36186610 PMCID: PMC9516427 DOI: 10.1177/20416695221116652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Accepted: 07/10/2022] [Indexed: 11/18/2022] Open
Abstract
Integrating mobile eye tracking and optoelectronic motion capture enables point of gaze
to be expressed within the laboratory co-ordinate system and presents a method not
commonly applied during research examining dynamic behaviors, such as locomotion. This
paper examines the quality of gaze data collected through the integration. Based on
research suggesting increased viewing distances are associated with reduced data quality;
the accuracy and precision of gaze data as participants (N = 11) viewed
floor-based targets at distances of 1–6 m was investigated. A mean accuracy of
2.55 ± 1.12° was identified, however, accuracy and precision measures (relative to
targets) were significantly (p < .05) reduced at greater viewing
distances. We then consider if signal processing techniques may improve accuracy and
precision, and overcome issues associated with missing data. A 4th-order Butterworth
lowpass filter with cut-off frequencies determined via autocorrelation did not
significantly improve data quality, however, interpolation via Quintic spline was
sufficient to overcome gaps of up to 0.1 s. We conclude the integration of gaze and motion
capture presents a viable methodology in the study of human behavior and presents
advantages for data collection, treatment, and analysis. We provide considerations for the
collection, analysis, and treatment of gaze data that may help inform future
methodological decisions.
Collapse
Affiliation(s)
- Rhys Hunt
- School of Sport, Health and Exercise Science, University of Portsmouth, Portsmouth, UK
| | - Tim Blackmore
- School of Sport, Health and Exercise Science, University of Portsmouth, Portsmouth, UK
| | - Chris Mills
- School of Sport, Health and Exercise Science, University of Portsmouth, Portsmouth, UK
| | - Matt Dicks
- School of Sport, Health and Exercise Science, University of Portsmouth, Portsmouth, UK
| |
Collapse
|
30
|
Eye contact avoidance in crowds: A large wearable eye-tracking study. Atten Percept Psychophys 2022; 84:2623-2640. [PMID: 35996058 PMCID: PMC9630249 DOI: 10.3758/s13414-022-02541-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/08/2022] [Indexed: 11/30/2022]
Abstract
Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated.
Collapse
|
31
|
Relationship between Cervicocephalic Kinesthetic Sensibility Measured during Dynamic Unpredictable Head Movements and Eye Movement Control or Postural Balance in Neck Pain Patients. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19148405. [PMID: 35886255 PMCID: PMC9317579 DOI: 10.3390/ijerph19148405] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 07/05/2022] [Accepted: 07/07/2022] [Indexed: 01/27/2023]
Abstract
Cervical afferent input is believed to affect postural balance and oculomotor control in neck pain patients, but its relationship to cervicocephalic kinesthesia, describing movement sense, has not yet been studied. The aim of this study was to analyze the relationship of two aspects of cervicocephalic kinesthesia to postural balance and oculomotor control in neck torsion positions. Forty-three idiopathic neck pain patients referred from orthopedic outpatient clinics and forty-two asymptomatic controls were enrolled in the study. A force plate was used to measure center-of-pressure movements during parallel stances under neutral and neck torsion maneuvers. Video-oculography was used to assess eye movements during smooth pursuit neck torsion test (SPNTT), while kinesthetic awareness was measured using the Butterfly test and head-to-neutral relocation test. Multiple regression was used to describe relationships between tests. Body sway in the anterior-posterior direction was related to Butterfly parameters but less to the head-to-neutral test. A medium relationship between Butterfly parameters and gain during SPNTT, with less SPNT-difference, was observed, but not for the head-to-neutral test. It can be concluded that specific aspect of neck kinesthetic functions (i.e., movement sense) importantly contributes towards oculomotor and balance control, which is more evident under neck torsion positions in neck pain patients, but is less pronounced in asymptomatic individuals.
Collapse
|
32
|
Lee G, Hwang S, Lee D. Improvements of Warning Signs for Black Ice Based on Driving Simulator Experiments. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19127549. [PMID: 35742797 PMCID: PMC9224529 DOI: 10.3390/ijerph19127549] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Revised: 06/11/2022] [Accepted: 06/17/2022] [Indexed: 12/03/2022]
Abstract
Black ice is one of the main causes of traffic accidents in winter, and warning signs for black ice are generally ineffective because of the lack of credible information. To overcome this limitation, new warning signs for black ice were developed using materials that change color in response to different temperatures. The performance and effects of the new signs were investigated by conducting driver behavior analysis. To this end, driving simulator experiments were conducted with 37 participants for two different rural highway sections, i.e., a curve and a tangent. The analysis results of the driving behavior and visual behavior experiments showed that the conventional signs had insufficient performance in terms of inducing changes in driving behavior for safety. Meanwhile, the new signs actuated by weather conditions offered a statistically significant performance improvement. Typically, driver showed two times higher speed deceleration when they fixed eyes on the new weather-actuated warning sign (12.80 km/h) compared to the conventional old warning sign (6.84 km/h) in the curve segment. Accordingly, this study concluded that the new weather-actuated warning signs for black ice are more effective than the conventional ones for accident reduction during winters.
Collapse
Affiliation(s)
- Ghangshin Lee
- Department of Smart Cities in Graduate School, University of Seoul, Seoul 02504, Korea; (G.L.); (S.H.)
| | - Sooncheon Hwang
- Department of Smart Cities in Graduate School, University of Seoul, Seoul 02504, Korea; (G.L.); (S.H.)
| | - Dongmin Lee
- Department of Transportation Engineering & Smart Cities, University of Seoul, Seoul 02504, Korea
- Correspondence: ; Tel.: +82-2-6490-6010
| |
Collapse
|
33
|
High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems. SENSORS 2022; 22:s22124357. [PMID: 35746135 PMCID: PMC9231356 DOI: 10.3390/s22124357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Revised: 06/04/2022] [Accepted: 06/07/2022] [Indexed: 11/16/2022]
Abstract
The problem of 3D gaze estimation can be viewed as inferring the visual axes from eye images. It remains a challenge especially for the head-mounted gaze tracker (HMGT) with a simple camera setup due to the complexity of the human visual system. Although the mainstream regression-based methods could establish the mapping relationship between eye image features and the gaze point to calculate the visual axes, it may lead to inadequate fitting performance and appreciable extrapolation errors. Moreover, regression-based methods suffer from a degraded user experience because of the increased burden in recalibration procedures when slippage occurs between HMGT and head. To address these issues, a high-accuracy 3D gaze estimation method along with an efficient recalibration approach is proposed with head pose tracking in this paper. The two key parameters, eyeball center and camera optical center, are estimated in head frame with geometry-based method, so that a mapping relationship between two direction features is proposed to calculate the direction of the visual axis. As the direction features are formulated with the accurately estimated parameters, the complexity of mapping relationship could be reduced and a better fitting performance can be achieved. To prevent the noticeable extrapolation errors, direction features with uniform angular intervals for fitting the mapping are retrieved over human’s field of view. Additionally, an efficient single-point recalibration method is proposed with an updated eyeball coordinate system, which reduces the burden of calibration procedures significantly. Our experiment results show that the calibration and recalibration methods could improve the gaze estimation accuracy by 35 percent (from a mean error of 2.00 degrees to 1.31 degrees) and 30 percent (from a mean error of 2.00 degrees to 1.41 degrees), respectively, compared with the state-of-the-art methods.
Collapse
|
34
|
Oculomotor performance in patients with neck pain: Does it matter which angle of neck torsion is used in smooth pursuit eye movement test and is the agreement between angles dependent on target movement amplitude and velocity? Musculoskelet Sci Pract 2022; 59:102535. [PMID: 35278834 DOI: 10.1016/j.msksp.2022.102535] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Revised: 02/15/2022] [Accepted: 02/18/2022] [Indexed: 11/20/2022]
Abstract
BACKGROUND Neck torsion manoeuvre is thought to affect eye movement control via afferent sensory drive in neck pain disorders patients. Literature reports inconsistencies regarding the angle of neck torsion most commonly used across the studies. OBJECTIVES The goal of this study was to determine the level of agreement in oculomotor performance between two most commonly used neck torsion angles during smooth pursuit neck torsion test (SPNT). DESIGN A cross-sectional design was used in thirty-two neck pain patients and thirty-two healthy individuals. METHOD Gain and SPNTdiff were measured during SPNT test at 30° and 45° of neck torsion angle, at 30°, 40° and 50° of target movement amplitudes and three different target movement velocities (20°s-1, 30°s-1 and 40°s-1) using eye tracking device. Bland-Altman plots and correlation analysis were used to study the agreement between the two angles. RESULTS Small to medium correlations and wide bias confidence intervals suggest medium level of agreement in gain or SPNTdiff between the two neck torsion angles for chronic neck pain patients, but higher in healthy individuals. Higher agreement in gain was observed at lager target movement amplitudes and at slower target movement velocities, however this trend was not observed for SPNTdiff. CONCLUSION Level of agreement between the two angles in SPNT test depends on the amplitude and velocity of the moving target. In cases when subjects within the same study are not able to perform 45° of neck torsion, 50° amplitude and 20°s-1 velocity of target movement are more suitable to reach higher agreement between the angles.
Collapse
|
35
|
Holmqvist K, Örbom SL, Zemblys R. Small head movements increase and colour noise in data from five video-based P-CR eye trackers. Behav Res Methods 2022; 54:845-863. [PMID: 34357538 PMCID: PMC8344338 DOI: 10.3758/s13428-021-01648-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/06/2021] [Indexed: 11/08/2022]
Abstract
We empirically investigate the role of small, almost imperceptible balance and breathing movements of the head on the level and colour of noise in data from five commercial video-based P-CR eye trackers. By comparing noise from recordings with completely static artificial eyes to noise from recordings where the artificial eyes are worn by humans, we show that very small head movements increase levels and colouring of the noise in data recorded from all five eye trackers in this study. This increase of noise levels is seen not only in the gaze signal, but also in the P and CR signals of the eye trackers that provide these camera image features. The P and CR signals of the SMI eye trackers correlate strongly during small head movements, but less so or not at all when the head is completely still, indicating that head movements are registered by the P and CR images in the eye camera. By recording with artificial eyes, we can also show that the pupil size artefact has no major role in increasing and colouring noise. Our findings add to and replicate the observation by Niehorster et al., (2021) that lowpass filters in video-based P-CR eye trackers colour the data. Irrespective of source, filters or head movements, coloured noise can be confused for oculomotor drift. We also find that usage of the default head restriction in the EyeLink 1000+, the EyeLink II and the HiSpeed240 result in noisier data compared to less head restriction. Researchers investigating data quality in eye trackers should consider not using the Gen 2 artificial eye from SR Research / EyeLink. Data recorded with this artificial eye are much noisier than data recorded with other artificial eyes, on average 2.2-14.5 times worse for the five eye trackers.
Collapse
Affiliation(s)
- Kenneth Holmqvist
- Institute of Psychology, Nicolaus Copernicus University in Torun, Torun, Poland
- Department of Psychology, Regensburg University, Regensburg, Germany
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | - Saga Lee Örbom
- Department of Psychology, Regensburg University, Regensburg, Germany
| | | |
Collapse
|
36
|
Is Altered Oculomotor Control during Smooth Pursuit Neck Torsion Test Related to Subjective Visual Complaints in Patients with Neck Pain Disorders? INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19073788. [PMID: 35409472 PMCID: PMC8997387 DOI: 10.3390/ijerph19073788] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/04/2022] [Revised: 03/17/2022] [Accepted: 03/19/2022] [Indexed: 02/04/2023]
Abstract
Subjective visual complaints are commonly reported in patients with neck pain, but their relation to objectively measured oculomotor functions during smooth pursuit neck torsion tests (SPNTs) has not yet been investigated. The aim of the study was to analyse classification accuracy of visual symptom intensity and frequency based on SPNT results. Forty-three patients with neck pain were referred by orthopaedic outpatient clinics where they were required to fill out 16-item proformas of visual complaints. Infrared video-oculography was used to measure smooth pursuit eye movements during neutral and neck torsion positions. Parameters of gain and SPNT difference (SPNTdiff) were taken into the Naïve Bayes model as classifiers, while intensity and frequency of visual symptoms were taken as predicted class. Intensity and, to a lesser degree, frequency of visual symptoms previously associated with neck pain or focal vision disorders (computer vision syndrome) showed better classification accuracy using gain at neck torsion position, indicating cervical driven visual disturbances. Moreover, SPNTdiff presented with slightly lower classification accuracy as compared to gain at neck torsion position. Our study confirmed the relationship between cervical driven oculomotor deficits and some visual complaints (concentrating to read, words moving on page, blurred vision, difficulty judging distance, sore eyes, heavy eyes, red eyes, and eyes strain).
Collapse
|
37
|
Beyond screen time: Using head-mounted eye tracking to study natural behavior. ADVANCES IN CHILD DEVELOPMENT AND BEHAVIOR 2022; 62:61-91. [PMID: 35249686 DOI: 10.1016/bs.acdb.2021.11.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Head-mounted eye tracking is a new method that allows researchers to catch a glimpse of what infants and children see during naturalistic activities. In this chapter, we review how mobile, wearable eye trackers improve the construct validity of important developmental constructs, such as visual object experiences and social attention, in ways that would be impossible using screen-based eye tracking. Head-mounted eye tracking improves ecological validity by allowing researchers to present more realistic and complex visual scenes, create more interactive experimental situations, and examine how the body influences what infants and children see. As with any new method, there are difficulties to overcome. Accordingly, we identify what aspects of head-mounted eye-tracking study design affect the measurement quality, interpretability of the results, and efficiency of gathering data. Moreover, we provide a summary of best practices aimed at allowing researchers to make well-informed decisions about whether and how to apply head-mounted eye tracking to their own research questions.
Collapse
|
38
|
Vehlen A, Standard W, Domes G. How to choose the size of facial areas of interest in interactive eye tracking. PLoS One 2022; 17:e0263594. [PMID: 35120188 PMCID: PMC8815978 DOI: 10.1371/journal.pone.0263594] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Accepted: 01/21/2022] [Indexed: 11/18/2022] Open
Abstract
Advances in eye tracking technology have enabled the development of interactive experimental setups to study social attention. Since these setups differ substantially from the eye tracker manufacturer's test conditions, validation is essential with regard to the quality of gaze data and other factors potentially threatening the validity of this signal. In this study, we evaluated the impact of accuracy and areas of interest (AOIs) size on the classification of simulated gaze (fixation) data. We defined AOIs of different sizes using the Limited-Radius Voronoi-Tessellation (LRVT) method, and simulated gaze data for facial target points with varying accuracy. As hypothesized, we found that accuracy and AOI size had strong effects on gaze classification. In addition, these effects were not independent and differed in falsely classified gaze inside AOIs (Type I errors; false alarms) and falsely classified gaze outside the predefined AOIs (Type II errors; misses). Our results indicate that smaller AOIs generally minimize false classifications as long as accuracy is good enough. For studies with lower accuracy, Type II errors can still be compensated to some extent by using larger AOIs, but at the cost of more probable Type I errors. Proper estimation of accuracy is therefore essential for making informed decisions regarding the size of AOIs in eye tracking research.
Collapse
Affiliation(s)
- Antonia Vehlen
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| | - William Standard
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| | - Gregor Domes
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| |
Collapse
|
39
|
Majcen Rosker Z, Rosker J, Vodicar M, Kristjansson E. The influence of neck torsion and sequence of cycles on intra-trial reliability of smooth pursuit eye movement test in patients with neck pain disorders. Exp Brain Res 2022; 240:763-771. [PMID: 35034178 DOI: 10.1007/s00221-021-06288-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Accepted: 12/07/2021] [Indexed: 11/29/2022]
Abstract
The sensory mismatch commonly observed in patients with neck pain disorders could alter intra-trial reliability in simple implicit smooth pursuit eye movement tasks. This could be more pronounced when neck is in torsioned position (SPNT). The aim of this study was to explore the effects of neck torsion, target movement velocity and amplitude on intra-trial reliability of smooth pursuit eye movements in patients with neck pain disorders and healthy individuals. SPNT test was evaluated in 32 chronic neck pain patients and 32 healthy controls. Ten cycles were performed using video-oculography at three different velocities (20° s-1, 30° s-1 and 40° s-1) and at three different amplitudes (30°, 40° and 50°) of target movement. Intra-trial reliability and differences between average gain and SPNT difference from the second to fifth cycle and from the sixth to ninth cycle were assessed using ICC3.1 and factorial analysis of variance, respectively. Intra-trial reliability for gain and SPNT difference at all target movement amplitudes and velocities proved to be good to excellent in both observed groups. Patients with neck pain disorders presented with a trend of inferior gain performance between the sixth and ninth cycle at 30° s-1 of target movement as compared to healthy individuals which was only evident when neck was in torsioned position. Although intra-trial reliability of smooth pursuit neck torsion test is good to excellent, the effects of learning are not as pronounced in patients with neck pain disorders.
Collapse
Affiliation(s)
| | - Jernej Rosker
- Faculty of Health Sciences, University of Primorska, Izola, Slovenia
| | - Miha Vodicar
- Department of Orthopaedic Surgery, University Medical Centre Ljubljana, Ljubljana, Slovenia
| | | |
Collapse
|
40
|
Schweizer T, Wyss T, Gilgen-Ammann R. Detecting Soldiers' Fatigue Using Eye-Tracking Glasses: Practical Field Applications and Research Opportunities. Mil Med 2021; 187:e1330-e1337. [PMID: 34915554 PMCID: PMC10100772 DOI: 10.1093/milmed/usab509] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Revised: 11/04/2021] [Accepted: 11/29/2021] [Indexed: 11/14/2022] Open
Abstract
INTRODUCTION Objectively determining soldiers' fatigue levels could help prevent injuries or accidents resulting from inattention or decreased alertness. Eye-tracking technologies, such as optical eye tracking (OET) and electrooculography (EOG), are often used to monitor fatigue. Eyeblinks-especially blink frequency and blink duration-are known as easily observable and valid biomarkers of fatigue. Currently, various eye trackers (i.e., eye-tracking glasses) are available on the market using either OET or EOG technologies. These wearable eye trackers offer several advantages, including unobtrusive functionality, practicality, and low costs. However, several challenges and limitations must be considered when implementing these technologies in the field to monitor fatigue levels. This review investigates the feasibility of eye tracking in the field focusing on the practical applications in military operational environments. MATERIALS AND METHOD This paper summarizes the existing literature about eyeblink dynamics and available wearable eye-tracking technologies, exposing challenges and limitations, as well as discussing practical recommendations on how to improve the feasibility of eye tracking in the field. RESULTS So far, no eye-tracking glasses can be recommended for use in a demanding work environment. First, eyeblink dynamics are influenced by multiple factors; therefore, environments, situations, and individual behavior must be taken into account. Second, the glasses' placement, sunlight, facial or body movements, vibrations, and sweat can drastically decrease measurement accuracy. The placement of the eye cameras for the OET and the placement of the electrodes for the EOG must be chosen consciously, the sampling rate must be minimal 200 Hz, and software and hardware must be robust to resist any factors influencing eye tracking. CONCLUSION Monitoring physiological and psychological readiness of soldiers, as well as other civil professionals that face higher risks when their attention is impaired or reduced, is necessary. However, improvements to eye-tracking devices' hardware, calibration method, sampling rate, and algorithm are needed in order to accurately monitor fatigue levels in the field.
Collapse
Affiliation(s)
- Theresa Schweizer
- Monitoring, Swiss Federal Institute of Sport Magglingen (SFISM), Macolin 2532, Switzerland
| | - Thomas Wyss
- Monitoring, Swiss Federal Institute of Sport Magglingen (SFISM), Macolin 2532, Switzerland
| | - Rahel Gilgen-Ammann
- Monitoring, Swiss Federal Institute of Sport Magglingen (SFISM), Macolin 2532, Switzerland
| |
Collapse
|
41
|
Holleman GA, Hooge ITC, Huijding J, Deković M, Kemner C, Hessels RS. Gaze and speech behavior in parent–child interactions: The role of conflict and cooperation. CURRENT PSYCHOLOGY 2021. [DOI: 10.1007/s12144-021-02532-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AbstractA primary mode of human social behavior is face-to-face interaction. In this study, we investigated the characteristics of gaze and its relation to speech behavior during video-mediated face-to-face interactions between parents and their preadolescent children. 81 parent–child dyads engaged in conversations about cooperative and conflictive family topics. We used a dual-eye tracking setup that is capable of concurrently recording eye movements, frontal video, and audio from two conversational partners. Our results show that children spoke more in the cooperation-scenario whereas parents spoke more in the conflict-scenario. Parents gazed slightly more at the eyes of their children in the conflict-scenario compared to the cooperation-scenario. Both parents and children looked more at the other's mouth region while listening compared to while speaking. Results are discussed in terms of the role that parents and children take during cooperative and conflictive interactions and how gaze behavior may support and coordinate such interactions.
Collapse
|
42
|
The application of noninvasive, restraint-free eye-tracking methods for use with nonhuman primates. Behav Res Methods 2021; 53:1003-1030. [PMID: 32935327 DOI: 10.3758/s13428-020-01465-6] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Over the past 50 years there has been a strong interest in applying eye-tracking techniques to study a myriad of questions related to human and nonhuman primate psychological processes. Eye movements and fixations can provide qualitative and quantitative insights into cognitive processes of nonverbal populations such as nonhuman primates, clarifying the evolutionary, physiological, and representational underpinnings of human cognition. While early attempts at nonhuman primate eye tracking were relatively crude, later, more sophisticated and sensitive techniques required invasive protocols and the use of restraint. In the past decade, technology has advanced to a point where noninvasive eye-tracking techniques, developed for use with human participants, can be applied for use with nonhuman primates in a restraint-free manner. Here we review the corpus of recent studies (N=32) that take such an approach. Despite the growing interest in eye-tracking research, there is still little consensus on "best practices," both in terms of deploying test protocols or reporting methods and results. Therefore, we look to advances made in the field of developmental psychology, as well as our own collective experiences using eye trackers with nonhuman primates, to highlight key elements that researchers should consider when designing noninvasive restraint-free eye-tracking research protocols for use with nonhuman primates. Beyond promoting best practices for research protocols, we also outline an ideal approach for reporting such research and highlight future directions for the field.
Collapse
|
43
|
Wang FS, Wolf J, Farshad M, Meboldt M, Lohmeyer Q. Object-Gaze Distance: Quantifying Near- Peripheral Gaze Behavior in Real-World Applications. J Eye Mov Res 2021; 14. [PMID: 34122747 PMCID: PMC8189527 DOI: 10.16910/jemr.14.1.5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Eye tracking (ET) has shown to reveal the wearer’s cognitive processes using the measurement
of the central point of foveal vision. However, traditional ET evaluation methods have
not been able to take into account the wearers’ use of the peripheral field of vision. We
propose an algorithmic enhancement to a state-of-the-art ET analysis method, the Object-
Gaze Distance (OGD), which additionally allows the quantification of near-peripheral gaze
behavior in complex real-world environments. The algorithm uses machine learning for area
of interest (AOI) detection and computes the minimal 2D Euclidean pixel distance to the
gaze point, creating a continuous gaze-based time-series. Based on an evaluation of two
AOIs in a real surgical procedure, the results show that a considerable increase of interpretable
fixation data from 23.8 % to 78.3 % of AOI screw and from 4.5 % to 67.2 % of AOI
screwdriver was achieved, when incorporating the near-peripheral field of vision. Additionally,
the evaluation of a multi-OGD time series representation has shown the potential to
reveal novel gaze patterns, which may provide a more accurate depiction of human gaze
behavior in multi-object environments.
Collapse
|
44
|
Miller HL, Zurutuza IR, Fears NE, Polat SO, Nielsen RD. Post-processing integration and semi-automated analysis of eye-tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration. PROCEEDINGS. EYE TRACKING RESEARCH & APPLICATIONS SYMPOSIUM 2021; 2021:17. [PMID: 34263270 PMCID: PMC8276594 DOI: 10.1145/3450341.3458881] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Mobile eye-tracking and motion-capture techniques yield rich, precisely quantifiable data that can inform our understanding of the relationship between visual and motor processes during task performance. However, these systems are rarely used in combination, in part because of the significant time and human resources required for post-processing and analysis. Recent advances in computer vision have opened the door for more efficient processing and analysis solutions. We developed a post-processing pipeline to integrate mobile eye-tracking and full-body motion-capture data. These systems were used simultaneously to measure visuomotor integration in an immersive virtual environment. Our approach enables calculation of a 3D gaze vector that can be mapped to the participant's body position and objects in the virtual environment using a uniform coordinate system. This approach is generalizable to other configurations, and enables more efficient analysis of eye, head, and body movements together during visuomotor tasks administered in controlled, repeatable environments.
Collapse
|
45
|
Eye-tracking glasses in face-to-face interactions: Manual versus automated assessment of areas-of-interest. Behav Res Methods 2021; 53:2037-2048. [PMID: 33742418 PMCID: PMC8516759 DOI: 10.3758/s13428-021-01544-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/14/2021] [Indexed: 01/14/2023]
Abstract
The assessment of gaze behaviour is essential for understanding the psychology of communication. Mobile eye-tracking glasses are useful to measure gaze behaviour during dynamic interactions. Eye-tracking data can be analysed by using manually annotated areas-of-interest. Computer vision algorithms may alternatively be used to reduce the amount of manual effort, but also the subjectivity and complexity of these analyses. Using additional re-identification (Re-ID) algorithms, different participants in the interaction can be distinguished. The aim of this study was to compare the results of manual annotation of mobile eye-tracking data with the results of a computer vision algorithm. We selected the first minute of seven randomly selected eye-tracking videos of consultations between physicians and patients in a Dutch Internal Medicine out-patient clinic. Three human annotators and a computer vision algorithm annotated mobile eye-tracking data, after which interrater reliability was assessed between the areas-of-interest annotated by the annotators and the computer vision algorithm. Additionally, we explored interrater reliability when using lengthy videos and different area-of-interest shapes. In total, we analysed more than 65 min of eye-tracking videos manually and with the algorithm. Overall, the absolute normalized difference between the manual and the algorithm annotations of face-gaze was less than 2%. Our results show high interrater agreements between human annotators and the algorithm with Cohen’s kappa ranging from 0.85 to 0.98. We conclude that computer vision algorithms produce comparable results to those of human annotators. Analyses by the algorithm are not subject to annotator fatigue or subjectivity and can therefore advance eye-tracking analyses.
Collapse
|
46
|
Schweizer T, Wyss T, Gilgen-Ammann R. Eyeblink Detection in the Field: A Proof of Concept Study of Two Mobile Optical Eye-Trackers. Mil Med 2021; 187:e404-e409. [PMID: 33564826 PMCID: PMC9244949 DOI: 10.1093/milmed/usab032] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Revised: 01/19/2021] [Accepted: 01/25/2021] [Indexed: 12/02/2022] Open
Abstract
Introduction High physical and cognitive strain, high pressure, and sleep deficit are part of daily
life for military professionals and civilians working in physiologically demanding
environments. As a result, cognitive and physical capacities decline and the risk of
illness, injury, or accidents increases. Such unfortunate outcomes could be prevented by
tracking real-time physiological information, revealing individuals’ objective fatigue
levels. Oculometrics, and especially eyeblinks, have been shown to be promising
biomarkers that reflect fatigue development. Head-mounted optical eye-trackers are a
common method to monitor these oculometrics. However, studies measuring eyeblink
detection in real-life settings have been lacking in the literature. Therefore, this
study aims to validate two current mobile optical eye-trackers in an unrestrained
military training environment. Materials and Method Three male participants (age 20.0 ± 1.0) of the Swiss Armed Forces participated in this
study by wearing three optical eye-trackers, two VPS16s (Viewpointsystem GmbH, Vienna,
Austria) and one Pupil Core (Pupil Labs GmbH, Berlin, Germany), during four military
training events: Healthcare education, orienteering, shooting, and military marching.
Software outputs were analyzed against a visual inspection (VI) of the video recordings
of participants’ eyes via the respective software. Absolute and relative blink numbers
were provided. Each blink detected by the software was classified as a “true blink” (TB)
when it occurred in the software output and the VI at the same time, as a “false blink”
(FB) when it occurred in the software but not in the VI, and as a “missed blink” (MB)
when the software failed to detect a blink that occurred in the VI. The FBs were further
examined for causes of the incorrect recordings, and they were divided into four
categories: “sunlight,” “movements,” “lost pupil,” and “double-counted”. Blink frequency
(i.e., blinks per minute) was also analyzed. Results Overall, 49.3% and 72.5% of registered eyeblinks were classified as TBs for the VPS16
and Pupil Core, respectively. The VPS16 recorded 50.7% of FBs and accounted for 8.5% of
MBs, while the Pupil Core recorded 27.5% of FBs and accounted for 55.5% of MBs. The
majority of FBs—45.5% and 73.9% for the VPS16 and Pupil Core, respectively—were
erroneously recorded due to participants’ eye movements while looking up, down, or to
one side. For blink frequency analysis, systematic biases (±limits of agreement) stood
at 23.3 (±43.5) and −4.87 (±14.1) blinks per minute for the VPS16 and Pupil Core,
respectively. Significant differences in systematic bias between devices and the
respective VIs were found for nearly all activities (P < .05). Conclusion An objective physiological monitoring of fatigue is necessary for soldiers as well as
civil professionals who are exposed to higher risks when their cognitive or physical
capacities weaken. However, optical eye-trackers’ accuracy has not been specified under
field conditions—especially not in monitoring fatigue. The significant overestimation
and underestimation of the VPS16 and Pupil Core, respectively, demonstrate the general
difficulty of blink detection in the field.
Collapse
Affiliation(s)
- Theresa Schweizer
- Monitoring Canton: Bern, Swiss Federal Institute of Sport Magglingen (SFISM), Magglingen/Macolin 2532, Switzerland
| | - Thomas Wyss
- Monitoring Canton: Bern, Swiss Federal Institute of Sport Magglingen (SFISM), Magglingen/Macolin 2532, Switzerland
| | - Rahel Gilgen-Ammann
- Monitoring Canton: Bern, Swiss Federal Institute of Sport Magglingen (SFISM), Magglingen/Macolin 2532, Switzerland
| |
Collapse
|
47
|
Sipatchin A, Wahl S, Rifai K. Eye-Tracking for Clinical Ophthalmology with Virtual Reality (VR): A Case Study of the HTC Vive Pro Eye's Usability. Healthcare (Basel) 2021; 9:180. [PMID: 33572072 PMCID: PMC7914806 DOI: 10.3390/healthcare9020180] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 02/04/2021] [Accepted: 02/05/2021] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND A case study is proposed to empirically test and discuss the eye-tracking status-quo hardware capabilities and limitations of an off-the-shelf virtual reality (VR) headset with embedded eye-tracking for at-home ready-to-go online usability in ophthalmology applications. METHODS The eye-tracking status-quo data quality of the HTC Vive Pro Eye is investigated with novel testing specific to objective online VR perimetry. Testing was done across a wide visual field of the head-mounted-display's (HMD) screen and in two different moving conditions. A new automatic and low-cost Raspberry Pi system is introduced for VR temporal precision testing for assessing the usability of the HTC Vive Pro Eye as an online assistance tool for visual loss. RESULTS The target position on the screen and head movement evidenced limitations of the eye-tracker capabilities as a perimetry assessment tool. Temporal precision testing showed the system's latency of 58.1 milliseconds (ms), evidencing its good potential usage as a ready-to-go online assistance tool for visual loss. CONCLUSIONS The test of the eye-tracking data quality provides novel analysis useful for testing upcoming VR headsets with embedded eye-tracking and opens discussion regarding expanding future introduction of these HMDs into patients' homes for low-vision clinical usability.
Collapse
Affiliation(s)
- Alexandra Sipatchin
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; (S.W.); (K.R.)
| | - Siegfried Wahl
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; (S.W.); (K.R.)
- Carl Zeiss Vision International GmbH, 73430 Aalen, Germany
| | - Katharina Rifai
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; (S.W.); (K.R.)
- Carl Zeiss Vision International GmbH, 73430 Aalen, Germany
| |
Collapse
|
48
|
Abstract
The magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker’s data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.
Collapse
|
49
|
Yen C, Chiang MC. Examining the effect of online advertisement cues on human responses using eye-tracking, EEG, and MRI. Behav Brain Res 2021; 402:113128. [PMID: 33460680 DOI: 10.1016/j.bbr.2021.113128] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2020] [Revised: 12/07/2020] [Accepted: 01/04/2021] [Indexed: 11/29/2022]
Abstract
This study sought to emphasize how disciplines such as neuroscience and marketing can be applied in advertising and consumer behavior. The application of neuroscience methods in analyzing and understanding human behavior related to the Elaboration Likelihood Model (ELM) and brain activity has recently garnered attention. This study examines brain processes while participants attempted to elicit preferences for a product, and demonstrates factors that influence consumer behavior using eye-tracking, electroencephalography (EEG), and magnetic resonance imaging (MRI) from a neuroscience approach. We planned two conditions of online advertising, namely, peripheral cues without argument and central cues with argument strength. Thirty respondents participated in the experiment, consisting of eye-tracking, EEG, and MRI instruments to explore brain activity in central cue conditions. We investigated whether diffusion tensor imaging (DTI) analysis could detect regional brain changes. Using eye-tracking, we found that the responses were mainly in the mean fixation duration, number of fixations, mean saccade duration, and number of saccade durations for the central cue condition. Moreover, the findings show that the fusiform gyrus and frontal cortex are significantly associated with building a relationship by inferring central cues in the EEG assay. The MRI images show that the fusiform gyrus and frontal cortex are significantly active in the central cue condition. DTI analysis indicates that the corpus callosum has changed in the central cue condition. We used eye-tracking, EEG, MRI, and DTI to understand that these connections may apprehend responses when viewing advertisements, especially in the fusiform gyrus, frontal cortex, and corpus callosum.
Collapse
Affiliation(s)
- Chiahui Yen
- Department of International Business, Ming Chuan University, Taipei 111, Taiwan
| | - Ming-Chang Chiang
- Department of Life Science, College of Science and Engineering, Fu Jen Catholic University, New Taipei City 242, Taiwan.
| |
Collapse
|
50
|
Abstract
There is a long history of interest in looking behavior during human interaction. With the advance of (wearable) video-based eye trackers, it has become possible to measure gaze during many different interactions. We outline the different types of eye-tracking setups that currently exist to investigate gaze during interaction. The setups differ mainly with regard to the nature of the eye-tracking signal (head- or world-centered) and the freedom of movement allowed for the participants. These features place constraints on the research questions that can be answered about human interaction. We end with a decision tree to help researchers judge the appropriateness of specific setups.
Collapse
|