1
|
Zhang X, Wang L, He Y, Mou Z, Cao Y. High-speed eye tracking based on a synchronized imaging mechanism by a dual-ring infrared lighting source. APPLIED OPTICS 2024; 63:4293-4302. [PMID: 38856606 DOI: 10.1364/ao.521840] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Accepted: 04/30/2024] [Indexed: 06/11/2024]
Abstract
It is a challenge for conventional monocular-camera single-light source eye-tracking methods to achieve high-speed eye tracking. In this work, a dual-ring infrared lighting source was designed to achieve bright and dark pupils in high speed. The eye-tracking method used a dual-ring infrared lighting source and synchronized triggers for the even and odd camera frames to capture bright and dark pupils. A pupillary corneal reflex was calculated by the center coordinates of the Purkinje spot and the pupil. A map function was established to map the relationship between the pupillary corneal reflex and gaze spots. The gaze coordinate was calculated based on the mapping function. The average detection time of each gaze spot was 3.76 ms.
Collapse
|
2
|
Sadeghi R, Ressmeyer R, Yates J, Otero-Millan J. Open Iris - An Open Source Framework for Video-Based Eye-Tracking Research and Development. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.27.582401. [PMID: 38463977 PMCID: PMC10925248 DOI: 10.1101/2024.02.27.582401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2024]
Abstract
Eye-tracking is an essential tool in many fields, yet existing solutions are often limited for customized applications due to cost or lack of flexibility. We present OpenIris, an adaptable and user-friendly open-source framework for video-based eye-tracking. OpenIris is developed in C# with modular design that allows further extension and customization through plugins for different hardware systems, tracking, and calibration pipelines. It can be remotely controlled via a network interface from other devices or programs. Eye movements can be recorded online from camera stream or offline post-processing recorded videos. Example plugins have been developed to track eye motion in 3-D, including torsion. Currently implemented binocular pupil tracking pipelines can achieve frame rates of more than 500Hz. With the OpenIris framework, we aim to fill a gap in the research tools available for high-precision and high-speed eye-tracking, especially in environments that require custom solutions that are not currently well-served by commercial eye-trackers.
Collapse
Affiliation(s)
- Roksana Sadeghi
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
| | - Ryan Ressmeyer
- Bioengineering, University of Washington, Seattle, Washington, USA
| | - Jacob Yates
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
| | - Jorge Otero-Millan
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
- Department of Neurology, Johns Hopkins University, Baltimore, Maryland, USA
| |
Collapse
|
3
|
Byrne SA, Nyström M, Maquiling V, Kasneci E, Niehorster DC. Precise localization of corneal reflections in eye images using deep learning trained on synthetic data. Behav Res Methods 2024; 56:3226-3241. [PMID: 38114880 PMCID: PMC11133043 DOI: 10.3758/s13428-023-02297-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/15/2023] [Indexed: 12/21/2023]
Abstract
We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with synthetic CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on two datasets consisting of high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 3-41.5% reduction in terms of spatial precision across data sets, and performed on par with state-of-the-art on synthetic images in terms of spatial accuracy. We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem, which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers.
Collapse
Affiliation(s)
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Virmarie Maquiling
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Enkelejda Kasneci
- Human-Centered Technologies for Learning, Technical University of Munich, Munich, Germany
| | - Diederick C Niehorster
- MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy.
- Department of Psychology, Lund University, Lund, Sweden.
| |
Collapse
|
4
|
Huang Z, Duan X, Zhu G, Zhang S, Wang R, Wang Z. Assessing the data quality of AdHawk MindLink eye-tracking glasses. Behav Res Methods 2024:10.3758/s13428-023-02310-2. [PMID: 38168041 DOI: 10.3758/s13428-023-02310-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/30/2023] [Indexed: 01/05/2024]
Abstract
Most commercially available eye-tracking devices rely on video cameras and image processing algorithms to track gaze. Despite this, emerging technologies are entering the field, making high-speed, cameraless eye-tracking more accessible. In this study, a series of tests were conducted to compare the data quality of MEMS-based eye-tracking glasses (AdHawk MindLink) with three widely used camera-based eye-tracking devices (EyeLink Portable Duo, Tobii Pro Glasses 2, and SMI Eye Tracking Glasses 2). The data quality measures assessed in these tests included accuracy, precision, data loss, and system latency. The results suggest that, overall, the data quality of the eye-tracking glasses was lower compared to that of a desktop EyeLink Portable Duo eye-tracker. Among the eye-tracking glasses, the accuracy and precision of the MindLink eye-tracking glasses were either higher or on par with those of Tobii Pro Glasses 2 and SMI Eye Tracking Glasses 2. The system latency of MindLink was approximately 9 ms, significantly lower than that of camera-based eye-tracking devices found in VR goggles. These results suggest that the MindLink eye-tracking glasses show promise for research applications where high sampling rates and low latency are preferred.
Collapse
Affiliation(s)
- Zehao Huang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Xiaoting Duan
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Gancheng Zhu
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Shuai Zhang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Rong Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China
| | - Zhiguo Wang
- Center for Psychological Sciences, Zhejiang University, 148 Tianmushan Rd., Hangzhou, 310028, China.
| |
Collapse
|
5
|
Lima DDS, Ventura DF. A review of experimental task design in psychophysical eye tracking research. Front Hum Neurosci 2023; 17:1112769. [PMID: 37662635 PMCID: PMC10469886 DOI: 10.3389/fnhum.2023.1112769] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2022] [Accepted: 07/28/2023] [Indexed: 09/05/2023] Open
Abstract
While eye tracking is a technique commonly used in the experimental study of higher-level perceptual processes such as visual search, working memory, reading, and scene exploration, its use for the quantification of basic visual functions (visual acuity, contrast sensitivity, color vision, motion detection) is less explored. The use of eye movement features as dependent variables in a psychophysical investigation can serve multiple roles. They can be central in studies with neurological patients or infants that cannot comply with verbal instructions, understand task demands, and/or emit manual responses. The technique may also serve a complementary role, determining the conditions under which a manual or verbal response is given, such as stimulus position in the visual field, or it can afford the analysis of new dependent variables, such as the time interval between oculomotor and manual responses. Our objective is to review the literature that applied the eye tracking technique to psychophysical problems. The two questions our review raises are: can eye movements (reflex or voluntary) be an objective index of stimulus detection in psychophysical tasks? If so, under what conditions, and how does it compare with traditional paradigms requiring manual responses? Our (non-systematic) methodological review selected studies that used video-oculography as the technique of choice and had a basic visual function as their primary object of investigation. Studies satisfying those criteria were then categorized into four broad classes reflecting their main research interest: (1) stimulus detection and threshold estimation, (2) the effects of stimulus properties on fixational eye movements, (3) the effects of eye movements on perception, and (4) visual field assessment. The reviewed studies support the idea that eye tracking is a valuable technique for the study of basic perceptual processes. We discuss methodological characteristics within each of the proposed classification area, with the objective of informing future task design.
Collapse
Affiliation(s)
- Diego da Silva Lima
- Laboratory of Clinical Visual Psychophysics and Electrophysiology, University of São Paulo, São Paulo, Brazil
| | | |
Collapse
|
6
|
Barbosa J, Stein H, Zorowitz S, Niv Y, Summerfield C, Soto-Faraco S, Hyafil A. A practical guide for studying human behavior in the lab. Behav Res Methods 2023; 55:58-76. [PMID: 35262897 DOI: 10.3758/s13428-022-01793-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/04/2022] [Indexed: 11/08/2022]
Abstract
In the last few decades, the field of neuroscience has witnessed major technological advances that have allowed researchers to measure and control neural activity with great detail. Yet, behavioral experiments in humans remain an essential approach to investigate the mysteries of the mind. Their relatively modest technological and economic requisites make behavioral research an attractive and accessible experimental avenue for neuroscientists with very diverse backgrounds. However, like any experimental enterprise, it has its own inherent challenges that may pose practical hurdles, especially to less experienced behavioral researchers. Here, we aim at providing a practical guide for a steady walk through the workflow of a typical behavioral experiment with human subjects. This primer concerns the design of an experimental protocol, research ethics, and subject care, as well as best practices for data collection, analysis, and sharing. The goal is to provide clear instructions for both beginners and experienced researchers from diverse backgrounds in planning behavioral experiments.
Collapse
Affiliation(s)
- Joao Barbosa
- Brain Circuits & Behavior lab, IDIBAPS, Barcelona, Spain.
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Supérieure - PSL Research University, 75005, Paris, France.
| | - Heike Stein
- Brain Circuits & Behavior lab, IDIBAPS, Barcelona, Spain
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Supérieure - PSL Research University, 75005, Paris, France
| | - Sam Zorowitz
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
| | - Yael Niv
- Princeton Neuroscience Institute, Princeton University, Princeton, USA
- Department of Psychology, Princeton University, Princeton, USA
| | | | - Salvador Soto-Faraco
- Multisensory Research Group, Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain, and Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | | |
Collapse
|
7
|
Ban S, Lee YJ, Kim KR, Kim JH, Yeo WH. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. BIOSENSORS 2022; 12:1039. [PMID: 36421157 PMCID: PMC9688058 DOI: 10.3390/bios12111039] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 11/11/2022] [Accepted: 11/13/2022] [Indexed: 06/16/2023]
Abstract
Eye movements show primary responses that reflect humans' voluntary intention and conscious selection. Because visual perception is one of the fundamental sensory interactions in the brain, eye movements contain critical information regarding physical/psychological health, perception, intention, and preference. With the advancement of wearable device technologies, the performance of monitoring eye tracking has been significantly improved. It also has led to myriad applications for assisting and augmenting human activities. Among them, electrooculograms, measured by skin-mounted electrodes, have been widely used to track eye motions accurately. In addition, eye trackers that detect reflected optical signals offer alternative ways without using wearable sensors. This paper outlines a systematic summary of the latest research on various materials, sensors, and integrated systems for monitoring eye movements and enabling human-machine interfaces. Specifically, we summarize recent developments in soft materials, biocompatible materials, manufacturing methods, sensor functions, systems' performances, and their applications in eye tracking. Finally, we discuss the remaining challenges and suggest research directions for future studies.
Collapse
Affiliation(s)
- Seunghyeb Ban
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Yoon Jae Lee
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Ka Ram Kim
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Jong-Hoon Kim
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Woon-Hong Yeo
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA 30332, USA
- Neural Engineering Center, Institute for Materials, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|
8
|
Sensor Technology and Intelligent Systems in Anorexia Nervosa: Providing Smarter Healthcare Delivery Systems. BIOMED RESEARCH INTERNATIONAL 2022; 2022:1955056. [PMID: 36193321 PMCID: PMC9526573 DOI: 10.1155/2022/1955056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/20/2022] [Accepted: 09/06/2022] [Indexed: 11/22/2022]
Abstract
Ubiquitous technology, big data, more efficient electronic health records, and predictive analytics are now at the core of smart healthcare systems supported by artificial intelligence. In the present narrative review, we focus on sensing technologies for the healthcare of Anorexia Nervosa (AN). We employed a framework inspired by the Interpersonal Neurobiology Theory (IPNB), which posits that human experience is characterized by a flow of energy and information both within us (within our whole body), and between us (in the connections we have with others and with nature). In line with this framework, we focused on sensors designed to evaluate bodily processes (body sensors such as implantable sensors, epidermal sensors, and wearable and portable sensors), human social interaction (sociometric sensors), and the physical environment (indoor and outdoor ambient sensors). There is a myriad of man-made sensors as well as nature-based sensors such as plants that can be used to design and deploy intelligent systems for human monitoring and healthcare. In conclusion, sensing technologies and intelligent systems can be employed for smarter healthcare of AN and help to relieve the burden of health professionals. However, there are technical, ethical, and environmental sustainability issues that must be considered prior to implementing these systems. A joint collaboration of professionals and other members of the society involved in the healthcare of individuals with AN can help in the development of these systems. The evolution of cyberphysical systems should also be considered in these collaborations.
Collapse
|
9
|
Holmqvist K, Örbom SL, Zemblys R. Small head movements increase and colour noise in data from five video-based P-CR eye trackers. Behav Res Methods 2022; 54:845-863. [PMID: 34357538 PMCID: PMC8344338 DOI: 10.3758/s13428-021-01648-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/06/2021] [Indexed: 11/08/2022]
Abstract
We empirically investigate the role of small, almost imperceptible balance and breathing movements of the head on the level and colour of noise in data from five commercial video-based P-CR eye trackers. By comparing noise from recordings with completely static artificial eyes to noise from recordings where the artificial eyes are worn by humans, we show that very small head movements increase levels and colouring of the noise in data recorded from all five eye trackers in this study. This increase of noise levels is seen not only in the gaze signal, but also in the P and CR signals of the eye trackers that provide these camera image features. The P and CR signals of the SMI eye trackers correlate strongly during small head movements, but less so or not at all when the head is completely still, indicating that head movements are registered by the P and CR images in the eye camera. By recording with artificial eyes, we can also show that the pupil size artefact has no major role in increasing and colouring noise. Our findings add to and replicate the observation by Niehorster et al., (2021) that lowpass filters in video-based P-CR eye trackers colour the data. Irrespective of source, filters or head movements, coloured noise can be confused for oculomotor drift. We also find that usage of the default head restriction in the EyeLink 1000+, the EyeLink II and the HiSpeed240 result in noisier data compared to less head restriction. Researchers investigating data quality in eye trackers should consider not using the Gen 2 artificial eye from SR Research / EyeLink. Data recorded with this artificial eye are much noisier than data recorded with other artificial eyes, on average 2.2-14.5 times worse for the five eye trackers.
Collapse
Affiliation(s)
- Kenneth Holmqvist
- Institute of Psychology, Nicolaus Copernicus University in Torun, Torun, Poland
- Department of Psychology, Regensburg University, Regensburg, Germany
- Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa
| | - Saga Lee Örbom
- Department of Psychology, Regensburg University, Regensburg, Germany
| | | |
Collapse
|
10
|
Azami H, Chang Z, Arnold SE, Sapiro G, Gupta AS. Detection of Oculomotor Dysmetria From Mobile Phone Video of the Horizontal Saccades Task Using Signal Processing and Machine Learning Approaches. IEEE ACCESS : PRACTICAL INNOVATIONS, OPEN SOLUTIONS 2022; 10:34022-34031. [PMID: 36339795 PMCID: PMC9632643 DOI: 10.1109/access.2022.3156964] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Eye movement assessments have the potential to help in diagnosis and tracking of neurological disorders. Cerebellar ataxias cause profound and characteristic abnormalities in smooth pursuit, saccades, and fixation. Oculomotor dysmetria (i.e., hypermetric and hypometric saccades) is a common finding in individuals with cerebellar ataxia. In this study, we evaluated a scalable approach for detecting and quantifying oculomotor dysmetria. Eye movement data were extracted from iPhone video recordings of the horizontal saccade task (a standard clinical task in ataxia) and combined with signal processing and machine learning approaches to quantify saccade abnormalities. Entropy-based measures of eye movements during saccades were significantly different in 72 individuals with ataxia with dysmetria compared with 80 ataxia and Parkinson's participants without dysmetria. A template matching-based analysis demonstrated that saccadic eye movements in patients without dysmetria were more similar to the ideal template of saccades. A support vector machine was then used to train and test the ability of multiple signal processing features in combination to distinguish individuals with and without oculomotor dysmetria. The model achieved 78% accuracy (sensitivity= 80% and specificity= 76%). These results show that the combination of signal processing and machine learning approaches applied to iPhone video of saccades, allow for extraction of information pertaining to oculomotor dysmetria in ataxia. Overall, this inexpensive and scalable approach for capturing important oculomotor information may be a useful component of a screening tool for ataxia and could allow frequent at-home assessments of oculomotor function in natural history studies and clinical trials.
Collapse
Affiliation(s)
- Hamed Azami
- Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA 02129, USA
| | - Zhuoqing Chang
- Department of Electrical and Computer Engineering, Duke University, Durham, NC 27707, USA
| | - Steven E Arnold
- Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA 02129, USA
| | - Guillermo Sapiro
- Department of Electrical and Computer Engineering, Duke University, Durham, NC 27707, USA
- Department of Computer Science, Duke University, Durham, NC 27707, USA
- Department of Biomedical Engineering, Duke University, Durham, NC 27707, USA
- Department of Mathematics, Duke University, Durham, NC 27707, USA
| | - Anoopum S Gupta
- Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA 02114, USA
| |
Collapse
|
11
|
Zandi B, Lode M, Herzog A, Sakas G, Khanh TQ. PupilEXT: Flexible Open-Source Platform for High-Resolution Pupillometry in Vision Research. Front Neurosci 2021; 15:676220. [PMID: 34220432 PMCID: PMC8249868 DOI: 10.3389/fnins.2021.676220] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 04/28/2021] [Indexed: 12/12/2022] Open
Abstract
The human pupil behavior has gained increased attention due to the discovery of the intrinsically photosensitive retinal ganglion cells and the afferent pupil control path's role as a biomarker for cognitive processes. Diameter changes in the range of 10-2 mm are of interest, requiring reliable and characterized measurement equipment to accurately detect neurocognitive effects on the pupil. Mostly commercial solutions are used as measurement devices in pupillometry which is associated with high investments. Moreover, commercial systems rely on closed software, restricting conclusions about the used pupil-tracking algorithms. Here, we developed an open-source pupillometry platform consisting of hardware and software competitive with high-end commercial stereo eye-tracking systems. Our goal was to make a professional remote pupil measurement pipeline for laboratory conditions accessible for everyone. This work's core outcome is an integrated cross-platform (macOS, Windows and Linux) pupillometry software called PupilEXT, featuring a user-friendly graphical interface covering the relevant requirements of professional pupil response research. We offer a selection of six state-of-the-art open-source pupil detection algorithms (Starburst, Swirski, ExCuSe, ElSe, PuRe and PuReST) to perform the pupil measurement. A developed 120-fps pupillometry demo system was able to achieve a calibration accuracy of 0.003 mm and an averaged temporal pupil measurement detection accuracy of 0.0059 mm in stereo mode. The PupilEXT software has extended features in pupil detection, measurement validation, image acquisition, data acquisition, offline pupil measurement, camera calibration, stereo vision, data visualization and system independence, all combined in a single open-source interface, available at https://github.com/openPupil/Open-PupilEXT.
Collapse
Affiliation(s)
- Babak Zandi
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Moritz Lode
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Alexander Herzog
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Georgios Sakas
- Interactive Graphic Systems, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Tran Quoc Khanh
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| |
Collapse
|
12
|
Ivanchenko D, Rifai K, Hafed ZM, Schaeffel F. A low-cost, high-performance video-based binocular eye tracker for psychophysical research. J Eye Mov Res 2021; 14. [PMID: 34122750 PMCID: PMC8190563 DOI: 10.16910/jemr.14.3.3] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
We describe a high-performance, pupil-based binocular eye tracker that approaches the performance
of a well-established commercial system, but at a fraction of the cost. The eye
tracker is built from standard hardware components, and its software (written in Visual C++)
can be easily implemented. Because of its fast and simple linear calibration scheme, the eye
tracker performs best in the central 10 degrees of the visual field. The eye tracker possesses
a number of useful features: (1) automated calibration simultaneously in both eyes while
subjects fixate four fixation points sequentially on a computer screen, (2) automated realtime
continuous analysis of measurement noise, (3) automated blink detection, (4) and realtime
analysis of pupil centration artifacts. This last feature is critical because it is known
that pupil diameter changes can be erroneously registered by pupil-based trackers as a
change in eye position. We evaluated the performance of our system against that of a wellestablished
commercial system using simultaneous measurements in 10 participants. We
propose our low-cost eye tracker as a promising resource for studies of binocular eye movements.
Collapse
|
13
|
Dai L, Liu J, Ju Z, Gao Y. Attention Mechanism based Real Time Gaze Tracking in Natural Scenes with Residual Blocks. IEEE Trans Cogn Dev Syst 2021. [DOI: 10.1109/tcds.2021.3064280] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
14
|
Pavlatos E, Huang D, Li Y. Eye motion correction algorithm for OCT-based corneal topography. BIOMEDICAL OPTICS EXPRESS 2020; 11:7343-7356. [PMID: 33409001 PMCID: PMC7747916 DOI: 10.1364/boe.412209] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/12/2020] [Revised: 11/18/2020] [Accepted: 11/19/2020] [Indexed: 05/30/2023]
Abstract
With its sequential image acquisition, OCT-based corneal topography is often susceptible to measurement errors due to eye motion. We have developed a novel algorithm to detect eye motion and minimize its impact on OCT topography maps. We applied the eye motion correction algorithm to corneal topographic scans acquired using a 70 kHz spectral-domain OCT device. OCT corneal topographic measurements were compared to those from a rotating Scheimpflug camera topographer. The motion correction algorithm provided a 2-4 fold improvement in the repeatability of OCT topography and its agreement with the standard Scheimpflug topographer. The repeatability of OCT Zernike-based corneal mean power, cardinal astigmatism, and oblique astigmatism after motion detection was 0.14 D, 0.28 D, and 0.24 D, respectively. The average differences between the two devices were 0.19 D for simulated keratometry-based corneal mean power, 0.23 D for cardinal astigmatism, and 0.25 D for oblique astigmatism. Our eye motion detection method can be applied to any OCT device, and it therefore represents a powerful tool for improving OCT topography.
Collapse
|