1
|
Agarwala R, Severitt BR, Reichel FF, Hosp BW, Wahl S. Performance of focus-tunable presbyopia correction lenses operated using gaze-tracking and LIDAR. BIOMEDICAL OPTICS EXPRESS 2025; 16:883-893. [PMID: 40109521 PMCID: PMC11919338 DOI: 10.1364/boe.543807] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/01/2024] [Revised: 12/06/2024] [Accepted: 12/12/2024] [Indexed: 03/22/2025]
Abstract
Presbyopia is an age-related loss of accommodation ability of the eye which affects an individual's capacity to focus on closer objects. With the advent of tunable lens technologies, various algorithms have been developed to tune such lenses for presbyopia correction in older populations. In this study, we assessed a gaze and LIDAR-based feedback mechanism with electronically tunable lenses for their use as correction lenses for presbyopia. The tunable lens prototype was evaluated in 15 healthy young participants with their corrected sphero-cylindrical refraction by comparing their performance for a dynamic matching task under two conditions: (1) natural accommodation, and (2) emulating presbyopia using cycloplegic drops to paralyse accommodation while focussing using the developed visual demonstrator prototype. The participants performed the matching task on three screens placed at multiple distances. We have demonstrated that gaze can be used in conjunction with LIDAR to tune the lenses in the wearable visual demonstrator prototype, enabling participants to achieve a fast and accurate response for the matching task.
Collapse
Affiliation(s)
- Rajat Agarwala
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Lindenstr. 6, Tübingen, Germany
| | - Björn R Severitt
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Lindenstr. 6, Tübingen, Germany
| | - Felix F Reichel
- University Eye Hospital, Centre for Ophthalmology, University Hospital Tübingen, Tübingen, Germany
| | - Benedikt W Hosp
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Lindenstr. 6, Tübingen, Germany
| | - Siegfried Wahl
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Lindenstr. 6, Tübingen, Germany
- Carl Zeiss Vision International GmbH, Turnstr. 27, Aalen, Germany
| |
Collapse
|
2
|
Ebner C, Plopski A, Schmalstieg D, Kalkofen D. Gaze-Contingent Layered Optical See-Through Displays with a Confidence-Driven View Volume. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7203-7213. [PMID: 39255112 DOI: 10.1109/tvcg.2024.3456204] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
The vergence-accommodation conflict (VAC) presents a major perceptual challenge for head-mounted displays with a fixed image plane. Varifocal and layered display designs can mitigate the VAC. However, the image quality of varifocal displays is affected by imprecise eye tracking, whereas layered displays suffer from reduced image contrast as the distance between layers increases. Combined designs support a larger workspace and tolerate some eye-tracking error. However, any layered design with a fixed layer spacing restricts the amount of error compensation and limits the in-focus contrast. We extend previous hybrid designs by introducing confidence-driven volume control, which adjusts the size of the view volume at runtime. We use the eye tracker's confidence to control the spacing of display layers and optimize the trade-off between the display's view volume and the amount of eye tracking error the display can compensate. In the case of high-quality focus point estimation, our approach provides high in-focus contrast, whereas low-quality eye tracking increases the view volume to tolerate the error. We describe our design, present its implementation as an optical-see head-mounted display using a multiplicative layer combination, and present an evaluation comparing our design with previous approaches.
Collapse
|
3
|
Hosp BW, Dechant M, Sauer Y, Severitt B, Agarwala R, Wahl S. VisionaryVR: An Optical Simulation Tool for Evaluating and Optimizing Vision Correction Solutions in Virtual Reality. SENSORS (BASEL, SWITZERLAND) 2024; 24:2458. [PMID: 38676074 PMCID: PMC11053766 DOI: 10.3390/s24082458] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/25/2024] [Revised: 03/26/2024] [Accepted: 04/05/2024] [Indexed: 04/28/2024]
Abstract
In the rapidly advancing field of vision science, traditional research approaches struggle to accurately simulate and evaluate vision correction methods, leading to time-consuming evaluations with limited scope and flexibility. To overcome these challenges, we introduce 'VisionaryVR', a virtual reality (VR) simulation framework designed to enhance optical simulation fidelity and broaden experimental capabilities. VisionaryVR leverages a versatile VR environment to support dynamic vision tasks and integrates comprehensive eye-tracking functionality. Its experiment manager's scene-loading feature fosters a scalable and flexible research platform. Preliminary validation through an empirical study has demonstrated VisionaryVR's effectiveness in replicating a wide range of visual impairments and providing a robust platform for evaluating vision correction solutions. Key findings indicate a significant improvement in evaluating vision correction methods and user experience, underscoring VisionaryVR's potential to transform vision science research by bridging the gap between theoretical concepts and their practical applications. This validation underscores VisionaryVR's contribution to overcoming traditional methodological limitations and establishing a foundational framework for research innovation in vision science.
Collapse
Affiliation(s)
- Benedikt W. Hosp
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Linden Straße 6, 72076 Tübingen, Germany
| | - Martin Dechant
- Interaction Centre, University College London, 66-72 Gower Street, London WC1E 6EA, UK
| | - Yannick Sauer
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Linden Straße 6, 72076 Tübingen, Germany
- Carl Zeiss Vision International GmbH, Turnstraße 27, 73430 Aalen, Germany
| | - Björn Severitt
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Linden Straße 6, 72076 Tübingen, Germany
| | - Rajat Agarwala
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Linden Straße 6, 72076 Tübingen, Germany
| | - Siegfried Wahl
- ZEISS Vision Science Lab, Institute for Ophthalmic Research, University of Tübingen, Maria-von-Linden Straße 6, 72076 Tübingen, Germany
- Carl Zeiss Vision International GmbH, Turnstraße 27, 73430 Aalen, Germany
| |
Collapse
|
4
|
Qiu T, An Q, Wang J, Wang J, Qiu CW, Li S, Lv H, Cai M, Wang J, Cong L, Qu S. Vision-driven metasurfaces for perception enhancement. Nat Commun 2024; 15:1631. [PMID: 38388545 PMCID: PMC10883922 DOI: 10.1038/s41467-024-45296-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2022] [Accepted: 01/16/2024] [Indexed: 02/24/2024] Open
Abstract
Metasurfaces have exhibited unprecedented degree of freedom in manipulating electromagnetic (EM) waves and thus provide fantastic front-end interfaces for smart systems. Here we show a framework for perception enhancement based on vision-driven metasurface. Human's eye movements are matched with microwave radiations to extend the humans' perception spectrum. By this means, our eyes can "sense" visual information and invisible microwave information. Several experimental demonstrations are given for specific implementations, including a physiological-signal-monitoring system, an "X-ray-glasses" system, a "glimpse-and-forget" tracking system and a speech reception system for deaf people. Both the simulation and experiment results verify evident advantages in perception enhancement effects and improving information acquisition efficiency. This framework can be readily integrated into healthcare systems to monitor physiological signals and to offer assistance for people with disabilities. This work provides an alternative framework for perception enhancement and may find wide applications in healthcare, wearable devices, search-and-rescue and others.
Collapse
Affiliation(s)
- Tianshuo Qiu
- Department of Biomedical Engineering, Fourth Military Medical University, Xi'an, China
- Fundamentals Department, Air Force Engineering University, Xi'an, China
- State Key Laboratory of Millimeter Waves, Southeast University, Nanjing, China
| | - Qiang An
- Department of Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Jianqi Wang
- Department of Biomedical Engineering, Fourth Military Medical University, Xi'an, China.
| | - Jiafu Wang
- Aerospace metamaterials laboratory of SuZhou National Laboratory, Suzhou, China.
| | - Cheng-Wei Qiu
- Department of Electrical and Computer Engineering, National University of Singapore, Singapore, Singapore.
| | - Shiyong Li
- School of Integrated Circuits and Electronics, Beijing Institute of Technology, Beijing, China
| | - Hao Lv
- Department of Biomedical Engineering, Fourth Military Medical University, Xi'an, China.
| | - Ming Cai
- Fundamentals Department, Air Force Engineering University, Xi'an, China
| | - Jianyi Wang
- Department of Neurology, the First Affiliated Hospital of Xi'an Jiaotong University, Xi'an, China
| | - Lin Cong
- Department of Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Shaobo Qu
- Aerospace metamaterials laboratory of SuZhou National Laboratory, Suzhou, China.
| |
Collapse
|
5
|
RaviChandran N, Teo ZL, Ting DSW. Artificial intelligence enabled smart digital eye wearables. Curr Opin Ophthalmol 2023; 34:414-421. [PMID: 37527195 DOI: 10.1097/icu.0000000000000985] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/03/2023]
Abstract
PURPOSE OF REVIEW Smart eyewear is a head-worn wearable device that is evolving as the next phase of ubiquitous wearables. Although their applications in healthcare are being explored, they have the potential to revolutionize teleophthalmology care. This review highlights their applications in ophthalmology care and discusses future scope. RECENT FINDINGS Smart eyewear equips advanced sensors, optical displays, and processing capabilities in a wearable form factor. Rapid technological developments and the integration of artificial intelligence are expanding their reach from consumer space to healthcare applications. This review systematically presents their applications in treating and managing eye-related conditions. This includes remote assessments, real-time monitoring, telehealth consultations, and the facilitation of personalized interventions. They also serve as low-vision assistive devices to help visually impaired, and can aid physicians with operational and surgical tasks. SUMMARY Wearables such as smart eyewear collects rich, continuous, objective, individual-specific data, which is difficult to obtain in a clinical setting. By leveraging sophisticated data processing and artificial intelligence based algorithms, these data can identify at-risk patients, recognize behavioral patterns, and make timely interventions. They promise cost-effective and personalized treatment for vision impairments in an effort to mitigate the global burden of eye-related conditions and aging.
Collapse
Affiliation(s)
| | - Zhen Ling Teo
- Singapore National Eye Center, Singapore Eye Research Institute
| | - Daniel S W Ting
- AI and Digital Innovations
- Singapore National Eye Center, Singapore Eye Research Institute
- Duke-NUS Medical School, National University Singapore, Singapore
| |
Collapse
|
6
|
Ebner C, Mohr P, Langlotz T, Peng Y, Schmalstieg D, Wetzstein G, Kalkofen D. Off-Axis Layered Displays: Hybrid Direct-View/Near-Eye Mixed Reality with Focus Cues. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2816-2825. [PMID: 37027729 DOI: 10.1109/tvcg.2023.3247077] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
This work introduces off-axis layered displays, the first approach to stereoscopic direct-view displays with support for focus cues. Off-axis layered displays combine a head-mounted display with a traditional direct-view display for encoding a focal stack and thus, for providing focus cues. To explore the novel display architecture, we present a complete processing pipeline for the real-time computation and post-render warping of off-axis display patterns. In addition, we build two prototypes using a head-mounted display in combination with a stereoscopic direct-view display, and a more widely available monoscopic direct-view display. In addition we show how extending off-axis layered displays with an attenuation layer and with eye-tracking can improve image quality. We thoroughly analyze each component in a technical evaluation and present examples captured through our prototypes.
Collapse
|
7
|
Güzel AH, Beyazian J, Chakravarthula P, Akșit K. ChromaCorrect: prescription correction in virtual reality headsets through perceptual guidance. BIOMEDICAL OPTICS EXPRESS 2023; 14:2166-2180. [PMID: 37206152 PMCID: PMC10191670 DOI: 10.1364/boe.485776] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/18/2023] [Revised: 04/05/2023] [Accepted: 04/10/2023] [Indexed: 05/21/2023]
Abstract
A large portion of today's world population suffers from vision impairments and wears prescription eyeglasses. However, prescription glasses cause additional bulk and discomfort when used with virtual reality (VR) headsets, negatively impacting the viewer's visual experience. In this work, we remedy the usage of prescription eyeglasses with screens by shifting the optical complexity into the software. Our proposal is a prescription-aware rendering approach for providing sharper and more immersive imagery for screens, including VR headsets. To this end, we develop a differentiable display and visual perception model encapsulating the human visual system's display-specific parameters, color, visual acuity, and user-specific refractive errors. Using this differentiable visual perception model, we optimize the rendered imagery in the display using gradient-descent solvers. This way, we provide prescription glasses-free sharper images for a person with vision impairments. We evaluate our approach and show significant quality and contrast improvements for users with vision impairments.
Collapse
Affiliation(s)
| | - Jeanne Beyazian
- University College London, Computer Science Department, London, UK
| | | | - Kaan Akșit
- University College London, Computer Science Department, London, UK
| |
Collapse
|
8
|
Agarwala R, Lukashova Sanz O, Seitz IP, Reichel FF, Wahl S. Evaluation of a liquid membrane-based tunable lens and a solid-state LIDAR camera feedback system for presbyopia. BIOMEDICAL OPTICS EXPRESS 2022; 13:5849-5859. [PMID: 36733729 PMCID: PMC9872906 DOI: 10.1364/boe.471190] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Revised: 08/27/2022] [Accepted: 08/28/2022] [Indexed: 05/30/2023]
Abstract
Presbyopia is an age-related loss of accommodation ability of the eye which affects individuals in their late 40s or early 50s. Presbyopia reduces the ability of a person to focus on closer objects at will. In this study, we assessed electronically tunable lenses for their aberration properties as well as for their use as correction lenses. The tunable lenses were evaluated in healthy subjects with cycloplegia by measuring visual acuity and contrast sensitivity for their use in presbyopia correction. Furthermore, we have developed and demonstrated the feasibility of a feedback mechanism for the operation of tunable lenses using a portable solid-state LIDAR camera with a processing time of 40 ± 5 ms.
Collapse
Affiliation(s)
- Rajat Agarwala
- Institute for Ophthalmic Research, University of Tuebingen, Elfriede-Aulhorn-Str. 7, Tuebingen, 72076, Germany
| | - Olga Lukashova Sanz
- Institute for Ophthalmic Research, University of Tuebingen, Elfriede-Aulhorn-Str. 7, Tuebingen, 72076, Germany
- Carl Zeiss Vision International GmbH, Turnstr. 27, Aalen, 73430, Germany
| | - Immanuel P. Seitz
- Institute for Ophthalmic Research, University of Tuebingen, Elfriede-Aulhorn-Str. 7, Tuebingen, 72076, Germany
- Carl Zeiss Vision International GmbH, Turnstr. 27, Aalen, 73430, Germany
| | - Felix F. Reichel
- Institute for Ophthalmic Research, University of Tuebingen, Elfriede-Aulhorn-Str. 7, Tuebingen, 72076, Germany
- University Eye Hospital, Centre for Ophthalmology, University Hospital Tübingen, Tübingen, Germany
| | - Siegfried Wahl
- Institute for Ophthalmic Research, University of Tuebingen, Elfriede-Aulhorn-Str. 7, Tuebingen, 72076, Germany
- Carl Zeiss Vision International GmbH, Turnstr. 27, Aalen, 73430, Germany
| |
Collapse
|
9
|
Ebner C, Mori S, Mohr P, Peng Y, Schmalstieg D, Wetzstein G, Kalkofen D. Video See-Through Mixed Reality with Focus Cues. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2256-2266. [PMID: 35167471 DOI: 10.1109/tvcg.2022.3150504] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
This work introduces the first approach to video see-through mixed reality with full support for focus cues. By combining the flexibility to adjust the focus distance found in varifocal designs with the robustness to eye-tracking error found in multifocal designs, our novel display architecture reliably delivers focus cues over a large workspace. In particular, we introduce gaze-contingent layered displays and mixed reality focal stacks, an efficient representation of mixed reality content that lends itself to fast processing for driving layered displays in real time. We thoroughly evaluate this approach by building a complete end-to-end pipeline for capture, render, and display of focus cues in video see-through displays that uses only off-the-shelf hardware and compute components.
Collapse
|
10
|
Acosta-Vargas P, Salvador-Acosta B, Salvador-Ullauri L, Jadán-Guerrero J. Accessibility challenges of e-commerce websites. PeerJ Comput Sci 2022; 8:e891. [PMID: 35494830 PMCID: PMC9044289 DOI: 10.7717/peerj-cs.891] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 01/24/2022] [Indexed: 06/12/2023]
Abstract
Today, there are many e-commerce websites, but not all of them are accessible. Accessibility is a crucial element that can make a difference and determine the success or failure of a digital business. The study was applied to 50 e-commerce sites in the top rankings according to the classification proposed by ecommerceDB. In evaluating the web accessibility of e-commerce sites, we applied an automatic review method based on a modification of Website Accessibility Conformance Evaluation Methodology (WCAG-EM) 1.0. To evaluate accessibility, we used Web Accessibility Evaluation Tool (WAVE) with the extension for Google Chrome, which helps verify password-protected, locally stored, or highly dynamic pages. The study found that the correlation between the ranking of e-commerce websites and accessibility barriers is 0.329, indicating that the correlation is low positive according to Spearman's Rho. According to the WAVE analysis, the research results reveal that the top 10 most accessible websites are Sainsbury's Supermarkets, Walmart, Target Corporation, Macy's, IKEA, H&M Hennes, Chewy, Kroger, QVC, and Nike. The most significant number of accessibility barriers relate to contrast errors that must be corrected for e-commerce websites to reach an acceptable level of accessibility. The most neglected accessibility principle is perceivable, representing 83.1%, followed by operable with 13.7%, in third place is robust with 1.7% and finally understandable with 1.5%. Future work suggests constructing a software tool that includes artificial intelligence algorithms that help the software identify accessibility barriers.
Collapse
Affiliation(s)
- Patricia Acosta-Vargas
- Intelligent and Interactive Systems Laboratory/FICA/Industrial Engineering, Universidad de Las Américas - Ecuador, Quito, Ecuador
| | | | - Luis Salvador-Ullauri
- Department of Software and Computing Systems, University of Alicante, Alicante, España
| | - Janio Jadán-Guerrero
- Centro de Investigación en Mecatrónica y Sistemas Interactivos - MIST, Universidad Tecnológica Indoamérica, Quito, Ecuador
| |
Collapse
|
11
|
Tringali D, Haci D, Mazza F, Nikolic K, Demarchi D, Constandinou TG. Eye Accommodation Sensing for Adaptive Focus Adjustment. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:7460-7464. [PMID: 34892819 DOI: 10.1109/embc46164.2021.9629844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Over 2 billion people across the world are affected by some visual impairment - mostly related to optical issues, and this number is estimated to grow. Often, particularly in the elderly, more than one condition can affect the eyes at the same time, e.g., myopia and presbyopia. Bifocal or multifocal lenses can be used, these however may become uncomfortable or disturbing and are not adapted to the user. There is therefore a need and opportunity for a new type of glasses able to adaptively change the lenses' focus. This paper explores the feasibility of recording the eye accommodation process in a non-invasive way using a wearable device. This can provide a way to measure eye convergence in real-time to determine what a person's eye is focused on. In this study, Electro-oculography (EoG) is used to observe eye muscle activity and estimate eye movement. To assess this, a group of 11 participants we each asked to switch their gaze from a near to far target and vice versa, whilst their EoG was measured. This revealed two distinct waveforms: one for the transition from a far to near target, and one for the transition from a near to far target. This informed the design of a correlation-based classifier to detect which signals are related to a far to near, or near to far transition. This achieved a classification accuracy of 97.9±1.37% across the experimental results gathered from our 11 participants. This pilot data provides a basic starting point to justify future device development.
Collapse
|
12
|
Karkhanis MU, Ghosh C, Banerjee A, Hasan N, Likhite R, Ghosh T, Kim H, Mastrangelo CH. Correcting Presbyopia With Autofocusing Liquid-Lens Eyeglasses. IEEE Trans Biomed Eng 2021; 69:390-400. [PMID: 34232861 DOI: 10.1109/tbme.2021.3094964] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
OBJECTIVE Presbyopia, an age-related ocular disorder, is characterized by the loss in the accommodative abilities of the human eye. Conventional methods of correcting presbyopia divide the field of view, thereby resulting in significant vision impairment. We demonstrate the design, assembly and evaluation of autofocusing eyeglasses for restoration of accommodation without dividing the field of view. METHODS The adaptive optics eyeglasses comprise of two variable-focus liquid lenses, a time-of-flight range sensor and low-power, dual microprocessor control electronics, housed within an ergonomic frame. Subject-specific accommodation deficiency models were utilized to demonstrate high-fidelity accommodative correction. The abilities of this system to reduce accommodation deficiency, its power consumption, response time, optical performance and MTF were evaluated. RESULTS Average corrected accommodation deficiencies for 5 subjects ranged from -0.021 D to 0.016 D. Each accommodation correction calculation was performed in ∼67 ms which consumed 4.86 mJ of energy. The optical resolution of the system was 10.5 cycles/degree, and featured a restorative accommodative range of 4.3 D. This system was capable of running for up to 19 hours between charge cycles and weighed ∼132 g. CONCLUSION The design, assembly and performance of an autofocusing eyeglasses system to restore accommodation in presbyopes has been demonstrated. SIGNIFICANCE The new autofocusing eyeglasses system presented in this article has the potential to restore pre-presbyopic levels of accommodation in subjects diagnosed with presbyopia.
Collapse
|
13
|
Angelopoulos AN, Martel JNP, Kohli AP, Conradt J, Wetzstein G. Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2577-2586. [PMID: 33780340 DOI: 10.1109/tvcg.2021.3067784] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
Collapse
|
14
|
Itoh Y, Langlotz T, Zollmann S, Iwai D, Kiyoshi K, Amano T. Computational Phase-Modulated Eyeglasses. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:1916-1928. [PMID: 31613772 DOI: 10.1109/tvcg.2019.2947038] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
We present computational phase-modulated eyeglasses, a see-through optical system that modulates the view of the user using phase-only spatial light modulators (PSLM). A PSLM is a programmable reflective device that can selectively retardate, or delay, the incoming light rays. As a result, a PSLM works as a computational dynamic lens device. We demonstrate our computational phase-modulated eyeglasses with either a single PSLM or dual PSLMs and show that the concept can realize various optical operations including focus correction, bi-focus, image shift, and field of view manipulation, namely optical zoom. Compared to other programmable optics, computational phase-modulated eyeglasses have the advantage in terms of its versatility. In addition, we also presents some prototypical focus-loop applications where the lens is dynamically optimized based on distances of objects observed by a scene camera. We further discuss the implementation, applications but also discuss limitations of the current prototypes and remaining issues that need to be addressed in future research.
Collapse
|
15
|
Aydındoğan G, Kavaklı K, Şahin A, Artal P, Ürey H. Applications of augmented reality in ophthalmology [Invited]. BIOMEDICAL OPTICS EXPRESS 2021; 12:511-538. [PMID: 33659087 PMCID: PMC7899512 DOI: 10.1364/boe.405026] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/11/2020] [Revised: 12/08/2020] [Accepted: 12/10/2020] [Indexed: 05/21/2023]
Abstract
Throughout the last decade, augmented reality (AR) head-mounted displays (HMDs) have gradually become a substantial part of modern life, with increasing applications ranging from gaming and driver assistance to medical training. Owing to the tremendous progress in miniaturized displays, cameras, and sensors, HMDs are now used for the diagnosis, treatment, and follow-up of several eye diseases. In this review, we discuss the current state-of-the-art as well as potential uses of AR in ophthalmology. This review includes the following topics: (i) underlying optical technologies, displays and trackers, holography, and adaptive optics; (ii) accommodation, 3D vision, and related problems such as presbyopia, amblyopia, strabismus, and refractive errors; (iii) AR technologies in lens and corneal disorders, in particular cataract and keratoconus; (iv) AR technologies in retinal disorders including age-related macular degeneration (AMD), glaucoma, color blindness, and vision simulators developed for other types of low-vision patients.
Collapse
Affiliation(s)
- Güneş Aydındoğan
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Koray Kavaklı
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Afsun Şahin
- Koç University, School of Medicine and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Pablo Artal
- Laboratorio de Óptica, Instituto Universitario de Investigación en Óptica y Nanofísica, Universidad de Murcia, Campus de Espinardo, E-30100 Murcia, Spain
| | - Hakan Ürey
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| |
Collapse
|
16
|
Kaminokado T, Hiroi Y, Itoh Y. StainedView: Variable-Intensity Light-Attenuation Display with Cascaded Spatial Color Filtering for Improved Color Fidelity. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3576-3586. [PMID: 32941143 DOI: 10.1109/tvcg.2020.3023569] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
We present StainedView, an optical see-through display that spatially filters the spectral distribution of light to form an image with improved color fidelity. Existing light-attenuation displays have limited color fidelity and contrast, resulting in a degraded appearance of virtual images. To use these displays to present virtual images that are more consistent with the real world, we require three things: intensity modulation of incoming light, spatial color filtering with narrower bandwidth, and appropriate light modulation for incoming light with an arbitrary spectral distribution. In StainedView, we address the three requirements by cascading two phase-only spatial light modulators (PSLMs), a digital micromirror device, and polarization optics to control both light intensity and spectrum distribution. We show that our design has a 1.8 times wider color gamut fidelity (75.8% fulfillment of sRGB color space) compared to the existing single-PSLM approach (41.4%) under a reference white light. We demonstrated the design with a proof-of-concept display system. We further introduce our optics design and pixel-selection algorithm for the given light input, evaluate the spatial color filter, and discuss the limitation of the current prototype.
Collapse
|
17
|
Portable device for presbyopia correction with optoelectronic lenses driven by pupil response. Sci Rep 2020; 10:20293. [PMID: 33219301 PMCID: PMC7680150 DOI: 10.1038/s41598-020-77465-5] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2020] [Accepted: 11/11/2020] [Indexed: 11/09/2022] Open
Abstract
A novel portable device has been developed and built to dynamically, and automatically, correct presbyopia by means of a couple of opto-electronics lenses driven by pupil tracking. The system is completely portable providing with a high range of defocus correction up to 10 D. The glasses are controlled and powered by a smartphone. To achieve a truly real-time response, image processing algorithms have been implemented in OpenCL and ran on the GPU of the smartphone. To validate the system, different visual experiments were carried out in presbyopic subjects. Visual acuity was maintained nearly constant for a range of distances from 5 m to 20 cm.
Collapse
|
18
|
Nam SW, Moon S, Lee B, Kim D, Lee S, Lee CK, Lee B. Aberration-corrected full-color holographic augmented reality near-eye display using a Pancharatnam-Berry phase lens. OPTICS EXPRESS 2020; 28:30836-30850. [PMID: 33115076 DOI: 10.1364/oe.405131] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/11/2023]
Abstract
We present a full-color holographic augmented reality near-eye display using a Pancharatnam-Berry phase lens (PBP lens) and its aberration correction method. Monochromatic and chromatic aberrations of the PBP lens are corrected by utilizing complex wavefront modulation of the holographic display. A hologram calculation method incorporating the phase profile of the PBP lens is proposed to correct the monochromatic aberration. Moreover, the chromatic aberration is corrected by warping the image using the mapping function obtained from ray tracing. The proposed system is demonstrated with the benchtop prototype, and the experimental results show that the proposed system offers 50° field of view full-color holographic images without optical aberrations.
Collapse
|
19
|
Rathinavel K, Wetzstein G, Fuchs H. Varifocal Occlusion-Capable Optical See-through Augmented Reality Display based on Focus-tunable Optics. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2019; 25:3125-3134. [PMID: 31502977 DOI: 10.1109/tvcg.2019.2933120] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Optical see-through augmented reality (AR) systems are a next-generation computing platform that offer unprecedented user experiences by seamlessly combining physical and digital content. Many of the traditional challenges of these displays have been significantly improved over the last few years, but AR experiences offered by today's systems are far from seamless and perceptually realistic. Mutually consistent occlusions between physical and digital objects are typically not supported. When mutual occlusion is supported, it is only supported for a fixed depth. We propose a new optical see-through AR display system that renders mutual occlusion in a depth-dependent, perceptually realistic manner. To this end, we introduce varifocal occlusion displays based on focus-tunable optics, which comprise a varifocal lens system and spatial light modulators that enable depth-corrected hard-edge occlusions for AR experiences. We derive formal optimization methods and closed-form solutions for driving this tunable lens system and demonstrate a monocular varifocal occlusion-capable optical see-through AR display capable of perceptually realistic occlusion across a large depth range.
Collapse
|