1
|
Plabst L, Niebling F, Oberdorfer S, Ortega F. Order Up! Multimodal Interaction Techniques for Notifications in Augmented Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:2258-2267. [PMID: 40053630 DOI: 10.1109/tvcg.2025.3549186] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/09/2025]
Abstract
As augmented reality (AR) headsets become increasingly integrated into professional and social settings, a critical challenge emerges: how can users effectively manage and interact with the frequent notifications they receive? With adults receiving nearly 200 notifications daily on their smartphones, which serve as primary computing devices for many, translating this interaction to AR systems is paramount. Unlike traditional devices, AR systems augment the physical world, requiring interaction techniques that blend seamlessly with real-world behaviors. This study explores the complexities of multimodal interaction with notifications in AR. We investigated user preferences, usability, workload, and performance during a virtual cooking task, where participants managed customer orders while interacting with notifications. Various interaction techniques were tested: Point and Pinch, Gaze and Pinch, Point and Voice, Gaze and Voice, and Touch. Our findings reveal significant impacts on workload, performance, and usability based on the interaction method used. We identify key issues in multimodal interaction and offer guidance for optimizing these techniques in AR environments.
Collapse
|
2
|
Park SY, Koo DK. The Impact of Virtual Reality Content Characteristics on Cybersickness and Head Movement Patterns. SENSORS (BASEL, SWITZERLAND) 2025; 25:215. [PMID: 39797006 PMCID: PMC11722776 DOI: 10.3390/s25010215] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2024] [Revised: 12/30/2024] [Accepted: 12/31/2024] [Indexed: 01/13/2025]
Abstract
Virtual reality (VR) technology has gained popularity across various fields; however, its use often induces cybersickness, characterized by symptoms such as dizziness, nausea, and eye strain. This study investigated the differences in cybersickness levels and head movement patterns under three distinct VR viewing conditions: dynamic VR (DVR), static VR (SVR), and a control condition (CON) using a simulator. Thirty healthy adults participated, and their head movements were recorded using the Meta Quest 2 VR headset and analyzed using Python. The Virtual Reality Sickness Questionnaire (VRSQ) assessed subjective cybersickness levels. The results revealed that the SVR condition induced the highest VRSQ scores (M = 58.057), indicating the most severe cybersickness symptoms, while the DVR condition elicited significantly higher values in head movement variables, particularly in the coefficient of variation (CV) and integral values of head position along the vertical axis, and mean velocity (p < 0.05). These findings suggest that VR content characteristics directly influence users' head movement patterns, closely related to cybersickness occurrence and severity. This study highlights the importance of analyzing head movement patterns in cybersickness research and provides insights for VR content design.
Collapse
Affiliation(s)
- Seo-Yoon Park
- Department of Physical Therapy, College of Health and Welfare, Woosuk University, 443 Samnye-ro, Samnye-eup, Wanju-gun 55338, Republic of Korea;
| | - Dong-Kyun Koo
- University-Industrial Cooperation Corps of HiVE Center, Wonkwang Health Science University, 514, Iksan-daero, Iksan-si 54538, Republic of Korea
| |
Collapse
|
3
|
Johnson PB, Bradley J, Lampotang S, Jackson A, Lizdas D, Johnson W, Brooks E, Vega RBM, Mendenhall N. First-in-human trial using mixed-reality visualization for patient setup during breast or chest wall radiotherapy. Radiat Oncol 2024; 19:163. [PMID: 39558366 PMCID: PMC11574990 DOI: 10.1186/s13014-024-02552-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2024] [Accepted: 10/31/2024] [Indexed: 11/20/2024] Open
Abstract
BACKGROUND The purpose of this study is to assess the feasibility of mixed-reality (MixR) visualization for patient setup in breast and chest wall radiotherapy (RT) by performing a first-in-human clinical trial comparing MixR with a 3-point alignment. METHODS IRB approval was granted for a study incorporating MixR during the setup process for patients undergoing proton (n = 10) or photon (n = 8) RT to the breast or chest wall. For each patient, MixR was utilized for five fractions and compared against another five fractions using 3-point alignment. During fractions with MixR, the patient was aligned by at least one therapist wearing a HoloLens 2 device who was able to guide the process by simultaneously and directly viewing the patient and a hologram of the patient's surface derived from their simulation CT scan. Alignment accuracy was quantified with cone-beam CT (CBCT) for photon treatments and CBCT plus kV/kV imaging for proton treatments. Registration time was tracked throughout the setup process as well as the amount of image guidance (IGRT) utilized for final alignment. RESULTS In the proton cohort, the mean 3D shift was 0.96 cm using 3-point alignment and 1.18 cm using MixR. An equivalence test indicated that the difference in registration accuracy between the two techniques was less than 0.5 cm. In the photon cohort, the mean 3D shift was 1.18 cm using 3-point alignment and 1.00 cm using MixR. An equivalence test indicated that the difference in registration accuracy was less than 0.3 cm. Minor differences were seen in registration time and the amount of IGRT utilization. CONCLUSIONS MixR for patient setup for breast cancer RT is possible at the level of accuracy and efficiency provided by a 3-point alignment. Further developments in marker tracking, feedback, and a better understanding of the perceptual challenges of MixR are needed to achieve a similar level of accuracy as provided by modern surface-guided radiotherapy (SGRT) systems. TRIAL REGISTRATION ClinicalTrials.gov, UFHPTI 2015-BR05: Improving Breast Radiotherapy Setup and Delivery Using Mixed-Reality Visualization, NCT05178927.
Collapse
Affiliation(s)
- Perry B Johnson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA.
- University of Florida College of Medicine, Gainesville, FL, USA.
| | - Julie Bradley
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Samsun Lampotang
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - Amanda Jackson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - David Lizdas
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - William Johnson
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
| | - Eric Brooks
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Raymond B Mailhot Vega
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| | - Nancy Mendenhall
- University of Florida Health Proton Therapy Institute, Jacksonville, FL, USA
- University of Florida College of Medicine, Gainesville, FL, USA
| |
Collapse
|
4
|
Li D, Jabbireddy S, Zhang Y, Metzler C, Varshney A. Instant-SFH: Non-Iterative Sparse Fourier Holograms Using Perlin Noise. SENSORS (BASEL, SWITZERLAND) 2024; 24:7358. [PMID: 39599134 PMCID: PMC11598788 DOI: 10.3390/s24227358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2024] [Revised: 11/15/2024] [Accepted: 11/16/2024] [Indexed: 11/29/2024]
Abstract
Holographic displays are an upcoming technology for AR and VR applications, with the ability to show 3D content with accurate depth cues, including accommodation and motion parallax. Recent research reveals that only a fraction of holographic pixels are needed to display images with high fidelity, improving energy efficiency in future holographic displays. However, the existing iterative method for computing sparse amplitude and phase layouts does not run in real time; instead, it takes hundreds of milliseconds to render an image into a sparse hologram. In this paper, we present a non-iterative amplitude and phase computation for sparse Fourier holograms that uses Perlin noise in the image-plane phase. We conduct simulated and optical experiments. Compared to the Gaussian-weighted Gerchberg-Saxton method, our method achieves a run time improvement of over 600 times while producing a nearly equal PSNR and SSIM quality. The real-time performance of our method enables the presentation of dynamic content crucial to AR and VR applications, such as video streaming and interactive visualization, on holographic displays.
Collapse
Affiliation(s)
- David Li
- Department of Computer Science, University of Maryland, College Park, MD 20742, USA; (S.J.); (C.M.); (A.V.)
| | - Susmija Jabbireddy
- Department of Computer Science, University of Maryland, College Park, MD 20742, USA; (S.J.); (C.M.); (A.V.)
| | - Yang Zhang
- Department of Electrical and Computer Engineering, University of Maryland, College Park, MD 20742, USA;
| | - Christopher Metzler
- Department of Computer Science, University of Maryland, College Park, MD 20742, USA; (S.J.); (C.M.); (A.V.)
| | - Amitabh Varshney
- Department of Computer Science, University of Maryland, College Park, MD 20742, USA; (S.J.); (C.M.); (A.V.)
| |
Collapse
|
5
|
Dymczyk M, Przekoracka-Krawczyk A, Kapturek Z, Pyżalska P. Effect of a vergence-accommodation conflict induced during a 30-minute Virtual Reality game on vergence-accommodation parameters and related symptoms. JOURNAL OF OPTOMETRY 2024; 17:100524. [PMID: 39520823 PMCID: PMC11585873 DOI: 10.1016/j.optom.2024.100524] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/05/2024] [Revised: 09/03/2024] [Accepted: 09/20/2024] [Indexed: 11/16/2024]
Abstract
PURPOSE The aim of the study was to verify the hypotheses that vergence-accommodation conflict (VAC) induced with head-mounted device (HMD) could cause symptoms in relation to changes in the accommodative-vergence system. In order to test this hypothesis, the Virtual Reality (VR) exposures were carried out in two types of VAC: VACsmall and VAClarge. METHOD Eighteen females, with a mean age of 22.5 ± 2.0 years, participated in two 30-minutes sessions with VR, which were separated by at least one week. Two sessions were differentiated by intensity of VAC presented in the VR system (VACsmall and VAClarge). Visual parameters were measured such as associated and dissociated phoria, accommodative response, the near point of convergence (NPC), fusional vergence ranges (FVR) and subjective complaints were measured using Simulator Sickness Questionnaire (SSQ). The parameters were measured immediately before (Pre-test) and after (Post-test) the VR exposure. RESULTS The subjective symptoms as nausea, oculomotor disorders and disorientation increased significantly after 30-minutes of exposure on VAClarge (P<0.05). The associated and dissociated phoria, lag of accommodation, FVR and the NPC did not significantly change after the VR exposure (P>0.05). CONCLUSION Short-term use of HMD (30-min) did not significantly affect accommodative-vergence functions regardless of the size of VAC (VACsmall and VAClarge). However, the level of symptoms increased after VR sessions, which was probably related to inappropriate oculo-vestibular relationship.
Collapse
Affiliation(s)
- Maciej Dymczyk
- Laboratory of Vision Science and Optometry, Faculty of Physics, Adam Mickiewicz University, Poznan, 61614, Poland.
| | - Anna Przekoracka-Krawczyk
- Laboratory of Vision Science and Optometry, Faculty of Physics, Adam Mickiewicz University, Poznan, 61614, Poland
| | - Zuzanna Kapturek
- Laboratory of Vision Science and Optometry, Faculty of Physics, Adam Mickiewicz University, Poznan, 61614, Poland
| | - Paulina Pyżalska
- Laboratory of Vision Science and Optometry, Faculty of Physics, Adam Mickiewicz University, Poznan, 61614, Poland
| |
Collapse
|
6
|
Hirota M, Sasaki K, Kato K, Nakagomi R, Takigawa R, Kageyama C, Morino S, Suzuki M, Mihashi T, Mizota A, Hayashi T. Ocular Accommodative and Pupillary Responses During Fixation on Augmented Reality With a Maxwellian Display. Invest Ophthalmol Vis Sci 2024; 65:30. [PMID: 39292450 PMCID: PMC11412603 DOI: 10.1167/iovs.65.11.30] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/19/2024] Open
Abstract
Purpose This study aimed to investigate the changes in ocular refraction and pupillary diameter during fixation on augmented reality (AR) images using a Maxwellian display. Methods Twenty-two healthy young volunteers (average age, 20.7 ± 0.5 years) wore a Maxwellian display device in front of their right eye and fixated on an asterisk displayed on both a liquid-crystal display (real target) and a Maxwellian display (AR target) for 29 seconds (real as a baseline for 3 seconds, AR for 13 seconds, and real for 13 seconds) at distances of 5.0, 0.5, 0.33, and 0.2 meters. A binocular open-view autorefractometer was used to measure the ocular refraction and pupillary diameter of the left eye. Results Accommodative (5.0 meters, 0.28 ± 0.29 diopter [D]; 0.5 meter, -0.12 ± 0.35 D; 0.33 meter, -0.43 ± 0.57 D; 0.2 meter, -1.20 ± 0.82 D) and pupillary (5.0 meters, 0.07 ± 0.22 mm; 0.5 meter, -0.08 ± 0.17 mm; 0.33 meter, -0.16 ± 0.20 mm; 0.2 meter, -0.25 ± 0.24 mm) responses were negative when the real target distances were farther away. The accommodative response was significantly and positively correlated with the pupillary response during fixation on the AR target (R2 = 0.187, P < 0.001). Conclusions Fixating on AR images using a Maxwellian display induces accommodative and pupillary responses. Accommodative responses depend on the distance between real objects. Overall, the Maxwellian display does not completely eliminate accommodation in real space.
Collapse
Affiliation(s)
- Masakazu Hirota
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
- Department of Ophthalmology, School of Medicine, Teikyo University, Itabashi-ku, Tokyo, Japan
- Graduate Degree Program of Health Data Science, Teikyo University, Itabashi-ku, Tokyo, Japan
- Graduate Degree Program of Comprehensive Applied Data Science, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Kakeru Sasaki
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Kanako Kato
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Ryota Nakagomi
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Ryusei Takigawa
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Chinatsu Kageyama
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | | | | | - Toshifumi Mihashi
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
| | - Atsushi Mizota
- Department of Ophthalmology, School of Medicine, Teikyo University, Itabashi-ku, Tokyo, Japan
- Nishikasai Inouye Eye Hospital, Edogawa-ku, Tokyo, Japan
| | - Takao Hayashi
- Department of Orthoptics, Faculty of Medical Technology, Teikyo University, Itabashi-ku, Tokyo, Japan
- Department of Ophthalmology, School of Medicine, Teikyo University, Itabashi-ku, Tokyo, Japan
| |
Collapse
|
7
|
Luckykumar Dwarkadas A, Talasila V, Challa RK, K G S. A review of the application of virtual and augmented reality in physical and occupational therapy. SOFTWARE: PRACTICE AND EXPERIENCE 2024; 54:1378-1407. [DOI: 10.1002/spe.3323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/22/2023] [Accepted: 02/14/2024] [Indexed: 01/04/2025]
Abstract
AbstractThis paper includes a research review in five bibliographic databases on using the application of virtual reality (VR) and augmented reality (AR) in physical and occupational therapy (POT). This literature review addresses five research questions and two sub‐research questions. A total of 36 relevant studies were selected in the review based on the defined keywords and inclusion‐exclusion criteria. The primary motivation for using the application of VR and AR in POT is that it is accurate, involves higher patient participation, and requires less therapy recovery time. The standard software tool used is the Unity 3D game engine, and the common device used is the Oculus Rift HMD. Various applications of VR and AR consist of different VR environments and AR contents used in POT. Post‐stroke rehabilitation, rehabilitation exercises, pain management, mental and behavioral disorders, and autism in children are the main aspects addressed through the VR and AR environments. Literature review indicates that questionnaires, interviews, and observation are the primary metrics for measuring therapy's effectiveness. The study's findings show positive results such as reduced treatment time, nervousness, pain, hospitalization period, making therapy enjoyable and encouraging, improved quality of life, and focus on using the application of VR and AR in POT. This review will be relevant to researchers, VR and AR application designers, doctors, and patients using the application of VR and AR in POT. Further research addressing multiple participants with clinical trials, adding new VR environments and AR content in VR and AR applications, including follow‐up sessions, and increasing training sessions while using the application of VR and AR in POT are recommended.
Collapse
Affiliation(s)
- Agrawal Luckykumar Dwarkadas
- Computer Science and Engineering Department National Institute of Technical Teachers Training and Research Chandigarh India
| | - Viswanath Talasila
- Electronics and Telecommunication Department Ramaiah Institute of Technology Bangaluru India
| | - Rama Krishna Challa
- Computer Science and Engineering Department National Institute of Technical Teachers Training and Research Chandigarh India
| | - Srinivasa K G
- Computer Science and Engineering Department International Institute of Information Technology Chhattisgarh India
| |
Collapse
|
8
|
Saeedpour-Parizi MR, Williams NL, Wong T, Guan P, Manocha D, Erkelens IM. Perceptual Thresholds for Radial Optic Flow Distortion in Near-Eye Stereoscopic Displays. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2570-2579. [PMID: 38437086 DOI: 10.1109/tvcg.2024.3372075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
We provide the first perceptual quantification of user's sensitivity to radial optic flow artifacts and demonstrate a promising approach for masking this optic flow artifact via blink suppression. Near-eye HMOs allow users to feel immersed in virtual environments by providing visual cues, like motion parallax and stereoscopy, that mimic how we view the physical world. However, these systems exhibit a variety of perceptual artifacts that can limit their usability and the user's sense of presence in VR. One well-known artifact is the vergence-accommodation conflict (VAC). Varifocal displays can mitigate VAC, but bring with them other artifacts such as a change in virtual image size (radial optic flow) when the focal plane changes. We conducted a set of psychophysical studies to measure users' ability to perceive this radial flow artifact before, during, and after self-initiated blinks. Our results showed that visual sensitivity was reduced by a factor of 10 at the start and for ~70 ms after a blink was detected. Pre- and post-blink sensitivity was, on average, ~O.15% image size change during normal viewing and increased to ~1.5- 2.0% during blinks. Our results imply that a rapid (under 70 ms) radial optic flow distortion can go unnoticed during a blink. Furthermore, our results provide empirical data that can be used to inform engineering requirements for both hardware design and software-based graphical correction algorithms for future varifocal near-eye displays. Our project website is available at https://gamma.umd.edu/ROF/.
Collapse
|
9
|
Wang XM, Southwick D, Robinson I, Nitsche M, Resch G, Mazalek A, Welsh TN. The geometry of the vergence-accommodation conflict in mixed reality systems. VIRTUAL REALITY 2024; 28:95. [PMID: 39233779 PMCID: PMC11371868 DOI: 10.1007/s10055-024-00991-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 03/21/2024] [Indexed: 09/06/2024]
Abstract
Mixed reality technologies, such as virtual (VR) and augmented (AR) reality, present promising opportunities to advance education and professional training due to their adaptability to diverse contexts. Distortions in the perceived distance in such mediated conditions, however, are well documented and have imposed nontrivial challenges that complicate and limit transferring task performance in a virtual setting to the unmediated reality (UR). One potential source of the distance distortion is the vergence-accommodation conflict-the discrepancy between the depth specified by the eyes' accommodative state and the angle at which the eyes converge to fixate on a target. The present study involved the use of a manual pointing task in UR, VR, and AR to quantify the magnitude of the potential depth distortion in each modality. Conceptualizing the effect of vergence-accommodation offset as a constant offset to the vergence angle, a model was developed based on the stereoscopic viewing geometry. Different versions of the model were used to fit and predict the behavioral data for all modalities. Results confirmed the validity of the conceptualization of vergence-accommodation as a device-specific vergence offset, which predicted up to 66% of the variance in the data. The fitted parameters indicate that, due to the vergence-accommodation conflict, participants' vergence angle was driven outwards by approximately 0.2°, which disrupted the stereoscopic viewing geometry and produced distance distortion in VR and AR. The implications of this finding are discussed in the context of developing virtual environments that minimize the effect of depth distortion.
Collapse
Affiliation(s)
- Xiaoye Michael Wang
- Faculty of Kinesiology & Physical Education, University of Toronto, Toronto, ON Canada
| | - Daniel Southwick
- Synaesthetic Media Lab, Toronto Metropolitan University, Toronto, ON Canada
| | - Ian Robinson
- Synaesthetic Media Lab, Toronto Metropolitan University, Toronto, ON Canada
| | - Michael Nitsche
- Ivan Allen College of Liberal Arts, Georgia Institute of Technology, Atlanta, GA USA
| | - Gabby Resch
- Faculty of Business and Information Technology, Ontario Tech University, Oshawa, ON Canada
| | - Ali Mazalek
- Synaesthetic Media Lab, Toronto Metropolitan University, Toronto, ON Canada
| | - Timothy N Welsh
- Faculty of Kinesiology & Physical Education, University of Toronto, Toronto, ON Canada
| |
Collapse
|
10
|
Louis B, Tigran G. Dynamic control of defocus, astigmatism, and tilt aberrations with a large area foveal liquid crystal lens. APPLIED OPTICS 2024; 63:2798-2805. [PMID: 38856374 DOI: 10.1364/ao.517797] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/14/2024] [Accepted: 03/09/2024] [Indexed: 06/11/2024]
Abstract
We have recently reported the dynamic adjustment of the focal length in an electrically tunable liquid crystal "foveal" lens, the center of which can be shifted over a large working area. In the present work, we show that this design allows also the independent generation of astigmatism with arbitrary axis and tilt of the light wavefront by simply changing the phase and the voltage differences between 4 control electrodes. Furthermore, we also demonstrate the capability of generating highly localized negative (defocusing) lenses with the same device by using a dual frequency liquid crystal.
Collapse
|
11
|
Han S, Kim S, Jung JH. The effect of visual rivalry in peripheral head-mounted displays on mobility. Sci Rep 2023; 13:20199. [PMID: 37980436 PMCID: PMC10657352 DOI: 10.1038/s41598-023-47427-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Accepted: 11/14/2023] [Indexed: 11/20/2023] Open
Abstract
Recent head-mounted displays and smart glasses use vision multiplexing, an optical approach where two or more views are superimposed on each other. In vision multiplexing, augmented information is presented over an observer's natural field of view, providing field expansion and critical information during mobility situations like walking and driving. Yet despite its utility, vision multiplexing may produce visual rivalry, a phenomenon where perception alternates between the augmented information and the background scene for seconds at a time. To investigate, we compared the effect of different peripheral vision multiplexing configurations (unilateral opaque, unilateral see-through and bilateral see-through) on the detection of augmented information, incorporating at the same time real-world characteristics (target eccentricity, depth condition, and gaze movement) for a more realistic assessment. Results showed a persistently lower target detection rate in unilateral configurations than the bilateral configuration, suggesting a larger effect of binocular rivalry on target visibility. Nevertheless, this effect does become attenuated when more naturalistic elements are incorporated, and we discuss recommendations for vision multiplexing design and possible avenues for further research.
Collapse
Affiliation(s)
- Shui'er Han
- Institute for Infocomm Research, Agency for Science, Technology and Research (A*STAR), Singapore, Singapore
| | - Sujin Kim
- Department of Ophthalmology, Harvard Medical School, Schepens Eye Research Institute of Massachusetts Eye and Ear, Boston, MA, USA
| | - Jae-Hyun Jung
- Department of Ophthalmology, Harvard Medical School, Schepens Eye Research Institute of Massachusetts Eye and Ear, Boston, MA, USA.
| |
Collapse
|
12
|
Ghasemi F, Harris LR, Jörges B. Simulated eye height impacts size perception differently depending on real-world posture. Sci Rep 2023; 13:20075. [PMID: 37974023 PMCID: PMC10654384 DOI: 10.1038/s41598-023-47364-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 11/13/2023] [Indexed: 11/19/2023] Open
Abstract
Changes in perceived eye height influence visually perceived object size in both the real world and in virtual reality. In virtual reality, conflicts can arise between the eye height in the real world and the eye height simulated in a VR application. We hypothesized that participants would be influenced more by variation in simulated eye height when they had a clear expectation about their eye height in the real world such as when sitting or standing, and less so when they did not have a clear estimate of the distance between their eyes and the real-life ground plane, e.g., when lying supine. Using virtual reality, 40 participants compared the height of a red square simulated at three different distances (6, 12, and 18 m) against the length of a physical stick (38.1 cm) held in their hands. They completed this task in all combinations of four real-life postures (supine, sitting, standing, standing on a table) and three simulated eye heights that corresponded to each participant's real-world eye height (123cm sitting; 161cm standing; 201cm on table; on average). Confirming previous results, the square's perceived size varied inversely with simulated eye height. Variations in simulated eye height affected participants' perception of size significantly more when sitting than in the other postures (supine, standing, standing on a table). This shows that real-life posture can influence the perception of size in VR. However, since simulated eye height did not affect size estimates less in the lying supine than in the standing position, our hypothesis that humans would be more influenced by variations in eye height when they had a reliable estimate of the distance between their eyes and the ground plane in the real world was not fully confirmed.
Collapse
Affiliation(s)
- Fatemeh Ghasemi
- Center for Vision Research, York University, 4700 Keele Street, Toronto, ON, M3J 1P3, Canada
| | - Laurence R Harris
- Center for Vision Research, York University, 4700 Keele Street, Toronto, ON, M3J 1P3, Canada.
| | - Björn Jörges
- Center for Vision Research, York University, 4700 Keele Street, Toronto, ON, M3J 1P3, Canada
| |
Collapse
|
13
|
Ye X, Fan F, Wen S. Cascaded transflective liquid crystal planar lenses enable multi-plane augmented reality. OPTICS LETTERS 2023; 48:5919-5922. [PMID: 37966752 DOI: 10.1364/ol.503343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Accepted: 10/22/2023] [Indexed: 11/16/2023]
Abstract
In this Letter, we report and experimentally demonstrate the multi-plane augmented reality (AR) by combining the reflective polarization volume lens (PVL) and electrically controlled transmissive Pancharatnam-Berry (PB) liquid crystal (LC) lens. This strategy is based on the electrically controlled power-based approach, which significantly alleviates the challenge of vergence-accommodation conflict (VAC) of the current near-eye display (NED). As a proof of concept, a birdbath architecture dual-plane optical see-through (OST) display was implemented experimentally by changing the power of the lens. The proposed method is expected to be a novel, to the best of our knowledge, NED that is compact, light, and fatigue-free.
Collapse
|
14
|
Cooper EA. The Perceptual Science of Augmented Reality. Annu Rev Vis Sci 2023; 9:455-478. [PMID: 36944311 DOI: 10.1146/annurev-vision-111022-123758] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/23/2023]
Abstract
Augmented reality (AR) systems aim to alter our view of the world and enable us to see things that are not actually there. The resulting discrepancy between perception and reality can create compelling entertainment and can support innovative approaches to education, guidance, and assistive tools. However, building an AR system that effectively integrates with our natural visual experience is hard. AR systems often suffer from visual limitations and artifacts, and addressing these flaws requires basic knowledge of perception. At the same time, AR system development can serve as a catalyst that drives innovative new research in perceptual science. This review describes recent perceptual research pertinent to and driven by modern AR systems, with the goal of highlighting thought-provoking areas of inquiry and open questions.
Collapse
Affiliation(s)
- Emily A Cooper
- Herbert Wertheim School of Optometry & Vision Science, Helen Wills Neuroscience Institute, University of California, Berkeley, California, USA;
| |
Collapse
|
15
|
Srinivasan S, Tripathi AB, Suryakumar R. Evolution of operating microscopes and development of 3D visualization systems for intraocular surgery. J Cataract Refract Surg 2023; 49:988-995. [PMID: 37144641 DOI: 10.1097/j.jcrs.0000000000001216] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Accepted: 05/02/2023] [Indexed: 05/06/2023]
Abstract
The recent development of high-resolution, heads-up, 3D visualization microscopy systems has provided new technical and visualization options for ophthalmic surgeons. In this review, we explore the evolution of microscope technologies, the science behind modern 3D visualization microscopy systems, and the practical benefits (as well as disadvantages) that these systems provide over conventional microscopes for intraocular surgical practice. Overall, modern 3D visualization systems reduce the requirements for artificial illumination and provide enhanced visualization and resolution of ocular structures, improving ergonomics, and facilitating a superior educational experience. Even when considering their disadvantages, such as those related to technical feasibility, 3D visualization systems have an overall positive benefit/risk ratio. It is hoped these systems will be adopted into routine clinical practice, pending further clinical evidence on the benefits they may provide on clinical outcomes.
Collapse
Affiliation(s)
- Sathish Srinivasan
- From the University Hospital Ayr, Ayr, Scotland, United Kingdom (Srinivasan); University of West of Scotland, Ayr, Scotland, United Kingdom (Srinivasan); Alcon Research LLC, Fort Worth, Texas (Tripathi, Suryakumar)
| | | | | |
Collapse
|
16
|
Fogt JS. Novel silicone elastomer contact lenses designed for simultaneous viewing of distance and near eye displays. Cont Lens Anterior Eye 2023; 46:101870. [PMID: 37277258 PMCID: PMC10445844 DOI: 10.1016/j.clae.2023.101870] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 05/24/2023] [Accepted: 05/29/2023] [Indexed: 06/07/2023]
Abstract
SIGNIFICANCE As technology advances, there is a need for a safe and well-fitting contact lens that can be utilized to carry embedded components without concerns of decreasing oxygen permeability to the eye. PURPOSE The purpose of this study was to assess fitting characteristics, vision and performance of a novel ultra-high Dk silicone elastomer contact lens having a fully encapsulated two-state polarizing filter and a high-powered central lenslet that allows viewing at distance and viewing of a near eye display, while managing the concomitant high water vapor permeability of the material. METHODS 15 participants were fit with the silicone elastomer study lenses. Biomicroscopy was conducted before and after lens wear. Visual acuity with manifest refraction and visual acuity with an over-refraction while wearing the plano-powered study lenses were measured. Participants wore spectacles with micro-displays at the focal length of the lenslet on each eye. Lens fit was assessed including ease of lens removal. Subjective assessments of viewing the micro-displays were completed on a 1(unable) to 10(immediate/profound/stable) scale. RESULTS Biomicroscopy revealed no eyes had moderate or severe corneal staining after study lens wear. Mean (±standard deviation) LogMAR acuity for all eyes was -0.13(0.08) with best corrected refraction and -0.03(0.06) with the study lenses and over-refraction. Mean spherical equivalent of the manifest refraction for both eyes was -3.12 D and was -2.75 D over the plano study lenses. Subjective assessments revealed a mean score of 7.67(1.91) for ease of obtaining fusion; 8.47(1.30) for ease of observing three-dimensional vision, and 8.27(1.49) for stability of the fused binocular display vision. CONCLUSION The silicone elastomer study lenses with a two-state polarizing filter and central lenslet allow for vision at distance and on spectacle mounted micro-displays.
Collapse
|
17
|
Adhanom IB, MacNeilage P, Folmer E. Eye Tracking in Virtual Reality: a Broad Review of Applications and Challenges. VIRTUAL REALITY 2023; 27:1481-1505. [PMID: 37621305 PMCID: PMC10449001 DOI: 10.1007/s10055-022-00738-z] [Citation(s) in RCA: 26] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2021] [Accepted: 11/30/2022] [Indexed: 08/26/2023]
Abstract
Eye tracking is becoming increasingly available in head-mounted virtual reality displays with various headsets with integrated eye trackers already commercially available. The applications of eye tracking in virtual reality are highly diversified and span multiple disciplines. As a result, the number of peer-reviewed publications that study eye tracking applications has surged in recent years. We performed a broad review to comprehensively search academic literature databases with the aim of assessing the extent of published research dealing with applications of eye tracking in virtual reality, and highlighting challenges, limitations and areas for future research.
Collapse
Affiliation(s)
| | - Paul MacNeilage
- University of Nevada Reno, 1664 N Virginia St, Reno, NV 89557, USA
| | - Eelke Folmer
- University of Nevada Reno, 1664 N Virginia St, Reno, NV 89557, USA
| |
Collapse
|
18
|
Ebner C, Mohr P, Langlotz T, Peng Y, Schmalstieg D, Wetzstein G, Kalkofen D. Off-Axis Layered Displays: Hybrid Direct-View/Near-Eye Mixed Reality with Focus Cues. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2816-2825. [PMID: 37027729 DOI: 10.1109/tvcg.2023.3247077] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
This work introduces off-axis layered displays, the first approach to stereoscopic direct-view displays with support for focus cues. Off-axis layered displays combine a head-mounted display with a traditional direct-view display for encoding a focal stack and thus, for providing focus cues. To explore the novel display architecture, we present a complete processing pipeline for the real-time computation and post-render warping of off-axis display patterns. In addition, we build two prototypes using a head-mounted display in combination with a stereoscopic direct-view display, and a more widely available monoscopic direct-view display. In addition we show how extending off-axis layered displays with an attenuation layer and with eye-tracking can improve image quality. We thoroughly analyze each component in a technical evaluation and present examples captured through our prototypes.
Collapse
|
19
|
Fernandes AS, Murdison TS, Proulx MJ. Leveling the Playing Field: A Comparative Reevaluation of Unmodified Eye Tracking as an Input and Interaction Modality for VR. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2269-2279. [PMID: 37027619 DOI: 10.1109/tvcg.2023.3247058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
In this study, we establish a much-needed baseline for evaluating eye tracking interactions using an eye tracking enabled Meta Quest 2 VR headset with 30 participants. Each participant went through 1098 targets using multiple conditions representative of AR/VR targeting and selecting tasks, including both traditional standards and those more aligned with AR/VR interactions today. We use circular white world-locked targets, and an eye tracking system with sub-1-degree mean accuracy errors running at approximately 90Hz. In a targeting and button press selection task, we, by design, compare completely unadjusted, cursor-less, eye tracking with controller and head tracking, which both had cursors. Across all inputs, we presented targets in a configuration similar to the ISO 9241-9 reciprocal selection task and another format with targets more evenly distributed near the center. Targets were laid out either flat on a plane or tangent to a sphere and rotated toward the user. Even though we intended this to be a baseline study, we see unmodified eye tracking, without any form of a cursor, or feedback, outperformed the head by 27.9% and performed comparably to the controller (5.63% decrease) in throughput. Eye tracking had improved subjective ratings relative to head in Ease of Use, Adoption, and Fatigue (66.4%, 89.8%, and 116.1 % improvements, respectively) and had similar ratings relative to the controller (reduction by 4.2%, 8.9%, and 5.2% respectively). Eye tracking had a higher miss percentage than controller and head (17.3% vs 4.7% vs 7.2% respectively). Collectively, the results of this baseline study serve as a strong indicator that eye tracking, with even minor sensible interaction design modifications, has tremendous potential in reshaping interactions in next-generation AR/VR head mounted displays.
Collapse
|
20
|
Wolffsohn JS, Lingham G, Downie LE, Huntjens B, Inomata T, Jivraj S, Kobia-Acquah E, Muntz A, Mohamed-Noriega K, Plainis S, Read M, Sayegh RR, Singh S, Utheim TP, Craig JP. TFOS Lifestyle: Impact of the digital environment on the ocular surface. Ocul Surf 2023; 28:213-252. [PMID: 37062428 DOI: 10.1016/j.jtos.2023.04.004] [Citation(s) in RCA: 49] [Impact Index Per Article: 24.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 04/06/2023] [Indexed: 04/18/2023]
Abstract
Eye strain when performing tasks reliant on a digital environment can cause discomfort, affecting productivity and quality of life. Digital eye strain (the preferred terminology) was defined as "the development or exacerbation of recurrent ocular symptoms and/or signs related specifically to digital device screen viewing". Digital eye strain prevalence of up to 97% has been reported, due to no previously agreed definition/diagnostic criteria and limitations of current questionnaires which fail to differentiate such symptoms from those arising from non-digital tasks. Objective signs such as blink rate or critical flicker frequency changes are not 'diagnostic' of digital eye strain nor validated as sensitive. The mechanisms attributed to ocular surface disease exacerbation are mainly reduced blink rate and completeness, partial/uncorrected refractive error and/or underlying binocular vision anomalies, together with the cognitive demand of the task and differences in position, size, brightness and glare compared to an equivalent non-digital task. In general, interventions are not well established; patients experiencing digital eye strain should be provided with a full refractive correction for the appropriate working distances. Improving blinking, optimizing the work environment and encouraging regular breaks may help. Based on current, best evidence, blue-light blocking interventions do not appear to be an effective management strategy. More and larger clinical trials are needed to assess artificial tear effectiveness for relieving digital eye strain, particularly comparing different constituents; a systematic review within the report identified use of secretagogues and warm compress/humidity goggles/ambient humidifiers as promising strategies, along with nutritional supplementation (such as omega-3 fatty acid supplementation and berry extracts).
Collapse
Affiliation(s)
- James S Wolffsohn
- College of Health & Life Sciences, School of Optometry, Aston University, Birmingham, UK; Department of Ophthalmology, New Zealand National Eye Centre, The University of Auckland, Auckland, New Zealand.
| | - Gareth Lingham
- Centre for Eye Research Ireland, Technological University Dublin, Dublin, Ireland
| | - Laura E Downie
- Department of Optometry and Vision Sciences, The University of Melbourne, Parkville, Victoria, Australia
| | - Byki Huntjens
- Division of Optometry and Visual Sciences, City, University of London, EC1V 0HB, UK
| | - Takenori Inomata
- Department of Ophthalmology, Juntendo University Graduate School of Medicine, Bunkyo-ku, Tokyo, Japan
| | - Saleel Jivraj
- College of Health & Life Sciences, School of Optometry, Aston University, Birmingham, UK
| | | | - Alex Muntz
- Department of Ophthalmology, New Zealand National Eye Centre, The University of Auckland, Auckland, New Zealand
| | - Karim Mohamed-Noriega
- Department of Ophthalmology, University Hospital and Faculty of Medicine, Autonomous University of Nuevo León (UANL). Monterrey, 64460, Mexico
| | - Sotiris Plainis
- College of Health & Life Sciences, School of Optometry, Aston University, Birmingham, UK; Laboratory of Optics and Vision, School of Medicine, University of Crete, Greece
| | - Michael Read
- Division of Pharmacy and Optometry, The University of Manchester, Manchester, UK
| | - Rony R Sayegh
- Cole Eye Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Sumeer Singh
- Department of Optometry and Vision Sciences, The University of Melbourne, Parkville, Victoria, Australia
| | - Tor P Utheim
- Department of Ophthalmology, Oslo University Hospital, Oslo, Norway
| | - Jennifer P Craig
- College of Health & Life Sciences, School of Optometry, Aston University, Birmingham, UK; Department of Ophthalmology, New Zealand National Eye Centre, The University of Auckland, Auckland, New Zealand
| |
Collapse
|
21
|
Combe T, Chardonnet JR, Merienne F, Ovtcharova J. CAVE and HMD: distance perception comparative study. VIRTUAL REALITY 2023; 27:1-11. [PMID: 37360808 PMCID: PMC10054200 DOI: 10.1007/s10055-023-00787-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Accepted: 03/06/2023] [Indexed: 06/28/2023]
Abstract
This paper proposes to analyse user experience using two different immersive device categories: a cave automatic virtual environment (CAVE) and a head-mounted display (HMD). While most past studies focused on one of these devices to characterize user experience, we propose to fill the gap in comparative studies by conducting investigations using both devices, considering the same application, method and analysis. Through this study, we want to highlight the differences in user experience induced when using either one of these technologies in terms of visualization and interaction. We performed two experiments, each focusing on a specific aspect of the devices employed. The first one is related to distance perception when walking and the possible influence of the HMD's weight, which does not occur with CAVE systems as they do not require wearing any heavy equipment. Past studies found that weight may impact distance perception. Several walking distances were considered. Results revealed that the HMD's weight does not induce significant differences over short distances (above three meters). In the second experiment, we focused on distance perception over short distances. We considered that the HMD's screen being closer to the user's eyes than in CAVE systems might induce substantial distance perception differences, especially for short-distance interaction. We designed a task in which users had to move an object from one place to another at several distances using the CAVE and an HMD. Results revealed significant underestimation compared to reality as in past work, but no significant differences between the immersive devices. These results provide a better understanding of the differences between the two emblematic virtual reality displays.
Collapse
Affiliation(s)
- Théo Combe
- Arts et Métiers Institute of Technology, LISPEN, HESAM Université, UBFC, F-71100, 2 Rue Thomas Dumorey, 71100 Chalon-sur-Saône, France
| | - Jean-Rémy Chardonnet
- Arts et Métiers Institute of Technology, LISPEN, HESAM Université, UBFC, F-71100, 2 Rue Thomas Dumorey, 71100 Chalon-sur-Saône, France
| | - Frédéric Merienne
- Arts et Métiers Institute of Technology, LISPEN, HESAM Université, UBFC, F-71100, 2 Rue Thomas Dumorey, 71100 Chalon-sur-Saône, France
| | - Jivka Ovtcharova
- IMI, Karlsruhe Institute of Technology, Kriegsstraße 77, 76133 Karlsruhe, Germany
| |
Collapse
|
22
|
Feasibility and Acceptance of Augmented and Virtual Reality Exergames to Train Motor and Cognitive Skills of Elderly. COMPUTERS 2023. [DOI: 10.3390/computers12030052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/03/2023]
Abstract
The GAME2AWE platform aims to provide a versatile tool for elderly fall prevention through exergames that integrate exercises, and simulate real-world environments and situations to train balance and reaction time using augmented and virtual reality technologies. In order to lay out the research area of interest, a review of the literature on systems that provide exergames for the elderly utilizing such technologies was conducted. The proposed use of augmented reality exergames on mobile devices as a complement to the traditional Kinect-based approach is a method that has been examined in the past with younger individuals in the context of physical activity interventions, but has not been studied adequately as an exergame tool for the elderly. An evaluation study was conducted with seniors, using multiple measuring scales to assess aspects such as usability, tolerability, applicability, and technology acceptance. In particular, the Unified Theory of Acceptance and Use of Technology (UTAUT) model was used to assess acceptance and identify factors that influence the seniors’ intentions to use the game platform in the long term, while the correlation between UTAUT factors was also investigated. The results indicate a positive assessment of the above user experience aspects leveraging on both qualitative and quantitative collected data.
Collapse
|
23
|
Aizenman AM, Koulieris GA, Gibaldi A, Sehgal V, Levi DM, Banks MS. The Statistics of Eye Movements and Binocular Disparities during VR Gaming: Implications for Headset Design. ACM TRANSACTIONS ON GRAPHICS 2023; 42:7. [PMID: 37122317 PMCID: PMC10139447 DOI: 10.1145/3549529] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
The human visual system evolved in environments with statistical regularities. Binocular vision is adapted to these such that depth perception and eye movements are more precise, faster, and performed comfortably in environments consistent with the regularities. We measured the statistics of eye movements and binocular disparities in virtual-reality (VR) - gaming environments and found that they are quite different from those in the natural environment. Fixation distance and direction are more restricted in VR, and fixation distance is farther. The pattern of disparity across the visual field is less regular in VR and does not conform to a prominent property of naturally occurring disparities. From this we predict that double vision is more likely in VR than in the natural environment. We also determined the optimal screen distance to minimize discomfort due to the vergence-accommodation conflict, and the optimal nasal-temporal positioning of head-mounted display (HMD) screens to maximize binocular field of view. Finally, in a user study we investigated how VR content affects comfort and performance. Content that is more consistent with the statistics of the natural world yields less discomfort than content that is not. Furthermore, consistent content yields slightly better performance than inconsistent content.
Collapse
|
24
|
Hsiao CY, Kuo CC, Liou YA, Wang MJ. Determining Work-Rest Schedules for Visual Tasks That Use Optical Head-Mounted Displays Based on Visual Fatigue and Visually Induced Motion Sickness Recovery. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2023; 20:1880. [PMID: 36767244 PMCID: PMC9914630 DOI: 10.3390/ijerph20031880] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/27/2022] [Revised: 01/17/2023] [Accepted: 01/18/2023] [Indexed: 06/18/2023]
Abstract
This study aimed to determine work-rest schedules for visual tasks of different lengths by evaluating visual fatigue and visually induced motion sickness (VIMS) using an optical head-mounted display (OHMD). Thirty participants were recruited to perform 15 and 30 min visual tasks using an OHMD. After completing each visual task, participants executed six levels of rest time. Critical flicker fusion frequency (CFF) values, relative electroencephalography indices, and Simulator Sickness Questionnaire (SSQ) scores were collected and analyzed. Results indicated that after completing the 15 and 30 min visual tasks, participants experienced visual fatigue and VIMS. There was no significant difference between baseline CFF values, four electroencephalography relative power index values, and SSQ scores when participants completed a 15 min visual task followed by a 20 min rest and a 30 min visual task followed by a 30 min rest. Based on our results, a 20 min rest for visual fatigue and VIMS recovery after a 15 min visual task on an OHMD and a 25 min rest for visual fatigue and VIMS recovery after a 30 min visual task on an OHMD are recommended. This study suggests a work-rest schedule for OHMDs that can be used as a reference for OHMD user guidelines to reduce visual fatigue and visually induced motion sickness.
Collapse
Affiliation(s)
- Chih-Yu Hsiao
- Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Hsinchu City 30013, Taiwan
| | - Chia-Chen Kuo
- Department of Industrial Engineering and Management, National Chin-Yi University of Technology, Taichung City 41170, Taiwan
| | - Yi-An Liou
- Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Hsinchu City 30013, Taiwan
| | - Mao-Jiun Wang
- Department of Industrial Engineering and Enterprise Information, Tunghai University, Taichung City 40704, Taiwan
| |
Collapse
|
25
|
Jiang J, Zhang J, Sun J, Wu D, Xu S. User's image perception improved strategy and application of augmented reality systems in smart medical care: A review. Int J Med Robot 2023; 19:e2497. [PMID: 36629798 DOI: 10.1002/rcs.2497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 12/26/2022] [Accepted: 01/06/2023] [Indexed: 01/12/2023]
Abstract
BACKGROUND Augmented reality (AR) is a new human-computer interaction technology that combines virtual reality, computer vision, and computer networks. With the rapid advancement of the medical field towards intelligence and data visualisation, AR systems are becoming increasingly popular in the medical field because they can provide doctors with clear enough medical images and accurate image navigation in practical applications. However, it has been discovered that different display types of AR systems have different effects on doctors' perception of the image after virtual-real fusion during the actual medical application. If doctors cannot correctly perceive the image, they may be unable to correctly match the virtual information with the real world, which will have a significant impact on their ability to recognise complex structures. METHODS This paper uses Citespace, a literature analysis tool, to visualise and analyse the research hotspots when AR systems are used in the medical field. RESULTS A visual analysis of the 1163 articles retrieved from the Web of Science Core Collection database reveals that display technology and visualisation technology are the key research directions of AR systems at the moment. CONCLUSION This paper categorises AR systems based on their display principles, reviews current image perception optimisation schemes for various types of systems, and analyses and compares different display types of AR systems based on their practical applications in the field of smart medical care so that doctors can select the appropriate display types based on different application scenarios. Finally, the future development direction of AR display technology is anticipated in order for AR technology to be more effectively applied in the field of smart medical care. The advancement of display technology for AR systems is critical for their use in the medical field, and the advantages and disadvantages of various display types should be considered in different application scenarios to select the best AR system.
Collapse
Affiliation(s)
- Jingang Jiang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China.,Robotics & Its Engineering Research Center, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jiawei Zhang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jianpeng Sun
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Dianhao Wu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Shuainan Xu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| |
Collapse
|
26
|
Bruno RR, Wolff G, Wernly B, Masyuk M, Piayda K, Leaver S, Erkens R, Oehler D, Afzal S, Heidari H, Kelm M, Jung C. Virtual and augmented reality in critical care medicine: the patient's, clinician's, and researcher's perspective. Crit Care 2022; 26:326. [PMID: 36284350 PMCID: PMC9593998 DOI: 10.1186/s13054-022-04202-x] [Citation(s) in RCA: 69] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2022] [Accepted: 10/12/2022] [Indexed: 11/09/2022] Open
Abstract
Virtual reality (VR) and augmented reality (AR) are aspiring, new technologies with increasing use in critical care medicine. While VR fully immerses the user into a virtual three-dimensional space, AR adds overlaid virtual elements into a real-world environment. VR and AR offer great potential to improve critical care medicine for patients, relatives and health care providers. VR may help to ameliorate anxiety, stress, fear, and pain for the patient. It may assist patients in mobilisation and rehabilitation and can improve communication between all those involved in the patient's care. AR can be an effective tool to support continuous education of intensive care medicine providers, and may complement traditional learning methods to acquire key practical competences such as central venous line placement, cardiopulmonary resuscitation, extracorporeal membrane oxygenation device management or endotracheal intubation. Currently, technical, human, and ethical challenges remain. The adaptation and integration of VR/AR modalities into useful clinical applications that can be used routinely on the ICU is challenging. Users may experience unwanted side effects (so-called "cybersickness") during VR/AR sessions, which may limit its applicability. Furthermore, critically ill patients are one of the most vulnerable patient groups and warrant special ethical considerations if new technologies are to be introduced into their daily care. To date, most studies involving AR/VR in critical care medicine provide only a low level of evidence due to their research design. Here we summarise background information, current developments, and key considerations that should be taken into account for future scientific investigations in this field.
Collapse
Affiliation(s)
- Raphael Romano Bruno
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Georg Wolff
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Bernhard Wernly
- grid.21604.310000 0004 0523 5263Department of Internal Medicine, General Hospital Oberndorf, Teaching Hospital of the Paracelsus Medical University Salzburg, Paracelsusstraße 37, 5110 Oberndorf, Salzburg Austria ,grid.21604.310000 0004 0523 5263Center for Public Health and Healthcare Research, Paracelsus Medical University Salzburg, 5020 Salzburg, Austria
| | - Maryna Masyuk
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Kerstin Piayda
- grid.411067.50000 0000 8584 9230Department of Cardiology and Angiology, Universitätsklinikum Gießen und Marburg, 35391 Giessen, Germany
| | - Susannah Leaver
- grid.451349.eGeneral Intensive Care, St George’s University Hospitals NHS Foundation Trust, London, UK
| | - Ralf Erkens
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Daniel Oehler
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Shazia Afzal
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Houtan Heidari
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| | - Malte Kelm
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany ,CARID, Cardiovascular Research Institute Duesseldorf, 40225 Düsseldorf, Germany
| | - Christian Jung
- grid.411327.20000 0001 2176 9917Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine-University Düsseldorf, 40225 Düsseldorf, Germany
| |
Collapse
|
27
|
Xu H, Tabata S, Liang H, Wang L, Ishikawa M. Accurate measurement of virtual image distance for near-eye displays based on auto-focusing. APPLIED OPTICS 2022; 61:9093-9098. [PMID: 36607038 DOI: 10.1364/ao.472931] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Accepted: 09/29/2022] [Indexed: 06/17/2023]
Abstract
Virtual reality (VR) and augmented reality (AR) are able to project virtual images to human eyes at a certain depth distance. This virtual image distance can be adjusted by controlling the diopter of the near-eye display. However, it is difficult to measure accurately and continuously since this virtual image distance spans a large range. In this work, we propose a method to accurately determine the virtual image distance of commercial VR/AR equipment. The measurement apparatus is built and calibrated to validate the feasibility. The focal distance of the focus-tunable lens can be automatically adjusted via a step motor by cooperating with the image sharpness analyzing program. Compared with other proposed methods, ours provides an effective means to achieve high accuracy, a wide and continuous testing range, and automatic evaluation of virtual image distance for compact near-eye displays.
Collapse
|
28
|
Yamamoto Y, Shimobaba T, Ito T. HORN-9: Special-purpose computer for electroholography with the Hilbert transform. OPTICS EXPRESS 2022; 30:38115-38127. [PMID: 36258393 DOI: 10.1364/oe.471720] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Accepted: 09/20/2022] [Indexed: 06/16/2023]
Abstract
Holography is a technology that uses light interference and diffraction to record and reproduce three-dimensional (3D) information. Using computers, holographic 3D scenes (electroholography) have been widely studied. Nevertheless, its practical application requires enormous computing power, and current computers have limitations in real-time processing. In this study, we show that holographic reconstruction (HORN)-9, a special-purpose computer for electroholography with the Hilbert transform, can compute a 1, 920 × 1, 080-pixel computer-generated hologram from a point cloud of 65,000 points in 0.030 s (33 fps) on a single card. This performance is 8, 7, and 170 times more efficient than a previously developed HORN-8, a graphics processing unit, and a central processing unit (CPU), respectively. We also demonstrated the real-time processing and display of 400,000 points on multiple HORN-9s, achieving an acceleration of 600 times with four HORN-9 units compared with a single CPU.
Collapse
|
29
|
Neural Research on Depth Perception and Stereoscopic Visual Fatigue in Virtual Reality. Brain Sci 2022; 12:brainsci12091231. [PMID: 36138967 PMCID: PMC9497221 DOI: 10.3390/brainsci12091231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2022] [Revised: 09/04/2022] [Accepted: 09/07/2022] [Indexed: 11/29/2022] Open
Abstract
Virtual reality (VR) technology provides highly immersive depth perception experiences; nevertheless, stereoscopic visual fatigue (SVF) has become an important factor currently hindering the development of VR applications. However, there is scant research on the underlying neural mechanism of SVF, especially those induced by VR displays, which need further research. In this paper, a Go/NoGo paradigm based on disparity variations is proposed to induce SVF associated with depth perception, and the underlying neural mechanism of SVF in a VR environment was investigated. The effects of disparity variations as well as SVF on the temporal characteristics of visual evoked potentials (VEPs) were explored. Point-by-point permutation statistical with repeated measures ANOVA results revealed that the amplitudes and latencies of the posterior VEP component P2 were modulated by disparities, and posterior P2 amplitudes were modulated differently by SVF in different depth perception situations. Cortical source localization analysis was performed to explore the original cortex areas related to certain fatigue levels and disparities, and the results showed that posterior P2 generated from the precuneus could represent depth perception in binocular vision, and therefore could be performed to distinguish SVF induced by disparity variations. Our findings could help to extend an understanding of the neural mechanisms underlying depth perception and SVF as well as providing beneficial information for improving the visual experience in VR applications.
Collapse
|
30
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
31
|
Kim D, Kim B, Shin B, Shin D, Lee CK, Chung JS, Seo J, Kim YT, Sung G, Seo W, Kim S, Hong S, Hwang S, Han S, Kang D, Lee HS, Koh JS. Actuating compact wearable augmented reality devices by multifunctional artificial muscle. Nat Commun 2022; 13:4155. [PMID: 35851053 PMCID: PMC9293895 DOI: 10.1038/s41467-022-31893-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 07/06/2022] [Indexed: 11/17/2022] Open
Abstract
An artificial muscle actuator resolves practical engineering problems in compact wearable devices, which are limited to conventional actuators such as electromagnetic actuators. Abstracting the fundamental advantages of an artificial muscle actuator provides a small-scale, high-power actuating system with a sensing capability for developing varifocal augmented reality glasses and naturally fit haptic gloves. Here, we design a shape memory alloy-based lightweight and high-power artificial muscle actuator, the so-called compliant amplified shape memory alloy actuator. Despite its light weight (0.22 g), the actuator has a high power density of 1.7 kW/kg, an actuation strain of 300% under 80 g of external payload. We show how the actuator enables image depth control and an immersive tactile response in the form of augmented reality glasses and two-way communication haptic gloves whose thin form factor and high power density can hardly be achieved by conventional actuators. Artificial muscle actuators enabled by responsive functional materials like shape memory alloys are promising candidates for compact e-wearable devices. Here, authors demonstrate augmented reality glasses and two-way communication haptic gloves capable of image depth control and immersive tactile response.
Collapse
Affiliation(s)
- Dongjin Kim
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea
| | - Baekgyeom Kim
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea
| | - Bongsu Shin
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Dongwook Shin
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea
| | - Chang-Kun Lee
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Jae-Seung Chung
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Juwon Seo
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Yun-Tae Kim
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Geeyoung Sung
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung Electronics, 34, Seongchon-gil, Seocho-gu, Seoul, 06765, Republic of Korea
| | - Wontaek Seo
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea
| | - Sunil Kim
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea
| | - Sunghoon Hong
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea
| | - Sungwoo Hwang
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea.,Samsung SDS, 125, Olympic-ro, 35-gil, Songpa-gu, Seoul, 05510, Republic of Korea
| | - Seungyong Han
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea.
| | - Daeshik Kang
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea.
| | - Hong-Seok Lee
- Samsung Advanced Institute of Technology, Samsung Electronics, 130 Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16678, Republic of Korea. .,Department of Electrical and Computer Engineering, Seoul National University, 1, Gwanak-ro, Gwanak-gu, Seoul, 08826, Republic of Korea.
| | - Je-Sung Koh
- Department of Mechanical Engineering, Ajou University, 206 Worldcup-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 16499, Republic of Korea.
| |
Collapse
|
32
|
Shi X, Xue Z, Ma S, Wang B, Liu Y, Wang Y, Song W. Design of a dual focal-plane near-eye display using diffractive waveguides and multiple lenses. APPLIED OPTICS 2022; 61:5844-5849. [PMID: 36255821 DOI: 10.1364/ao.461300] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Accepted: 06/15/2022] [Indexed: 06/16/2023]
Abstract
We propose a method to construct a compact dual focal-plane optical see-through near-eye display using diffractive waveguides and multiple lenses. A virtual image from a display device is projected into a three-grating waveguide using an objective lens, and a virtual image can be shown at a far distance with an extended eye box. One negative lens is employed to reduce the focus distance of the virtual image, and a corresponding positive lens is used to compensate for the distortion and accommodation errors. Thus, not only can a virtual image with a near distance be achieved, but also a virtual plane with a further distance can be generated by introducing another projection module and waveguide. Only two waveguides and two pieces of lenses are used in front of one eye to obtain a lightweight outlook. To verify the proposed method, a proof-of-concept prototype was developed to provide vivid virtual images at different depths in front of the human eye.
Collapse
|
33
|
Yin K, Hsiang EL, Zou J, Li Y, Yang Z, Yang Q, Lai PC, Lin CL, Wu ST. Advanced liquid crystal devices for augmented reality and virtual reality displays: principles and applications. LIGHT, SCIENCE & APPLICATIONS 2022; 11:161. [PMID: 35637183 PMCID: PMC9151772 DOI: 10.1038/s41377-022-00851-3] [Citation(s) in RCA: 75] [Impact Index Per Article: 25.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Revised: 05/04/2022] [Accepted: 05/14/2022] [Indexed: 05/20/2023]
Abstract
Liquid crystal displays (LCDs) and photonic devices play a pivotal role to augmented reality (AR) and virtual reality (VR). The recently emerging high-dynamic-range (HDR) mini-LED backlit LCDs significantly boost the image quality and brightness and reduce the power consumption for VR displays. Such a light engine is particularly attractive for compensating the optical loss of pancake structure to achieve compact and lightweight VR headsets. On the other hand, high-resolution-density, and high-brightness liquid-crystal-on-silicon (LCoS) is a promising image source for the see-through AR displays, especially under high ambient lighting conditions. Meanwhile, the high-speed LCoS spatial light modulators open a new door for holographic displays and focal surface displays. Finally, the ultrathin planar diffractive LC optical elements, such as geometric phase LC grating and lens, have found useful applications in AR and VR for enhancing resolution, widening field-of-view, suppressing chromatic aberrations, creating multiplanes to overcome the vergence-accommodation conflict, and dynamic pupil steering to achieve gaze-matched Maxwellian displays, just to name a few. The operation principles, potential applications, and future challenges of these advanced LC devices will be discussed.
Collapse
Affiliation(s)
- Kun Yin
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - En-Lin Hsiang
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Junyu Zou
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Yannanqi Li
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Zhiyong Yang
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Qian Yang
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Po-Cheng Lai
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Chih-Lung Lin
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Shin-Tson Wu
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA.
| |
Collapse
|
34
|
Arefin MS, Phillips N, Plopski A, Gabbard JL, Swan JE. The Effect of Context Switching, Focal Switching Distance, Binocular and Monocular Viewing, and Transient Focal Blur on Human Performance in Optical See-Through Augmented Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2014-2025. [PMID: 35167470 DOI: 10.1109/tvcg.2022.3150503] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
In optical see-through augmented reality (AR), information is often distributed between real and virtual contexts, and often appears at different distances from the user. To integrate information, users must repeatedly switch context and change focal distance. If the user's task is conducted under time pressure, they may attempt to integrate information while their eye is still changing focal distance, a phenomenon we term transient focal blur. Previously, Gabbard, Mehra, and Swan (2018) examined these issues, using a text-based visual search task on a one-eye optical see-through AR display. This paper reports an experiment that partially replicates and extends this task on a custom-built AR Haploscope. The experiment examined the effects of context switching, focal switching distance, binocular and monocular viewing, and transient focal blur on task performance and eye fatigue. Context switching increased eye fatigue but did not decrease performance. Increasing focal switching distance increased eye fatigue and decreased performance. Monocular viewing also increased eye fatigue and decreased performance. The transient focal blur effect resulted in additional performance decrements, and is an addition to knowledge about AR user interface design issues.
Collapse
|
35
|
Danyluk K, Ulusoy T, Wei W, Willett W. Touch and Beyond: Comparing Physical and Virtual Reality Visualizations. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:1930-1940. [PMID: 32915741 DOI: 10.1109/tvcg.2020.3023336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
We compare physical and virtual reality (VR) versions of simple data visualizations and explore how the addition of virtual annotation and filtering tools affects how viewers solve basic data analysis tasks. We report on two studies, inspired by previous examinations of data physicalizations. The first study examines differences in how viewers interact with physical hand-scale, virtual hand-scale, and virtual table-scale visualizations and the impact that the different forms had on viewer's problem solving behavior. A second study examines how interactive annotation and filtering tools might support new modes of use that transcend the limitations of physical representations. Our results highlight challenges associated with virtual reality representations and hint at the potential of interactive annotation and filtering tools in VR visualizations.
Collapse
|
36
|
Ferrari V, Cattari N, Fontana U, Cutolo F. Parallax Free Registration for Augmented Reality Optical See-Through Displays in the Peripersonal Space. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:1608-1618. [PMID: 32881688 DOI: 10.1109/tvcg.2020.3021534] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Egocentric augmented reality (AR) interfaces are quickly becoming a key asset for assisting high precision activities in the peripersonal space in several application fields. In these applications, accurate and robust registration of computer-generated information to the real scene is hard to achieve with traditional Optical See-Through (OST) displays given that it relies on the accurate calibration of the combined eye-display projection model. The calibration is required to efficiently estimate the projection parameters of the pinhole model that encapsulate the optical features of the display and whose values vary according to the position of the user's eye. In this article, we describe an approach that prevents any parallax-related AR misregistration at a pre-defined working distance in OST displays with infinity focus; our strategy relies on the use of a magnifier placed in front of the OST display, and features a proper parameterization of the virtual rendering camera achieved through a dedicated calibration procedure that accounts for the contribution of the magnifier. We model the registration error due to the viewpoint parallax outside the ideal working distance. Finally, we validate our strategy on a OST display, and we show that sub-millimetric registration accuracy can be achieved for working distances of ±100 mm around the focal length of the magnifier.
Collapse
|
37
|
Lin CH, Lin HC, Chen CY, Lih CC. Variations in intraocular pressure and visual parameters before and after using mobile virtual reality glasses and their effects on the eyes. Sci Rep 2022; 12:3176. [PMID: 35210496 PMCID: PMC8873506 DOI: 10.1038/s41598-022-07090-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Accepted: 02/11/2022] [Indexed: 11/11/2022] Open
Abstract
We examined the effects of using mobile devices with immersive virtual reality for a short period on the physiological parameters of both eyes. The average age of the 50 participants (23 men and 27 women) was 17.72 ± 1.48 years, and refractive error ranged from 0 D to − 5.00 D. All the participants wore + 3.00 D glasses and underwent a 5-min relaxation adjustment through the atomization method. The participants wore immersive virtual reality (VR) glasses to watch a movie on a roller coaster for 10 min. Their relevant physiological parameters of the eyes were measured both before and after using VR glasses. Compared with before VR use, no significant difference (P > 0.05) was observed in the near-horizontal vergence and refractive error but a significant difference (P < 0.05) was observed in the amplitude of accommodation, intraocular pressure, divergence/convergence, and stereopsis after VR use. The corneal elastic coefficient was > 0.2 MPa, and we used Friedenwald’s eye rigidity relationship to obtain the K value (0.065–0.09). Approximately 10% of the participants experienced cybersickness symptoms such as nausea and dizziness. The use of VR to watch three-dimensional movies reduced intraocular pressure, which may help prevent or treat glaucoma. Moreover, the binocular convergence was higher when viewing near-field objects in VR than in the real world. Therefore, individuals with convergence excess may experience symptoms. Binocular parallax is the most likely cause of cybersickness symptoms. Thus, mobile VR devices with higher quality and comfort are necessary.
Collapse
Affiliation(s)
- Ching-Huang Lin
- Department of Electronic Engineering, National Yunlin University of Science and Technology, Yunlin, Taiwan, 640
| | - Hsien-Chang Lin
- Graduate School of Engineering Science and Technology, National Yunlin University of Science and Technology, Yunlin, Taiwan, 640
| | - Chien-Yu Chen
- Graduate Institute of Color and Illumination Technology, National Taiwan University of Science and Technology, Taipei, Taiwan, 106
| | - Chong-Chung Lih
- Department of Optometry, Jenteh Junior College of Medicine, Nursing and Management, Miaoli, Taiwan, 35664.
| |
Collapse
|
38
|
Fatigue-free visual perception of high-density super-multiview augmented reality images. Sci Rep 2022; 12:2959. [PMID: 35194078 PMCID: PMC8863894 DOI: 10.1038/s41598-022-06778-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Accepted: 02/07/2022] [Indexed: 11/09/2022] Open
Abstract
It is well known that wearing virtual reality (VR) and augmented reality (AR) devices for long periods can cause visual fatigue and motion sickness due to vergence-accommodation conflict (VAC). VAC is considered the main obstacle to the development of advanced three-dimensional VR and AR technology. In this paper, we present a novel AR high-density super-multiview (HDSMV) display technique capable of eliminating VAC in wide range. The designed binocular time-sequential AR HDSMV projection, which delivers 11 views to each eye pupil, is experimentally demonstrated, confirming that VAC is eliminated over a wide-range of viewer's focus distance. It is believed that the proposed time-sequential AR HDSMV method will pave the way for the development of VAC-free AR technology.
Collapse
|
39
|
Stebryte M. Reflective optical components based on chiral liquid crystal for head-up displays. LIQUID CRYSTALS TODAY 2022. [DOI: 10.1080/1358314x.2021.2036431] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Migle Stebryte
- LCP Group, ELIS Department, Ghent University, Ghent, Belgium
| |
Collapse
|
40
|
Zhang S, Zhang Z, Liu J. Adjustable and continuous eyebox replication for a holographic Maxwellian near-eye display. OPTICS LETTERS 2022; 47:445-448. [PMID: 35103647 DOI: 10.1364/ol.438855] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Accepted: 11/11/2021] [Indexed: 06/14/2023]
Abstract
A Maxwellian display presents always-focused images to the viewer, alleviating the vergence-accommodation conflict (VAC) in near-eye displays (NEDs). Recently, many methods of improving its limited eyebox have been proposed, among which viewpoint replication has attracted a lot of attention. However, double-image, blind-area, and image-shift effects always happen in typical eyebox-replication Maxwellian NEDs when the eye moves between the replicated viewpoints, which prevents these NEDs from being applied more widely. In this Letter, we propose a method for designing a holographic Maxwellian NED system with continuous eyebox replication as well as flexible interval adjustment by changing the projection angles of the reconstructed images. Thus, holograms corresponding to the positions of different viewpoints are calculated to match the interval of the replicated viewpoints with the human pupil diameter, making it possible to eliminate or alleviate double-image or blind-area effects. Also, seamless viewpoint conversion in the eyebox is achieved by aligning the images of adjacent viewpoints on the retina via hologram pre-processing independently. These effects are verified successfully in optical experiments and have the potential to be applied in near-eye three-dimensional displays without VAC.
Collapse
|
41
|
Doughty M, Ghugre NR. Head-Mounted Display-Based Augmented Reality for Image-Guided Media Delivery to the Heart: A Preliminary Investigation of Perceptual Accuracy. J Imaging 2022; 8:jimaging8020033. [PMID: 35200735 PMCID: PMC8878166 DOI: 10.3390/jimaging8020033] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2021] [Revised: 01/25/2022] [Accepted: 01/28/2022] [Indexed: 01/14/2023] Open
Abstract
By aligning virtual augmentations with real objects, optical see-through head-mounted display (OST-HMD)-based augmented reality (AR) can enhance user-task performance. Our goal was to compare the perceptual accuracy of several visualization paradigms involving an adjacent monitor, or the Microsoft HoloLens 2 OST-HMD, in a targeted task, as well as to assess the feasibility of displaying imaging-derived virtual models aligned with the injured porcine heart. With 10 participants, we performed a user study to quantify and compare the accuracy, speed, and subjective workload of each paradigm in the completion of a point-and-trace task that simulated surgical targeting. To demonstrate the clinical potential of our system, we assessed its use for the visualization of magnetic resonance imaging (MRI)-based anatomical models, aligned with the surgically exposed heart in a motion-arrested open-chest porcine model. Using the HoloLens 2 with alignment of the ground truth target and our display calibration method, users were able to achieve submillimeter accuracy (0.98 mm) and required 1.42 min for calibration in the point-and-trace task. In the porcine study, we observed good spatial agreement between the MRI-models and target surgical site. The use of an OST-HMD led to improved perceptual accuracy and task-completion times in a simulated targeting task.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada;
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada;
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
42
|
Lee SH, Kim M, Kim H, Park CY. Visual fatigue induced by watching virtual reality device and the effect of anisometropia. ERGONOMICS 2021; 64:1522-1531. [PMID: 34270388 DOI: 10.1080/00140139.2021.1957158] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2021] [Accepted: 07/14/2021] [Indexed: 06/13/2023]
Abstract
The effect of small anisometropia on visual fatigue when using virtual reality (VR) devices was investigated. Participants (n = 34) visited three times. In the first visit, VR exposure (10 min) was conducted with the full correction of the refractive error of both eyes. Experimental anisometropia was induced by adding a + 1.0 dioptre spherical lens either on the dominant eyes in the second visit or on the non-dominant eyes in the third visit. At each visit, the participants played a predetermined video game using a head-mounted display VR for 10 min. Visual fatigue was assessed before and after playing VR game using the Virtual Reality Symptom Questionnaire (VRSQ) and high-frequency component of accommodative microfluctuation. Results showed that watching VR induced significant increase of VRSQ score, significant decrease in the maximum accommodation power and objective increase in visual fatigue. Experimental anisometropia induction either on the dominant or non-dominant eyes did not aggravate visual fatigue. Practitioner summary: Mild differences in refractive error (up to 1.0 dioptre) between both eyes do not significantly increase ocular fatigue by viewing virtual reality device (10 min). The impact of small anisometropia may be limited in developing a virtual reality device. Abbreviations: VR: virtual reality; VRSQ: virtual reality symptom questionnaire; HMD: head-mounted display; HFC: high-frequency component.
Collapse
Affiliation(s)
- Sang Hyeok Lee
- Department of Ophthalmology, Dongguk University, Ilsan Hospital, Goyang, South Korea
| | - Martha Kim
- Department of Ophthalmology, Dongguk University, Ilsan Hospital, Goyang, South Korea
| | - Hyosun Kim
- Samsung Display, Display R&D center, Suwon, South Korea
| | - Choul Yong Park
- Department of Ophthalmology, Dongguk University, Ilsan Hospital, Goyang, South Korea
| |
Collapse
|
43
|
Johnson PB, Jackson A, Saki M, Feldman E, Bradley J. Patient posture correction and alignment using mixed reality visualization and the HoloLens 2. Med Phys 2021; 49:15-22. [PMID: 34780068 DOI: 10.1002/mp.15349] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2021] [Revised: 10/08/2021] [Accepted: 11/02/2021] [Indexed: 12/17/2022] Open
Abstract
PURPOSE The purpose of this study was to develop and preliminarily test a radiotherapy system for patient posture correction and alignment using mixed reality (MixR) visualization. The write-up of this work also provides an opportunity to introduce the concepts and technology of MixR for a medical physics audience who may be unfamiliar with the topic. METHODS A MixR application was developed for on optical-see-through head-mounted display (HoloLens 2) allowing a user to simultaneously and directly view a patient and a reference hologram derived from their simulation CT scan. The hologram provides a visual reference for the exact posture needed during treatment and is initialized in relation to the origin of a radiotherapy device using marker-based tracking. The system further provides marker-less tracking that allows the user tofreely navigate the room as they view and align the patient from various angles. The system was preliminarily tested using both a rigid (pelvis) and nonrigid (female mannequin) anthropomorphic phantom. Each phantom was aligned via hologram and accuracy quantified using CBCT and CT. RESULTS A fully realized system was developed. Rigid registration accuracy was on the order of 3.0 ± 1.5 mm based on the performance of three users repeating alignment five times each. The lateral direction showed the most variability among users and was associated with the largest off-sets (approximately 2.0 mm). For nonrigid alignment, the MixR setup outperformed a setup based on three-point alignment and setup photos, the latter of which showed a difference in arm position of 2 cm and a torso roll of 6-7°. CONCLUSIONS MixR visualization is a rapidly emerging domain that has the potential to significantly impact the field of medicine. The current application is an illustration of this and highlights the advantages of MixR for patient setup in radiation oncology. The key feature of the system is the way in which it transforms nonrigid registration into rigid registration by providing an efficient, portable, and cost-effective mechanism for reproducing patient posture without the use of ionizing radiation. Preliminary estimates of registration accuracy indicate clinical viability and form the foundation for further development and clinical testing.
Collapse
Affiliation(s)
- Perry B Johnson
- Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida, USA.,University of Florida Health Proton Therapy Institute, Jacksonville, Florida, USA
| | - Amanda Jackson
- Department of Radiology, University of Florida College of Medicine, Gainesville, Florida, USA
| | - Mohammad Saki
- University of Florida Health Proton Therapy Institute, Jacksonville, Florida, USA
| | - Emily Feldman
- University of Florida Health Proton Therapy Institute, Jacksonville, Florida, USA
| | - Julie Bradley
- Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida, USA.,University of Florida Health Proton Therapy Institute, Jacksonville, Florida, USA
| |
Collapse
|
44
|
Xiong J, Hsiang EL, He Z, Zhan T, Wu ST. Augmented reality and virtual reality displays: emerging technologies and future perspectives. LIGHT, SCIENCE & APPLICATIONS 2021; 10:216. [PMID: 34697292 PMCID: PMC8546092 DOI: 10.1038/s41377-021-00658-8] [Citation(s) in RCA: 212] [Impact Index Per Article: 53.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/06/2021] [Revised: 09/26/2021] [Accepted: 10/04/2021] [Indexed: 05/19/2023]
Abstract
With rapid advances in high-speed communication and computation, augmented reality (AR) and virtual reality (VR) are emerging as next-generation display platforms for deeper human-digital interactions. Nonetheless, to simultaneously match the exceptional performance of human vision and keep the near-eye display module compact and lightweight imposes unprecedented challenges on optical engineering. Fortunately, recent progress in holographic optical elements (HOEs) and lithography-enabled devices provide innovative ways to tackle these obstacles in AR and VR that are otherwise difficult with traditional optics. In this review, we begin with introducing the basic structures of AR and VR headsets, and then describing the operation principles of various HOEs and lithography-enabled devices. Their properties are analyzed in detail, including strong selectivity on wavelength and incident angle, and multiplexing ability of volume HOEs, polarization dependency and active switching of liquid crystal HOEs, device fabrication, and properties of micro-LEDs (light-emitting diodes), and large design freedoms of metasurfaces. Afterwards, we discuss how these devices help enhance the AR and VR performance, with detailed description and analysis of some state-of-the-art architectures. Finally, we cast a perspective on potential developments and research directions of these photonic devices for future AR and VR displays.
Collapse
Affiliation(s)
- Jianghao Xiong
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - En-Lin Hsiang
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Ziqian He
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Tao Zhan
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA
| | - Shin-Tson Wu
- College of Optics and Photonics, University of Central Florida, Orlando, FL, 32816, USA.
| |
Collapse
|
45
|
Jung C, Wolff G, Wernly B, Bruno RR, Franz M, Schulze PC, Silva JNA, Silva JR, Bhatt DL, Kelm M. Virtual and Augmented Reality in Cardiovascular Care: State-of-the-Art and Future Perspectives. JACC Cardiovasc Imaging 2021; 15:519-532. [PMID: 34656478 DOI: 10.1016/j.jcmg.2021.08.017] [Citation(s) in RCA: 43] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Accepted: 08/17/2021] [Indexed: 12/19/2022]
Abstract
Applications of virtual reality (VR) and augmented reality (AR) assist both health care providers and patients in cardiovascular education, complementing traditional learning methods. Interventionalists have successfully used VR to plan difficult procedures and AR to facilitate complex interventions. VR/AR has already been used to treat patients, during interventions in rehabilitation programs and in immobilized intensive care patients. There are numerous additional potential applications in the catheterization laboratory. By using AR, interventionalists could combine visual fluoroscopy information projected and registered on the patient body with data derived from preprocedural imaging and live fusion of different imaging modalities such as fluoroscopy with echocardiography. Persistent technical challenges to overcome include the integration of different imaging modalities into VR/AR and the harmonization of data flow and interfaces. Cybersickness might exclude some patients and users from the potential benefits of VR/AR. Critical ethical considerations arise in the application of VR/AR in vulnerable patients. In addition, digital applications must not distract physicians from the patient. It is our duty as physicians to participate in the development of these innovations to ensure a virtual health reality benefit for our patients in a real-world setting. The purpose of this review is to summarize the current and future role of VR and AR in different fields within cardiology, its challenges, and perspectives.
Collapse
Affiliation(s)
- Christian Jung
- Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, Heinrich-Heine-University, University Hospital Düsseldorf, Düsseldorf, Germany.
| | - Georg Wolff
- Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, Heinrich-Heine-University, University Hospital Düsseldorf, Düsseldorf, Germany
| | - Bernhard Wernly
- Department of Anesthesiology and Intensive Care, Paracelsus Medical University of Salzburg, Salzburg, Austria; Division of Cardiology, Department of Medicine, Karolinska Institutet, Karolinska University Hospital, Stockholm, Sweden
| | - Raphael Romano Bruno
- Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, Heinrich-Heine-University, University Hospital Düsseldorf, Düsseldorf, Germany
| | - Marcus Franz
- Department of Internal Medicine I, Medical Faculty, Friedrich Schiller University Jena, University Hospital Jena, Jena, Germany
| | - P Christian Schulze
- Department of Internal Medicine I, Medical Faculty, Friedrich Schiller University Jena, University Hospital Jena, Jena, Germany
| | - Jennifer N Avari Silva
- Pediatric Cardiology Division, Department of Pediatrics, Washington University in Saint Louis, School of Medicine, Saint Louis, Missouri, USA; Department of Biomedical Engineering, McKelvey School of Engineering, Washington University in Saint Louis, Saint Louis, Missouri, USA; SentiAR, Saint Louis, Missouri, USA
| | - Jonathan R Silva
- Department of Biomedical Engineering, McKelvey School of Engineering, Washington University in Saint Louis, Saint Louis, Missouri, USA; SentiAR, Saint Louis, Missouri, USA
| | - Deepak L Bhatt
- Brigham and Women's Hospital Heart and Vascular Center, Harvard Medical School, Boston, Massachusetts, USA. https://twitter.com/DLBHATTMD
| | - Malte Kelm
- Division of Cardiology, Pulmonology, and Vascular Medicine, Medical Faculty, Heinrich-Heine-University, University Hospital Düsseldorf, Düsseldorf, Germany; Cardiovascular Research Institute Duesseldorf, Düsseldorf, Germany
| |
Collapse
|
46
|
Kaimara P, Oikonomou A, Deliyannis I. Could virtual reality applications pose real risks to children and adolescents? A systematic review of ethical issues and concerns. VIRTUAL REALITY 2021; 26:697-735. [PMID: 34366688 PMCID: PMC8328811 DOI: 10.1007/s10055-021-00563-w] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Accepted: 07/20/2021] [Indexed: 06/13/2023]
Abstract
Virtual reality technologies (VRTs) are high-tech human-computer interfaces used to develop digital content and can be applied to multiple different areas, often offering innovative solutions to existing problems. A wide range of digital games is being also developed with VRTs and together with their components, the games' structural elements are appealing to children and engaging them more in virtual worlds. Our research interest is directed towards children's development and the effects of VRTs within gaming environments. Contemporary psychology studies perceive human development as a holistic and lifelong process with important interrelationships between physical, mental, social and emotional aspects. For the objectives and scope of this work, we examine children development across three domains: physical, cognitive and psychosocial. In this context, the authors review the literature on the impact of VRTs on children, in terms of software and hardware. Since research requires an wide-ranging approach, we study the evidence reported on the brain and neural structure, knowledge, behaviour, pedagogy, academic performance, and wellness. Our main concern is to outline the emerging ethical issues and worries of parents, educators, ophthalmologists, neurologists, psychologists, paediatricians and all relevant scientists, as well as the industry's views and actions. The systematic review was performed on the databases Scopus, IEEE Xplore, PubMed, and Google Scholar from 2010 to 2020 and 85 studies were selected. The review concluded that findings remain contradictory especially for the psychosocial domain. Official recommendations from organizations and well-documented researches by academics on child well-being are reassuring if health and safety specifications and particularly the time limit are met. Research is still ongoing, constantly updated and consist of a priority for the scientific community given that technology evolves.
Collapse
Affiliation(s)
- Polyxeni Kaimara
- Department of Audio and Visual Arts, Ionian University, Tsirigoti Sq. 7, 49100 Corfu, Greece
| | - Andreas Oikonomou
- School of Pedagogical and Technological Education (ASPETE), Alexandrou Papanastasiou 13, Thessaloniki, Greece
| | - Ioannis Deliyannis
- Department of Audio and Visual Arts, Ionian University, Tsirigoti Sq. 7, 49100 Corfu, Greece
| |
Collapse
|
47
|
Recent Advances in Photoalignment Liquid Crystal Polarization Gratings and Their Applications. CRYSTALS 2021. [DOI: 10.3390/cryst11080900] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Liquid crystal (LC) circular polarization gratings (PGs), also known as Pancharatnam–Berry (PB) phase deflectors, are diffractive waveplates with linearly changed optical anisotropy axes. Due to the high diffraction efficiency, polarization selectivity character, and simple fabrication process, photoalignment LC PGs have been widely studied and developed especially in polarization management and beam split. In this review paper, we analyze the physical principles, show the exposure methods and fabrication process, and present relevant promising applications in photonics and imaging optics.
Collapse
|
48
|
Bruckheimer E, Goreczny S. Advanced imaging techniques to assist transcatheter congenital heart defects therapies. PROGRESS IN PEDIATRIC CARDIOLOGY 2021. [DOI: 10.1016/j.ppedcard.2021.101373] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
49
|
Drewes J, Feder S, Einhäuser W. Gaze During Locomotion in Virtual Reality and the Real World. Front Neurosci 2021; 15:656913. [PMID: 34108857 PMCID: PMC8180583 DOI: 10.3389/fnins.2021.656913] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2021] [Accepted: 04/27/2021] [Indexed: 11/20/2022] Open
Abstract
How vision guides gaze in realistic settings has been researched for decades. Human gaze behavior is typically measured in laboratory settings that are well controlled but feature-reduced and movement-constrained, in sharp contrast to real-life gaze control that combines eye, head, and body movements. Previous real-world research has shown environmental factors such as terrain difficulty to affect gaze; however, real-world settings are difficult to control or replicate. Virtual reality (VR) offers the experimental control of a laboratory, yet approximates freedom and visual complexity of the real world (RW). We measured gaze data in 8 healthy young adults during walking in the RW and simulated locomotion in VR. Participants walked along a pre-defined path inside an office building, which included different terrains such as long corridors and flights of stairs. In VR, participants followed the same path in a detailed virtual reconstruction of the building. We devised a novel hybrid control strategy for movement in VR: participants did not actually translate: forward movements were controlled by a hand-held device, rotational movements were executed physically and transferred to the VR. We found significant effects of terrain type (flat corridor, staircase up, and staircase down) on gaze direction, on the spatial spread of gaze direction, and on the angular distribution of gaze-direction changes. The factor world (RW and VR) affected the angular distribution of gaze-direction changes, saccade frequency, and head-centered vertical gaze direction. The latter effect vanished when referencing gaze to a world-fixed coordinate system, and was likely due to specifics of headset placement, which cannot confound any other analyzed measure. Importantly, we did not observe a significant interaction between the factors world and terrain for any of the tested measures. This indicates that differences between terrain types are not modulated by the world. The overall dwell time on navigational markers did not differ between worlds. The similar dependence of gaze behavior on terrain in the RW and in VR indicates that our VR captures real-world constraints remarkably well. High-fidelity VR combined with naturalistic movement control therefore has the potential to narrow the gap between the experimental control of a lab and ecologically valid settings.
Collapse
Affiliation(s)
- Jan Drewes
- Institute of Brain and Psychological Sciences, Sichuan Normal University, Chengdu, China
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Sascha Feder
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Wolfgang Einhäuser
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| |
Collapse
|
50
|
Zhang Q, Song W, Hu X, Hu K, Weng D, Liu Y, Wang Y. Design of a near-eye display measurement system using an anthropomorphic vision imaging method. OPTICS EXPRESS 2021; 29:13204-13218. [PMID: 33985060 DOI: 10.1364/oe.421920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Accepted: 04/05/2021] [Indexed: 06/12/2023]
Abstract
We developed a new near-eye display measurement system using anthropomorphic vision imaging to measure the key parameters of near-eye displays, including field-of-view (FOV), angular resolution, eye box, and virtual image depth. The characteristics of the human eye, such as pupil position, pupil size variation, accommodation function, and the high resolution of the fovea, are imitated by the proposed measurement system. A FOV scanning structure, together with a non-vignetting image-telecentric lens system, captures the virtual image from the near-eye display by imitating human eye function. As a proof-of-concept, a prototype device was used to obtain large-range, high-resolution measurements for key parameters of near-eye displays.
Collapse
|