1
|
Dastan M, Fiorentino M, Walter ED, Diegritz C, Uva AE, Eck U, Navab N. Co-Designing Dynamic Mixed Reality Drill Positioning Widgets: A Collaborative Approach with Dentists in a Realistic Setup. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7053-7063. [PMID: 39250405 DOI: 10.1109/tvcg.2024.3456149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/11/2024]
Abstract
Mixed Reality (MR) is proven in the literature to support precise spatial dental drill positioning by superimposing 3D widgets. Despite this, the related knowledge about widget's visual design and interactive user feedback is still limited. Therefore, this study is contributed to by co-designed MR drill tool positioning widgets with two expert dentists and three MR experts. The results of co-design are two static widgets (SWs): a simple entry point, a target axis, and two dynamic widgets (DWs), variants of dynamic error visualization with and without a target axis (DWTA and DWEP). We evaluated the co-designed widgets in a virtual reality simulation supported by a realistic setup with a tracked phantom patient, a virtual magnifying loupe, and a dentist's foot pedal. The user study involved 35 dentists with various backgrounds and years of experience. The findings demonstrated significant results; DWs outperform SWs in positional and rotational precision, especially with younger generations and subjects with gaming experiences. The user preference remains for DWs (19) instead of SWs (16). However, findings indicated that the precision positively correlates with the time trade-off. The post-experience questionnaire (NASA-TLX) showed that DWs increase mental and physical demand, effort, and frustration more than SWs. Comparisons between DWEP and DWTA show that the DW's complexity level influences time, physical and mental demands. The DWs are extensible to diverse medical and industrial scenarios that demand precision.
Collapse
|
2
|
Saruwatari MS, Nguyen TN, Talari HF, Matisoff AJ, Sharma KV, Donoho KG, Basu S, Dwivedi P, Bost JE, Shekhar R. Assessing the Effect of Augmented Reality on Procedural Outcomes During Ultrasound-Guided Vascular Access. ULTRASOUND IN MEDICINE & BIOLOGY 2023; 49:2346-2353. [PMID: 37573178 PMCID: PMC10658651 DOI: 10.1016/j.ultrasmedbio.2023.07.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Revised: 06/16/2023] [Accepted: 07/11/2023] [Indexed: 08/14/2023]
Abstract
OBJECTIVE Augmented reality devices are increasingly accepted in health care, though most applications involve education and pre-operative planning. A novel augmented reality ultrasound application, HoloUS, was developed for the Microsoft HoloLens 2 to project real-time ultrasound images directly into the user's field of view. In this work, we assessed the effect of using HoloUS on vascular access procedural outcomes. METHODS A single-center user study was completed with participants with (N = 22) and without (N = 12) experience performing ultrasound-guided vascular access. Users completed a venipuncture and aspiration task a total of four times: three times on study day 1, and once on study day 2 between 2 and 4 weeks later. Users were randomized to use conventional ultrasound during either their first or second task and the HoloUS application at all other times. Task completion time, numbers of needle re-directions, head adjustments and needle visualization rates were recorded. RESULTS For expert users, task completion time was significantly faster using HoloUS (11.5 s, interquartile range [IQR] = 6.5-23.5 s vs. 18.5 s, IQR = 11.0-36.5 s; p = 0.04). The number of head adjustments was significantly lower using the HoloUS app (1.0, IQR = 0.0-1.0 vs. 3.0, IQR = 1.0-5.0; p < 0.0001). No significant differences were identified in other measured outcomes. CONCLUSION This is the first investigation of augmented reality-based ultrasound-guided vascular access using the second-generation HoloLens. It demonstrates equivalent procedural efficiency and accuracy, with favorable usability, ergonomics and user independence when compared with traditional ultrasound techniques.
Collapse
Affiliation(s)
- Michele S Saruwatari
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; Department of Surgery, MedStar Georgetown University Hospital and Washington Hospital Center, Washington, DC, USA
| | | | - Hadi Fooladi Talari
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA
| | - Andrew J Matisoff
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Kelsey G Donoho
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Sonali Basu
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Pallavi Dwivedi
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA
| | - James E Bost
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Raj Shekhar
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; IGI Technologies, Silver Spring, MD, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA.
| |
Collapse
|
3
|
Song T, Yu K, Eck U, Navab N. Augmented reality collaborative medical displays (ARC-MeDs) for multi-user surgical planning and intra-operative communication. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2022.2150892] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Tianyu Song
- Chair for Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany
| | - Kevin Yu
- Chair for Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany
- Research & Development, MedPhoton GmbH, Salzburg, Austria
| | - Ulrich Eck
- Chair for Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany
- Computer Aided MedicalProcedure, Johns-Hopkins University, Baltimore MD, USA
| |
Collapse
|
4
|
Advances and Innovations in Ablative Head and Neck Oncologic Surgery Using Mixed Reality Technologies in Personalized Medicine. J Clin Med 2022; 11:jcm11164767. [PMID: 36013006 PMCID: PMC9410374 DOI: 10.3390/jcm11164767] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 08/10/2022] [Accepted: 08/12/2022] [Indexed: 11/17/2022] Open
Abstract
The benefit of computer-assisted planning in head and neck ablative and reconstructive surgery has been extensively documented over the last decade. This approach has been proven to offer a more secure surgical procedure. In the treatment of cancer of the head and neck, computer-assisted surgery can be used to visualize and estimate the location and extent of the tumor mass. Nowadays, some software tools even allow the visualization of the structures of interest in a mixed reality environment. However, the precise integration of mixed reality systems into a daily clinical routine is still a challenge. To date, this technology is not yet fully integrated into clinical settings such as the tumor board, surgical planning for head and neck tumors, or medical and surgical education. As a consequence, the handling of these systems is still of an experimental nature, and decision-making based on the presented data is not yet widely used. The aim of this paper is to present a novel, user-friendly 3D planning and mixed reality software and its potential application for ablative and reconstructive head and neck surgery.
Collapse
|
5
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
6
|
Doughty M, Ghugre NR. Head-Mounted Display-Based Augmented Reality for Image-Guided Media Delivery to the Heart: A Preliminary Investigation of Perceptual Accuracy. J Imaging 2022; 8:jimaging8020033. [PMID: 35200735 PMCID: PMC8878166 DOI: 10.3390/jimaging8020033] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2021] [Revised: 01/25/2022] [Accepted: 01/28/2022] [Indexed: 01/14/2023] Open
Abstract
By aligning virtual augmentations with real objects, optical see-through head-mounted display (OST-HMD)-based augmented reality (AR) can enhance user-task performance. Our goal was to compare the perceptual accuracy of several visualization paradigms involving an adjacent monitor, or the Microsoft HoloLens 2 OST-HMD, in a targeted task, as well as to assess the feasibility of displaying imaging-derived virtual models aligned with the injured porcine heart. With 10 participants, we performed a user study to quantify and compare the accuracy, speed, and subjective workload of each paradigm in the completion of a point-and-trace task that simulated surgical targeting. To demonstrate the clinical potential of our system, we assessed its use for the visualization of magnetic resonance imaging (MRI)-based anatomical models, aligned with the surgically exposed heart in a motion-arrested open-chest porcine model. Using the HoloLens 2 with alignment of the ground truth target and our display calibration method, users were able to achieve submillimeter accuracy (0.98 mm) and required 1.42 min for calibration in the point-and-trace task. In the porcine study, we observed good spatial agreement between the MRI-models and target surgical site. The use of an OST-HMD led to improved perceptual accuracy and task-completion times in a simulated targeting task.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada;
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada;
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
7
|
Carbone M, Domeneghetti D, Cutolo F, D’Amato R, Cigna E, Parchi PD, Gesi M, Morelli L, Ferrari M, Ferrari V. Can Liquid Lenses Increase Depth of Field in Head Mounted Video See-Through Devices? J Imaging 2021; 7:138. [PMID: 34460773 PMCID: PMC8404927 DOI: 10.3390/jimaging7080138] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 08/02/2021] [Accepted: 08/03/2021] [Indexed: 02/05/2023] Open
Abstract
Wearable Video See-Through (VST) devices for Augmented Reality (AR) and for obtaining a Magnified View are taking hold in the medical and surgical fields. However, these devices are not yet usable in daily clinical practice, due to focusing problems and a limited depth of field. This study investigates the use of liquid-lens optics to create an autofocus system for wearable VST visors. The autofocus system is based on a Time of Flight (TOF) distance sensor and an active autofocus control system. The integrated autofocus system in the wearable VST viewers showed good potential in terms of providing rapid focus at various distances and a magnified view.
Collapse
Affiliation(s)
- Marina Carbone
- Department of Information Engineering, University of Pisa, 56122 Pisa, Italy; (D.D.); (F.C.); (R.D.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy;
| | - Davide Domeneghetti
- Department of Information Engineering, University of Pisa, 56122 Pisa, Italy; (D.D.); (F.C.); (R.D.); (V.F.)
| | - Fabrizio Cutolo
- Department of Information Engineering, University of Pisa, 56122 Pisa, Italy; (D.D.); (F.C.); (R.D.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy;
| | - Renzo D’Amato
- Department of Information Engineering, University of Pisa, 56122 Pisa, Italy; (D.D.); (F.C.); (R.D.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy;
| | - Emanuele Cigna
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy; (E.C.); (P.D.P.); (M.G.); (L.M.)
| | - Paolo Domenico Parchi
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy; (E.C.); (P.D.P.); (M.G.); (L.M.)
| | - Marco Gesi
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy; (E.C.); (P.D.P.); (M.G.); (L.M.)
| | - Luca Morelli
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy; (E.C.); (P.D.P.); (M.G.); (L.M.)
| | - Mauro Ferrari
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy;
| | - Vincenzo Ferrari
- Department of Information Engineering, University of Pisa, 56122 Pisa, Italy; (D.D.); (F.C.); (R.D.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56122 Pisa, Italy;
| |
Collapse
|