1
|
Borde T, Saccenti L, Li M, Varble NA, Hazen LA, Kassin MT, Ukeh IN, Horton KM, Delgado JF, Martin C, Xu S, Pritchard WF, Karanian JW, Wood BJ. Smart goggles augmented reality CT-US fusion compared to conventional fusion navigation for percutaneous needle insertion. Int J Comput Assist Radiol Surg 2024:10.1007/s11548-024-03148-5. [PMID: 38814530 DOI: 10.1007/s11548-024-03148-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Accepted: 04/10/2024] [Indexed: 05/31/2024]
Abstract
PURPOSE Targeting accuracy determines outcomes for percutaneous needle interventions. Augmented reality (AR) in IR may improve procedural guidance and facilitate access to complex locations. This study aimed to evaluate percutaneous needle placement accuracy using a goggle-based AR system compared to an ultrasound (US)-based fusion navigation system. METHODS Six interventional radiologists performed 24 independent needle placements in an anthropomorphic phantom (CIRS 057A) in four needle guidance cohorts (n = 6 each): (1) US-based fusion, (2) goggle-based AR with stereoscopically projected anatomy (AR-overlay), (3) goggle AR without the projection (AR-plain), and (4) CT-guided freehand. US-based fusion included US/CT registration with electromagnetic (EM) needle, transducer, and patient tracking. For AR-overlay, US, EM-tracked needle, stereoscopic anatomical structures and targets were superimposed over the phantom. Needle placement accuracy (distance from needle tip to target center), placement time (from skin puncture to final position), and procedure time (time to completion) were measured. RESULTS Mean needle placement accuracy using US-based fusion, AR-overlay, AR-plain, and freehand was 4.5 ± 1.7 mm, 7.0 ± 4.7 mm, 4.7 ± 1.7 mm, and 9.2 ± 5.8 mm, respectively. AR-plain demonstrated comparable accuracy to US-based fusion (p = 0.7) and AR-overlay (p = 0.06). Excluding two outliers, AR-overlay accuracy became 5.9 ± 2.6 mm. US-based fusion had the highest mean placement time (44.3 ± 27.7 s) compared to all navigation cohorts (p < 0.001). Longest procedure times were recorded with AR-overlay (34 ± 10.2 min) compared to AR-plain (22.7 ± 8.6 min, p = 0.09), US-based fusion (19.5 ± 5.6 min, p = 0.02), and freehand (14.8 ± 1.6 min, p = 0.002). CONCLUSION Goggle-based AR showed no difference in needle placement accuracy compared to the commercially available US-based fusion navigation platform. Differences in accuracy and procedure times were apparent with different display modes (with/without stereoscopic projections). The AR-based projection of the US and needle trajectory over the body may be a helpful tool to enhance visuospatial orientation. Thus, this study refines the potential role of AR for needle placements, which may serve as a catalyst for informed implementation of AR techniques in IR.
Collapse
Affiliation(s)
- Tabea Borde
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA.
| | - Laetitia Saccenti
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Henri Mondor Biomedical Research Institute, Inserm U955, Team N°18, Créteil, France
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Nicole A Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Philips Healthcare, Cambridge, MA, 02141, USA
| | - Lindsey A Hazen
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Ifechi N Ukeh
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Keith M Horton
- Department of Radiology, Georgetown Medical School, Medstar Washington Hospital Center, Washington, DC, 20007, USA
| | - Jose F Delgado
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Fischell Department of Bioengineering, University of Maryland, College Park, MD, 20742, USA
| | - Charles Martin
- Department of Interventional Radiology, Cleveland Clinic, Cleveland, OH, 44195, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA.
- Fischell Department of Bioengineering, University of Maryland, College Park, MD, 20742, USA.
| |
Collapse
|
2
|
Saccenti L, Bessy H, Ben Jedidia B, Longere B, Tortolano L, Derbel H, Luciani A, Kobeiter H, Grandpierre T, Tacher V. Performance Comparison of Augmented Reality Versus Ultrasound Guidance for Puncture: A Phantom Study. Cardiovasc Intervent Radiol 2024:10.1007/s00270-024-03727-8. [PMID: 38710797 DOI: 10.1007/s00270-024-03727-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 04/02/2024] [Indexed: 05/08/2024]
Abstract
PURPOSE Augmented reality (AR) is an innovative approach that could assist percutaneous procedures; by directly seeing "through" a phantom, targeting a lesion might be more intuitive than using ultrasound (US). The objective of this study was to compare the performance of experienced interventional radiologists and operators untrained in soft tissue lesion puncture using AR guidance and standard US guidance. MATERIAL AND METHODS Three trained interventional radiologists with 5-10 years of experience and three untrained operators performed punctures of five targets in an abdominal phantom, with US guidance and AR guidance. Correct targeting, accuracy (defined as the Euclidean distance between the tip and the center of the target), planning time, and puncture time were documented. RESULTS Accuracy was higher for the trained group than the untrained group using US guidance (1 mm versus 4 mm, p = 0.001), but not when using AR guidance (4 mm vs. 4 mm, p = 0.76). All operators combined, no significant difference was found concerning accuracy between US and AR guidance (2 mm vs. 4 mm, p = 0.09), but planning time and puncture time were significantly shorter using AR (respectively, 15.1 s vs. 74 s, p < 0.001; 16.1 s vs. 59 s; p < 0.001). CONCLUSION Untrained and trained operators obtained comparable accuracy in percutaneous punctures when using AR guidance whereas US performance was better in the experienced group. All operators together, accuracy was similar between US and AR guidance, but shorter planning time, puncture time were found for AR guidance.
Collapse
Affiliation(s)
- Laetitia Saccenti
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France.
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France.
| | - Hugo Bessy
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
| | | | - Benjamin Longere
- Department of Cardiovascular Radiology, CHU Lille, 59000, Lille, France
| | | | - Haytham Derbel
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Alain Luciani
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Hicham Kobeiter
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Thierry Grandpierre
- Ecole superieure d'ingenieurs en electrotechnique et electronique, ESIEE Paris, Noisy Le Grand, France
| | - Vania Tacher
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| |
Collapse
|
3
|
Lee KH, Li M, Varble N, Negussie AH, Kassin MT, Arrichiello A, Carrafiello G, Hazen LA, Wakim PG, Li X, Xu S, Wood BJ. Smartphone Augmented Reality Outperforms Conventional CT Guidance for Composite Ablation Margins in Phantom Models. J Vasc Interv Radiol 2024; 35:452-461.e3. [PMID: 37852601 DOI: 10.1016/j.jvir.2023.10.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Revised: 09/23/2023] [Accepted: 10/08/2023] [Indexed: 10/20/2023] Open
Abstract
PURPOSE To develop and evaluate a smartphone augmented reality (AR) system for a large 50-mm liver tumor ablation with treatment planning for composite overlapping ablation zones. MATERIALS AND METHODS A smartphone AR application was developed to display tumor, probe, projected probe paths, ablated zones, and real-time percentage of the ablated target tumor volume. Fiducial markers were attached to phantoms and an ablation probe hub for tracking. The system was evaluated with tissue-mimicking thermochromic phantoms and gel phantoms. Four interventional radiologists performed 2 trials each of 3 probe insertions per trial using AR guidance versus computed tomography (CT) guidance approaches in 2 gel phantoms. Insertion points and optimal probe paths were predetermined. On Gel Phantom 2, serial ablated zones were saved and continuously displayed after each probe placement/adjustment, enabling feedback and iterative planning. The percentages of tumor ablated for AR guidance versus CT guidance, and with versus without display of recorded ablated zones, were compared among interventional radiologists with pairwise t-tests. RESULTS The means of percentages of tumor ablated for CT freehand and AR guidance were 36% ± 7 and 47% ± 4 (P = .004), respectively. The mean composite percentages of tumor ablated for AR guidance were 43% ± 1 (without) and 50% ± 2 (with display of ablation zone) (P = .033). There was no strong correlation between AR-guided percentage of ablation and years of experience (r < 0.5), whereas there was a strong correlation between CT-guided percentage of ablation and years of experience (r > 0.9). CONCLUSIONS A smartphone AR guidance system for dynamic iterative large liver tumor ablation was accurate, performed better than conventional CT guidance, especially for less experienced interventional radiologists, and enhanced more standardized performance across experience levels for ablation of a 50-mm tumor.
Collapse
Affiliation(s)
- Katerina H Lee
- McGovern Medical School at UTHealth, Houston, Texas; Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Ming Li
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Nicole Varble
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland; Philips Research North America, Cambridge, Massachusetts
| | - Ayele H Negussie
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Michael T Kassin
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Antonio Arrichiello
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Gianpaolo Carrafiello
- Department of Radiology, Foundation IRCCS Ca' Granda Ospedale Maggiore Policlinico, University of Milan, Milan, Italy
| | - Lindsey A Hazen
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Paul G Wakim
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Sheng Xu
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Bradford J Wood
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.
| |
Collapse
|
4
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
5
|
Li M, Mehralivand S, Xu S, Varble N, Bakhutashvili I, Gurram S, Pinto PA, Choyke PL, Wood BJ, Turkbey B. HoloLens augmented reality system for transperineal free-hand prostate procedures. J Med Imaging (Bellingham) 2023; 10:025001. [PMID: 36875636 PMCID: PMC9976411 DOI: 10.1117/1.jmi.10.2.025001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 02/09/2023] [Indexed: 03/05/2023] Open
Abstract
Purpose An augmented reality (AR) system was developed to facilitate free-hand real-time needle guidance for transperineal prostate (TP) procedures and to overcome the limitations of a traditional guidance grid. Approach The HoloLens AR system enables the superimposition of annotated anatomy derived from preprocedural volumetric images onto a patient and addresses the most challenging part of free-hand TP procedures by providing real-time needle tip localization and needle depth visualization during insertion. The AR system accuracy, or the image overlay accuracy ( n = 56 ), and needle targeting accuracy ( n = 24 ) were evaluated within a 3D-printed phantom. Three operators each used a planned-path guidance method ( n = 4 ) and free-hand guidance ( n = 4 ) to guide needles into targets in a gel phantom. Placement error was recorded. The feasibility of the system was further evaluated by delivering soft tissue markers into tumors of an anthropomorphic pelvic phantom via the perineum. Results The image overlay error was 1.29 ± 0.57 mm , and needle targeting error was 2.13 ± 0.52 mm . The planned-path guidance placements showed similar error compared to the free-hand guidance ( 4.14 ± 1.08 mm versus 4.20 ± 1.08 mm , p = 0.90 ). The markers were successfully implanted either into or in close proximity to the target lesion. Conclusions The HoloLens AR system can provide accurate needle guidance for TP interventions. AR support for free-hand lesion targeting is feasible and may provide more flexibility than grid-based methods, due to the real-time 3D and immersive experience during free-hand TP procedures.
Collapse
Affiliation(s)
- Ming Li
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Sherif Mehralivand
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Sheng Xu
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Nicole Varble
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
- Philips Research of North America, Cambridge, Massachusetts, United States
| | - Ivane Bakhutashvili
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Sandeep Gurram
- National Institutes of Health, Urologic Oncology Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Peter A. Pinto
- National Institutes of Health, Urologic Oncology Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Peter L. Choyke
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Bradford J. Wood
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Baris Turkbey
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| |
Collapse
|
6
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
7
|
Out-of-Plane Needle Placements Using 3D Augmented Reality Protractor on Smartphone: An Experimental Phantom Study. Cardiovasc Intervent Radiol 2023; 46:675-679. [PMID: 36658373 DOI: 10.1007/s00270-023-03357-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/29/2022] [Accepted: 12/31/2022] [Indexed: 01/20/2023]
Abstract
PURPOSE To evaluate the accuracy of needle placement using a three-dimensional (3D) augmented reality (AR) protractor on smartphones (AR Puncture). MATERIALS AND METHODS An AR protractor that can be rotated in three directions against the CT plane with angle guidance lines for smartphones was developed. The protractor center can be adjusted to an entry point by manually moving the smartphone with the protractor center fixed at the center of the screen (Fix-On-Screen) or by image tracking with a printed QR code placed at an entry point (QR-Tracking). Needle placement was performed by viewing a target line in the tangent direction with the Bull's eye method. The needle placement errors placed by four operators in six out-of-plane directions in a phantom using a smartphone (iPhone XR, Apple, Cupertino, CA, USA) were compared with two registration methods. RESULTS No significant difference in the average needle placement error was observed between the Fix-On-Screen and QR-Tracking methods (5.6 ± 1.7 mm vs. 6.1 ± 2.9 mm, p = 0.475). The average procedural time of the Fix-On-Screen method was shorter than that of the QR-Tracking method (71.0 ± 23.9 s vs. 98.4 ± 59.5 s, p = 0.042). CONCLUSION The accuracies of out-of-plane needle placements using the 3D AR protractor with the two registration methods were equally high, with short procedure times. In clinical use, the Fix-On-Screen registration method would be more convenient because no additional markers are required.
Collapse
|
8
|
Moreta-Martínez R, Rubio-Pérez I, García-Sevilla M, García-Elcano L, Pascau J. Evaluation of optical tracking and augmented reality for needle navigation in sacral nerve stimulation. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2022; 224:106991. [PMID: 35810510 DOI: 10.1016/j.cmpb.2022.106991] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Revised: 06/10/2022] [Accepted: 06/28/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND AND OBJECTIVE Sacral nerve stimulation (SNS) is a minimally invasive procedure where an electrode lead is implanted through the sacral foramina to stimulate the nerve modulating colonic and urinary functions. One of the most crucial steps in SNS procedures is the placement of the tined lead close to the sacral nerve. However, needle insertion is very challenging for surgeons. Several x-ray projections are required to interpret the needle position correctly. In many cases, multiple punctures are needed, causing an increase in surgical time and patient's discomfort and pain. In this work we propose and evaluate two different navigation systems to guide electrode placement in SNS surgeries designed to reduce surgical time, minimize patient discomfort and improve surgical outcomes. METHODS We developed, for the first alternative, an open-source navigation software to guide electrode placement by real-time needle tracking with an optical tracking system (OTS). In the second method, we present a smartphone-based AR application that displays virtual guidance elements directly on the affected area, using a 3D printed reference marker placed on the patient. This guidance facilitates needle insertion with a predefined trajectory. Both techniques were evaluated to determine which one obtained better results than the current surgical procedure. To compare the proposals with the clinical method, we developed an x-ray software tool that calculates a digitally reconstructed radiograph, simulating the fluoroscopy acquisitions during the procedure. Twelve physicians (inexperienced and experienced users) performed needle insertions through several specific targets to evaluate the alternative SNS guidance methods on a realistic patient-based phantom. RESULTS With each navigation solution, we observed that users took less average time to complete each insertion (36.83 s and 44.43 s for the OTS and AR methods, respectively) and needed fewer average punctures to reach the target (1.23 and 1.96 for the OTS and AR methods respectively) than following the standard clinical method (189.28 s and 3.65 punctures). CONCLUSIONS To conclude, we have shown two navigation alternatives that could improve surgical outcome by significantly reducing needle insertions, surgical time and patient's pain in SNS procedures. We believe that these solutions are feasible to train surgeons and even replace current SNS clinical procedures.
Collapse
Affiliation(s)
- Rafael Moreta-Martínez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain
| | - Inés Rubio-Pérez
- Servicio de Cirugía General, Hospital Universitario La Paz, Madrid 28046, Spain
| | - Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain
| | - Laura García-Elcano
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Centro de Investigación Médica Aplicada, Clínica Universidad de Navarra, Madrid 28027, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain.
| |
Collapse
|
9
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
10
|
Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. Cardiovasc Intervent Radiol 2022; 45:349-356. [PMID: 35022858 DOI: 10.1007/s00270-021-03029-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2021] [Accepted: 10/28/2021] [Indexed: 11/02/2022]
Abstract
PURPOSE To develop and assess the accuracy of a mixed reality (MR) needle guidance application on smartglasses. MATERIALS AND METHODS An MR needle guidance application on HoloLens2, without pre-procedural CT image reconstruction or import by manually matching the spatial and MR coordinate systems, was developed. First, the accuracy of the target locations in the image overlay at 63 points arranged on a 45 × 35 × 21 cm box and needle angles from 0° to 80°, placed using the MR application, was verified. The needle placement errors from 12 different entry points in a phantom by seven operators (four physicians and three non-physicians) were compared using a linear mixed model between the MR guidance and conventional methods using protractors. RESULTS The average errors of the target locations and needle angles placed using the MR application were 5.9 ± 2.6 mm and 2.3 ± 1.7°, respectively. The average needle insertion error using the MR guidance was slightly smaller compared to that using the conventional method (8.4 ± 4.0 mm vs. 9.6 ± 5.1 mm, p = 0.091), particularly in the out-of-plane approach (9.6 ± 3.5 mm vs. 12.3 ± 4.6 mm, p = 0.003). The procedural time was longer with MR guidance than with the conventional method (412 ± 134 s vs. 219 ± 66 s, p < 0.001). CONCLUSION MR needle guidance without pre-procedural CT image import is feasible when matching coordinate systems, and the accuracy of needle insertion is slightly better than that of the conventional method.
Collapse
|
11
|
Long DJ, Li M, De Ruiter QMB, Hecht R, Li X, Varble N, Blain M, Kassin MT, Sharma KV, Sarin S, Krishnasamy VP, Pritchard WF, Karanian JW, Wood BJ, Xu S. Comparison of Smartphone Augmented Reality, Smartglasses Augmented Reality, and 3D CBCT-guided Fluoroscopy Navigation for Percutaneous Needle Insertion: A Phantom Study. Cardiovasc Intervent Radiol 2021; 44:774-781. [PMID: 33409547 DOI: 10.1007/s00270-020-02760-7] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 12/23/2020] [Indexed: 11/30/2022]
Abstract
PURPOSE To compare needle placement performance using an augmented reality (AR) navigation platform implemented on smartphone or smartglasses devices to that of CBCT-guided fluoroscopy in a phantom. MATERIALS AND METHODS An AR application was developed to display a planned percutaneous needle trajectory on the smartphone (iPhone7) and smartglasses (HoloLens1) devices in real time. Two AR-guided needle placement systems and CBCT-guided fluoroscopy with navigation software (XperGuide, Philips) were compared using an anthropomorphic phantom (CIRS, Norfolk, VA). Six interventional radiologists each performed 18 independent needle placements using smartphone (n = 6), smartglasses (n = 6), and XperGuide (n = 6) guidance. Placement error was defined as the distance from the needle tip to the target center. Placement time was recorded. For XperGuide, dose-area product (DAP, mGy*cm2) and fluoroscopy time (sec) were recorded. Statistical comparisons were made using a two-way repeated measures ANOVA. RESULTS The placement error using the smartphone, smartglasses, or XperGuide was similar (3.98 ± 1.68 mm, 5.18 ± 3.84 mm, 4.13 ± 2.38 mm, respectively, p = 0.11). Compared to CBCT-guided fluoroscopy, the smartphone and smartglasses reduced placement time by 38% (p = 0.02) and 55% (p = 0.001), respectively. The DAP for insertion using XperGuide was 3086 ± 2920 mGy*cm2, and no intra-procedural radiation was required for augmented reality. CONCLUSIONS Smartphone- and smartglasses-based augmented reality reduced needle placement time and radiation exposure while maintaining placement accuracy compared to a clinically validated needle navigation platform.
Collapse
Affiliation(s)
- Dilara J Long
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.
| | - Quirina M B De Ruiter
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Rachel Hecht
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.,Philips Research of North America, Cambridge, MA, 02141, USA
| | - Maxime Blain
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, Washington, DC, USA
| | - Shawn Sarin
- Department of Interventional Radiology, George Washington University Hospital, Washington, DC, USA
| | - Venkatesh P Krishnasamy
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| |
Collapse
|