1
|
Saccenti L, Varble N, Borde T, Huth H, Kassin M, Ukeh I, Bakhutashvili I, Golbus A, Hazen L, Xu S, Tacher V, Kobeiter H, Horton K, Li M, Wood B. Integrated Needle Guide on Smartphone for Percutaneous Interventions Using Augmented Reality. Cardiovasc Intervent Radiol 2025:10.1007/s00270-025-04044-4. [PMID: 40295398 DOI: 10.1007/s00270-025-04044-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/03/2024] [Accepted: 04/03/2025] [Indexed: 04/30/2025]
Abstract
PURPOSE This study aimed to describe the workflow and evaluate the accuracy of a novel smartphone augmented reality (AR) application that includes an integrated needle guide, in a phantom. MATERIALS AND METHODS A smartphone cover with an integrated needle guide was designed and 3D-printed. An AR application for percutaneous application was developed, which integrated a projected needle path based on the rigid needle guide. After planning the needle path using this novel tool, the operator could place the needle through the guide to reach the target. Six lesions with out-of-plane entry points were targeted on an abdominal phantom. Timing and accuracy of needle placements were measured on post-procedural CT both using smartphone AR guidance and with a freehand approach. Results were compared using Wilcoxon rank sum and Pearson's chi-squared tests. RESULTS A total of 108 needle placements were performed by 9 physicians with widely varying experience. The median accuracy was 4 mm (IQR 3-6 mm) using the smartphone versus 18 mm (IQR 9-27 mm) for freehand (P < 0.001). Using the smartphone AR application, planning time was 91 s (IQR 71-151 s), and puncture time was 68 s (IQR 57-77 s). There was no difference in accuracy, planning, or puncture times according to experience level when using the AR tool. CONCLUSION This smartphone application with an integrated needle guide allows path planning and accurate needle placement on phantoms with real-time angular feedback in less than 3 min. This technology could promote standardization, reduce experience requirements, or provide accurate low-cost guidance in environments without procedural imaging for percutaneous interventions.
Collapse
Affiliation(s)
- Laetitia Saccenti
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
- Henri Mondor's Institute of Biomedical Research - Inserm, U955 Team N°18, Creteil, France
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
- Philips Research of North America, Cambridge, MA, USA
| | - Tabea Borde
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Hannah Huth
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Michael Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Ifechi Ukeh
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Ivan Bakhutashvili
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Ashley Golbus
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Lindsey Hazen
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Vania Tacher
- Henri Mondor's Institute of Biomedical Research - Inserm, U955 Team N°18, Creteil, France
- Department of Radiology, Henri Mondor University Hospital, Assistance Publique - Hopitaux de Paris (AP-HP), Creteil, France
| | - Hicham Kobeiter
- Henri Mondor's Institute of Biomedical Research - Inserm, U955 Team N°18, Creteil, France
- Department of Radiology, Henri Mondor University Hospital, Assistance Publique - Hopitaux de Paris (AP-HP), Creteil, France
| | - Keith Horton
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA
| | - Bradford Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Bethesda, MD, 20892, USA.
| |
Collapse
|
2
|
Rohmer K, Becker M, Georgiades M, March C, Melekh B, Sperka P, Spinczyk D, Wolińska-Sołtys A, Pech M. Acceptance and feasibility of an augmented reality-based navigation system with optical tracking for percutaneous procedures in interventional radiology - a simulation-based phantom study. ROFO-FORTSCHR RONTG 2024. [PMID: 39366404 DOI: 10.1055/a-2416-1080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/06/2024]
Abstract
Augmented reality (AR) projects additional information into the user's field of view during interventions. The aim was to evaluate the acceptance and clinical feasibility of an AR system and to compare users with different levels of experience. A system was examined that projects a CT-generated 3D model of a phantom into the field of view using a HoloLens 2, whereby the tracked needle is displayed and navigated live. A projected ultrasound image is used for live control of the needle positioning. This should minimize radiation exposure and improve orientation.The acceptance and usability of the AR navigation system was evaluated by 10 physicians and medical students with different levels of experience by performing punctures with the system in a phantom. The required time was then compared and a questionnaire was completed to assess clinical acceptance and feasibility. For statistical analysis, frequencies for qualitative characteristics, location and dispersion measures for quantitative characteristics and Spearman rank correlations for correlations were calculated.9 out of 10 subjects hit all 5 target regions in the first attempt, taking an average of 29:39 minutes for all punctures. There was a significant correlation between previous experience in interventional radiology, years in the profession, and the time required. Overall, the time varied from an average of 43:00 min. for medical students to 15:00 min. for chief physicians. All test subjects showed high acceptance of the system and rated especially the potential clinical feasibility, the simplification of the puncture, and the image quality positively. However, the majority require further training for sufficient safety in use.The system offers distinct advantages for navigation and orientation, facilitates percutaneous interventions during training and enables professionally experienced physicians to achieve short intervention times. In addition, the system improves ergonomics during the procedure by making important information always directly available in the field of view and has the potential to reduce the radiation exposure of staff in particular by combining AR and sonography and thus shortening CT-fluoroscopy times. · AR navigation offers advantages for orientation during percutaneous radiological interventions.. · The subjects would like to use the AR system in everyday clinical practice on patients.. · AR improves ergonomics by making important information directly available in the field of view.. · The combination of AR and sonography can significantly reduce radiation exposure for staff.. · Rohmer K, Becker M, Georgiades M et al. Acceptance and feasibility of an augmented reality-based navigation system with optical tracking for percutaneous procedures in interventional radiology - a simulation-based phantom study. Fortschr Röntgenstr 2024; DOI 10.1055/a-2416-1080.
Collapse
Affiliation(s)
- Karl Rohmer
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
| | - Mathias Becker
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
| | - Marilena Georgiades
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
| | - Christine March
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
| | - Bohdan Melekh
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
| | - Piotr Sperka
- Holo4Med Spółka S. A., Holo4Med Spółka S. A., Białystok, Poland
| | - Dominik Spinczyk
- Faculty of Biomedical Engineering/Department of Medical Informatics and Artificial Intelligence, Silesian University of Technology, Gliwice, Poland
- Holo4Med Spółka S. A., Holo4Med Spółka S. A., Białystok, Poland
| | | | - Maciej Pech
- University Clinic for Radiology and Nuclear Medicine, University Hospital Magdeburg, Magdeburg, Germany
- Research Campus STIMULATE, Otto von Guericke University Magdeburg, Magdeburg, Germany
| |
Collapse
|
3
|
Saccenti L, Bessy H, Ben Jedidia B, Longere B, Tortolano L, Derbel H, Luciani A, Kobeiter H, Grandpierre T, Tacher V. Performance Comparison of Augmented Reality Versus Ultrasound Guidance for Puncture: A Phantom Study. Cardiovasc Intervent Radiol 2024; 47:993-999. [PMID: 38710797 DOI: 10.1007/s00270-024-03727-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 04/02/2024] [Indexed: 05/08/2024]
Abstract
PURPOSE Augmented reality (AR) is an innovative approach that could assist percutaneous procedures; by directly seeing "through" a phantom, targeting a lesion might be more intuitive than using ultrasound (US). The objective of this study was to compare the performance of experienced interventional radiologists and operators untrained in soft tissue lesion puncture using AR guidance and standard US guidance. MATERIAL AND METHODS Three trained interventional radiologists with 5-10 years of experience and three untrained operators performed punctures of five targets in an abdominal phantom, with US guidance and AR guidance. Correct targeting, accuracy (defined as the Euclidean distance between the tip and the center of the target), planning time, and puncture time were documented. RESULTS Accuracy was higher for the trained group than the untrained group using US guidance (1 mm versus 4 mm, p = 0.001), but not when using AR guidance (4 mm vs. 4 mm, p = 0.76). All operators combined, no significant difference was found concerning accuracy between US and AR guidance (2 mm vs. 4 mm, p = 0.09), but planning time and puncture time were significantly shorter using AR (respectively, 15.1 s vs. 74 s, p < 0.001; 16.1 s vs. 59 s; p < 0.001). CONCLUSION Untrained and trained operators obtained comparable accuracy in percutaneous punctures when using AR guidance whereas US performance was better in the experienced group. All operators together, accuracy was similar between US and AR guidance, but shorter planning time, puncture time were found for AR guidance.
Collapse
Affiliation(s)
- Laetitia Saccenti
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France.
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France.
| | - Hugo Bessy
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
| | | | - Benjamin Longere
- Department of Cardiovascular Radiology, CHU Lille, 59000, Lille, France
| | | | - Haytham Derbel
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Alain Luciani
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Hicham Kobeiter
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Thierry Grandpierre
- Ecole superieure d'ingenieurs en electrotechnique et electronique, ESIEE Paris, Noisy Le Grand, France
| | - Vania Tacher
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| |
Collapse
|
4
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
5
|
Advances and Innovations in Ablative Head and Neck Oncologic Surgery Using Mixed Reality Technologies in Personalized Medicine. J Clin Med 2022; 11:jcm11164767. [PMID: 36013006 PMCID: PMC9410374 DOI: 10.3390/jcm11164767] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 08/10/2022] [Accepted: 08/12/2022] [Indexed: 11/17/2022] Open
Abstract
The benefit of computer-assisted planning in head and neck ablative and reconstructive surgery has been extensively documented over the last decade. This approach has been proven to offer a more secure surgical procedure. In the treatment of cancer of the head and neck, computer-assisted surgery can be used to visualize and estimate the location and extent of the tumor mass. Nowadays, some software tools even allow the visualization of the structures of interest in a mixed reality environment. However, the precise integration of mixed reality systems into a daily clinical routine is still a challenge. To date, this technology is not yet fully integrated into clinical settings such as the tumor board, surgical planning for head and neck tumors, or medical and surgical education. As a consequence, the handling of these systems is still of an experimental nature, and decision-making based on the presented data is not yet widely used. The aim of this paper is to present a novel, user-friendly 3D planning and mixed reality software and its potential application for ablative and reconstructive head and neck surgery.
Collapse
|
6
|
Abstract
Although substantial advancements have been achieved in robot-assisted surgery, the blueprint to existing snake robotics predominantly focuses on the preliminary structural design, control, and human–robot interfaces, with features which have not been particularly explored in the literature. This paper aims to conduct a review of planning and operation concepts of hyper-redundant serpentine robots for surgical use, as well as any future challenges and solutions for better manipulation. Current researchers in the field of the manufacture and navigation of snake robots have faced issues, such as a low dexterity of the end-effectors around delicate organs, state estimation and the lack of depth perception on two-dimensional screens. A wide range of robots have been analysed, such as the i²Snake robot, inspiring the use of force and position feedback, visual servoing and augmented reality (AR). We present the types of actuation methods, robot kinematics, dynamics, sensing, and prospects of AR integration in snake robots, whilst addressing their shortcomings to facilitate the surgeon’s task. For a smoother gait control, validation and optimization algorithms such as deep learning databases are examined to mitigate redundancy in module linkage backlash and accidental self-collision. In essence, we aim to provide an outlook on robot configurations during motion by enhancing their material compositions within anatomical biocompatibility standards.
Collapse
|
7
|
Guo Z, Tai Y, Du J, Chen Z, Li Q, Shi J. Automatically Addressing System for Ultrasound-Guided Renal Biopsy Training Based on Augmented Reality. IEEE J Biomed Health Inform 2021; 25:1495-1507. [PMID: 33684049 DOI: 10.1109/jbhi.2021.3064308] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Chronic kidney disease has become one of the diseases with the highest morbidity and mortality in kidney diseases, and there are still some problems in surgery. During the operation, the surgeon can only operate on two-dimensional ultrasound images and cannot determine the spatial position relationship between the lesion and the medical puncture needle in real-time. The average number of punctures per patient will reach 3 to 4, Increasing the incidence of complications after a puncture. This article starts with ultrasound-guided renal biopsy navigation training, optimizes puncture path planning, and puncture training assistance. The augmented reality technology, combined with renal puncture surgery training was studied. This paper develops a prototype ultrasound-guided renal biopsy surgery training system, which improves the accuracy and reliability of the system training. The system is compared with the VR training system. The results show that the augmented reality training platform is more suitable as a surgical training platform. Because it takes a short time and has a good training effect.
Collapse
|