1
|
Autelitano M, Cattari N, Carbone M, Cutolo F, Montemurro N, Cigna E, Ferrari V. Augmented reality for rhinoplasty: 3D scanning and projected AR for intraoperative planning validation. Healthc Technol Lett 2025; 12:e12116. [PMID: 39816704 PMCID: PMC11730711 DOI: 10.1049/htl2.12116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2024] [Accepted: 11/25/2024] [Indexed: 01/18/2025] Open
Abstract
Rhinoplasty is one of the major surgical procedures most popular and it is generally performed modelling the internal bones and cartilage using a closed approach to reduce the damage of soft tissue, whose final shape is determined by means of their new settlement over the internal remodelled rigid structures. An optimal planning, achievable thanks to advanced acquisition of 3D images and thanks to the virtual simulation of the intervention via specific software. Anyway, the final result depends also on factors that cannot be totally predicted regarding the settlement of soft tissues on the rigid structures, and a final objective check would be useful to eventually perform some adjustments before to conclude the intervention. The main idea of the present work is the using of 3D scan to acquire directly in the surgical room the final shape of the nose and to show the surgeon the differences respect to the planning in an intuitive way using augmented reality (AR) to show false colours directly over the patient face. This work motivates the selection of the devices integrated in our system, both from a technical and an ergonomic point of view, whose global error, evaluated on an anthropomorphic phantom, is lower than ± 1.2 mm with a confidence interval of 95%, while the mean error in detecting depth thickness variations is 0.182 mm.
Collapse
Affiliation(s)
- Martina Autelitano
- Department of Information EngineeringUniversity of PisaPisaItaly
- EndoCAS CenterUniversity of PisaPisaItaly
| | - Nadia Cattari
- Department of Information EngineeringUniversity of PisaPisaItaly
- EndoCAS CenterUniversity of PisaPisaItaly
| | - Marina Carbone
- Department of Information EngineeringUniversity of PisaPisaItaly
- EndoCAS CenterUniversity of PisaPisaItaly
| | - Fabrizio Cutolo
- Department of Information EngineeringUniversity of PisaPisaItaly
- EndoCAS CenterUniversity of PisaPisaItaly
| | - Nicola Montemurro
- Department of NeuroscienceAzienda Ospedaliero Universitaria Pisana (AOUP)PisaItaly
| | - Emanuele Cigna
- EndoCAS CenterUniversity of PisaPisaItaly
- Plastic Surgery and Microsurgery UnitDepartment of Translational Research and New Technologies in Medicine and SurgeryUniversity of PisaPisaItaly
| | - Vincenzo Ferrari
- Department of Information EngineeringUniversity of PisaPisaItaly
- EndoCAS CenterUniversity of PisaPisaItaly
| |
Collapse
|
2
|
Keramati H, Lu X, Cabanag M, Wu L, Kushwaha V, Beier S. Applications and advances of immersive technology in cardiology. Curr Probl Cardiol 2024; 49:102762. [PMID: 39067719 DOI: 10.1016/j.cpcardiol.2024.102762] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2024] [Accepted: 07/23/2024] [Indexed: 07/30/2024]
Abstract
Different forms of immersive technology, such as Virtual Reality (VR) and Augmented Reality (AR), are getting increasingly invested in medicine. Advances in head-mounted display technology, processing, and rendering power have demonstrated the increasing utility of immersive technology in medicine and the healthcare environment. There are a growing number of publications on using immersive technology in cardiology. We reviewed the articles published within the last decade that reported case studies or research that uses or investigates the application of immersive technology in the broad field of cardiology - from education to preoperative planning and intraoperative guidance. We summarized the advantages and disadvantages of using AR and VR for various application categories. Our review highlights the need for a robust assessment of the effectiveness of the methods and discusses the technical limitations that hinder the complete integration of AR and VR in cardiology, including cost-effectiveness and regulatory compliance. Despite the limitations and gaps that have inhibited us from benefiting from immersive technologies' full potential in cardiology settings to date, its promising, impactful future for standard cardiovascular care is undoubted.
Collapse
Affiliation(s)
- Hamed Keramati
- School of Mechanical and Manufacturing Engineering, Faculty of Engineering, The University of New South Wales, Sydney 2052, NSW, Australia.
| | - Xueqing Lu
- Learning and Digital Environments, Deputy Vice-Chancellor Education and Student Experience, The University of New South Wales, Sydney 2052, NSW, Australia
| | - Matt Cabanag
- School of Art and Design, Faculty of Arts, Design and Architecture, The University of New South Wales, Sydney 2052, NSW, Australia
| | - Liao Wu
- School of Mechanical and Manufacturing Engineering, Faculty of Engineering, The University of New South Wales, Sydney 2052, NSW, Australia
| | - Virag Kushwaha
- Eastern Heart Clinic, Prince of Wales Hospital, Barker Street Randwick, NSW 2031, Australia; Faculty of Medicine, The University of New South Wales, Kensington, Sydney 2033, NSW, Australia
| | - Susann Beier
- School of Mechanical and Manufacturing Engineering, Faculty of Engineering, The University of New South Wales, Sydney 2052, NSW, Australia
| |
Collapse
|
3
|
Ding H, Sun W, Zheng G. Robot-Assisted Augmented Reality (AR)-Guided Surgical Navigation for Periacetabular Osteotomy. SENSORS (BASEL, SWITZERLAND) 2024; 24:4754. [PMID: 39066150 PMCID: PMC11280818 DOI: 10.3390/s24144754] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2024] [Revised: 07/11/2024] [Accepted: 07/20/2024] [Indexed: 07/28/2024]
Abstract
Periacetabular osteotomy (PAO) is an effective approach for the surgical treatment of developmental dysplasia of the hip (DDH). However, due to the complex anatomical structure around the hip joint and the limited field of view (FoV) during the surgery, it is challenging for surgeons to perform a PAO surgery. To solve this challenge, we propose a robot-assisted, augmented reality (AR)-guided surgical navigation system for PAO. The system mainly consists of a robot arm, an optical tracker, and a Microsoft HoloLens 2 headset, which is a state-of-the-art (SOTA) optical see-through (OST) head-mounted display (HMD). For AR guidance, we propose an optical marker-based AR registration method to estimate a transformation from the optical tracker coordinate system (COS) to the virtual space COS such that the virtual models can be superimposed on the corresponding physical counterparts. Furthermore, to guide the osteotomy, the developed system automatically aligns a bone saw with osteotomy planes planned in preoperative images. Then, it provides surgeons with not only virtual constraints to restrict movement of the bone saw but also AR guidance for visual feedback without sight diversion, leading to higher surgical accuracy and improved surgical safety. Comprehensive experiments were conducted to evaluate both the AR registration accuracy and osteotomy accuracy of the developed navigation system. The proposed AR registration method achieved an average mean absolute distance error (mADE) of 1.96 ± 0.43 mm. The robotic system achieved an average center translation error of 0.96 ± 0.23 mm, an average maximum distance of 1.31 ± 0.20 mm, and an average angular deviation of 3.77 ± 0.85°. Experimental results demonstrated both the AR registration accuracy and the osteotomy accuracy of the developed system.
Collapse
Affiliation(s)
| | | | - Guoyan Zheng
- Institute of Medical Robotics, School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China; (H.D.); (W.S.)
| |
Collapse
|
4
|
Armstrong DG, Bazikian S, Armstrong AA, Clerici G, Casini A, Pillai A. An Augmented Vision of Our Medical and Surgical Future, Today? J Diabetes Sci Technol 2024; 18:968-973. [PMID: 38439541 PMCID: PMC11307216 DOI: 10.1177/19322968241236458] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
Incorporating consumer electronics into the operating room, we evaluated the Apple Vision Pro (AVP) during limb preservation surgeries, just as we evaluated Google Glass and FaceTime more than a decade ago. Although AVP's real-time mixed-reality data overlay and controls offer potential enhancements to surgical precision and team communication, our assessment recognized limitations in adapting consumer technology to clinical environments. The initial use facilitated intraoperative decision-making and educational interactions with trainees. The current mixed-reality pass-through resolution allows for input but not for highly dexterous surgical interactions. These early observations indicate that while AVP may soon improve aspects of surgical performance and education, further iteration, evaluation, and experience are needed to fully understand its impact on patient outcomes and to refine its integration into clinical practice.
Collapse
Affiliation(s)
- David G. Armstrong
- Southwestern Academic Limb Salvage Alliance (SALSA), Department of Surgery, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Sebouh Bazikian
- Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | | | | | - Andrea Casini
- Diabetic Foot Unit, Policlinico Abano Terme, Padua, Italy
| | - Anand Pillai
- Manchester University NHS Foundation Trust, Manchester, UK
| |
Collapse
|
5
|
Tătaru OS, Ferro M, Marchioni M, Veccia A, Coman O, Lasorsa F, Brescia A, Crocetto F, Barone B, Catellani M, Lazar A, Petrisor M, Vartolomei MD, Lucarelli G, Antonelli A, Schips L, Autorino R, Rocco B, Azamfirei L. HoloLens ® platform for healthcare professionals simulation training, teaching, and its urological applications: an up-to-date review. Ther Adv Urol 2024; 16:17562872241297554. [PMID: 39654822 PMCID: PMC11626676 DOI: 10.1177/17562872241297554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2023] [Accepted: 10/15/2024] [Indexed: 12/12/2024] Open
Abstract
The advancements of technological devices and software are putting mixed reality in the frontline of teaching medical personnel. The Microsoft® HoloLens 2® offers a unique 3D visualization of a hologram in a physical, real environment and allows the urologists to interact with it. This review provides a state-of-the-art analysis of the applications of the HoloLens® in a medical and healthcare context of teaching through simulation designed for medical students, nurses, residents especially in urology. Our objective has been to perform a comprehensively analysis of the studies in PubMed/Medline database from January 2016 to April 2023. The identified articles that researched Microsoft HoloLens, having description of feasibility and teaching outcomes in medicine with an emphasize in urological healthcare, have been included. The qualitative analysis performed identifies an increasing use of HoloLens in a teaching setting that covers a great area of expertise in medical sciences (anatomy, anatomic pathology, biochemistry, pharmacogenomics, clinical skills, emergency medicine and nurse education, imaging), and above these urology applications (urological procedures and technique, skill improvement, perception of complex renal tumors, accuracy of calyx puncture guidance in percutaneous nephrolithotomy and targeted biopsy of the prostate) can mostly benefit from it. The future potential of HoloLens technology in teaching is immense. So far, studies have focused on feasibility, applicability, perception, comparisons with traditional methods, and limitations. Moving forward, research should also prioritize the development of applications specifically for urology. This will require validation of needs and the creation of adequate protocols to standardize future research efforts.
Collapse
Affiliation(s)
- Octavian Sabin Tătaru
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Matteo Ferro
- Istituto Europeo di Oncologia, IRCCS—Istituto di Ricovero e Cura a Carattere Scientifico, via Ripamonti 435 Milano, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Michele Marchioni
- Department of Medical, Oral and Biotechnological Sciences, G. d’Annunzio, University of Chieti, Urology Unit, “SS. Annunziata” Hospital, Chieti, Italy; Department of Urology, ASL Abruzzo 2, Chieti, Italy
| | - Alessandro Veccia
- Department of Urology, University of Verona, Azienda Ospedaliera Universitaria Integrata of Verona, Verona, Italy
| | - Oana Coman
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Francesco Lasorsa
- Department of Emergency and Organ Transplantation, Urology, Andrology and Kidney Transplantation Unit, University of Bari, Bari, Italy
| | - Antonio Brescia
- Department of Urology, European Institute of Oncology, IRCCS, Milan, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Felice Crocetto
- Department of Neurosciences and Reproductive Sciences and Odontostomatology, University of Naples Federico II, Naples, Italy
| | - Biagio Barone
- Department of Surgical Sciences, Urology Unit, AORN Sant’Anna e San Sebastiano, Caserta, Italy
| | - Michele Catellani
- Department of Urology, European Institute of Oncology, IRCCS, Milan, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Alexandra Lazar
- Department of Anesthesia and Intensive Care, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Marius Petrisor
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | | | - Giuseppe Lucarelli
- Department of Emergency and Organ Transplantation, Urology, Andrology and Kidney Transplantation Unit, University of Bari, Bari, Italy
| | - Alessandro Antonelli
- Department of Urology, Azienda Ospedaliera Universitaria Integrata of Verona, University of Verona, Verona, Italy
| | - Luigi Schips
- Department of Medical, Oral and Biotechnological Sciences, G. d’Annunzio, University of Chieti, Urology Unit, “SS. Annunziata” Hospital, Chieti, Italy'
- Department of Urology, ASL Abruzzo 2, Chieti, Italy
| | - Riccardo Autorino
- Department of Urology, Rush University Medical Center, Chicago, IL, USA
| | - Bernardo Rocco
- Unit of Urology, Department of Health Science, University of Milan, ASST Santi Paolo and Carlo, Milan, Italy
- Matteo Ferro is also affiliated to Unit of Urology, Department of Health Science, University of Milan, ASST Santi Paolo and Carlo, Milan, Italy
- Bernardo Rocco is also affiliated to U.O.C. Clinica Urologica, Dipartimento Universitario di Medicina e Chirurgia Traslazionale Fondazione Policlinico Universitario, IRCCS, Rome, Italy; Università Cattolica del Sacro Cuore, Milan, Italy
- Giuseppe Lucarelli is also affiliated to Department of Precision and Regenerative Medicine and Ionian Area Urology, Andrology and Kidney Transplantation Unit, Aldo Moro University of Bari, Bari, Italy
| | - Leonard Azamfirei
- Department of Anesthesia and Intensive Care, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| |
Collapse
|
6
|
Verhellen A, Elprama SA, Scheerlinck T, Van Aerschot F, Duerinck J, Van Gestel F, Frantz T, Jansen B, Vandemeulebroucke J, Jacobs A. Exploring technology acceptance of head-mounted device-based augmented reality surgical navigation in orthopaedic surgery. Int J Med Robot 2023:e2585. [PMID: 37830305 DOI: 10.1002/rcs.2585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Revised: 09/18/2023] [Accepted: 09/28/2023] [Indexed: 10/14/2023]
Abstract
BACKGROUND This study used the Unified Theory of Acceptance and Use of Technology (UTAUT) to investigate the acceptance of HMD-based AR surgical navigation. METHODS An experiment was conducted in which participants drilled 12 predefined holes using freehand drilling, proprioceptive control, and AR assistance. Technology acceptance was assessed through a survey and non-participant observations. RESULTS Participants' intention to use AR correlated (p < 0.05) with social influence (Spearman's rho (rs) = 0.599), perceived performance improvement (rs = 0.592) and attitude towards AR (rs = 0.542). CONCLUSIONS While most participants acknowledged the potential of AR, they also highlighted persistent barriers to adoption, such as issues related to user-friendliness, time efficiency and device discomfort. To overcome these challenges, future AR surgical navigation systems should focus on enhancing surgical performance while minimising disruptions to workflows and operating times. Engaging orthopaedic surgeons in the development process can facilitate the creation of tailored solutions and accelerate adoption.
Collapse
Affiliation(s)
| | | | - Thierry Scheerlinck
- Department of Orthopedic Surgery and Traumatology - Research Group BEFY-ORTHO, Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Fiene Van Aerschot
- Department of Orthopedic Surgery and Traumatology - Research Group BEFY-ORTHO, Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Johnny Duerinck
- Department of Neurosurgery-Research Group Center for Neurosciences (C4N-NEUR), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Frederick Van Gestel
- Department of Neurosurgery-Research Group Center for Neurosciences (C4N-NEUR), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Taylor Frantz
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussel, Belgium
| | - Bart Jansen
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussel, Belgium
| | - Jef Vandemeulebroucke
- Department of Radiology - Department of Electronics and Informatics (ETRO), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel - Imec, Brussel, Belgium
| | - An Jacobs
- IMEC-SMIT, Vrije Universiteit, Brussel, Belgium
| |
Collapse
|
7
|
Tzelnick S, Rampinelli V, Sahovaler A, Franz L, Chan HHL, Daly MJ, Irish JC. Skull-Base Surgery-A Narrative Review on Current Approaches and Future Developments in Surgical Navigation. J Clin Med 2023; 12:2706. [PMID: 37048788 PMCID: PMC10095207 DOI: 10.3390/jcm12072706] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2023] [Revised: 03/10/2023] [Accepted: 03/29/2023] [Indexed: 04/07/2023] Open
Abstract
Surgical navigation technology combines patient imaging studies with intraoperative real-time data to improve surgical precision and patient outcomes. The navigation workflow can also include preoperative planning, which can reliably simulate the intended resection and reconstruction. The advantage of this approach in skull-base surgery is that it guides access into a complex three-dimensional area and orients tumors intraoperatively with regard to critical structures, such as the orbit, carotid artery and brain. This enhances a surgeon's capabilities to preserve normal anatomy while resecting tumors with adequate margins. The aim of this narrative review is to outline the state of the art and the future directions of surgical navigation in the skull base, focusing on the advantages and pitfalls of this technique. We will also present our group experience in this field, within the frame of the current research trends.
Collapse
Affiliation(s)
- Sharon Tzelnick
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Vittorio Rampinelli
- Unit of Otorhinolaryngology—Head and Neck Surgery, Department of Medical and Surgical Specialties, Radiologic Sciences and Public Health, University of Brescia, 25121 Brescia, Italy
- Technology for Health (PhD Program), Department of Information Engineering, University of Brescia, 25121 Brescia, Italy
| | - Axel Sahovaler
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
- Head & Neck Surgery Unit, University College London Hospitals, London NW1 2PG, UK
| | - Leonardo Franz
- Department of Neuroscience DNS, Otolaryngology Section, University of Padova, 35122 Padua, Italy
| | - Harley H. L. Chan
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Michael J. Daly
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Jonathan C. Irish
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| |
Collapse
|
8
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 30] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
9
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
10
|
Eves J, Sudarsanam A, Shalhoub J, Amiras D. Augmented Reality in Vascular and Endovascular Surgery: Scoping Review. JMIR Serious Games 2022; 10:e34501. [PMID: 36149736 PMCID: PMC9547335 DOI: 10.2196/34501] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 04/22/2022] [Accepted: 06/23/2022] [Indexed: 11/22/2022] Open
Abstract
BACKGROUND Technological advances have transformed vascular intervention in recent decades. In particular, improvements in imaging and data processing have allowed for the development of increasingly complex endovascular and hybrid interventions. Augmented reality (AR) is a subject of growing interest in surgery, with the potential to improve clinicians' understanding of 3D anatomy and aid in the processing of real-time information. This study hopes to elucidate the potential impact of AR technology in the rapidly evolving fields of vascular and endovascular surgery. OBJECTIVE The aim of this review is to summarize the fundamental concepts of AR technologies and conduct a scoping review of the impact of AR and mixed reality in vascular and endovascular surgery. METHODS A systematic search of MEDLINE, Scopus, and Embase was performed in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. All studies written in English from inception until January 8, 2021, were included in the search. Combinations of the following keywords were used in the systematic search string: ("augmented reality" OR "hololens" OR "image overlay" OR "daqri" OR "magic leap" OR "immersive reality" OR "extended reality" OR "mixed reality" OR "head mounted display") AND ("vascular surgery" OR "endovascular"). Studies were selected through a blinded process between 2 investigators (JE and AS) and assessed using data quality tools. RESULTS AR technologies have had a number of applications in vascular and endovascular surgery. Most studies (22/32, 69%) used 3D imaging of computed tomography angiogram-derived images of vascular anatomy to augment clinicians' anatomical understanding during procedures. A wide range of AR technologies were used, with heads up fusion imaging and AR head-mounted displays being the most commonly applied clinically. AR applications included guiding open, robotic, and endovascular surgery while minimizing dissection, improving procedural times, and reducing radiation and contrast exposure. CONCLUSIONS AR has shown promising developments in the field of vascular and endovascular surgery, with potential benefits to surgeons and patients alike. These include reductions in patient risk and operating times as well as in contrast and radiation exposure for radiological interventions. Further technological advances are required to overcome current limitations, including processing capacity and vascular deformation by instrumentation.
Collapse
Affiliation(s)
- Joshua Eves
- Imperial Vascular Unit, Imperial College Healthcare NHS Trust, London, United Kingdom
| | - Abhilash Sudarsanam
- Imperial Vascular Unit, Imperial College Healthcare NHS Trust, London, United Kingdom
| | - Joseph Shalhoub
- Imperial Vascular Unit, Imperial College Healthcare NHS Trust, London, United Kingdom
- Department of Surgery & Cancer, Imperial College London, London, United Kingdom
| | - Dimitri Amiras
- Department of Surgery & Cancer, Imperial College London, London, United Kingdom
- Department of Radiology, Imperial College Healthcare NHS Trust, London, United Kingdom
| |
Collapse
|
11
|
Condino S, Piazza R, Carbone M, Bath J, Troisi N, Ferrari M, Berchiolli R. Bioengineering, augmented reality, and robotic surgery in vascular surgery: A literature review. Front Surg 2022; 9:966118. [PMID: 36061062 PMCID: PMC9437582 DOI: 10.3389/fsurg.2022.966118] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2022] [Accepted: 08/04/2022] [Indexed: 12/20/2022] Open
Abstract
Biomedical engineering integrates a variety of applied sciences with life sciences to improve human health and reduce the invasiveness of surgical procedures. Technological advances, achieved through biomedical engineering, have contributed to significant improvements in the field of vascular and endovascular surgery. This paper aims to review the most cutting-edge technologies of the last decade involving the use of augmented reality devices and robotic systems in vascular surgery, highlighting benefits and limitations. Accordingly, two distinct literature surveys were conducted through the PubMed database: the first review provides a comprehensive assessment of augmented reality technologies, including the different techniques available for the visualization of virtual content (11 papers revised); the second review collects studies with bioengineering content that highlight the research trend in robotic vascular surgery, excluding works focused only on the clinical use of commercially available robotic systems (15 papers revised). Technological flow is constant and further advances in imaging techniques and hardware components will inevitably bring new tools for a clinical translation of innovative therapeutic strategies in vascular surgery.
Collapse
Affiliation(s)
- Sara Condino
- Department of Information Engineering, University of Pisa, Pisa, Italy
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - Roberta Piazza
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - Marina Carbone
- Department of Information Engineering, University of Pisa, Pisa, Italy
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
- Correspondence: Marina Carbone
| | - Jonathan Bath
- Division of Vascular Surgery, University of Missouri, Columbia, MO, United States
| | - Nicola Troisi
- Vascular Surgery Unit, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - Mauro Ferrari
- Vascular Surgery Unit, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - Raffaella Berchiolli
- Vascular Surgery Unit, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| |
Collapse
|
12
|
Projected cutting guides using an augmented reality system to improve surgical margins in maxillectomies: A preclinical study. Oral Oncol 2022; 127:105775. [DOI: 10.1016/j.oraloncology.2022.105775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 02/03/2022] [Accepted: 02/13/2022] [Indexed: 11/21/2022]
|
13
|
Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. INFORMATION 2022. [DOI: 10.3390/info13020081] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
In the context of image-guided surgery, augmented reality (AR) represents a ground-breaking enticing improvement, mostly when paired with wearability in the case of open surgery. Commercially available AR head-mounted displays (HMDs), designed for general purposes, are increasingly used outside their indications to develop surgical guidance applications with the ambition to demonstrate the potential of AR in surgery. The applications proposed in the literature underline the hunger for AR-guidance in the surgical room together with the limitations that hinder commercial HMDs from being the answer to such a need. The medical domain demands specifically developed devices that address, together with ergonomics, the achievement of surgical accuracy objectives and compliance with medical device regulations. In the framework of an EU Horizon2020 project, a hybrid video and optical see-through augmented reality headset paired with a software architecture, both specifically designed to be seamlessly integrated into the surgical workflow, has been developed. In this paper, the overall architecture of the system is described. The developed AR HMD surgical navigation platform was positively tested on seven patients to aid the surgeon while performing Le Fort 1 osteotomy in cranio-maxillofacial surgery, demonstrating the value of the hybrid approach and the safety and usability of the navigation platform.
Collapse
|
14
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
15
|
Study on augmented reality for robotic surgery bedside assistants. J Robot Surg 2021; 16:1019-1026. [PMID: 34762249 DOI: 10.1007/s11701-021-01335-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Accepted: 10/29/2021] [Indexed: 10/19/2022]
Abstract
Robotic surgery bedside assistants play an important role in robotic procedures by performing intra-corporeal tasks while accommodating the physical presence of the robot. We hypothesized that an augmented reality headset enabling 3D intra-corporeal vision while facing the surgical field could decrease time and improve accuracy of robotic bedside tasks. Bedside assistants (one physician assistant, one medical student, three surgical trainees, and two attending surgeons) performed validated tasks within a mock abdominal cavity with a surgical robot docked. Tasks were performed with a bedside monitor providing 2D or 3D vision, or an optical see-through head-mounted augmented reality device with 2D or 3D vision. The effect of augmented reality device resolution on performance was also evaluated. For the simplest task of touching a straw, performance was generally high, regardless of mode of visualization. With more complex tasks, including stapling and pulling a ring along a path, 3D augmented reality decreased time and number of errors per task. 3D augmented reality allowed the physician assistant to perform at the level of an attending surgeon using 3D augmented reality (p = 0.08). All participants had improved times for the ring path task with better resolution (lower resolution 23 ± 11 s vs higher resolution 14 ± 4 s, p = 0.002). 3D augmented reality vision with high resolution decreased time and improved accuracy of more complex tasks, enabling a less experienced robotic surgical bedside assistant to function similar to attending surgeons. These data warrant further study with additional complex tasks and bedside assistants at various levels of training.
Collapse
|
16
|
Hu X, Baena FRY, Cutolo F. Head-Mounted Augmented Reality Platform for Markerless Orthopaedic Navigation. IEEE J Biomed Health Inform 2021; 26:910-921. [PMID: 34115600 DOI: 10.1109/jbhi.2021.3088442] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Visual augmented reality (AR) has the potential to improve the accuracy, efficiency and reproducibility of computer-assisted orthopaedic surgery (CAOS). AR Head-mounted displays (HMDs) further allow non-eye-shift target observation and egocentric view. Recently, a markerless tracking and registration (MTR) algorithm was proposed to avoid the artificial markers that are conventionally pinned into the target anatomy for tracking, as their use prolongs surgical workflow, introduces human-induced errors, and necessitates additional surgical invasion in patients. However, such an MTR-based method has neither been explored for surgical applications nor integrated into current AR HMDs, making the ergonomic HMD-based markerless AR CAOS navigation hard to achieve. To these aims, we present a versatile, device-agnostic and accurate HMD-based AR platform. Our software platform, supporting both video see-through (VST) and optical see-through (OST) modes, integrates two proposed fast calibration procedures using a specially designed calibration tool. According to the camera-based evaluation, our AR platform achieves a display error of 6.31 2.55 arcmin for VST and 7.72 3.73 arcmin for OST. A proof-of-concept markerless surgical navigation system to assist in femoral bone drilling was then developed based on the platform and Microsoft HoloLens 1. According to the user study, both VST and OST markerless navigation systems are reliable, with the OST system providing the best usability. The measured navigation error is 4.90 1.04 mm, 5.96 2.22 for VST system and 4.36 0.80 mm, 5.65 1.42 for OST system.
Collapse
|
17
|
Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031228] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Augmented reality (AR)-based surgical navigation may offer new possibilities for safe and accurate surgical execution of complex osteotomies. In this study we investigated the feasibility of navigating the periacetabular osteotomy of Ganz (PAO), known as one of the most complex orthopedic interventions, on two cadaveric pelves under realistic operating room conditions. Preoperative planning was conducted on computed tomography (CT)-reconstructed 3D models using an in-house developed software, which allowed creating cutting plane objects for planning of the osteotomies and reorientation of the acetabular fragment. An AR application was developed comprising point-based registration, motion compensation and guidance for osteotomies as well as fragment reorientation. Navigation accuracy was evaluated on CT-reconstructed 3D models, resulting in an error of 10.8 mm for osteotomy starting points and 5.4° for osteotomy directions. The reorientation errors were 6.7°, 7.0° and 0.9° for the x-, y- and z-axis, respectively. Average postoperative error of LCE angle was 4.5°. Our study demonstrated that the AR-based execution of complex osteotomies is feasible. Fragment realignment navigation needs further improvement, although it is more accurate than the state of the art in PAO surgery.
Collapse
|
18
|
Cercenelli L, Carbone M, Condino S, Cutolo F, Marcelli E, Tarsitano A, Marchetti C, Ferrari V, Badiali G. The Wearable VOSTARS System for Augmented Reality-Guided Surgery: Preclinical Phantom Evaluation for High-Precision Maxillofacial Tasks. J Clin Med 2020; 9:jcm9113562. [PMID: 33167432 PMCID: PMC7694536 DOI: 10.3390/jcm9113562] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Revised: 10/29/2020] [Accepted: 11/03/2020] [Indexed: 12/19/2022] Open
Abstract
BACKGROUND In the context of guided surgery, augmented reality (AR) represents a groundbreaking improvement. The Video and Optical See-Through Augmented Reality Surgical System (VOSTARS) is a new AR wearable head-mounted display (HMD), recently developed as an advanced navigation tool for maxillofacial and plastic surgery and other non-endoscopic surgeries. In this study, we report results of phantom tests with VOSTARS aimed to evaluate its feasibility and accuracy in performing maxillofacial surgical tasks. METHODS An early prototype of VOSTARS was used. Le Fort 1 osteotomy was selected as the experimental task to be performed under VOSTARS guidance. A dedicated set-up was prepared, including the design of a maxillofacial phantom, an ad hoc tracker anchored to the occlusal splint, and cutting templates for accuracy assessment. Both qualitative and quantitative assessments were carried out. RESULTS VOSTARS, used in combination with the designed maxilla tracker, showed excellent tracking robustness under operating room lighting. Accuracy tests showed that 100% of Le Fort 1 trajectories were traced with an accuracy of ±1.0 mm, and on average, 88% of the trajectory's length was within ±0.5 mm accuracy. CONCLUSIONS Our preliminary results suggest that the VOSTARS system can be a feasible and accurate solution for guiding maxillofacial surgical tasks, paving the way to its validation in clinical trials and for a wide spectrum of maxillofacial applications.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab—Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy;
- Correspondence: ; Tel.: +39-0516364603
| | - Marina Carbone
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (M.C.); (S.C.); (F.C.); (V.F.)
| | - Sara Condino
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (M.C.); (S.C.); (F.C.); (V.F.)
| | - Fabrizio Cutolo
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (M.C.); (S.C.); (F.C.); (V.F.)
| | - Emanuela Marcelli
- eDIMES Lab—Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy;
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Sciences and S. Orsola-Malpighi Hospital, University of Bologna, 40138 Bologna, Italy; (A.T.); (C.M.); (G.B.)
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Sciences and S. Orsola-Malpighi Hospital, University of Bologna, 40138 Bologna, Italy; (A.T.); (C.M.); (G.B.)
| | - Vincenzo Ferrari
- Information Engineering Department, University of Pisa, 56126 Pisa, Italy; (M.C.); (S.C.); (F.C.); (V.F.)
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Sciences and S. Orsola-Malpighi Hospital, University of Bologna, 40138 Bologna, Italy; (A.T.); (C.M.); (G.B.)
| |
Collapse
|