1
|
Strong EB, Patel A, Marston AP, Sadegh C, Potts J, Johnston D, Ahn D, Bryant S, Li M, Raslan O, Lucero SA, Fischer MJ, Zwienenberg M, Sharma N, Thieringer F, El Amm C, Shahlaie K, Metzger M, Strong EB. Augmented Reality Navigation in Craniomaxillofacial/Head and Neck Surgery. OTO Open 2025; 9:e70108. [PMID: 40224293 PMCID: PMC11986686 DOI: 10.1002/oto2.70108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2025] [Accepted: 03/15/2025] [Indexed: 04/15/2025] Open
Abstract
Objective This study aims to (1) develop an augmented reality (AR) navigation platform for craniomaxillofacial (CMF) and head and neck surgery; (2) apply it to a range of surgical cases; and (3) evaluate the advantages, disadvantages, and clinical opportunities for AR navigation. Study Design A multi-center retrospective case series. Setting Four tertiary care academic centers. Methods A novel AR navigation platform was collaboratively developed with Xironetic and deployed intraoperatively using only a head-mounted display (Microsoft HoloLens 2). Virtual surgical plans were generated from computed tomography/magnetic resonance imaging data and uploaded onto the AR platform. A reference array was mounted to the patient, and the virtual plan was registered to the patient intraoperatively. A retrospective review of all AR-navigated CMF cases since September 2023 was performed. Results Thirty-three cases were reviewed and classified as either trauma, orthognathic, tumor, or craniofacial. The AR platform had several advantages over traditional navigation including real-time 3D visualization of the surgical plan, identification of critical structures, and real-time tracking. Furthermore, this case series presents the first-known examples of (1) AR instrument tracking for midface osteotomies, (2) AR tracking of the zygomaticomaxillary complex during fracture reduction, (3) mandibular tracking in orthognathic surgery, (4) AR fibula cutting guides for mandibular reconstruction, and (5) integration of real-time infrared visualization in an AR headset for vasculature identification. Conclusion While still a developing technology, AR navigation provides several advantages over traditional navigation for CMF and head and neck surgery, including heads up, interactive 3D visualization of the surgical plan, identification of critical anatomy, and real-time tracking.
Collapse
Affiliation(s)
- E. Brandon Strong
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Anuj Patel
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Alexander P. Marston
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Cameron Sadegh
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Jeffrey Potts
- Department of Plastic and Reconstructive SurgeryUniversity of OklahomaOklahoma CityOklahomaUSA
| | - Darin Johnston
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - David Ahn
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - Shae Bryant
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - Michael Li
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Osama Raslan
- Department of RadiologyUniversity of California, DavisDavisCaliforniaUSA
| | - Steven A. Lucero
- Department of Biomedical EngineeringUniversity of California, DavisDavisCaliforniaUSA
| | - Marc J. Fischer
- Department of Computer ScienceTechnical University of MunichMunichGermany
| | - Marike Zwienenberg
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Neha Sharma
- Clinic of Oral and Craniomaxillofacial SurgeryUniversity Hospital BaselBaselSwitzerland
- Medical Additive Manufacturing (Swiss MAM) Research Group, Department of Biomedical EngineeringUniversity of BaselBaselSwitzerland
| | - Florian Thieringer
- Clinic of Oral and Craniomaxillofacial SurgeryUniversity Hospital BaselBaselSwitzerland
- Medical Additive Manufacturing (Swiss MAM) Research Group, Department of Biomedical EngineeringUniversity of BaselBaselSwitzerland
| | - Christian El Amm
- Department of Plastic and Reconstructive SurgeryUniversity of OklahomaOklahoma CityOklahomaUSA
| | - Kiarash Shahlaie
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Marc Metzger
- Department of Oral and Maxillofacial SurgeryUniversity Hospital FreiburgFreiburgGermany
| | - E. Bradley Strong
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| |
Collapse
|
2
|
Ye J, Chen Q, Zhong T, Liu J, Gao H. Is Overlain Display a Right Choice for AR Navigation? A Qualitative Study of Head-Mounted Augmented Reality Surgical Navigation on Accuracy for Large-Scale Clinical Deployment. CNS Neurosci Ther 2025; 31:e70217. [PMID: 39817491 PMCID: PMC11736426 DOI: 10.1111/cns.70217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 12/24/2024] [Accepted: 01/03/2025] [Indexed: 01/18/2025] Open
Abstract
BACKGROUND During the course of the past two decades, head-mounted augmented reality surgical navigation (HMARSN) systems have been increasingly employed in a variety of surgical specialties as a result of both advancements in augmented reality-related technologies and surgeons' desires to overcome some drawbacks inherent to conventional surgical navigation systems. In the present time, most experimental HMARSN systems adopt overlain display (OD) that overlay virtual models and planned routes of surgical tools on corresponding physical tissues, organs, lesions, and so forth, in a surgical field so as to provide surgeons with an intuitive and direct view to gain better hand-eye coordination as well as avoid attention shift and loss of sight (LOS), among other benefits during procedures. Yet, its system accuracy, which is the most crucial performance indicator of any surgical navigation system, is difficult to ascertain because it is highly subjective and user-dependent. Therefore, the aim of this study was to review presently available experimental OD HMARSN systems qualitatively, explore how their system accuracy is affected by overlain display, and find out if such systems are suited to large-scale clinical deployment. METHOD We searched PubMed and ScienceDirect with the following terms: head mounted augmented reality surgical navigation, and 445 records were returned in total. After screening and eligibility assessment, 60 papers were finally analyzed. Specifically, we focused on how their accuracies were defined and measured, as well as whether such accuracies are stable in clinical practice and competitive with corresponding commercially available systems. RESULTS AND CONCLUSIONS The primary findings are that the system accuracy of OD HMARSN systems is seriously affected by a transformation between the spaces of the user's eyes and the surgical field, because measurement of the transformation is heavily individualized and user-dependent. Additionally, the transformation itself is potentially subject to changes during surgical procedures, and hence unstable. Therefore, OD HMARSN systems are not suitable for large-scale clinical deployment.
Collapse
Affiliation(s)
- Jian Ye
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Qingwen Chen
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Tao Zhong
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Jian Liu
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Han Gao
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| |
Collapse
|
3
|
Serrano CM, Atenas MJ, Rodriguez PJ, Vervoorn JM. From Virtual Reality to Reality: Fine-Tuning the Taxonomy for Extended Reality Simulation in Dental Education. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2024. [PMID: 39698875 DOI: 10.1111/eje.13064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Revised: 11/03/2024] [Accepted: 12/05/2024] [Indexed: 12/20/2024]
Abstract
INTRODUCTION Digital simulation in dental education has substantially evolved, addressing several educational challenges in dentistry. Following global lockdowns and sustainability concerns, dental educators are increasingly adopting digital simulation to enhance or replace traditional training methods. This review aimed to contribute to a uniform taxonomy for extended reality (XR) simulation within dental education. METHODS This scoping review followed the PRISMA and PRISMA-ScR guidelines. PubMed/MEDLINE, EMBASE, Web of Science and Google Scholar were searched. Eligible studies included English-written publications in indexed journals related to digital simulation in dental/maxillofacial education, providing theoretical descriptions of extended reality (XR) and/or immersive training tools (ITT). The outcomes of the scoping review were used as building blocks for a uniform of XR-simulation taxonomy. RESULTS A total of 141 articles from 2004 to 2024 were selected and categorised into Virtual Reality (VR), Mixed Reality (MR), Augmented Reality (AR), Augmented Virtuality (AV) and Computer Simulation (CS). Stereoscopic vision, immersion, interaction, modification and haptic feedback were identified as recurring features across XR-simulation in dentistry. These features formed the basis for a general XR-simulation taxonomy. DISCUSSION While XR-simulation features were consistent in the literature, the variety of definitions and classifications complicated the development of a taxonomy framework. VR was frequently used as an umbrella term. To address this, operational definitions were proposed for each category within the virtuality continuum, clarifying distinctions and commonalities. CONCLUSION This scoping review highlights the need for a uniform taxonomy in XR simulation within dental education. Establishing a consensus on XR-related terminology and definitions facilitates future research, allowing clear evidence reporting and analysis. The proposed taxonomy may also be of use for medical education, promoting alignment and the creation of a comprehensive body of evidence in XR technologies.
Collapse
Affiliation(s)
- Carlos M Serrano
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - María J Atenas
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - Patricio J Rodriguez
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - Johanna M Vervoorn
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| |
Collapse
|
4
|
Li B, Wei H, Yan J, Wang X. A novel portable augmented reality surgical navigation system for maxillofacial surgery: technique and accuracy study. Int J Oral Maxillofac Surg 2024; 53:961-967. [PMID: 38839534 DOI: 10.1016/j.ijom.2024.02.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2023] [Revised: 01/21/2024] [Accepted: 02/06/2024] [Indexed: 06/07/2024]
Abstract
Surgical navigation, despite its potential benefits, faces challenges in widespread adoption in clinical practice. Possible reasons include the high cost, increased surgery time, attention shifts during surgery, and the mental task of mapping from the monitor to the patient. To address these challenges, a portable, all-in-one surgical navigation system using augmented reality (AR) was developed, and its feasibility and accuracy were investigated. The system achieves AR visualization by capturing a live video stream of the actual surgical field using a visible light camera and merging it with preoperative virtual images. A skull model with reference spheres was used to evaluate the accuracy. After registration, virtual models were overlaid on the real skull model. The discrepancies between the centres of the real spheres and the virtual model were measured to assess the AR visualization accuracy. This AR surgical navigation system demonstrated precise AR visualization, with an overall overlap error of 0.53 ± 0.21 mm. By seamlessly integrating the preoperative virtual plan with the intraoperative field of view in a single view, this novel AR navigation system could provide a feasible solution for the use of AR visualization to guide the surgeon in performing the operation as planned.
Collapse
Affiliation(s)
- B Li
- Departments of Oral and Craniomaxillofacial Surgery, Shanghai 9th People's Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China; Shanghai Key Laboratory of Stomatology, Shanghai, China; National Clinical Research Center of Stomatology, Shanghai, China
| | - H Wei
- Departments of Oral and Craniomaxillofacial Surgery, Shanghai 9th People's Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China; Shanghai Key Laboratory of Stomatology, Shanghai, China; National Clinical Research Center of Stomatology, Shanghai, China
| | - J Yan
- Departments of Oral and Craniomaxillofacial Surgery, Shanghai 9th People's Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China; Shanghai Key Laboratory of Stomatology, Shanghai, China; National Clinical Research Center of Stomatology, Shanghai, China
| | - X Wang
- Departments of Oral and Craniomaxillofacial Surgery, Shanghai 9th People's Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China; Shanghai Key Laboratory of Stomatology, Shanghai, China; National Clinical Research Center of Stomatology, Shanghai, China.
| |
Collapse
|
5
|
Bochet Q, Raoul G, Lauwers L, Nicot R. Augmented reality in implantology: Virtual surgical checklist and augmented implant placement. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2024; 125:101813. [PMID: 38452901 DOI: 10.1016/j.jormas.2024.101813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Revised: 02/17/2024] [Accepted: 03/05/2024] [Indexed: 03/09/2024]
Abstract
OBJECTIVES Aim of the present study was to create a pedagogical checklist for implant surgical protocol with an augmented reality (AR) guided freehand surgery to inexperienced surgeons using a head mounted display (HMD) with tracking. METHODS The anatomical model of a patient with two missing mandibular teeth requiring conventional single-tooth implants was selected. The computed tomography (CT) scans were extracted and imported into segmentation and implant planning software. A Patient-specific dental splint through an intermediate strut, supported 3D-printed QR code. A checklist was generated to guide surgical procedure. After tracking, the AR-HMD projects the virtual pre-surgical plan (inferior alveolar nerve (IAN), implant axis, implant location) onto the real 3D-printed anatomical models. The entire drilling sequence was based on the manufacturer's recommendations, on 3D-printed anatomical models. After the implant surgical procedure, CT of the 3D-printed models was performed to compare the actual and simulated implant placements. All procedures in the study were performed in accordance with the Declaration of Helsinki. RESULTS In total, two implants were placed in a 3D-printed anatomical model of a female patient who required implant rehabilitation for dental agenesis at the second mandibular premolar positions (#35 and #45). Superimposition of the actual and simulated implants showed high concordance between them. CONCLUSION AR in education offers crucial surgical information for novice surgeons in real time. However, the benefits provided by AR in clinical and educational implantology must be demonstrated in other studies involving a larger number of patients, surgeons and apprentices.
Collapse
Affiliation(s)
- Quentin Bochet
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, Lille F-59000, France
| | - Gwénaël Raoul
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France
| | - Ludovic Lauwers
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, URL 2694 - METRICS, Lille F-59000, France
| | - Romain Nicot
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France; CNRS, Centrale Lille, Univ. Lille, UMR 9013 - LaMcube - Laboratoire de Mécanique, Multiphysique, Multiéchelle, Lille F-59000, France.
| |
Collapse
|
6
|
Du Y, Liu K, Ju Y, Wang H. Effect of prolonged wear and frame tightness of AR glasses on comfort. Heliyon 2024; 10:e35899. [PMID: 39220948 PMCID: PMC11365371 DOI: 10.1016/j.heliyon.2024.e35899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2024] [Revised: 07/19/2024] [Accepted: 08/06/2024] [Indexed: 09/04/2024] Open
Abstract
The present study aimed to investigate the effect of frame tightness on the wearing comfort of augmented reality (AR) glasses during a prolonged video viewing task. A frame prototype of AR glasses with an adjustable frame width was adopted to accommodate variations in head size within the Chinese population, and two hundred participants were recruited to wear the glasses for an hour under five different tightness conditions. Local and overall discomfort ratings were obtained as outcome measures, and the ratings exhibited a significant increase with higher tightness levels. Furthermore, females and older people reported greater discomfort than other participants did, whereas previous spectacle use and body type had nonsignificant effects on wearing comfort. Consideration of approaches to alleviate frame tightness is crucial in the design of AR glasses targeting females and older people. These findings provide valuable ergonomic insights for AR glasses design and offer considerations applicable to the glasses-type wearable device industry.
Collapse
Affiliation(s)
- Yujia Du
- School of Design, Hunan University, 410012, Changsha, China
| | - Kexiang Liu
- School of Design, Hunan University, 410012, Changsha, China
| | - Yuxin Ju
- School of Design, Hunan University, 410012, Changsha, China
| | - Haining Wang
- School of Design, Hunan University, 410012, Changsha, China
| |
Collapse
|
7
|
Al Hamad KQ, Said KN, Engelschalk M, Matoug-Elwerfelli M, Gupta N, Eric J, Ali SA, Ali K, Daas H, Abu Alhaija ES. Taxonomic discordance of immersive realities in dentistry: A systematic scoping review. J Dent 2024; 146:105058. [PMID: 38729286 DOI: 10.1016/j.jdent.2024.105058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Revised: 05/04/2024] [Accepted: 05/07/2024] [Indexed: 05/12/2024] Open
Abstract
OBJECTIVES This review aimed to map taxonomy frameworks, descriptions, and applications of immersive technologies in the dental literature. DATA The Preferred reporting items for systematic reviews and meta-analyses extension for scoping reviews (PRISMA-ScR) guidelines was followed, and the protocol was registered at open science framework platform (https://doi.org/10.17605/OSF.IO/H6N8M). SOURCES Systematic search was conducted in MEDLINE (via PubMed), Scopus, and Cochrane Library databases, and complemented by manual search. STUDY SELECTION A total of 84 articles were included, with 81 % between 2019 and 2023. Most studies were experimental (62 %), including education (25 %), protocol feasibility (20 %), in vitro (11 %), and cadaver (6 %). Other study types included clinical report/technique article (24 %), clinical study (9 %), technical note/tip to reader (4 %), and randomized controlled trial (1 %). Three-quarters of the included studies were published in oral and maxillofacial surgery (38 %), dental education (26 %), and implant (12 %) disciplines. Methods of display included head mounted display device (HMD) (55 %), see through screen (32 %), 2D screen display (11 %), and projector display (2 %). Descriptions of immersive realities were fragmented and inconsistent with lack of clear taxonomy framework for the umbrella and the subset terms including virtual reality (VR), augmented reality (AR), mixed reality (MR), augmented virtuality (AV), extended reality, and X reality. CONCLUSIONS Immersive reality applications in dentistry are gaining popularity with a notable surge in the number of publications in the last 5 years. Ambiguities are apparent in the descriptions of immersive realities. A taxonomy framework based on method of display (full or partial) and reality class (VR, AR, or MR) is proposed. CLINICAL SIGNIFICANCE Understanding different reality classes can be perplexing due to their blurred boundaries and conceptual overlapping. Immersive technologies offer novel educational and clinical applications. This domain is fast developing. With the current fragmented and inconsistent terminologies, a comprehensive taxonomy framework is necessary.
Collapse
Affiliation(s)
- Khaled Q Al Hamad
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar.
| | - Khalid N Said
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Marcus Engelschalk
- Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Germany
| | | | - Nidhi Gupta
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Jelena Eric
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Shaymaa A Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Kamran Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Hanin Daas
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | | |
Collapse
|
8
|
Niloy I, Liu RH, Pham NM, Yim CMR. Novel Use of Virtual Reality and Augmented Reality in Temporomandibular Total Joint Replacement Using Stock Prosthesis. J Oral Maxillofac Surg 2024; 82:632-640. [PMID: 38442876 DOI: 10.1016/j.joms.2024.02.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Revised: 01/17/2024] [Accepted: 02/12/2024] [Indexed: 03/07/2024]
Abstract
This technical innovation demonstrates the use of ImmersiveTouch Virtual Reality (VR) and Augmented Reality (AR)-guided total temporomandibular joint replacement (TJR) using Biomet stock prosthesis in 2 patients with condylar degeneration. TJR VR planning includes condylar resection, prosthesis selection and positioning, and interference identification. AR provides real-time guidance for osteotomies, placement of prostheses and fixation screws, occlusion verification, and flexibility to modify the surgical course. Radiographic analysis demonstrated high correspondence between the preoperative plan and postoperative result. The average differences in the positioning of the condylar and fossa prosthesis are 1.252 ± 0.269 mm and 1.393 ± 0.335 mm, respectively. The main challenges include a steep learning curve, intraoperative technical difficulties, added surgical time, and additional costs. In conclusion, the case report demonstrates the advantages of implementing AR and VR technology in TJR's using stock prostheses as a pilot study. Further clinical trials are needed prior to this innovation becoming a mainstream practice.
Collapse
Affiliation(s)
- Injamamul Niloy
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD
| | - Robert H Liu
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD.
| | - Nikole M Pham
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD
| | - Chang Min Richard Yim
- Department of Oral & Maxillofacial Surgery, Rutgers School of Dental Medicine, Newark, NJ
| |
Collapse
|
9
|
Tel A, Raccampo L, Vinayahalingam S, Troise S, Abbate V, Orabona GD, Sembronio S, Robiony M. Complex Craniofacial Cases through Augmented Reality Guidance in Surgical Oncology: A Technical Report. Diagnostics (Basel) 2024; 14:1108. [PMID: 38893634 PMCID: PMC11171943 DOI: 10.3390/diagnostics14111108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2024] [Revised: 05/20/2024] [Accepted: 05/24/2024] [Indexed: 06/21/2024] Open
Abstract
Augmented reality (AR) is a promising technology to enhance image guided surgery and represents the perfect bridge to combine precise virtual planning with computer-aided execution of surgical maneuvers in the operating room. In craniofacial surgical oncology, AR brings to the surgeon's sight a digital, three-dimensional representation of the anatomy and helps to identify tumor boundaries and optimal surgical paths. Intraoperatively, real-time AR guidance provides surgeons with accurate spatial information, ensuring accurate tumor resection and preservation of critical structures. In this paper, the authors review current evidence of AR applications in craniofacial surgery, focusing on real surgical applications, and compare existing literature with their experience during an AR and navigation guided craniofacial resection, to subsequently analyze which technological trajectories will represent the future of AR and define new perspectives of application for this revolutionizing technology.
Collapse
Affiliation(s)
- Alessandro Tel
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Luca Raccampo
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Shankeeth Vinayahalingam
- Department of Oral and Maxillofacial Surgery, Radboud University Medical Center, 6525 GA Nijmegen, The Netherlands
| | - Stefania Troise
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Vincenzo Abbate
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Giovanni Dell’Aversana Orabona
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Salvatore Sembronio
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Massimo Robiony
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| |
Collapse
|
10
|
Samuel S, Elvezio C, Khan S, Bitzer LZ, Moss-Salentijn L, Feiner S. Visuo-Haptic VR and AR Guidance for Dental Nerve Block Education. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2839-2848. [PMID: 38498761 DOI: 10.1109/tvcg.2024.3372125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/20/2024]
Abstract
The inferior alveolar nerve block (IANB) is a dental anesthetic injection that is critical to the performance of many dental procedures. Dental students typically learn to administer an IANB through videos and practice on silicone molds and, in many dental schools, on other students. This causes significant stress for both the students and their early patients. To reduce discomfort and improve clinical outcomes, we created an anatomically informed virtual reality headset-based educational system for the IANB. It combines a layered 3D anatomical model, dynamic visual guidance for syringe position and orientation, and active force feedback to emulate syringe interaction with tissue. A companion mobile augmented reality application allows students to step through a visualization of the procedure on a phone or tablet. We conducted a user study to determine the advantages of preclinical training with our IANB simulator. We found that in comparison to dental students who were exposed only to traditional supplementary study materials, dental students who used our IANB simulator were more confident administering their first clinical injections, had less need for syringe readjustments, and had greater success in numbing patients.
Collapse
|
11
|
Hsu MC, Lin CC, Hsu JT, Yu JH, Huang HL. Effects of an augmented reality aided system on the placement precision of orthodontic miniscrews: A pilot study. J Dent Sci 2024; 19:100-108. [PMID: 38303815 PMCID: PMC10829748 DOI: 10.1016/j.jds.2023.05.025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2023] [Revised: 05/19/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality (AR) is gaining popularity in medical applications, which may aid clinicians in achieving improved clinical outcomes. The purpose of this study was to determine the positional and angle errors of orthodontic miniscrew placement by using a self-developed AR aided system. Materials and methods Cone beam computed tomography (CBCT) and patient printed models were used in in vitro experiments. The participants were divided into a control group and an AR group, in which traditional orthodontic methods and the AR-aided system were used respectively. After the information obtained from the CBCT images and navigation system was combined on the display device, the AR-aided system indicated the planned miniscrew position to guide the clinicians during the placement of miniscrews. Both methods were compared by a senior and a junior dentist, and the position and angle of miniscrew placement were statistically analyzed using Wilcoxon's signed-rank and Mann-Whitney U tests. Results When the AR-aided system was used, the accuracy of miniscrew placement in the mesiodistal position considerably increased (83%) when the procedure was performed by a senior clinician. In addition, the accuracy of miniscrew placement in the mesiodistal position and the angle of miniscrew placement considerably increased by approximately 67% and 72%, respectively, when the procedure was performed by a junior clinician. The position error of miniscrew placement was smaller for the junior clinician when the AR-aided system was used than for the senior clinician. Conclusion The AR-aided system improved the accuracy of miniscrew placement regardless of the clinician's level of experience.
Collapse
Affiliation(s)
- Meng-Chu Hsu
- School of Dentistry, China Medical University, Taichung, Taiwan
| | - Chih-Chieh Lin
- Department of Dentistry, China Medical University Hospital, Taichung, Taiwan
| | - Jui-Ting Hsu
- Department of Biomedical Engineering, China Medical University, Taichung, Taiwan
| | - Jian-Hong Yu
- School of Dentistry, China Medical University, Taichung, Taiwan
- Department of Dentistry, China Medical University Hospital, Taichung, Taiwan
| | - Heng-Li Huang
- School of Dentistry, China Medical University, Taichung, Taiwan
- Department of Bioinformatics and Medical Engineering, Asia University, Taichung, Taiwan
| |
Collapse
|
12
|
Stucki J, Dastgir R, Baur DA, Quereshy FA. The use of virtual reality and augmented reality in oral and maxillofacial surgery: A narrative review. Oral Surg Oral Med Oral Pathol Oral Radiol 2024; 137:12-18. [PMID: 37723007 DOI: 10.1016/j.oooo.2023.07.001] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 07/03/2023] [Indexed: 09/20/2023]
Abstract
OBJECTIVE The purpose of this article is to review the current uses of virtual reality (VR) and augmented reality (AR) in oral and maxillofacial surgery. We discuss the use of VR/AR in educational training, surgical planning, advances in hardware and software, and the implementation of VR/AR in this field. STUDY DESIGN A retrospective comprehensive review search of PubMed, Web of Science, Embase, and Cochrane Library was conducted. The search resulted in finding 313 English articles in the last 10 years. RESULTS A total of 38 articles were selected after a meticulous review of the aims, objectives, and methodology by 2 independent reviewers. CONCLUSIONS Virtual reality/AR technology offers significant potential in various aspects, including student education, resident evaluation, surgical planning, and overall surgical implementation. However, its widespread adoption in practice is hindered by factors such as the need for further research, cost concerns, unfamiliarity among current educators, and the necessity for technological improvement. Furthermore, residency programs hold a unique position to influence the future of oral and maxillofacial surgery. As VR/AR has demonstrated substantial benefits in resident education and other applications, residency programs have much to gain by integrating these emerging technologies into their curricula.
Collapse
Affiliation(s)
- Jacob Stucki
- Resident, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Ramtin Dastgir
- Research Fellow, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Dale A Baur
- Professor and Chair, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Faisal A Quereshy
- Professor and Program Director, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA.
| |
Collapse
|
13
|
Tel A, Zeppieri M, Robiony M, Sembronio S, Vinayahalingam S, Pontoriero A, Pergolizzi S, Angileri FF, Spadea L, Ius T. Exploring Deep Cervical Compartments in Head and Neck Surgical Oncology through Augmented Reality Vision: A Proof of Concept. J Clin Med 2023; 12:6650. [PMID: 37892787 PMCID: PMC10607265 DOI: 10.3390/jcm12206650] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 09/26/2023] [Accepted: 10/16/2023] [Indexed: 10/29/2023] Open
Abstract
Virtual surgical planning allows surgeons to meticulously define surgical procedures by creating a digital replica of patients' anatomy. This enables precise preoperative assessment, facilitating the selection of optimal surgical approaches and the customization of treatment plans. In neck surgery, virtual planning has been significantly underreported compared to craniofacial surgery, due to a multitude of factors, including the predominance of soft tissues, the unavailability of intraoperative navigation and the complexity of segmenting such areas. Augmented reality represents the most innovative approach to translate virtual planning for real patients, as it merges the digital world with the surgical field in real time. Surgeons can access patient-specific data directly within their field of view, through dedicated visors. In head and neck surgical oncology, augmented reality systems overlay critical anatomical information onto the surgeon's visual field. This aids in locating and preserving vital structures, such as nerves and blood vessels, during complex procedures. In this paper, the authors examine a series of patients undergoing complex neck surgical oncology procedures with prior virtual surgical planning analysis. For each patient, the surgical plan was imported in Hololens headset to allow for intraoperative augmented reality visualization. The authors discuss the results of this preliminary investigation, tracing the conceptual framework for an increasing AR implementation in complex head and neck surgical oncology procedures.
Collapse
Affiliation(s)
- Alessandro Tel
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, Piazzale S. Maria della Misericordia 15, 33100 Udine, Italy; (A.T.)
| | - Marco Zeppieri
- Department of Ophthalmology, University Hospital of Udine, Piazzale S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Massimo Robiony
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, Piazzale S. Maria della Misericordia 15, 33100 Udine, Italy; (A.T.)
| | - Salvatore Sembronio
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, Piazzale S. Maria della Misericordia 15, 33100 Udine, Italy; (A.T.)
| | - Shankeeth Vinayahalingam
- Department of Maxillofacial Surgery, Radboud Medical University, Geert Grooteplein Zuid 10, 6525 GA Nijmegen, The Netherlands
| | - Antonio Pontoriero
- Radiation Oncology Unit, Department of Biomedical, Dental Science and Morphological and Functional Images, University of Messina, 98125 Messina, Italy
| | - Stefano Pergolizzi
- Radiation Oncology Unit, Department of Biomedical, Dental Science and Morphological and Functional Images, University of Messina, 98125 Messina, Italy
| | - Filippo Flavio Angileri
- Neurosurgery Unit, Department of Biomedical, Dental Science and Morphological and Functional Images, 98125 Messina, Italy
| | - Leopoldo Spadea
- Eye Clinic, Policlinico Umberto I, “Sapienza” University of Rome, 00142 Rome, Italy
| | - Tamara Ius
- Neurosurgery Unit, Head-Neck and NeuroScience Department, University Hospital of Udine, Piazzale S. Maria della Misericordia 15, 33100 Udine, Italy
| |
Collapse
|
14
|
Martinho FC, Griffin IL, Price JB, Tordik PA. Augmented Reality and 3-Dimensional Dynamic Navigation System Integration for Osteotomy and Root-end Resection. J Endod 2023; 49:1362-1368. [PMID: 37453501 DOI: 10.1016/j.joen.2023.07.007] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 07/03/2023] [Accepted: 07/05/2023] [Indexed: 07/18/2023]
Abstract
INTRODUCTION Augmented reality (AR) superimposes high-definition computer-generated virtual content onto the existing environment, providing users with an enhanced perception of reality. This study investigates the feasibility of integrating an AR head-mounted device into a 3-dimensional dynamic navigation system (3D-DNS) for osteotomy and root-end resection (RER). It compares the accuracy and efficiency of AR + 3D-DNS to 3D-DNS for osteotomy and RER. METHODS Seventy-two tooth roots of 3D-printed surgical jaw models were divided into two groups: AR + 3D-DNS (n = 36) and 3D-DNS (n = 36). Cone-beam computed tomography scans were taken pre and postoperatively. The osteotomy and RER were virtually planned on X-guide software and delivered under 3D-DNS guidance. For the AR + 3D-DNS group, an AR head-mounted device (Microsoft HoloLens 2) was integrated into the 3D-DNS. The 2D- and 3D-deviations were calculated. The osteotomy and RER time and the number of procedural mishaps were recorded. RESULTS Osteotomy and RER were completed in all samples (72/72). AR + 3D-DNS was more accurate than 3D-DNS, showing lower 2D- and 3D-deviation values (P < .05). The AR + 3D-DNS was more efficient in time than 3D-DNS (P < .05). There was no significant difference in the number of mishaps (P > .05). CONCLUSIONS Within the limitations of this in vitro study, the integration of an AR head-mounted device to 3D-DNS is feasible for osteotomy and RER. AR improved the accuracy and time efficiency of 3D-DNS in osteotomy and RER. Head-mounted AR has the potential to be safely and reliably integrated into 3D-DNS for endodontic microsurgery.
Collapse
Affiliation(s)
- Frederico C Martinho
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland.
| | - Ina L Griffin
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Jeffery B Price
- Division of Oral Radiology, Department of Oncology and Diagnostic Sciences, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Patricia A Tordik
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| |
Collapse
|
15
|
Du Y, Liu K, Ju Y, Wang H. A comfort analysis of AR glasses on physical load during long-term wearing. ERGONOMICS 2023; 66:1325-1339. [PMID: 36377507 DOI: 10.1080/00140139.2022.2146207] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Accepted: 11/04/2022] [Indexed: 06/16/2023]
Abstract
The present study investigated the effect of the physical load of augmented reality (AR) glasses on subjective assessments for an extended duration of a video viewing task. Ninety-six subjects were recruited for this test and were divided by spectacle use, sex, age, and body mass index (BMI). Four glasses frame weights were assessed. To investigate their effectiveness, a novel prototype adopting three design interventions, (1) adjustable frame width, (2) ergonomic temples, and (3) fixed centre of gravity, was designed with regard to subjective discomfort ratings (nose, ear, and overall). Subjective discomfort in all regions was significantly increased with increasing physical load on the nose. In addition, non-spectacle users, women, older users, and participants in the middle BMI category reported higher discomfort than other groups. This finding could have important implications for the ergonomic design of AR glasses and could help to identify design considerations relevant to the emerging wearable display industry. Practitioner summary: This research aims to explore the influence of the physical load of augmented reality (AR) glasses. It found that discomfort was increased with added nose load. Non-spectacle users, women, older users, and participants in the middle BMI category were more sensitive to discomfort. The results have important implications for glasses-type wearables' design.
Collapse
Affiliation(s)
- Yujia Du
- School of Design, Hunan University, Changsha, China
| | - Kexiang Liu
- School of Design, Hunan University, Changsha, China
| | - Yuxin Ju
- School of Design, Hunan University, Changsha, China
| | - Haining Wang
- School of Design, Hunan University, Changsha, China
| |
Collapse
|
16
|
Malenova Y, Ortner F, Liokatis P, Haidari S, Tröltzsch M, Fegg F, Obermeier KT, Hartung JT, Kakoschke TK, Burian E, Otto S, Sabbagh H, Probst FA. Accuracy of maxillary positioning using computer-designed and manufactured occlusal splints or patient-specific implants in orthognathic surgery. Clin Oral Investig 2023; 27:5063-5072. [PMID: 37382718 PMCID: PMC10492762 DOI: 10.1007/s00784-023-05125-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2023] [Accepted: 06/20/2023] [Indexed: 06/30/2023]
Abstract
OBJECTIVE To determine the accuracy of maxillary positioning using computer-designed and manufactured occlusal splints or patient-specific implants in orthognathic surgery. MATERIAL AND METHODS A retrospective analysis of 28 patients that underwent virtually planned orthognathic surgery with maxillary Le Fort I osteotomy either using VSP-generated splints (n = 13) or patient-specific implants (PSI) (n = 15) was conducted. The accuracy and surgical outcome of both techniques were compared by superimposing preoperative surgical planning with postoperative CT scans and measurement of translational and rotational deviation for each patient. RESULTS The 3D global geometric deviation between the planned position and the postoperative outcome was 0.60 mm (95%-CI 0.46-0.74, range 0.32-1.11 mm) for patients with PSI and 0.86 mm (95%-CI 0.44-1.28, range 0.09-2.60 mm) for patients with surgical splints. Postoperative differences for absolute and signed single linear deviations between planned and postoperative position were a little higher regarding the x-axis and pitch but lower regarding the y- and z-axis as well as yaw and roll for PSI compared to surgical splints. There were no significant differences regarding global geometric deviation, absolute and signed linear deviations in the x-, y-, and z-axis, and rotations (yaw, pitch, and roll) between both groups. CONCLUSIONS Regarding accuracy for positioning of maxillary segments after Le Fort I osteotomy in orthognathic surgery patient-specific implants and surgical splints provide equivalent high accuracy. CLINICAL RELEVANCE Patient-specific implants for maxillary positioning and fixation facilitate the concept of splintless orthognathic surgery and can be reliably used in clinical routines.
Collapse
Affiliation(s)
- Yoana Malenova
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany.
| | - Florian Ortner
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Paris Liokatis
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Selgai Haidari
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Matthias Tröltzsch
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
- Center for Oral, Maxillofacial, and Facial Reconstructive Surgery, Ansbach, Germany
| | - Florian Fegg
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Katharina T Obermeier
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Jens T Hartung
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Tamara K Kakoschke
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Egon Burian
- Institute of Diagnostic and Interventional Radiology, Technical University of Munich, School of Medicine, Munich, Germany
| | - Sven Otto
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| | - Hisham Sabbagh
- Department of Orthodontics and Dentofacial Orthopedics, University Hospital LMU Munich, Munich, Germany
| | - Florian A Probst
- Department of Oral and Maxillofacial Surgery and Facial Plastic Surgery, University Hospital LMU Munich, Munich, Germany
| |
Collapse
|
17
|
Benmahdjoub M, Thabit A, van Veelen MLC, Niessen WJ, Wolvius EB, Walsum TV. Evaluation of AR visualization approaches for catheter insertion into the ventricle cavity. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2434-2445. [PMID: 37027733 DOI: 10.1109/tvcg.2023.3247042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Augmented reality (AR) has shown potential in computer-aided surgery. It allows for the visualization of hidden anatomical structures as well as assists in navigating and locating surgical instruments at the surgical site. Various modalities (devices and/or visualizations) have been used in the literature, but few studies investigated the adequacy/superiority of one modality over the other. For instance, the use of optical see-through (OST) HMDs has not always been scientifically justified. Our goal is to compare various visualization modalities for catheter insertion in external ventricular drain and ventricular shunt procedures. We investigate two AR approaches: (1) 2D approaches consisting of a smartphone and a 2D window visualized through an OST (Microsoft HoloLens 2), and (2) 3D approaches consisting of a fully aligned patient model and a model that is adjacent to the patient and is rotationally aligned using an OST. 32 participants joined this study. For each visualization approach, participants were asked to perform five insertions after which they filled NASA-TLX and SUS forms. Moreover, the position and orientation of the needle with respect to the planning during the insertion task were collected. The results show that participants achieved a better insertion performance significantly under 3D visualizations, and the NASA-TLX and SUS forms reflected the preference of participants for these approaches compared to 2D approaches.
Collapse
|
18
|
Ruggiero F, Cercenelli L, Emiliani N, Badiali G, Bevini M, Zucchelli M, Marcelli E, Tarsitano A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J Clin Med 2023; 12:jcm12072693. [PMID: 37048777 PMCID: PMC10095377 DOI: 10.3390/jcm12072693] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2023] [Revised: 03/29/2023] [Accepted: 03/31/2023] [Indexed: 04/08/2023] Open
Abstract
Background: Augmented reality (AR) allows the overlapping and integration of virtual information with the real environment. The camera of the AR device reads the object and integrates the virtual data. It has been widely applied to medical and surgical sciences in recent years and has the potential to enhance intraoperative navigation. Materials and methods: In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 head-mounted display (HMD) in pediatric craniofacial surgery. The Authors selected fronto-orbital remodeling (FOR) as the procedure to test (specifically, frontal osteotomy and nasal osteotomy were considered). Six people (three surgeons and three engineers) were recruited to perform the osteotomies on a 3D printed stereolithographic model under the guidance of AR. By means of calibrated CAD/CAM cutting guides with different grooves, the authors measured the accuracy of the osteotomies that were performed. We tested accuracy levels of ±1.5 mm, ±1 mm, and ±0.5 mm. Results: With the HoloLens 2, the majority of the individuals involved were able to successfully trace the trajectories of the frontal and nasal osteotomies with an accuracy level of ±1.5 mm. Additionally, 80% were able to achieve an accuracy level of ±1 mm when performing a nasal osteotomy, and 52% were able to achieve an accuracy level of ±1 mm when performing a frontal osteotomy, while 61% were able to achieve an accuracy level of ±0.5 mm when performing a nasal osteotomy, and 33% were able to achieve an accuracy level of ±0.5 mm when performing a frontal osteotomy. Conclusions: despite this being an in vitro study, the authors reported encouraging results for the prospective use of AR on actual patients.
Collapse
Affiliation(s)
- Federica Ruggiero
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Maxillo-Facial Surgery Unit, AUSL Bologna, 40124 Bologna, Italy
| | - Laura Cercenelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Nicolas Emiliani
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mirko Bevini
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mino Zucchelli
- Pediatric Neurosurgery, IRCCS Istituto delle Scienze Neurologiche di Bologna, Via Altura 3, 40138 Bologna, Italy
| | - Emanuela Marcelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| |
Collapse
|
19
|
Jiang J, Zhang J, Sun J, Wu D, Xu S. User's image perception improved strategy and application of augmented reality systems in smart medical care: A review. Int J Med Robot 2023; 19:e2497. [PMID: 36629798 DOI: 10.1002/rcs.2497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 12/26/2022] [Accepted: 01/06/2023] [Indexed: 01/12/2023]
Abstract
BACKGROUND Augmented reality (AR) is a new human-computer interaction technology that combines virtual reality, computer vision, and computer networks. With the rapid advancement of the medical field towards intelligence and data visualisation, AR systems are becoming increasingly popular in the medical field because they can provide doctors with clear enough medical images and accurate image navigation in practical applications. However, it has been discovered that different display types of AR systems have different effects on doctors' perception of the image after virtual-real fusion during the actual medical application. If doctors cannot correctly perceive the image, they may be unable to correctly match the virtual information with the real world, which will have a significant impact on their ability to recognise complex structures. METHODS This paper uses Citespace, a literature analysis tool, to visualise and analyse the research hotspots when AR systems are used in the medical field. RESULTS A visual analysis of the 1163 articles retrieved from the Web of Science Core Collection database reveals that display technology and visualisation technology are the key research directions of AR systems at the moment. CONCLUSION This paper categorises AR systems based on their display principles, reviews current image perception optimisation schemes for various types of systems, and analyses and compares different display types of AR systems based on their practical applications in the field of smart medical care so that doctors can select the appropriate display types based on different application scenarios. Finally, the future development direction of AR display technology is anticipated in order for AR technology to be more effectively applied in the field of smart medical care. The advancement of display technology for AR systems is critical for their use in the medical field, and the advantages and disadvantages of various display types should be considered in different application scenarios to select the best AR system.
Collapse
Affiliation(s)
- Jingang Jiang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China.,Robotics & Its Engineering Research Center, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jiawei Zhang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jianpeng Sun
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Dianhao Wu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Shuainan Xu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| |
Collapse
|
20
|
Ceccariglia F, Cercenelli L, Badiali G, Marcelli E, Tarsitano A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J Pers Med 2022; 12:jpm12122047. [PMID: 36556268 PMCID: PMC9785494 DOI: 10.3390/jpm12122047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/03/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
In the relevant global context, although virtual reality, augmented reality, and mixed reality have been emerging methodologies for several years, only now have technological and scientific advances made them suitable for revolutionizing clinical care and medical settings through the provision of advanced features and improved healthcare services. Over the past fifteen years, tools and applications using augmented reality (AR) have been designed and tested in the context of various surgical and medical disciplines, including maxillofacial surgery. The purpose of this paper is to show how a marker-less AR guidance system using the Microsoft® HoloLens 2 can be applied in mandible and maxillary demolition surgery to guide maxillary osteotomies. We describe three mandibular and maxillary oncologic resections performed during 2021 using AR support. In these three patients, we applied a marker-less tracking method based on recognition of the patient's facial profile. The surgeon, using HoloLens 2 smart glasses, could see the virtual surgical planning superimposed on the patient's anatomy. We showed that performing osteotomies under AR guidance is feasible and viable, as demonstrated by comparison with osteotomies performed using CAD-CAM cutting guides. This technology has advantages and disadvantages. However, further research is needed to improve the stability and robustness of the marker-less tracking method applied to patient face recognition.
Collapse
Affiliation(s)
- Francesco Ceccariglia
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Correspondence: ; Tel.: +39-051-2144197
| | - Laura Cercenelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| | - Emanuela Marcelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| |
Collapse
|
21
|
Faus-Matoses V, Faus-Llácer V, Moradian T, Riad Deglow E, Ruiz-Sánchez C, Hamoud-Kharrat N, Zubizarreta-Macho Á, Faus-Matoses I. Accuracy of Endodontic Access Cavities Performed Using an Augmented Reality Appliance: An In Vitro Study. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:11167. [PMID: 36141439 PMCID: PMC9517686 DOI: 10.3390/ijerph191811167] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Revised: 08/29/2022] [Accepted: 09/03/2022] [Indexed: 06/16/2023]
Abstract
INTRODUCTION The purpose of this study was to compare and contrast the accuracy of endodontic access cavities created using an augmented reality appliance to those performed using the conventional technique. MATERIALS AND METHODS 60 single-rooted anterior teeth were chosen for study and randomly divided between two study groups: Group A-endodontic access cavities created using an augmented reality appliance as a guide (n = 30) (AR); and Group B-endodontic access cavities performed with the manual (freehand) technique (n = 30) (MN). A 3D implant planning software was used to plan the endodontic access cavities for the AR group, with a cone-beam computed tomography (CBCT) and 3D intraoral surface scan taken preoperatively and subsequently transferred to the augmented reality device. A second CBCT scan was taken after performing the endodontic access cavities to compare the planned and performed endodontic access for accuracy. Therapeutic planning software and Student's t-test were used to analyze the cavities at the apical, coronal, and angular levels. The repeatability and reproducibility of the digital measurement technique were analyzed using Gage R&R statistical analysis. RESULTS The paired t-test found statistically significant differences between the study groups at the coronal (p = 0.0029) and apical (p = 0.0063) levels; no statistically significant differences were found between the AR and MN groups at the angular (p = 0.6596) level. CONCLUSIONS Augmented reality devices enable the safer and more accurate performance of endodontic access cavities when compared with the conventional freehand technique.
Collapse
Affiliation(s)
- Vicente Faus-Matoses
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| | - Vicente Faus-Llácer
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| | - Tanaz Moradian
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| | - Elena Riad Deglow
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, 28691 Madrid, Spain
| | - Celia Ruiz-Sánchez
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| | - Nirmine Hamoud-Kharrat
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| | - Álvaro Zubizarreta-Macho
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, 28691 Madrid, Spain
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | - Ignacio Faus-Matoses
- Department of Stomatology, Faculty of Medicine and Dentistry, University of Valencia, 46010 Valencia, Spain
| |
Collapse
|
22
|
Han B, Li R, Huang T, Ma L, Liang H, Zhang X, Liao H. An accurate 3D augmented reality navigation system with enhanced autostereoscopic display for oral and maxillofacial surgery. Int J Med Robot 2022; 18:e2404. [DOI: 10.1002/rcs.2404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2021] [Revised: 03/03/2022] [Accepted: 04/05/2022] [Indexed: 11/10/2022]
Affiliation(s)
- Boxuan Han
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Ruiyang Li
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Tianqi Huang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Longfei Ma
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Hanying Liang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Xinran Zhang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Hongen Liao
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| |
Collapse
|
23
|
Shi J, Liu S, Zhu Z, Deng Z, Bian G, He B. Augmented reality for oral and maxillofacial surgery: The feasibility of a marker‐free registration method. Int J Med Robot 2022; 18:e2401. [DOI: 10.1002/rcs.2401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Revised: 02/28/2022] [Accepted: 03/31/2022] [Indexed: 11/07/2022]
Affiliation(s)
- Jiafeng Shi
- School of Mechanical Engineering and Automation Fuzhou University Fuzhou China
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering Fuzhou China
| | - Shaofeng Liu
- Department of Oral and Maxillofacial Surgery The First Affiliated Hospital, Laboratory of Facial Plastic and Reconstruction Fujian Medical University Fuzhou China
| | - Zhaoju Zhu
- School of Mechanical Engineering and Automation Fuzhou University Fuzhou China
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering Fuzhou China
- Institute of Automation Chinese Academy of Sciences Beijing China
| | - Zhen Deng
- School of Mechanical Engineering and Automation Fuzhou University Fuzhou China
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering Fuzhou China
| | - Guibin Bian
- Institute of Automation Chinese Academy of Sciences Beijing China
| | - Bingwei He
- School of Mechanical Engineering and Automation Fuzhou University Fuzhou China
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering Fuzhou China
| |
Collapse
|
24
|
Pu JJ, Hakim SG, Melville JC, Su YX. Current Trends in the Reconstruction and Rehabilitation of Jaw following Ablative Surgery. Cancers (Basel) 2022; 14:cancers14143308. [PMID: 35884369 PMCID: PMC9320033 DOI: 10.3390/cancers14143308] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2022] [Revised: 06/18/2022] [Accepted: 06/21/2022] [Indexed: 12/04/2022] Open
Abstract
Simple Summary The Maxilla and mandible provide skeletal support for of the middle and lower third of our faces, allowing for the normal functioning of breathing, chewing, swallowing, and speech. The ablative surgery of jaws in the past often led to serious disfigurement and disruption in form and function. However, with recent strides made in computer-assisted surgery and patient-specific implants, the individual functional reconstruction of the jaw is evolving rapidly and the prompt rehabilitation of both the masticatory function and aesthetics after jaw resection has been made possible. In the present review, the recent advancements in jaw reconstruction technology and future perspectives will be discussed. Abstract The reconstruction and rehabilitation of jaws following ablative surgery have been transformed in recent years by the development of computer-assisted surgery and virtual surgical planning. In this narrative literature review, we aim to discuss the current state-of-the-art jaw reconstruction, and to preview the potential future developments. The application of patient-specific implants and the “jaw-in-a-day technique” have made the fast restoration of jaws’ function and aesthetics possible. The improved efficiency of primary reconstructive surgery allows for the rehabilitation of neurosensory function following ablative surgery. Currently, a great deal of research has been conducted on augmented/mixed reality, artificial intelligence, virtual surgical planning for soft tissue reconstruction, and the rehabilitation of the stomatognathic system. This will lead to an even more exciting future for the functional reconstruction and rehabilitation of the jaw following ablative surgery.
Collapse
Affiliation(s)
- Jane J. Pu
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong;
| | - Samer G. Hakim
- Department Oral and Maxillofacial Surgery, University Hospital of Lübeck, Ratzeburger Allee 160, 23538 Lübeck, Germany;
| | - James C. Melville
- Department of Oral and Maxillofacial Surgery, University of Texas Health Science Center at Houston, Houston, TX 77030, USA;
| | - Yu-Xiong Su
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong;
- Correspondence:
| |
Collapse
|
25
|
Augmented Reality in Orthopedic Surgery and Its Application in Total Joint Arthroplasty: A Systematic Review. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12105278] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
The development of augmented reality (AR) and its application in total joint arthroplasty aims at improving the accuracy and precision in implant components’ positioning, hopefully leading to increased outcomes and survivorship. However, this field is far from being thoroughly explored. We therefore performed a systematic review of the literature in order to examine the application, the results, and the different AR systems available in TJA. A systematic review of the literature according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was performed. A comprehensive search of PubMed, MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews was conducted for English articles on the application of augmented reality in total joint arthroplasty using various combinations of keywords since the inception of the database to 31 March 2022. Accuracy was intended as the mean error from the targeted positioning angle and compared as mean values and standard deviations. In all, 14 articles met the inclusion criteria. Among them, four studies reported on the application of AR in total knee arthroplasty, six studies on total hip arthroplasty, three studies reported on reverse shoulder arthroplasty, and one study on total elbow arthroplasty. Nine of the included studies were preclinical (sawbones or cadaveric), while five of them reported results of AR’s clinical application. The main common feature was the high accuracy and precision when implant positioning was compared with preoperative targeted angles with errors ≤2 mm and/or ≤2°. Despite the promising results in terms of increased accuracy and precision, this technology is far from being widely adopted in daily clinical practice. However, the recent exponential growth in machine learning techniques and technologies may eventually lead to the resolution of the ongoing limitations including depth perception and their high complexity, favorably encouraging the widespread usage of AR systems.
Collapse
|
26
|
Arpaia P, De Benedetto E, De Paolis L, D’Errico G, Donato N, Duraccio L. Performance and Usability Evaluation of an Extended Reality Platform to Monitor Patient’s Health during Surgical Procedures. SENSORS 2022; 22:s22103908. [PMID: 35632317 PMCID: PMC9143436 DOI: 10.3390/s22103908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 05/14/2022] [Accepted: 05/18/2022] [Indexed: 02/01/2023]
Abstract
An extended-reality (XR) platform for real-time monitoring of patients’ health during surgical procedures is proposed. The proposed system provides real-time access to a comprehensive set of patients’ information, which are made promptly available to the surgical team in the operating room (OR). In particular, the XR platform supports the medical staff by automatically acquiring the patient’s vitals from the operating room instrumentation and displaying them in real-time directly on an XR headset. Furthermore, information regarding the patient clinical record is also shown upon request. Finally, the XR-based monitoring platform also allows displaying in XR the video stream coming directly from the endoscope. The innovative aspect of the proposed XR-based monitoring platform lies in the comprehensiveness of the available information, in its modularity and flexibility (in terms of adaption to different sources of data), ease of use, and most importantly, in a reliable communication, which are critical requirements for the healthcare field. To validate the proposed system, experimental tests were conducted using instrumentation typically available in the operating room (i.e., a respiratory ventilator, a patient monitor for intensive care, and an endoscope). The overall results showed (i) an accuracy of the data communication greater than 99 %, along with (ii) an average time response below ms, and (iii) satisfying feedback from the SUS questionnaires filled out by the physicians after intensive use.
Collapse
Affiliation(s)
- Pasquale Arpaia
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
- Correspondence:
| | - Egidio De Benedetto
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
| | - Lucio De Paolis
- Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy;
| | - Giovanni D’Errico
- Department of Applied Science and Technology, Polytechnic University of Turin, 10129 Turin, Italy;
| | - Nicola Donato
- Department of Engineering, University of Messina, 98122 Messina, Italy;
| | - Luigi Duraccio
- Department of Electronics and Telecommunications, Polytechnic University of Turin, 10129 Turin, Italy;
| |
Collapse
|
27
|
Fahim S, Maqsood A, Das G, Ahmed N, Saquib S, Lal A, Khan AAG, Alam MK. Augmented Reality and Virtual Reality in Dentistry: Highlights from the Current Research. APPLIED SCIENCES 2022; 12:3719. [DOI: 10.3390/app12083719] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Many modern advancements have taken place in dentistry that have exponentially impacted the progress and practice of dentistry. Augmented reality (AR) and virtual reality (VR) are becoming the trend in the practice of modern dentistry because of their impact on changing the patient’s experience. The use of AR and VR has been beneficial in different fields of science, but their use in dentistry is yet to be thoroughly explored, and conventional ways of dentistry are still practiced at large. Over the past few years, dental treatment has been significantly reshaped by technological advancements. In dentistry, the use of AR and VR systems has not become widespread, but their different uses should be explored. Therefore, the aim of this review was to provide an update on the contemporary knowledge, to report on the ongoing progress of AR and VR in various fields of dental medicine and education, and to identify the further research required to achieve their translation into clinical practice. A literature search was performed in PubMed, Scopus, Web of Science, and Google Scholar for articles in peer-reviewed English-language journals published in the last 10 years up to 31 March 2021, with the help of specific keywords related to AR and VR in various dental fields. Of the total of 101 articles found in the literature search, 68 abstracts were considered suitable and further evaluated, and consequently, 33 full-texts were identified. Finally, a total of 13 full-texts were excluded from further analysis, resulting in 20 articles for final inclusion. The overall number of studies included in this review was low; thus, at this point in time, scientifically-proven recommendations could not be stated. AR and VR have been found to be beneficial tools for clinical practice and for enhancing the learning experiences of students during their pre-clinical education and training sessions. Clinicians can use VR technology to show their patients the expected outcomes before the undergo dental procedures. Additionally, AR and VR can be implemented to overcome dental phobia, which is commonly experienced by pediatric patients. Future studies should focus on forming technological standards with high-quality data and developing scientifically-proven AR/VR gadgets for dental practice.
Collapse
Affiliation(s)
- Sidra Fahim
- Department of Oral Medicine, Altamash Institute of Dental Medicine, Karachi 75500, Pakistan
| | - Afsheen Maqsood
- Department of Oral Pathology, Bahria University Dental College, Karachi 07557, Pakistan
| | - Gotam Das
- Department of Prosthodontics, College of Dentistry, King Khalid University, Abha 61341, Saudi Arabia
| | - Naseer Ahmed
- Department of Prosthodontics, Altamash Institute of Dental Medicine, Karachi 75500, Pakistan
| | - Shahabe Saquib
- Department of Periodontics and Community Dental Sciences, College of Dentistry, King Khalid University, Abha 61341, Saudi Arabia
| | - Abhishek Lal
- Department of Prosthodontics, Altamash Institute of Dental Medicine, Karachi 75500, Pakistan
| | - Abdul Ahad Ghaffar Khan
- Department of Oral and Maxillofacial Surgery, College of Dentistry, King Khalid University, Abha 61341, Saudi Arabia
| | - Mohammad Khursheed Alam
- Department of Preventive Dentistry, College of Dentistry, Jouf University, Sakaka 72345, Saudi Arabia
- Center for Transdisciplinary Research (CFTR), Saveetha Dental College, Institute of Medical and Technical Sciences, Saveetha University, Chennai 600077, India
- Department of Public Health, Faculty of Allied Health Sciences, Daffodil International University, Dhaka 1341, Bangladesh
| |
Collapse
|
28
|
Ferrari V, Cattari N, Fontana U, Cutolo F. Parallax Free Registration for Augmented Reality Optical See-Through Displays in the Peripersonal Space. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:1608-1618. [PMID: 32881688 DOI: 10.1109/tvcg.2020.3021534] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Egocentric augmented reality (AR) interfaces are quickly becoming a key asset for assisting high precision activities in the peripersonal space in several application fields. In these applications, accurate and robust registration of computer-generated information to the real scene is hard to achieve with traditional Optical See-Through (OST) displays given that it relies on the accurate calibration of the combined eye-display projection model. The calibration is required to efficiently estimate the projection parameters of the pinhole model that encapsulate the optical features of the display and whose values vary according to the position of the user's eye. In this article, we describe an approach that prevents any parallax-related AR misregistration at a pre-defined working distance in OST displays with infinity focus; our strategy relies on the use of a magnifier placed in front of the OST display, and features a proper parameterization of the virtual rendering camera achieved through a dedicated calibration procedure that accounts for the contribution of the magnifier. We model the registration error due to the viewpoint parallax outside the ideal working distance. Finally, we validate our strategy on a OST display, and we show that sub-millimetric registration accuracy can be achieved for working distances of ±100 mm around the focal length of the magnifier.
Collapse
|
29
|
Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. INFORMATION 2022. [DOI: 10.3390/info13020081] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
In the context of image-guided surgery, augmented reality (AR) represents a ground-breaking enticing improvement, mostly when paired with wearability in the case of open surgery. Commercially available AR head-mounted displays (HMDs), designed for general purposes, are increasingly used outside their indications to develop surgical guidance applications with the ambition to demonstrate the potential of AR in surgery. The applications proposed in the literature underline the hunger for AR-guidance in the surgical room together with the limitations that hinder commercial HMDs from being the answer to such a need. The medical domain demands specifically developed devices that address, together with ergonomics, the achievement of surgical accuracy objectives and compliance with medical device regulations. In the framework of an EU Horizon2020 project, a hybrid video and optical see-through augmented reality headset paired with a software architecture, both specifically designed to be seamlessly integrated into the surgical workflow, has been developed. In this paper, the overall architecture of the system is described. The developed AR HMD surgical navigation platform was positively tested on seven patients to aid the surgeon while performing Le Fort 1 osteotomy in cranio-maxillofacial surgery, demonstrating the value of the hybrid approach and the safety and usability of the navigation platform.
Collapse
|
30
|
Jones JP, Amarista FJ, Jeske NA, Szalay D, Ellis E. Comparison of the Accuracy of Maxillary Positioning with Interim Splints versus Patient Specific Guides and Plates in Executing a Virtual Bimaxillary Surgical Plan. J Oral Maxillofac Surg 2022; 80:827-837. [DOI: 10.1016/j.joms.2022.01.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Revised: 12/22/2021] [Accepted: 01/07/2022] [Indexed: 11/25/2022]
|
31
|
Rahimov C, Aliyev D, Rahimov N, Farzaliyev I. Mixed reality in the reconstruction of orbital floor: An experimental and clinical evaluative study. Ann Maxillofac Surg 2022; 12:46-53. [PMID: 36199454 PMCID: PMC9527844 DOI: 10.4103/ams.ams_141_21] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Revised: 06/22/2022] [Accepted: 07/21/2022] [Indexed: 11/04/2022] Open
Abstract
Introduction: Materials and Methods: Results: Discussion:
Collapse
|
32
|
Kassutto SM, Baston C, Clancy C. Virtual, Augmented, and Alternate Reality in Medical Education: Socially Distanced but Fully Immersed. ATS Sch 2021; 2:651-664. [PMID: 35079743 PMCID: PMC8751670 DOI: 10.34197/ats-scholar.2021-0002re] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 08/10/2021] [Indexed: 12/14/2022] Open
Abstract
BACKGROUND Advancements in technology continue to transform the landscape of medical education. The need for technology-enhanced distance learning has been further accelerated by the coronavirus disease (COVID-19) pandemic. The relatively recent emergence of virtual reality (VR), augmented reality (AR), and alternate reality has expanded the possible applications of simulation-based education (SBE) outside of the traditional simulation laboratory, making SBE accessible asynchronously and in geographically diverse locations. OBJECTIVE In this review, we will explore the evidence base for use of emerging technologies in SBE as well as the strengths and limitations of each modality in a variety of settings. METHODS PubMed was searched for peer-reviewed articles published between 1995 and 2021 that focused on VR in medical education. The search terms included medical education, VR, simulation, AR, and alternate reality. We also searched reference lists from selected articles to identify additional relevant studies. RESULTS VR simulations have been used successfully in resuscitation, communication, and bronchoscopy training. In contrast, AR has demonstrated utility in teaching anatomical correlates with the use of diagnostic imaging, such as point-of-care ultrasound. Alternate reality has been used as a tool for developing clinical reasoning skills, longitudinal patient panel management, and crisis resource management via multiplayer platforms. CONCLUSION Although each of these modalities has a variety of educational applications in health profession education, there are benefits and limitations to each that are important to recognize prior to the design and implementation of educational content, including differences in equipment requirements, cost, and scalability.
Collapse
Affiliation(s)
- Stacey M Kassutto
- Department of Medicine, Division of Pulmonary, Allergy and Critical Care, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
| | - Cameron Baston
- Department of Medicine, Division of Pulmonary, Allergy and Critical Care, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
| | - Caitlin Clancy
- Department of Medicine, Division of Pulmonary, Allergy and Critical Care, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
| |
Collapse
|
33
|
Benmahdjoub M, Niessen WJ, Wolvius EB, van Walsum T. Virtual extensions improve perception-based instrument alignment using optical see-through devices. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:4332-4341. [PMID: 34449385 DOI: 10.1109/tvcg.2021.3106506] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
Instrument alignment is a common task in various surgical interventions using navigation. The goal of the task is to position and orient an instrument as it has been planned preoperatively. To this end, surgeons rely on patient-specific data visualized on screens alongside preplanned trajectories. The purpose of this manuscript is to investigate the effect of instrument visualization/non visualization on alignment tasks, and to compare it with virtual extensions approach which augments the realistic representation of the instrument with simple 3D objects. 18 volunteers performed six alignment tasks under each of the following conditions: no visualization on the instrument; realistic visualization of the instrument; realistic visualization extended with virtual elements (Virtual extensions). The first condition represents an egocentric-based alignment while the two other conditions additionally make use of exocentric depth estimation to perform the alignment. The device used was a see-through device (Microsoft HoloLens 2). The positions of the head and the instrument were acquired during the experiment. Additionally, the users were asked to fill NASA-TLX and SUS forms for each condition. The results show that instrument visualization is essential for a good alignment using see-through devices. Moreover, virtual extensions helped achieve the best performance compared to the other conditions with medians of 2 mm and 2° positional and angular error respectively. Furthermore, the virtual extensions decreased the average head velocity while similarly reducing the frustration levels. Therefore, making use of virtual extensions could facilitate alignment tasks in augmented and virtual reality (AR/VR) environments, specifically in AR navigated surgical procedures when using optical see-through devices.
Collapse
|
34
|
Yi Z, Deng Z, Liu Y, He B, Huang S, Hong W, Shi J, Chen Z. Marker-less augmented reality based on monocular vision for falx meningioma localization. Int J Med Robot 2021; 18:e2341. [PMID: 34647683 DOI: 10.1002/rcs.2341] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2021] [Revised: 10/12/2021] [Accepted: 10/12/2021] [Indexed: 11/09/2022]
Abstract
BACKGROUND The existing augmented reality (AR) based neuronavigation systems typically require markers and additional tracking devices for model registration, which causes excessive preparatory steps. METHODS For fast and accurate intraoperative navigation, this work proposes a marker-less AR system that tracks the head features with the monocular camera. After the semi-automatic initialization process, the feature points between the captured image and the pre-loaded keyframes are matched for obtaining correspondences. The camera pose is estimated by solving the Perspective-n-Point problem. RESULTS The localization error of AR visualization on scalp and falx meningioma is 0.417 ± 0.057 and 1.413 ± 0.282 mm, respectively. The maximum localization error is less than 2 mm. The AR system is robust to occlusions and changes in viewpoint and scale. CONCLUSIONS We demonstrate that the developed system can successfully display the augmented falx meningioma with enough accuracy and provide guidance for neurosurgeons to locate the tumour in brain.
Collapse
Affiliation(s)
- Zongchao Yi
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China.,Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China
| | - Zhen Deng
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China.,Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China
| | - Yuqing Liu
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China.,Department of Neurosurgery, Fujian Provincial Hospital, Fuzhou, China
| | - Bingwei He
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China.,Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China
| | - Shengyue Huang
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China.,Department of Neurosurgery, Fujian Provincial Hospital, Fuzhou, China
| | - Wenyao Hong
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China.,Department of Neurosurgery, Fujian Provincial Hospital, Fuzhou, China
| | - Jiafeng Shi
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China.,Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China
| | - Zhongyi Chen
- Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, China.,Department of Neurosurgery, Fujian Provincial Hospital, Fuzhou, China
| |
Collapse
|
35
|
Mehrotra D, Markus A. Emerging simulation technologies in global craniofacial surgical training. J Oral Biol Craniofac Res 2021; 11:486-499. [PMID: 34345584 PMCID: PMC8319526 DOI: 10.1016/j.jobcr.2021.06.002] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2021] [Accepted: 06/22/2021] [Indexed: 12/14/2022] Open
Abstract
The last few decades have seen an exponential growth in the development and adoption of novel technologies in medical and surgical training of residents globally. Simulation is an active and innovative teaching method, and can be achieved via physical or digital models. Simulation allows the learners to repeatedly practice without the risk of causing any error in an actual patient and enhance their surgical skills and efficiency. Simulation may also allow the clinical instructor to objectively test the ability of the trainee to carry out the clinical procedure competently and independently prior to trainee's completion of the program. This review aims to explore the role of emerging simulation technologies globally in craniofacial training of students and residents in improving their surgical knowledge and skills. These technologies include 3D printed biomodels, virtual and augmented reality, use of google glass, hololens and haptic feedback, surgical boot camps, serious games and escape games and how they can be implemented in low and middle income countries. Craniofacial surgical training methods will probably go through a sea change in the coming years, with the integration of these new technologies in the surgical curriculum, allowing learning in a safe environment with a virtual patient, through repeated exercise. In future, it may also be used as an assessment tool to perform any specific procedure, without putting the actual patient on risk. Although these new technologies are being enthusiastically welcomed by the young surgeons, they should only be used as an addition to the actual curriculum and not as a replacement to the conventional tools, as the mentor-mentee relationship can never be replaced by any technology.
Collapse
Affiliation(s)
- Divya Mehrotra
- Department of Oral and Maxillofacial Surgery KGMU, Lucknow, India
| | - A.F. Markus
- Emeritus Consultant Maxillofacial Surgeon, Poole Hospital University of Bournemouth, University of Duisburg-Essen, Trinity College, Dublin, Ireland
| |
Collapse
|
36
|
Kurt Y, Öztürk H. The effect of mobile augmented reality application developed for injections on the knowledge and skill levels of nursing students: An experimental controlled study. NURSE EDUCATION TODAY 2021; 103:104955. [PMID: 34051543 DOI: 10.1016/j.nedt.2021.104955] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/11/2020] [Revised: 04/17/2021] [Accepted: 05/04/2021] [Indexed: 05/29/2023]
Abstract
OBJECTIVE To evaluate the effect of Mobile Augmented Reality (MAR) educational materials on the knowledge and skill levels of nursing students on injection practices. METHOD This controlled experimental study was carried out with 122 first-year nursing students, 64 of whom were included in the experimental group and 58 in the control group. Data were collected between March and April 2018 using an information form, a pre-test, a post-test, a persistence test, and injection evaluation checklists. In the study, the experimental group used MAR applications and the control group used traditional teaching methods in learning injection practices. RESULTS There was no statistically significant difference between the pre-test scores of the students in the experimental and control groups, which determined the knowledge level of subcutaneous, intramuscular, and intravenous injections before the lesson (p > 0.05). After the lesson, it was found statistically significant that the post-test and persistence test scores of students in the experimental group were higher than the control group (p < 0.05). In the first and second/persistence skill evaluations, the injection skill scores of the students in the experimental group were higher than the control group (p < 0.05), which was also statistically significant. In addition, the students in the experimental group stated that MAR applications increased their motivation and self-confidence and reduced their concerns. CONCLUSION It was determined that MAR applications had a positive effect on the knowledge and skill levels of nursing students regarding injection practices and provided persistence in the learned knowledge and skills.
Collapse
Affiliation(s)
- Yeter Kurt
- Faculty of Health Sciences, Karadeniz Technical University, Nursing Department, Trabzon, Turkey.
| | - Havva Öztürk
- Faculty of Health Sciences, Karadeniz Technical University, Nursing Department, Trabzon, Turkey
| |
Collapse
|
37
|
The utility of augmented reality in lateral skull base surgery: A preliminary report. Am J Otolaryngol 2021; 42:102942. [PMID: 33556837 DOI: 10.1016/j.amjoto.2021.102942] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2021] [Accepted: 01/23/2021] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To discuss the utility of augmented reality in lateral skull base surgery. PATIENTS Those undergoing lateral skull base surgery at our institution. INTERVENTION(S) Cerebellopontine angle tumor resection using an augmented reality interface. MAIN OUTCOME MEASURE(S) Ease of use, utility of, and future directions of augmented reality in lateral skull base surgery. RESULTS Anecdotally we have found an augmented reality interface helpful in simulating cerebellopontine angle tumor resection as well as assisting in planning the incision and craniotomy. CONCLUSIONS Augmented reality has the potential to be a useful adjunct in lateral skull base surgery, but more study is needed with large series.
Collapse
|
38
|
Kovoor JG, Gupta AK, Gladman MA. Validity and effectiveness of augmented reality in surgical education: A systematic review. Surgery 2021; 170:88-98. [PMID: 33744003 DOI: 10.1016/j.surg.2021.01.051] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2020] [Revised: 01/27/2021] [Accepted: 01/28/2021] [Indexed: 01/22/2023]
Abstract
BACKGROUND Current challenges in surgical training have led to the investigation of augmented reality as a potential method of supplementary education. However, its value for this purpose remains uncertain. The aim of this study was to perform a systematic review of the published literature to evaluate the validity and effectiveness of augmented reality in surgical education, and to compare it with other simulation modalities. METHODS Electronic literature searches were performed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. Two authors independently extracted pertinent data and assessed study quality. The primary outcome measures of interest were the validity and effectiveness of augmented reality as an educational tool. RESULTS Of 6,500 articles, 24 studies met eligibility criteria for inclusion, of which 2 were randomized. Ten studies investigated validity, with 7 establishing both face and content validity and an additional 1 just content validity. Construct validity was demonstrated in 9 of 11 studies. Of the 11 studies that examined the effectiveness of augmented reality in skills acquisition, 9 demonstrated enhanced learning. Of the 5 studies in which the effectiveness of augmented reality as an educational tool was compared with other modes of simulation, augmented reality was found to be superior in 2 and equivalent in the others. CONCLUSION Overall, the majority, including 2 high-quality randomized controlled trials, demonstrated the validity and effectiveness of augmented reality in surgical education. However, the quality of published studies was poor with marked heterogeneity. Although these results are encouraging, additional high-quality studies, preferably in the real-life environment, are required before the widespread implementation of augmented reality within surgical curricula can be recommended.
Collapse
Affiliation(s)
- Joshua G Kovoor
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia
| | - Aashray K Gupta
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia
| | - Marc A Gladman
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia.
| |
Collapse
|
39
|
Benmahdjoub M, van Walsum T, van Twisk P, Wolvius EB. Augmented reality in craniomaxillofacial surgery: added value and proposed recommendations through a systematic review of the literature. Int J Oral Maxillofac Surg 2021; 50:969-978. [PMID: 33339731 DOI: 10.1016/j.ijom.2020.11.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2019] [Revised: 11/11/2020] [Accepted: 11/18/2020] [Indexed: 10/22/2022]
Abstract
This systematic review provides an overview of augmented reality (AR) and its benefits in craniomaxillofacial surgery in an attempt to answer the question: Is AR beneficial for craniomaxillofacial surgery? This review includes a description of the studies conducted, the systems used and their technical characteristics. The search was performed in four databases: PubMed, Cochrane Library, Embase, and Web of Science. All journal articles published during the past 11 years related to AR, mixed reality, craniomaxillofacial, and surgery were considered in this study. From a total of 7067 articles identified using AR- and surgery-related keywords, 39 articles were finally selected. Based on these articles, a classification of study types, surgery types, devices used, metrics reported, and benefits were collected. The findings of this review indicate that AR could provide various benefits, addressing the challenges of conventional navigation systems, such as hand-eye coordination and depth perception. However, three main concerns were raised while performing this study: (1) it is complicated to aggregate the metrics reported in the articles, (2) it is difficult to obtain statistical value from the current studies, and (3) user evaluation studies are lacking. This article concludes with recommendations for future studies by addressing the latter points.
Collapse
Affiliation(s)
- M Benmahdjoub
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands; Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands.
| | - T van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - P van Twisk
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - E B Wolvius
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
40
|
Lee SH, Quan YH, Kim MS, Kwon KH, Choi BH, Kim HK, Kim BM. Design and Testing of Augmented Reality-Based Fluorescence Imaging Goggle for Intraoperative Imaging-Guided Surgery. Diagnostics (Basel) 2021; 11:diagnostics11060927. [PMID: 34064205 PMCID: PMC8224390 DOI: 10.3390/diagnostics11060927] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2021] [Revised: 05/13/2021] [Accepted: 05/19/2021] [Indexed: 11/16/2022] Open
Abstract
The different pathways between the position of a near-infrared camera and the user's eye limit the use of existing near-infrared fluorescence imaging systems for tumor margin assessments. By utilizing an optical system that precisely matches the near-infrared fluorescence image and the optical path of visible light, we developed an augmented reality (AR)-based fluorescence imaging system that provides users with a fluorescence image that matches the real-field, without requiring any additional algorithms. Commercial smart glasses, dichroic beam splitters, mirrors, and custom near-infrared cameras were employed to develop the proposed system, and each mount was designed and utilized. After its performance was assessed in the laboratory, preclinical experiments involving tumor detection and lung lobectomy in mice and rabbits by using indocyanine green (ICG) were conducted. The results showed that the proposed system provided a stable image of fluorescence that matched the actual site. In addition, preclinical experiments confirmed that the proposed system could be used to detect tumors using ICG and evaluate lung lobectomies. The AR-based intraoperative smart goggle system could detect fluorescence images for tumor margin assessments in animal models, without disrupting the surgical workflow in an operating room. Additionally, it was confirmed that, even when the system itself was distorted when worn, the fluorescence image consistently matched the actual site.
Collapse
Affiliation(s)
- Seung Hyun Lee
- Institute of Global Health Technology, College of Health Science, Korea University, Seoul 02841, Korea;
| | - Yu Hua Quan
- Department of Biomedical Sciences, College of Medicine, Korea University, Seoul 02841, Korea; (Y.H.Q.); (B.H.C.)
- Department of Thoracic and Cardiovascular Surgery, Korea University Guro Hospital, College of Medicine, Korea University, Seoul 08308, Korea
| | - Min Sub Kim
- Department of Bio-convergence Engineering, College of Health Science, Korea University, Seoul 02841, Korea;
| | - Ki Hyeok Kwon
- Department of Interdisciplinary Bio/Micro System Technology, College of Engineering, Korea University, Seoul 02841, Korea;
| | - Byeong Hyeon Choi
- Department of Biomedical Sciences, College of Medicine, Korea University, Seoul 02841, Korea; (Y.H.Q.); (B.H.C.)
- Department of Thoracic and Cardiovascular Surgery, Korea University Guro Hospital, College of Medicine, Korea University, Seoul 08308, Korea
| | - Hyun Koo Kim
- Department of Biomedical Sciences, College of Medicine, Korea University, Seoul 02841, Korea; (Y.H.Q.); (B.H.C.)
- Department of Thoracic and Cardiovascular Surgery, Korea University Guro Hospital, College of Medicine, Korea University, Seoul 08308, Korea
- Correspondence: (H.K.K.); (B.-M.K.)
| | - Beop-Min Kim
- Department of Bioengineering, College of Health Science, Korea University, Seoul 02841, Korea
- Interdisciplinary Program in Precision Public Health, Korea University, Seoul 02841, Korea
- Correspondence: (H.K.K.); (B.-M.K.)
| |
Collapse
|
41
|
Liu K, Gao Y, Abdelrehem A, Zhang L, Chen X, Xie L, Wang X. Augmented reality navigation method for recontouring surgery of craniofacial fibrous dysplasia. Sci Rep 2021; 11:10043. [PMID: 33976233 PMCID: PMC8113548 DOI: 10.1038/s41598-021-88860-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 04/13/2021] [Indexed: 11/09/2022] Open
Abstract
The objective of this study is to introduce the application of augmented reality (AR) navigation system developed by the authors in recontouring surgery of craniofacial fibrous dysplasia. Five consecutive patients with craniofacial fibrous dysplasia were enrolled. Through three-dimensional (3D) simulation, a virtual plan was designed to reconstruct the normal anatomical contour of the deformed region. Surgical recontouring was achieved with the assistance of the AR navigation system. The accuracy of the surgical procedure was assessed by superimposing the post-operative 3D craniomaxillofacial model onto the virtual plan. The pre-operative preparation time and operation time were also counted. In all patients, AR navigation was performed successfully, with a mean ± SD of the errors of 1.442 ± 0.234 mm. The operative time of the patients ranged from 60 to 80 min. The pre-operative preparation time was 20 min for each patient. All the patients showed uneventful healing without any complications, in addition to satisfaction with the post-operative aesthetics. Using our AR navigation system in recontouring surgery can provide surgeons with a comprehensive and intuitive view of the recontouring border, as well as the depth, in real time. This method could improve the efficiency and safety of craniofacial fibrous dysplasia recontouring procedures.
Collapse
Affiliation(s)
- Kai Liu
- Department of Oral and Craniomaxillofacial Surgery, Shanghai 9Th People's Hospital, Shanghai Jiaotong University College of Medicine, Shanghai, China.,Shanghai Key Laboratory of Stomatology, Shanghai, China
| | - Yuan Gao
- Institute of Forming Technology and Equipment, Shanghai JiaoTong University, Shanghai, China
| | - Ahmed Abdelrehem
- Department of Craniomaxillofacial and Plastic Surgery, Faculty of Dentistry, Alexandria University, Alexandria, Egypt
| | - Lei Zhang
- Department of Oral and Craniomaxillofacial Surgery, Shanghai 9Th People's Hospital, Shanghai Jiaotong University College of Medicine, Shanghai, China.,Shanghai Key Laboratory of Stomatology, Shanghai, China
| | - Xi Chen
- Department of Oral and Craniomaxillofacial Surgery, Shanghai 9Th People's Hospital, Shanghai Jiaotong University College of Medicine, Shanghai, China
| | - Le Xie
- Institute of Forming Technology and Equipment, Shanghai JiaoTong University, Shanghai, China. .,Institute of Medical Robot, Shanghai JiaoTong University, Shanghai, China. .,Quanzhou Normal University, Fujian, China.
| | - Xudong Wang
- Department of Oral and Craniomaxillofacial Surgery, Shanghai 9Th People's Hospital, Shanghai Jiaotong University College of Medicine, Shanghai, China. .,Shanghai Key Laboratory of Stomatology, Shanghai, China.
| |
Collapse
|
42
|
Sethi RKV, Spector ME, Chinn SB. New Technologies in Bony Reconstruction of Complex Head and Neck Defects. CURRENT SURGERY REPORTS 2021. [DOI: 10.1007/s40137-021-00290-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
43
|
Gerbino G, Autorino U, Giaccone E, Novaresio A, Ramieri G. Virtual planning and CAD/CAM-assisted distraction for maxillary hypoplasia in cleft lip and palate patients: Accuracy evaluation and clinical outcome. J Craniomaxillofac Surg 2021; 49:799-808. [PMID: 33906808 DOI: 10.1016/j.jcms.2021.03.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 02/23/2021] [Accepted: 03/11/2021] [Indexed: 10/21/2022] Open
Abstract
The aim of this prospective study was to report the experience with a specific guided distraction protocol for the treatment of CLP patients with severe midface hypoplasia. From January 2016 to April 2019, six consecutive, non-growing, CLP patients with maxillary hypoplasia underwent a specific distraction protocol based on the use of VSP, CAD/CAM-generated surgical splints, cutting guides, prebent internal maxillary distractors, early removal of distractors, and acute callus manipulation and fixation. STL files for VSP, using multislice CT scans taken preoperatively (T0) and 3 months after distractor removal (T1) were superimposed using the free software 3D Slicer and Geomagic Wrap to evaluate the accuracy of maxillary repositioning and assess 3D bone changes. Clinical outcome was evaluated at the 1-year follow-up (T2). The patients and surgeon were satisfied with the occlusal and aesthetic outcomes. A maximum difference of 2 mm between the VSP and the actual surgical outcome was chosen as the success criterion for accuracy. The average linear difference for selected points was <2 mm in four patients and >2 mm in two patients. The average distance of the postoperative maxilla from the VSP model was 2.28 mm (median 1.85), while the average forward movement of the maxilla was 10.18 mm The protocol used is effective and accurate in the correction of severe maxillary hypoplasia in CLP patients. Early removal of the distractor and stabilization with plates reduces patient discomfort and does not jeopardize stability. This protocol should be reserved for complex cases due to the costs of the procedure, which are not negligible.
Collapse
Affiliation(s)
- Giovanni Gerbino
- Division of Maxillofacial Surgery, Città della Salute e della Scienza Hospital, University of Torino, Italy
| | - Umberto Autorino
- Division of Maxillofacial Surgery, Città della Salute e della Scienza Hospital, University of Torino, Italy.
| | - Elena Giaccone
- Division of Maxillofacial Surgery, Città della Salute e della Scienza Hospital, University of Torino, Italy
| | - Andrea Novaresio
- Department of Surgical Sciences, University of Torino, Department of Management and Production Engineering, Politecnico of Torino, Italy
| | - Gugliemo Ramieri
- Division of Maxillofacial Surgery, Città della Salute e della Scienza Hospital, University of Torino, Italy
| |
Collapse
|
44
|
Glas HH, Kraeima J, van Ooijen PMA, Spijkervet FKL, Yu L, Witjes MJH. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J Oral Maxillofac Surg 2021; 79:1943.e1-1943.e10. [PMID: 34033801 DOI: 10.1016/j.joms.2021.04.001] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/01/2021] [Accepted: 04/01/2021] [Indexed: 01/21/2023]
Abstract
BACKGROUND Oral and maxillofacial surgery currently relies on virtual surgery planning based on image data (CT, MRI). Three-dimensional (3D) visualizations are typically used to plan and predict the outcome of complex surgical procedures. To translate the virtual surgical plan to the operating room, it is either converted into physical 3D-printed guides or directly translated using real-time navigation systems. PURPOSE This study aims to improve the translation of the virtual surgery plan to a surgical procedure, such as oncologic or trauma surgery, in terms of accuracy and speed. Here we report an augmented reality visualization technique for image-guided surgery. It describes how surgeons can visualize and interact with the virtual surgery plan and navigation data while in the operating room. The user friendliness and usability is objectified by a formal user study that compared our augmented reality assisted technique to the gold standard setup of a perioperative navigation system (Brainlab). Moreover, accuracy of typical navigation tasks as reaching landmarks and following trajectories is compared. RESULTS Overall completion time of navigation tasks was 1.71 times faster using augmented reality (P = .034). Accuracy improved significantly using augmented reality (P < .001), for reaching physical landmarks a less strong correlation was found (P = .087). Although the participants were relatively unfamiliar with VR/AR (rated 2.25/5) and gesture-based interaction (rated 2/5), they reported that navigation tasks become easier to perform using augmented reality (difficulty Brainlab rated 3.25/5, HoloLens 2.4/5). CONCLUSION The proposed workflow can be used in a wide range of image-guided surgery procedures as an addition to existing verified image guidance systems. Results of this user study imply that our technique enables typical navigation tasks to be performed faster and more accurately compared to the current gold standard. In addition, qualitative feedback on our augmented reality assisted technique was more positive compared to the standard setup.?>.
Collapse
Affiliation(s)
- H H Glas
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
| | - J Kraeima
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - P M A van Ooijen
- Associate Professor Faculty of Medical Sciences, Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - F K L Spijkervet
- Professor, Oral and Maxillofacial Surgeon, Head of the Department, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - L Yu
- Lecturer in the Department of Computer Science and Software Engineering (CSSE), Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - M J H Witjes
- Oral and Maxillofacial Surgeon, Principal Investigator, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
45
|
Use of augmented reality navigation to optimise the surgical management of craniofacial fibrous dysplasia. Br J Oral Maxillofac Surg 2021; 60:162-167. [PMID: 34930644 DOI: 10.1016/j.bjoms.2021.03.011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 03/25/2021] [Indexed: 11/20/2022]
Abstract
The aim of this study was to apply an augmented reality (AR) navigation technique based on a head- mounted display in the treatment of craniofacial fibrous dysplasia and to explore the feasibility and the value of AR in craniofacial surgery. With preoperative planning and three-dimensional simulation, the normal anatomical contours of the deformed area were recreated by superimposing the unaffected side on to the affected side. We completed the recontouring procedures in real time with the aid of an AR navigation system. The surgical outcome was assessed by superimposing the postoperative computed tomographic images on to the preoperative virtual plan. The preparation and operation times were recorded. With intraoperative AR guidance, facial bone recontouring was performed uneventfully in all cases. The mean (SD) discrepancy between the actual surgical reduction and preoperative planning was 1.036 (0.081) mm (range: 0.913 (0.496) to 1.165 (0.498) mm). The operation time ranged from 50 to 80 minutes, with an average of 66.4 minutes. The preoperative preparation time ranged from 26 to 36 minutes, with a mean of 29.6 minutes. AR navigation-assisted facial bone recontouring is a valuable treatment modality in managing craniomaxillofacial fibrous dysplasia and shows benefits in improving the efficiency and safety of this complicated procedure.
Collapse
|
46
|
Gsaxner C, Pepe A, Li J, Ibrahimpasic U, Wallner J, Schmalstieg D, Egger J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 200:105854. [PMID: 33261944 DOI: 10.1016/j.cmpb.2020.105854] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Accepted: 11/16/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Augmented reality (AR) can help to overcome current limitations in computer assisted head and neck surgery by granting "X-ray vision" to physicians. Still, the acceptance of AR in clinical applications is limited by technical and clinical challenges. We aim to demonstrate the benefit of a marker-free, instant calibration AR system for head and neck cancer imaging, which we hypothesize to be acceptable and practical for clinical use. METHODS We implemented a novel AR system for visualization of medical image data registered with the head or face of the patient prior to intervention. Our system allows the localization of head and neck carcinoma in relation to the outer anatomy. Our system does not require markers or stationary infrastructure, provides instant calibration and allows 2D and 3D multi-modal visualization for head and neck surgery planning via an AR head-mounted display. We evaluated our system in a pre-clinical user study with eleven medical experts. RESULTS Medical experts rated our application with a system usability scale score of 74.8 ± 15.9, which signifies above average, good usability and clinical acceptance. An average of 12.7 ± 6.6 minutes of training time was needed by physicians, before they were able to navigate the application without assistance. CONCLUSIONS Our AR system is characterized by a slim and easy setup, short training time and high usability and acceptance. Therefore, it presents a promising, novel tool for visualizing head and neck cancer imaging and pre-surgical localization of target structures.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jianning Li
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Una Ibrahimpasic
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria
| | - Jürgen Wallner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria; Department of Cranio-Maxillofacial Surgery, AZ Monica Hospital Antwerp and Antwerp University Hospital, Antwerp, Belgium.
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| |
Collapse
|
47
|
Johnson AA, Reidler JS, Speier W, Fuerst B, Wang J, Osgood GM. Visualization of Fluoroscopic Imaging in Orthopedic Surgery: Head-Mounted Display vs Conventional Monitor. Surg Innov 2021; 29:353-359. [PMID: 33517863 DOI: 10.1177/1553350620987978] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Purpose. See-through head-mounted displays (HMDs) can be used to view fluoroscopic imaging during orthopedic surgical procedures. The goals of this study were to determine whether HMDs reduce procedure time, number of fluoroscopic images required, or number of head turns by the surgeon compared with standard monitors. Methods. Sixteen orthopedic surgery residents each performed fluoroscopy-guided drilling of 8 holes for placement of tibial nail distal interlocking screws in an anatomical model, with 4 holes drilled while using HMD and 4 holes drilled while using a standard monitor. Procedure time, number of fluoroscopic images needed, and number of head turns by the resident during the procedure were compared between the 2 modalities. Statistical significance was set at P < .05. Results. Mean (SD) procedure time did not differ significantly between attempts using the standard monitor (55 [37] seconds) vs the HMD (56 [31] seconds) (P = .73). Neither did mean number of fluoroscopic images differ significantly between attempts using the standard monitor vs the HMD (9 [5] images for each) (P = .84). Residents turned their heads significantly more times when using the standard monitor (9 [5] times) vs the HMD (1 [2] times) (P < .001). Conclusions. Head-mounted displays lessened the need for residents to turn their heads away from the surgical field while drilling holes for tibial nail distal interlocking screws in an anatomical model; however, there was no difference in terms of procedure time or number of fluoroscopic images needed using the HMD compared with the standard monitor.
Collapse
Affiliation(s)
- Alex A Johnson
- The American Sports Medicine Institute, Birmingham, AL, USA
| | - Jay S Reidler
- The Och Spine Hospital at Columbia University, New York, NY, USA
| | - William Speier
- Department of Radiology, 8783University of California, Los Angeles Medical Center, Los Angeles, CA, USA
| | | | - Jiangxia Wang
- The Johns Hopkins Biostatistics Center, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Greg M Osgood
- Department of Orthopedic Surgery, 1466Johns Hopkins School of Medicine, Baltimore, MD, USA
| |
Collapse
|
48
|
Mondal SB, Achilefu S. Virtual and Augmented Reality Technologies in Molecular and Anatomical Imaging. Mol Imaging 2021. [DOI: 10.1016/b978-0-12-816386-3.00066-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
49
|
Buch VP, Mensah-Brown KG, Germi JW, Park BJ, Madsen PJ, Borja AJ, Haldar D, Basenfelder P, Yoon JW, Schuster JM, Chen HCI. Development of an Intraoperative Pipeline for Holographic Mixed Reality Visualization During Spinal Fusion Surgery. Surg Innov 2020; 28:427-437. [PMID: 33382008 DOI: 10.1177/1553350620984339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Objective. Holographic mixed reality (HMR) allows for the superimposition of computer-generated virtual objects onto the operator's view of the world. Innovative solutions can be developed to enable the use of this technology during surgery. The authors developed and iteratively optimized a pipeline to construct, visualize, and register intraoperative holographic models of patient landmarks during spinal fusion surgery. Methods. The study was carried out in two phases. In phase 1, the custom intraoperative pipeline to generate patient-specific holographic models was developed over 7 patients. In phase 2, registration accuracy was optimized iteratively for 6 patients in a real-time operative setting. Results. In phase 1, an intraoperative pipeline was successfully employed to generate and deploy patient-specific holographic models. In phase 2, the registration error with the native hand-gesture registration was 20.2 ± 10.8 mm (n = 7 test points). Custom controller-based registration significantly reduced the mean registration error to 4.18 ± 2.83 mm (n = 24 test points, P < .01). Accuracy improved over time (B = -.69, P < .0001) with the final patient achieving a registration error of 2.30 ± .58 mm. Across both phases, the average model generation time was 18.0 ± 6.1 minutes (n = 6) for isolated spinal hardware and 33.8 ± 8.6 minutes (n = 6) for spinal anatomy. Conclusions. A custom pipeline is described for the generation of intraoperative 3D holographic models during spine surgery. Registration accuracy dramatically improved with iterative optimization of the pipeline and technique. While significant improvements and advancements need to be made to enable clinical utility, HMR demonstrates significant potential as the next frontier of intraoperative visualization.
Collapse
Affiliation(s)
- Vivek P Buch
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Kobina G Mensah-Brown
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - James W Germi
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Brian J Park
- Department of Radiology, 6572University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA
| | - Peter J Madsen
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Austin J Borja
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Debanjan Haldar
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Patricia Basenfelder
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Jang W Yoon
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - James M Schuster
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| | - Han-Chiao I Chen
- Department of Neurosurgery, 6572University of Pennsylvania Health System Penn Presbyterian Medical Center, Philadelphia, PA, USA
| |
Collapse
|
50
|
Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices 2020; 18:47-62. [PMID: 33283563 DOI: 10.1080/17434440.2021.1860750] [Citation(s) in RCA: 65] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Background: Research proves that the apprenticeship model, which is the gold standard for training surgical residents, is obsolete. For that reason, there is a continuing effort toward the development of high-fidelity surgical simulators to replace the apprenticeship model. Applying Virtual Reality Augmented Reality (AR) and Mixed Reality (MR) in surgical simulators increases the fidelity, level of immersion and overall experience of these simulators.Areas covered: The objective of this review is to provide a comprehensive overview of the application of VR, AR and MR for distinct surgical disciplines, including maxillofacial surgery and neurosurgery. The current developments in these areas, as well as potential future directions, are discussed.Expert opinion: The key components for incorporating VR into surgical simulators are visual and haptic rendering. These components ensure that the user is completely immersed in the virtual environment and can interact in the same way as in the physical world. The key components for the application of AR and MR into surgical simulators include the tracking system as well as the visual rendering. The advantages of these surgical simulators are the ability to perform user evaluations and increase the training frequency of surgical residents.
Collapse
Affiliation(s)
- Abel J Lungu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Wout Swinkels
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Luc Claesen
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Jan Egger
- Graz University of Technology, Institute of Computer Graphics and Vision, Graz, Austria.,Graz Department of Oral &maxillofacial Surgery, Medical University of Graz, Graz, Austria.,The Laboratory of Computer Algorithms for Medicine, Medical University of Graz, Graz, Austria
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|