1
|
Chou DW, Annadata V, Willson G, Gray M, Rosenberg J. Augmented and Virtual Reality Applications in Facial Plastic Surgery: A Scoping Review. Laryngoscope 2024; 134:2568-2577. [PMID: 37947302 DOI: 10.1002/lary.31178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 10/05/2023] [Accepted: 10/27/2023] [Indexed: 11/12/2023]
Abstract
OBJECTIVES Augmented reality (AR) and virtual reality (VR) are emerging technologies with wide potential applications in health care. We performed a scoping review of the current literature on the application of augmented and VR in the field of facial plastic and reconstructive surgery (FPRS). DATA SOURCES PubMed and Web of Science. REVIEW METHODS According to PRISMA guidelines, PubMed and Web of Science were used to perform a scoping review of literature regarding the utilization of AR and/or VR relevant to FPRS. RESULTS Fifty-eight articles spanning 1997-2023 met the criteria for review. Five overarching categories of AR and/or VR applications were identified across the articles: preoperative, intraoperative, training/education, feasibility, and technical. The following clinical areas were identified: burn, craniomaxillofacial surgery (CMF), face transplant, face lift, facial analysis, facial palsy, free flaps, head and neck surgery, injectables, locoregional flaps, mandible reconstruction, mandibuloplasty, microtia, skin cancer, oculoplastic surgery, rhinology, rhinoplasty, and trauma. CONCLUSION AR and VR have broad applications in FPRS. AR for surgical navigation may have the most emerging potential in CMF surgery and free flap harvest. VR is useful as distraction analgesia for patients and as an immersive training tool for surgeons. More data on these technologies' direct impact on objective clinical outcomes are still needed. LEVEL OF EVIDENCE N/A Laryngoscope, 134:2568-2577, 2024.
Collapse
Affiliation(s)
- David W Chou
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Emory University School of Medicine, Atlanta, Georgia, USA
| | - Vivek Annadata
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Gloria Willson
- Education and Research Services, Levy Library, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Mingyang Gray
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Joshua Rosenberg
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| |
Collapse
|
2
|
Arensmeyer J, Bedetti B, Schnorr P, Buermann J, Zalepugas D, Schmidt J, Feodorovici P. A System for Mixed-Reality Holographic Overlays of Real-Time Rendered 3D-Reconstructed Imaging Using a Video Pass-through Head-Mounted Display-A Pathway to Future Navigation in Chest Wall Surgery. J Clin Med 2024; 13:2080. [PMID: 38610849 PMCID: PMC11012529 DOI: 10.3390/jcm13072080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Revised: 03/26/2024] [Accepted: 03/27/2024] [Indexed: 04/14/2024] Open
Abstract
Background: Three-dimensional reconstructions of state-of-the-art high-resolution imaging are progressively being used more for preprocedural assessment in thoracic surgery. It is a promising tool that aims to improve patient-specific treatment planning, for example, for minimally invasive or robotic-assisted lung resections. Increasingly available mixed-reality hardware based on video pass-through technology enables the projection of image data as a hologram onto the patient. We describe the novel method of real-time 3D surgical planning in a mixed-reality setting by presenting three representative cases utilizing volume rendering. Materials: A mixed-reality system was set up using a high-performance workstation running a video pass-through-based head-mounted display. Image data from computer tomography were imported and volume-rendered in real-time to be customized through live editing. The image-based hologram was projected onto the patient, highlighting the regions of interest. Results: Three oncological cases were selected to explore the potentials of the mixed-reality system. Two of them presented large tumor masses in the thoracic cavity, while a third case presented an unclear lesion of the chest wall. We aligned real-time rendered 3D holographic image data onto the patient allowing us to investigate the relationship between anatomical structures and their respective body position. Conclusions: The exploration of holographic overlay has proven to be promising in improving preprocedural surgical planning, particularly for complex oncological tasks in the thoracic surgical field. Further studies on outcome-related surgical planning and navigation should therefore be conducted. Ongoing technological progress of extended reality hardware and intelligent software features will most likely enhance applicability and the range of use in surgical fields within the near future.
Collapse
Affiliation(s)
- Jan Arensmeyer
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| | - Benedetta Bedetti
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Schnorr
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Jens Buermann
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Donatas Zalepugas
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Joachim Schmidt
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Feodorovici
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| |
Collapse
|
3
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023; 26:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M. Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
| | - L. Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Pierre A. Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Tristan P. C. van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
4
|
Tokunaga T, Sugimoto M, Saito Y, Kashihara H, Yoshikawa K, Nakao T, Nishi M, Takasu C, Wada Y, Waki Y, Yoshimoto T, Noma T, Shimada M. Transanal lateral lymph node dissection with intraoperative hologram support in low rectal cancer. Surg Endosc 2023:10.1007/s00464-023-09977-w. [PMID: 37017769 DOI: 10.1007/s00464-023-09977-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2022] [Accepted: 02/21/2023] [Indexed: 04/06/2023]
Abstract
BACKGROUND In Japan, the standard treatment for stage II/III advanced low rectal cancer is total mesorectal excision plus lateral lymph node dissection (LLND). There are also recent reports on the use of transanal LLND. However, the transanal anatomy is difficult to understand, and additional support tools are required to improve the surgical safety. The present study examined the utility of holograms with mixed reality as an intraoperative support tool for assessing the complex pelvic anatomy. METHODS Polygon (stereolithography) files of patients' pelvic organs were created and exported from the SYNAPSE VINCENT imaging system and uploaded into the Holoeyes MD virtual reality software. Three-dimensional images were automatically converted into patient-specific holograms. Each hologram was then installed into a head mount display (HoloLens2), and the surgeons and assistants wore the HoloLens2 when they performed transanal LLND. Twelve digestive surgeons with prior practice in hologram manipulation evaluated the utility of the intraoperative hologram support by means of a questionnaire. RESULTS Intraoperative hologram support improved the surgical understanding of the lateral lymph node region anatomy. In the questionnaire, 75% of the surgeons answered that the hologram accurately reflected the anatomy, and 92% of the surgeons answered that the anatomy was better understood by simulating the hologram intraoperatively than preoperatively. Moreover, 92% of the surgeons agreed that intraoperative holograms were a useful support tool for improving the surgical safety. CONCLUSIONS Intraoperative hologram support improved the surgical understanding of the pelvic anatomy for transanal LLND. Intraoperative holograms may represent a next-generation surgical tool for transanal LLND.
Collapse
Affiliation(s)
- Takuya Tokunaga
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan.
| | - Maki Sugimoto
- Okinaga Research Institute, Teikyo University, Tokyo, Japan
| | - Yu Saito
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Hideya Kashihara
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Kozo Yoshikawa
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Toshihiro Nakao
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Masaaki Nishi
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Chie Takasu
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Yuma Wada
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Yuhei Waki
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Toshiaki Yoshimoto
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Takayuki Noma
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Mitsuo Shimada
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| |
Collapse
|
5
|
Winkler AA, Chabuz C, McIntosh CND, Lekakis G. The Need for Innovation in Rhinoplasty. Facial Plast Surg 2022; 38:440-446. [DOI: 10.1055/s-0042-1748954] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
AbstractRhinoplasty is a challenging surgery and results are not always perfect. There are many obstacles to achieving optimal results. Among these are inadequate instrumentation, the unpredictability of healing, imprecise planning, and many more. Furthermore, selecting patients who can most benefit from surgery is equally important. In this article, some of the more pressing areas of rhinoplasty that need innovation are discussed. From proper patient selection, to advances in education, to the standardization of training programs, to the development of sophisticated implants, the future of rhinoplasty surgery lies in continued creativity and innovation.
Collapse
Affiliation(s)
- Andrew A. Winkler
- Department of Otolaryngology, Head and Neck Surgery, University of Colorado School of Medicine, Aurora, Colorado
| | - Carolyn Chabuz
- Department of Otolaryngology, Head and Neck Surgery, University of Colorado School of Medicine, Aurora, Colorado
| | | | - Garyfalia Lekakis
- Department of Otorhinolaryngology, Head and Neck Surgery, University Hospitals Leuven, Leuven, Belgium
| |
Collapse
|
6
|
Tokunaga T, Sugimoto M, Saito Y, Kashihara H, Yoshikawa K, Nakao T, Nishi M, Takasu C, Wada Y, Yoshimoto T, Yamashita S, Iwakawa Y, Yokota N, Shimada M. Intraoperative holographic image-guided surgery in a transanal approach for rectal cancer. Langenbecks Arch Surg 2022; 407:2579-2584. [PMID: 35840706 DOI: 10.1007/s00423-022-02607-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 07/09/2022] [Indexed: 11/28/2022]
Abstract
PURPOSE Urethral injury is one of the most important complications in transanal total mesorectal excision (TaTME) in male patients with rectal cancer. The purpose of this study was to investigate holographic image-guided surgery in TaTME. METHODS Polygon (stereolithography) files were created and exported from SYNAPSE VINCENT, and then uploaded into the Holoeyes MD system (Holoeyes Inc., Tokyo, Japan). After uploading the data, the three-dimensional image was automatically converted into a case-specific hologram. The hologram was then installed into the head mount display, HoloLens (Microsoft Corporation, Redmond, WA). The surgeons and assistants wore the HoloLens when they performed TaTME. RESULTS In a Wi-Fi-enabled operating room, each surgeon, wearing a HoloLens, shared the same hologram and succeeded in adjusting the hologram by making simple hand gestures from their respective angles. The hologram contributed to better comprehension of the positional relationships between the urethra and the surrounding pelvic organs during surgery. All surgeons were able to properly determine the dissection line. CONCLUSIONS This first experience suggests that intraoperative holograms contributed to reducing the risk of urethral injury and understanding transanal anatomy. Intraoperative holograms have the potential to become a new next-generation surgical support tool for use in spatial awareness and the sharing of information between surgeons.
Collapse
Affiliation(s)
- Takuya Tokunaga
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan.
| | - Maki Sugimoto
- Okinaga Research Institute, Teikyo University, Tokyo, Japan
| | - Yu Saito
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Hideya Kashihara
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Kozo Yoshikawa
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Toshihiro Nakao
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Masaaki Nishi
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Chie Takasu
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Yuma Wada
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Toshiaki Yoshimoto
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Shoko Yamashita
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Yosuke Iwakawa
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Noriko Yokota
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| | - Mitsuo Shimada
- Department of Surgery, Tokushima University, 3-18-15 Kuramoto-cho, Tokushima, 770-8503, Japan
| |
Collapse
|
7
|
Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. INFORMATION 2022. [DOI: 10.3390/info13020081] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
In the context of image-guided surgery, augmented reality (AR) represents a ground-breaking enticing improvement, mostly when paired with wearability in the case of open surgery. Commercially available AR head-mounted displays (HMDs), designed for general purposes, are increasingly used outside their indications to develop surgical guidance applications with the ambition to demonstrate the potential of AR in surgery. The applications proposed in the literature underline the hunger for AR-guidance in the surgical room together with the limitations that hinder commercial HMDs from being the answer to such a need. The medical domain demands specifically developed devices that address, together with ergonomics, the achievement of surgical accuracy objectives and compliance with medical device regulations. In the framework of an EU Horizon2020 project, a hybrid video and optical see-through augmented reality headset paired with a software architecture, both specifically designed to be seamlessly integrated into the surgical workflow, has been developed. In this paper, the overall architecture of the system is described. The developed AR HMD surgical navigation platform was positively tested on seven patients to aid the surgeon while performing Le Fort 1 osteotomy in cranio-maxillofacial surgery, demonstrating the value of the hybrid approach and the safety and usability of the navigation platform.
Collapse
|
8
|
Neves CA, Leuze C, Gomez AM, Navab N, Blevins N, Vaisbuch Y, McNab JA. Augmented Reality for Retrosigmoid Craniotomy Planning. Skull Base Surg 2021; 83:e564-e573. [DOI: 10.1055/s-0041-1735509] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 07/28/2021] [Indexed: 10/20/2022]
Abstract
AbstractWhile medical imaging data have traditionally been viewed on two-dimensional (2D) displays, augmented reality (AR) allows physicians to project the medical imaging data on patient's bodies to locate important anatomy. We present a surgical AR application to plan the retrosigmoid craniotomy, a standard approach to access the posterior fossa and the internal auditory canal. As a simple and accurate alternative to surface landmarks and conventional surgical navigation systems, our AR application augments the surgeon's vision to guide the optimal location of cortical bone removal. In this work, two surgeons performed a retrosigmoid approach 14 times on eight cadaver heads. In each case, the surgeon manually aligned a computed tomography (CT)-derived virtual rendering of the sigmoid sinus on the real cadaveric heads using a see-through AR display, allowing the surgeon to plan and perform the craniotomy accordingly. Postprocedure CT scans were acquired to assess the accuracy of the retrosigmoid craniotomies with respect to their intended location relative to the dural sinuses. The two surgeons had a mean margin of davg = 0.6 ± 4.7 mm and davg = 3.7 ± 2.3 mm between the osteotomy border and the dural sinuses over all their cases, respectively, and only positive margins for 12 of the 14 cases. The intended surgical approach to the internal auditory canal was successfully achieved in all cases using the proposed method, and the relatively small and consistent margins suggest that our system has the potential to be a valuable tool to facilitate planning a variety of similar skull-base procedures.
Collapse
Affiliation(s)
- Caio A. Neves
- Department of Otolaryngology, Stanford School of Medicine, Stanford, United States
- Faculty of Medicine, University of Brasília, Brasília, Brazil
| | - Christoph Leuze
- Department of Radiology, Stanford School of Medicine, Stanford, United States
| | - Alejandro M. Gomez
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Germany
- Laboratory for Computer Aided Medical Procedures, Whiting School of Engineering, Johns Hopkins University, Baltimore, USA
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Germany
- Laboratory for Computer Aided Medical Procedures, Whiting School of Engineering, Johns Hopkins University, Baltimore, USA
| | - Nikolas Blevins
- Department of Otolaryngology, Stanford School of Medicine, Stanford, United States
| | - Yona Vaisbuch
- Department of Otolaryngology, Stanford School of Medicine, Stanford, United States
| | - Jennifer A. McNab
- Department of Radiology, Stanford School of Medicine, Stanford, United States
| |
Collapse
|
9
|
Tel A, Arboit L, Sembronio S, Costa F, Nocini R, Robiony M. The Transantral Endoscopic Approach: A Portal for Masses of the Inferior Orbit-Improving Surgeons' Experience Through Virtual Endoscopy and Augmented Reality. Front Surg 2021; 8:715262. [PMID: 34497829 PMCID: PMC8419325 DOI: 10.3389/fsurg.2021.715262] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Accepted: 07/27/2021] [Indexed: 01/17/2023] Open
Abstract
In the past years, endoscopic techniques have raised an increasing interest to perform minimally invasive accesses to the orbit, resulting in excellent clinical outcomes with inferior morbidities and complication rates. Among endoscopic approaches, the transantral endoscopic approach allows us to create a portal to the orbital floor, representing the most straightforward access to lesions located in the inferior orbital space. However, if endoscopic surgery provides enhanced magnified vision of the anatomy in a bloodless field, then it has several impairments compared with classic open surgery, owing to restricted operative spaces. Virtual surgical planning and anatomical computer-generated models have proved to be of great importance to plan endoscopic surgical approaches, and their role can be widened with the integration of surgical navigation, virtual endoscopy simulation, and augmented reality (AR). This study focuses on the strict conjugation between the technologies that allow the virtualization of surgery in an entirely digital environment, which can be transferred to the patient using intraoperative navigation or to a printed model using AR for pre-surgical analysis. Therefore, the interaction between different software packages and platforms offers a highly predictive preview of the surgical scenario, contributing to increasing orientation, awareness, and effectiveness of maneuvers performed under endoscopic guidance, which can be checked at any time using surgical navigation. In this paper, the authors explore the transantral approach for the excision of masses of the inferior orbital compartment through modern technology. The authors apply this technique for masses located in the inferior orbit and share their clinical results, describing why technological innovation, and, in particular, computer planning, virtual endoscopy, navigation, and AR can contribute to empowering minimally invasive orbital surgery, at the same time offering a valuable and indispensable tool for pre-surgical analysis and training.
Collapse
Affiliation(s)
- Alessandro Tel
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Lorenzo Arboit
- Faculty of Medicine and Surgery, Sant'Anna School of Advanced Studies, Pisa, Italy
| | - Salvatore Sembronio
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Fabio Costa
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Riccardo Nocini
- Department of Otorhinolaryngology, University Hospital of Verona, Verona, Italy
| | - Massimo Robiony
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| |
Collapse
|
10
|
Leuze C, Zoellner A, Schmidt AR, Cushing RE, Fischer MJ, Joltes K, Zientara GP. Augmented reality visualization tool for the future of tactical combat casualty care. J Trauma Acute Care Surg 2021; 91:S40-S45. [PMID: 33938509 DOI: 10.1097/ta.0000000000003263] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
ABSTRACT The objective of this project was to identify and develop software for an augmented reality application that runs on the US Army Integrated Visual Augmentation System (IVAS) to support a medical caregiver during tactical combat casualty care scenarios. In this augmented reality tactical combat casualty care application, human anatomy of individual soldiers obtained predeployment is superimposed on the view of an injured war fighter through the IVAS. This offers insight into the anatomy of the injured war fighter to advance treatment in austere environments.In this article, we describe various software components required for an augmented reality tactical combat casualty care tool. These include a body pose tracking system to track the patient's body pose, a virtual rendering of a human anatomy avatar, speech input to control the application and rendering techniques to visualize the virtual anatomy, and treatment information on the augmented reality display. We then implemented speech commands and visualization for four common medical scenarios including injury of a limb, a blast to the pelvis, cricothyrotomy, and a pneumothorax on the Microsoft HoloLens 1 (Microsoft, Redmond, WA).The software is designed for a forward surgical care tool on the US Army IVAS, with the intention to provide the medical caregiver with a unique ability to quickly assess affected internal anatomy. The current software components still had some limitations with respect to speech recognition reliability during noise and body pose tracking. These will likely be improved with the improved hardware of the IVAS, which is based on a modified HoloLens 2.
Collapse
Affiliation(s)
- Christoph Leuze
- From the Nakamir Inc. (C.L., A.Z., M.J.F.), Palo Alto; Department of Anesthesia (A.R.S.), Perioperative and Pain Medicine, Stanford School of Medicine, Stanford, California; US Army Research Institute of Environmental Medicine (R.E.C., K.J., G.P.Z.), Natick, Massachusetts
| | | | | | | | | | | | | |
Collapse
|
11
|
Davidovic A, Chavaz L, Meling TR, Schaller K, Bijlenga P, Haemmerli J. Evaluation of the effect of standard neuronavigation and augmented reality on the integrity of the perifocal structures during a neurosurgical approach. Neurosurg Focus 2021; 51:E19. [PMID: 34333474 DOI: 10.3171/2021.5.focus21202] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2021] [Accepted: 05/17/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Intracranial minimally invasive procedures imply working in a restricted surgical corridor surrounded by critical structures, such as vessels and cranial nerves. Any damage to them may affect patient outcome. Neuronavigation systems may reduce the risk of such complications. In this study, the authors sought to compare standard neuronavigation (NV) and augmented reality (AR)-guided navigation with respect to the integrity of the perifocal structures during a neurosurgical approach using a novel model imitating intracranial vessels. METHODS A custom-made box, containing crisscrossing hard metal wires, a hidden nail at its bottom, and a wooden top, was scanned, fused, and referenced for the purpose of the study. The metal wires and an aneurysm clip applier were connected to a controller, which counted the number of contacts between them. Twenty-three naive participants were asked to 1) use NV to define an optimal entry point on the top, perform the smallest craniotomy possible on the wooden top, and to use a surgical microscope when placing a clip on the nail without touching the metal wires; and 2) use AR to preoperatively define an ideal trajectory, navigate the surgical microscope, and then perform the same task. The primary outcome was the number of contacts made between the metal wires and the clip applier. Secondary outcomes were craniotomy size, and trust in NV and AR to help avoid touching the metal wires, as assessed by a 9-level Likert scale. RESULTS The median number of contacts tended to be lower with the use of AR than with NV (AR, median 1 [Q1: 1, Q3: 2]; NV, median 3 [Q1: 1, Q3: 6]; p = 0.074). The size of the target-oriented craniotomy was significantly lower with the use of AR compared with NV (AR, median 4.91 cm2 [Q1: 4.71 cm2, Q3: 7.55 cm2]; and NV, median 9.62 cm2 [Q1: 7.07 cm2; Q3: 13.85 cm2]). Participants had more trust in AR than in NV (the differences posttest minus pretest were mean 0.9 [SD 1.2] and mean -0.3 [SD 0.2], respectively; p < 0.05). CONCLUSIONS The results of this study show a trend favoring the use of AR over NV with respect to reducing contact between a clip applier and the perifocal structures during a simulated clipping of an intracranial aneurysm. Target-guided craniotomies were smaller with the use of AR. AR may be used not only to localize surgical targets but also to prevent complications associated with damage to structures encountered during the surgical approach.
Collapse
Affiliation(s)
| | - Lara Chavaz
- 2Faculty of Medicine, University of Geneva, Geneva, Switzerland
| | - Torstein R Meling
- 1Division of Neurosurgery, Department of Clinical Neurosciences, Geneva University Hospitals; and.,2Faculty of Medicine, University of Geneva, Geneva, Switzerland
| | - Karl Schaller
- 1Division of Neurosurgery, Department of Clinical Neurosciences, Geneva University Hospitals; and.,2Faculty of Medicine, University of Geneva, Geneva, Switzerland
| | - Philippe Bijlenga
- 1Division of Neurosurgery, Department of Clinical Neurosciences, Geneva University Hospitals; and.,2Faculty of Medicine, University of Geneva, Geneva, Switzerland
| | - Julien Haemmerli
- 1Division of Neurosurgery, Department of Clinical Neurosciences, Geneva University Hospitals; and
| |
Collapse
|
12
|
Winnand P, Ayoub N, Redick T, Gesenhues J, Heitzer M, Peters F, Raith S, Abel D, Hölzle F, Modabber A. Navigation of iliac crest graft harvest using markerless augmented reality and cutting guide technology: A pilot study. Int J Med Robot 2021; 18:e2318. [PMID: 34328700 DOI: 10.1002/rcs.2318] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 07/27/2021] [Accepted: 03/04/2021] [Indexed: 11/08/2022]
Abstract
BACKGROUND Defects of the facial skeleton often require complex reconstruction with vascularized grafts. This trial elucidated the usability, visual perception and accuracy of a markerless augmented reality (AR)-guided navigation for harvesting iliac crest transplants. METHODS Random CT scans were used to virtually plan two common transplant configurations on 10 iliac crest models, each printed four times. The transplants were harvested using projected AR and cutting guides. The duration and accuracies of the angulation, distance and volume between the planned and executed osteotomies were measured. RESULTS AR was characterized by the efficient use of time and accurate rendition of preoperatively planned geometries. However, vertical osteotomies and complex anatomical settings displayed significant inferiority of AR guidance compared to cutting guides. CONCLUSIONS This study demonstrated the usability of a markerless AR setup for harvesting iliac crest transplants. The visual perception and accuracy of the AR-guided osteotomies constituted remaining weaknesses against cutting guide technology.
Collapse
Affiliation(s)
- Philipp Winnand
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Nassim Ayoub
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Tim Redick
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Jonas Gesenhues
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Marius Heitzer
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Florian Peters
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Stefan Raith
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Dirk Abel
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Frank Hölzle
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Ali Modabber
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| |
Collapse
|
13
|
Zhao Z, Poyhonen J, Chen Cai X, Sophie Woodley Hooper F, Ma Y, Hu Y, Ren H, Song W, Tsz Ho Tse Z. Augmented reality technology in image-guided therapy: State-of-the-art review. Proc Inst Mech Eng H 2021; 235:1386-1398. [PMID: 34304631 PMCID: PMC8573682 DOI: 10.1177/09544119211034357] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
Abstract
Image-guided therapies have been on the rise in recent years as they can achieve higher accuracy and are less invasive than traditional methods. By combining augmented reality technology with image-guided therapy, more organs, and tissues can be observed by surgeons to improve surgical accuracy. In this review, 233 publications (dated from 2015 to 2020) on the design and application of augmented reality-based systems for image-guided therapy, including both research prototypes and commercial products, were considered for review. Based on their functions and applications. Sixteen studies were selected. The engineering specifications and applications were analyzed and summarized for each study. Finally, future directions and existing challenges in the field were summarized and discussed.
Collapse
Affiliation(s)
- Zhuo Zhao
- School of Electrical and Computer Engineering, University of Georgia, Athens, GA, USA
| | - Jasmin Poyhonen
- Department of Electronic Engineering, University of York, York, UK
| | - Xin Chen Cai
- Department of Electronic Engineering, University of York, York, UK
| | | | - Yangmyung Ma
- Hull York Medical School, University of York, York, UK
| | - Yihua Hu
- Department of Electronic Engineering, University of York, York, UK
| | - Hongliang Ren
- Department of Electronic Engineering The Chinese University of Hong Kong (CUHK), Hong Kong, China
| | - Wenzhan Song
- School of Electrical and Computer Engineering, University of Georgia, Athens, GA, USA
| | - Zion Tsz Ho Tse
- Department of Electronic Engineering, University of York, York, UK
| |
Collapse
|
14
|
Barbieri M, Fantazzini P, Testa C, Bortolotti V, Baruffaldi F, Kogan F, Brizi L. Characterization of Structural Bone Properties through Portable Single-Sided NMR Devices: State of the Art and Future Perspectives. Int J Mol Sci 2021; 22:7318. [PMID: 34298936 PMCID: PMC8303251 DOI: 10.3390/ijms22147318] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Revised: 06/30/2021] [Accepted: 07/02/2021] [Indexed: 11/18/2022] Open
Abstract
Nuclear Magnetic Resonance (NMR) is a well-suited methodology to study bone composition and structural properties. This is because the NMR parameters, such as the T2 relaxation time, are sensitive to the chemical and physical environment of the 1H nuclei. Although magnetic resonance imaging (MRI) allows bone structure assessment in vivo, its cost limits the suitability of conventional MRI for routine bone screening. With difficulty accessing clinically suitable exams, the diagnosis of bone diseases, such as osteoporosis, and the associated fracture risk estimation is based on the assessment of bone mineral density (BMD), obtained by the dual-energy X-ray absorptiometry (DXA). However, integrating the information about the structure of the bone with the bone mineral density has been shown to improve fracture risk estimation related to osteoporosis. Portable NMR, based on low-field single-sided NMR devices, is a promising and appealing approach to assess NMR properties of biological tissues with the aim of medical applications. Since these scanners detect the signal from a sensitive volume external to the magnet, they can be used to perform NMR measurement without the need to fit a sample inside a bore of a magnet, allowing, in principle, in vivo application. Techniques based on NMR single-sided devices have the potential to provide a high impact on the clinical routine because of low purchasing and running costs and low maintenance of such scanners. In this review, the development of new methodologies to investigate structural properties of trabecular bone exploiting single-sided NMR devices is reviewed, and current limitations and future perspectives are discussed.
Collapse
Affiliation(s)
- Marco Barbieri
- Department of Radiology, Stanford University, Stanford, CA 94395, USA;
- Department of Physics and Astronomy “Augusto Righi”, University of Bologna, 40127 Bologna, Italy; (P.F.); (C.T.)
| | - Paola Fantazzini
- Department of Physics and Astronomy “Augusto Righi”, University of Bologna, 40127 Bologna, Italy; (P.F.); (C.T.)
| | - Claudia Testa
- Department of Physics and Astronomy “Augusto Righi”, University of Bologna, 40127 Bologna, Italy; (P.F.); (C.T.)
- IRCCS Istituto delle Scienze Neurologiche Bologna, Functional and Molecular Neuroimaging Unit, 40139 Bologna, Italy
| | - Villiam Bortolotti
- Department of Civil, Chemical, Environmental, and Materials Engineering, University of Bologna, 40134 Bologna, Italy;
| | - Fabio Baruffaldi
- Medical Technology Laboratory, IRCCS Istituto Ortopedico Rizzoli, 40136 Bologna, Italy;
| | - Feliks Kogan
- Department of Radiology, Stanford University, Stanford, CA 94395, USA;
| | - Leonardo Brizi
- Department of Physics and Astronomy “Augusto Righi”, University of Bologna, 40127 Bologna, Italy; (P.F.); (C.T.)
| |
Collapse
|
15
|
Neves CA, Tran ED, Blevins NH, Hwang PH. Deep learning automated segmentation of middle skull-base structures for enhanced navigation. Int Forum Allergy Rhinol 2021; 11:1694-1697. [PMID: 34185969 DOI: 10.1002/alr.22856] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 05/26/2021] [Accepted: 06/03/2021] [Indexed: 12/24/2022]
Affiliation(s)
- Caio A Neves
- Faculty of Medicine, University of Brasilia, Brasília, Brazil
| | - Emma D Tran
- Department of Otolaryngology-Head & Neck Surgery, Stanford University School of Medicine, Stanford, California, USA
| | - Nikolas H Blevins
- Department of Otolaryngology-Head & Neck Surgery, Stanford University School of Medicine, Stanford, California, USA
| | - Peter H Hwang
- Department of Otolaryngology-Head & Neck Surgery, Stanford University School of Medicine, Stanford, California, USA
| |
Collapse
|
16
|
Rai AT, Deib G, Smith D, Boo S. Teleproctoring for Neurovascular Procedures: Demonstration of Concept Using Optical See-Through Head-Mounted Display, Interactive Mixed Reality, and Virtual Space Sharing-A Critical Need Highlighted by the COVID-19 Pandemic. AJNR Am J Neuroradiol 2021; 42:1109-1115. [PMID: 33707282 PMCID: PMC8191671 DOI: 10.3174/ajnr.a7066] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 01/11/2021] [Indexed: 01/21/2023]
Abstract
BACKGROUND AND PURPOSE Physician training and onsite proctoring are critical for safely introducing new biomedical devices, a process that has been disrupted by the pandemic. A teleproctoring concept using optical see-through head-mounted displays with a proctor's ability to see and, more important, virtually interact in the operator's visual field is presented. MATERIALS AND METHODS Test conditions were created for simulated proctoring using a bifurcation aneurysm flow model for WEB device deployment. The operator in the angiography suite wore a Magic Leap-1 optical see-through head-mounted display to livestream his or her FOV to a proctor's computer in an adjacent building. A Web-based application (Spatial) was used for the proctor to virtually interact in the operator's visual space. Tested elements included the quality of the livestream, communication, and the proctor's ability to interact in the operator's environment using mixed reality. A hotspot and a Wi-Fi-based network were tested. RESULTS The operator successfully livestreamed the angiography room environment and his FOV of the monitor to the remotely located proctor. The proctor communicated and guided the operator through the procedure over the optical see-through head-mounted displays, a process that was repeated several times. The proctor used mixed reality and virtual space sharing to successfully project images, annotations, and data in the operator's FOV for highlighting any device or procedural aspects. The livestream latency was 0.71 (SD, 0.03) seconds for Wi-Fi and 0.86 (SD, 0.3) seconds for the hotspot (P = .02). The livestream quality was subjectively better over the Wi-Fi. CONCLUSIONS New technologies using head-mounted displays and virtual space sharing could offer solutions applicable to remote proctoring in the neurointerventional space.
Collapse
Affiliation(s)
- A T Rai
- From the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
| | - G Deib
- From the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
| | - D Smith
- West Virginia University Reed College of Media (D.S.), Morgantown, West Virginia
| | - S Boo
- From the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
| |
Collapse
|
17
|
Glas HH, Kraeima J, van Ooijen PMA, Spijkervet FKL, Yu L, Witjes MJH. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J Oral Maxillofac Surg 2021; 79:1943.e1-1943.e10. [PMID: 34033801 DOI: 10.1016/j.joms.2021.04.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/01/2021] [Accepted: 04/01/2021] [Indexed: 01/21/2023]
Abstract
BACKGROUND Oral and maxillofacial surgery currently relies on virtual surgery planning based on image data (CT, MRI). Three-dimensional (3D) visualizations are typically used to plan and predict the outcome of complex surgical procedures. To translate the virtual surgical plan to the operating room, it is either converted into physical 3D-printed guides or directly translated using real-time navigation systems. PURPOSE This study aims to improve the translation of the virtual surgery plan to a surgical procedure, such as oncologic or trauma surgery, in terms of accuracy and speed. Here we report an augmented reality visualization technique for image-guided surgery. It describes how surgeons can visualize and interact with the virtual surgery plan and navigation data while in the operating room. The user friendliness and usability is objectified by a formal user study that compared our augmented reality assisted technique to the gold standard setup of a perioperative navigation system (Brainlab). Moreover, accuracy of typical navigation tasks as reaching landmarks and following trajectories is compared. RESULTS Overall completion time of navigation tasks was 1.71 times faster using augmented reality (P = .034). Accuracy improved significantly using augmented reality (P < .001), for reaching physical landmarks a less strong correlation was found (P = .087). Although the participants were relatively unfamiliar with VR/AR (rated 2.25/5) and gesture-based interaction (rated 2/5), they reported that navigation tasks become easier to perform using augmented reality (difficulty Brainlab rated 3.25/5, HoloLens 2.4/5). CONCLUSION The proposed workflow can be used in a wide range of image-guided surgery procedures as an addition to existing verified image guidance systems. Results of this user study imply that our technique enables typical navigation tasks to be performed faster and more accurately compared to the current gold standard. In addition, qualitative feedback on our augmented reality assisted technique was more positive compared to the standard setup.?>.
Collapse
Affiliation(s)
- H H Glas
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
| | - J Kraeima
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - P M A van Ooijen
- Associate Professor Faculty of Medical Sciences, Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - F K L Spijkervet
- Professor, Oral and Maxillofacial Surgeon, Head of the Department, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - L Yu
- Lecturer in the Department of Computer Science and Software Engineering (CSSE), Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - M J H Witjes
- Oral and Maxillofacial Surgeon, Principal Investigator, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
18
|
Fully automated preoperative segmentation of temporal bone structures from clinical CT scans. Sci Rep 2021; 11:116. [PMID: 33420386 PMCID: PMC7794235 DOI: 10.1038/s41598-020-80619-0] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Accepted: 12/23/2020] [Indexed: 11/11/2022] Open
Abstract
Middle- and inner-ear surgery is a vital treatment option in hearing loss, infections, and tumors of the lateral skull base. Segmentation of otologic structures from computed tomography (CT) has many potential applications for improving surgical planning but can be an arduous and time-consuming task. We propose an end-to-end solution for the automated segmentation of temporal bone CT using convolutional neural networks (CNN). Using 150 manually segmented CT scans, a comparison of 3 CNN models (AH-Net, U-Net, ResNet) was conducted to compare Dice coefficient, Hausdorff distance, and speed of segmentation of the inner ear, ossicles, facial nerve and sigmoid sinus. Using AH-Net, the Dice coefficient was 0.91 for the inner ear; 0.85 for the ossicles; 0.75 for the facial nerve; and 0.86 for the sigmoid sinus. The average Hausdorff distance was 0.25, 0.21, 0.24 and 0.45 mm, respectively. Blinded experts assessed the accuracy of both techniques, and there was no statistical difference between the ratings for the two methods (p = 0.93). Objective and subjective assessment confirm good correlation between automated segmentation of otologic structures and manual segmentation performed by a specialist. This end-to-end automated segmentation pipeline can help to advance the systematic application of augmented reality, simulation, and automation in otologic procedures.
Collapse
|
19
|
Amine A, Habashy KJ, Najem E, Abbas R, Moussalem C, Bsat S, Hourany R, Darwish H. Frontal Sinus Morphometry in Relation to Surgically Relevant Landmarks in the Middle East Population: Can We Globalize? World Neurosurg 2020; 148:e87-e93. [PMID: 33309894 DOI: 10.1016/j.wneu.2020.12.018] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Revised: 12/01/2020] [Accepted: 12/02/2020] [Indexed: 11/29/2022]
Abstract
BACKGROUND The frontal bone is frequently approached during neurosurgical procedures. Feared complications of such surgeries include cerebrospinal fluid leak, among others, and frequently result from a breach of the frontal sinus. For this reason, the sinus should be avoided when possible. The supraorbital notch (SON) is a reliable and easily identifiable surgical landmark and its relation to the frontal sinus has been previously studied. However, the frontal sinus shows significant variability in size and shape between populations. METHODS In the present study, we investigate the frontal sinus dimension and its relation to the SON in the Middle Eastern population. RESULTS The analysis of a set of computed tomography scans reveals a significant variation in size between genders, and we subsequently provide neurosurgeons in the region with population-targeted, gender-specific risk maps. CONCLUSIONS We finally conclude that a 2-cm margin rostral and lateral to the SON is safest.
Collapse
Affiliation(s)
- Ali Amine
- Department of Neurosurgery, American University of Beirut Medical Center, Beirut, Lebanon
| | - Karl John Habashy
- Faculty of Medicine, American University of Beirut Medical Center, Beirut, Lebanon
| | - Elie Najem
- Department of Diagnostic Radiology, American University of Beirut Medical Center, Beirut, Lebanon
| | - Rawad Abbas
- Faculty of Medicine, American University of Beirut Medical Center, Beirut, Lebanon
| | - Charbel Moussalem
- Department of Neurosurgery, American University of Beirut Medical Center, Beirut, Lebanon
| | - Shadi Bsat
- Department of Neurosurgery, American University of Beirut Medical Center, Beirut, Lebanon
| | - Roula Hourany
- Department of Diagnostic Radiology, American University of Beirut Medical Center, Beirut, Lebanon
| | - Houssein Darwish
- Department of Neurosurgery, American University of Beirut Medical Center, Beirut, Lebanon.
| |
Collapse
|