1
|
Haworth J, Sahu M, Zhu K, Hammond J, Ishida H, Munawar A, Yang R, Taylor R. A cooperatively controlled robotic system with active constraints for enhancing efficacy in bilateral sagittal split osteotomy. Int J Comput Assist Radiol Surg 2025:10.1007/s11548-025-03403-3. [PMID: 40397231 DOI: 10.1007/s11548-025-03403-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2025] [Accepted: 04/14/2025] [Indexed: 05/22/2025]
Abstract
PURPOSE Precise osteotomies are vital in maxillofacial procedures such as the bilateral sagittal split osteotomy (BSSO) where surgical accuracy and precision directly impacts patient outcomes. Conventional freehand drilling can lead to unfavorable splits, negatively impacting surgical outcome. METHODS This paper presents the development work of a cooperatively controlled robot system designed to enhance the efficacy of osteotomies during BSSO. The system features two assistive modes for the execution of a patient-specific surgical plan: (1) a Haptic guidance mode that helps the surgeon align the surgical drill with the planned cutting plane to improve surgical accuracy of the cut and (2) an Active constraint mode that restricts deviations from the cutting plane to enhance surgical precision during drilling. We validated the system through feasibility experiments involving 36 mandible phantoms and a cadaveric specimen, with a surgeon, a surgical resident, and a medical student performing osteotomies freehand and with robotic assistance. Additionally, NASA TLX surveys were conducted to assess the perceived ease of use of the robotic system. RESULTS Compared to freehand methods, the robotic system improved the efficacy of the cut from 2.16 ± 0.98 to 0.71 ± 0.53 mm for the med student, 1.74 ± 0.95 to 0.53 ± 0.35 mm for the resident, and 1.64 ± 0.85 to 0.63 ± 0.24 mm for the surgeon while reducing the task load. CONCLUSION Our experimental results demonstrate that the proposed robotic system can enhance the precision of surgical drilling in the BSSO compared to a freehand approach. These findings indicate the potential of robotic systems to reduce errors and enhance patient outcomes in maxillofacial surgery.
Collapse
Affiliation(s)
- Jesse Haworth
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, USA.
| | - Manish Sahu
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA
| | - Katherine Zhu
- Department of Plastic and Reconstructive Surgery, Johns Hopkins Medicine, Baltimore, USA
| | - Jacob Hammond
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, USA
| | - Hisashi Ishida
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA
| | - Adnan Munawar
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA
| | - Robin Yang
- Department of Plastic and Reconstructive Surgery, Johns Hopkins Medicine, Baltimore, USA
| | - Russell Taylor
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA
| |
Collapse
|
2
|
Strong EB, Patel A, Marston AP, Sadegh C, Potts J, Johnston D, Ahn D, Bryant S, Li M, Raslan O, Lucero SA, Fischer MJ, Zwienenberg M, Sharma N, Thieringer F, El Amm C, Shahlaie K, Metzger M, Strong EB. Augmented Reality Navigation in Craniomaxillofacial/Head and Neck Surgery. OTO Open 2025; 9:e70108. [PMID: 40224293 PMCID: PMC11986686 DOI: 10.1002/oto2.70108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2025] [Accepted: 03/15/2025] [Indexed: 04/15/2025] Open
Abstract
Objective This study aims to (1) develop an augmented reality (AR) navigation platform for craniomaxillofacial (CMF) and head and neck surgery; (2) apply it to a range of surgical cases; and (3) evaluate the advantages, disadvantages, and clinical opportunities for AR navigation. Study Design A multi-center retrospective case series. Setting Four tertiary care academic centers. Methods A novel AR navigation platform was collaboratively developed with Xironetic and deployed intraoperatively using only a head-mounted display (Microsoft HoloLens 2). Virtual surgical plans were generated from computed tomography/magnetic resonance imaging data and uploaded onto the AR platform. A reference array was mounted to the patient, and the virtual plan was registered to the patient intraoperatively. A retrospective review of all AR-navigated CMF cases since September 2023 was performed. Results Thirty-three cases were reviewed and classified as either trauma, orthognathic, tumor, or craniofacial. The AR platform had several advantages over traditional navigation including real-time 3D visualization of the surgical plan, identification of critical structures, and real-time tracking. Furthermore, this case series presents the first-known examples of (1) AR instrument tracking for midface osteotomies, (2) AR tracking of the zygomaticomaxillary complex during fracture reduction, (3) mandibular tracking in orthognathic surgery, (4) AR fibula cutting guides for mandibular reconstruction, and (5) integration of real-time infrared visualization in an AR headset for vasculature identification. Conclusion While still a developing technology, AR navigation provides several advantages over traditional navigation for CMF and head and neck surgery, including heads up, interactive 3D visualization of the surgical plan, identification of critical anatomy, and real-time tracking.
Collapse
Affiliation(s)
- E. Brandon Strong
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Anuj Patel
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Alexander P. Marston
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Cameron Sadegh
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Jeffrey Potts
- Department of Plastic and Reconstructive SurgeryUniversity of OklahomaOklahoma CityOklahomaUSA
| | - Darin Johnston
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - David Ahn
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - Shae Bryant
- Department of Oral and Maxillofacial SurgeryDavid Grant Medical CenterFairfieldCaliforniaUSA
| | - Michael Li
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Osama Raslan
- Department of RadiologyUniversity of California, DavisDavisCaliforniaUSA
| | - Steven A. Lucero
- Department of Biomedical EngineeringUniversity of California, DavisDavisCaliforniaUSA
| | - Marc J. Fischer
- Department of Computer ScienceTechnical University of MunichMunichGermany
| | - Marike Zwienenberg
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Neha Sharma
- Clinic of Oral and Craniomaxillofacial SurgeryUniversity Hospital BaselBaselSwitzerland
- Medical Additive Manufacturing (Swiss MAM) Research Group, Department of Biomedical EngineeringUniversity of BaselBaselSwitzerland
| | - Florian Thieringer
- Clinic of Oral and Craniomaxillofacial SurgeryUniversity Hospital BaselBaselSwitzerland
- Medical Additive Manufacturing (Swiss MAM) Research Group, Department of Biomedical EngineeringUniversity of BaselBaselSwitzerland
| | - Christian El Amm
- Department of Plastic and Reconstructive SurgeryUniversity of OklahomaOklahoma CityOklahomaUSA
| | - Kiarash Shahlaie
- Department of Neurological SurgeryUniversity of California, DavisDavisCaliforniaUSA
| | - Marc Metzger
- Department of Oral and Maxillofacial SurgeryUniversity Hospital FreiburgFreiburgGermany
| | - E. Bradley Strong
- Department of Otolaryngology–Head and Neck SurgeryUniversity of California, DavisDavisCaliforniaUSA
| |
Collapse
|
3
|
Grzybowski G, Stewart MM, Milner TD, Dinur AB, McGee OM, Pakdel A, Tran KL, Fels SS, Hodgson AJ, Prisman E. Intraoperative Real-Time Image-Guided Fibular Harvest and Mandibular Reconstruction: A Feasibility Study on Cadaveric Specimens. Head Neck 2025; 47:640-650. [PMID: 39367586 PMCID: PMC11717937 DOI: 10.1002/hed.27954] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2024] [Revised: 08/07/2024] [Accepted: 09/22/2024] [Indexed: 10/06/2024] Open
Abstract
BACKGROUND This study assesses the feasibility of real-time surgical navigation to plan and guide sequential steps during mandible reconstruction on a series of cadaveric specimens. METHODS An image-guided surgical (IGS) system was designed including customized mandible and fibula fixation devices with navigation reference frames and an accompanied image-guided software. The mandibular and fibular segmental osteotomies were performed using the IGS in all five cadaveric patients. Procedural time and cephalometric measurements were recorded. RESULTS Five real-time IGS mandibulectomy and fibular reconstruction were successfully performed. The mean Dice score and Hausdorff-95 distance between the planned and actual mandible reconstructions was 0.8 ± 0.08 and 7.29 ± 4.81 mm, respectively. Intercoronoid width, interangle width, and mandible projection differences were 1.15 ± 1.17 mm, 0.9 ± 0.56 mm, and 1.47 ± 1.62 mm, respectively. CONCLUSION This study presents the first demonstration of a comprehensive image-guided workflow for mandibulectomy and fibular flap reconstruction on cadaveric specimens and resulted in adequate cephalometric accuracy.
Collapse
Affiliation(s)
- Georgia Grzybowski
- Department of Mechanical Engineering, Faculty of Applied ScienceUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Molly Murray Stewart
- Department of Mechanical Engineering, Faculty of Applied ScienceUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Thomas D. Milner
- Division of Otolaryngology, Department of Surgery, Faculty of MedicineUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Anat Bahat Dinur
- Division of Otolaryngology, Department of Surgery, Faculty of MedicineUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Orla M. McGee
- Department of Mechanical Engineering, Faculty of Applied ScienceUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Amir Pakdel
- Division of Otolaryngology, Department of Surgery, Faculty of MedicineUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Khanh Linh Tran
- Division of Otolaryngology, Department of Surgery, Faculty of MedicineUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Sidney S. Fels
- Department of Electrical and Computer Engineering, Faculty of Applied ScienceUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Antony J. Hodgson
- Department of Mechanical Engineering, Faculty of Applied ScienceUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| | - Eitan Prisman
- Division of Otolaryngology, Department of Surgery, Faculty of MedicineUniversity of British ColumbiaVancouverBritish ColumbiaCanada
| |
Collapse
|
4
|
Ye J, Chen Q, Zhong T, Liu J, Gao H. Is Overlain Display a Right Choice for AR Navigation? A Qualitative Study of Head-Mounted Augmented Reality Surgical Navigation on Accuracy for Large-Scale Clinical Deployment. CNS Neurosci Ther 2025; 31:e70217. [PMID: 39817491 PMCID: PMC11736426 DOI: 10.1111/cns.70217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 12/24/2024] [Accepted: 01/03/2025] [Indexed: 01/18/2025] Open
Abstract
BACKGROUND During the course of the past two decades, head-mounted augmented reality surgical navigation (HMARSN) systems have been increasingly employed in a variety of surgical specialties as a result of both advancements in augmented reality-related technologies and surgeons' desires to overcome some drawbacks inherent to conventional surgical navigation systems. In the present time, most experimental HMARSN systems adopt overlain display (OD) that overlay virtual models and planned routes of surgical tools on corresponding physical tissues, organs, lesions, and so forth, in a surgical field so as to provide surgeons with an intuitive and direct view to gain better hand-eye coordination as well as avoid attention shift and loss of sight (LOS), among other benefits during procedures. Yet, its system accuracy, which is the most crucial performance indicator of any surgical navigation system, is difficult to ascertain because it is highly subjective and user-dependent. Therefore, the aim of this study was to review presently available experimental OD HMARSN systems qualitatively, explore how their system accuracy is affected by overlain display, and find out if such systems are suited to large-scale clinical deployment. METHOD We searched PubMed and ScienceDirect with the following terms: head mounted augmented reality surgical navigation, and 445 records were returned in total. After screening and eligibility assessment, 60 papers were finally analyzed. Specifically, we focused on how their accuracies were defined and measured, as well as whether such accuracies are stable in clinical practice and competitive with corresponding commercially available systems. RESULTS AND CONCLUSIONS The primary findings are that the system accuracy of OD HMARSN systems is seriously affected by a transformation between the spaces of the user's eyes and the surgical field, because measurement of the transformation is heavily individualized and user-dependent. Additionally, the transformation itself is potentially subject to changes during surgical procedures, and hence unstable. Therefore, OD HMARSN systems are not suitable for large-scale clinical deployment.
Collapse
Affiliation(s)
- Jian Ye
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Qingwen Chen
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Tao Zhong
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Jian Liu
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Han Gao
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| |
Collapse
|
5
|
Serrano CM, Atenas MJ, Rodriguez PJ, Vervoorn JM. From Virtual Reality to Reality: Fine-Tuning the Taxonomy for Extended Reality Simulation in Dental Education. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2024. [PMID: 39698875 DOI: 10.1111/eje.13064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Revised: 11/03/2024] [Accepted: 12/05/2024] [Indexed: 12/20/2024]
Abstract
INTRODUCTION Digital simulation in dental education has substantially evolved, addressing several educational challenges in dentistry. Following global lockdowns and sustainability concerns, dental educators are increasingly adopting digital simulation to enhance or replace traditional training methods. This review aimed to contribute to a uniform taxonomy for extended reality (XR) simulation within dental education. METHODS This scoping review followed the PRISMA and PRISMA-ScR guidelines. PubMed/MEDLINE, EMBASE, Web of Science and Google Scholar were searched. Eligible studies included English-written publications in indexed journals related to digital simulation in dental/maxillofacial education, providing theoretical descriptions of extended reality (XR) and/or immersive training tools (ITT). The outcomes of the scoping review were used as building blocks for a uniform of XR-simulation taxonomy. RESULTS A total of 141 articles from 2004 to 2024 were selected and categorised into Virtual Reality (VR), Mixed Reality (MR), Augmented Reality (AR), Augmented Virtuality (AV) and Computer Simulation (CS). Stereoscopic vision, immersion, interaction, modification and haptic feedback were identified as recurring features across XR-simulation in dentistry. These features formed the basis for a general XR-simulation taxonomy. DISCUSSION While XR-simulation features were consistent in the literature, the variety of definitions and classifications complicated the development of a taxonomy framework. VR was frequently used as an umbrella term. To address this, operational definitions were proposed for each category within the virtuality continuum, clarifying distinctions and commonalities. CONCLUSION This scoping review highlights the need for a uniform taxonomy in XR simulation within dental education. Establishing a consensus on XR-related terminology and definitions facilitates future research, allowing clear evidence reporting and analysis. The proposed taxonomy may also be of use for medical education, promoting alignment and the creation of a comprehensive body of evidence in XR technologies.
Collapse
Affiliation(s)
- Carlos M Serrano
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - María J Atenas
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - Patricio J Rodriguez
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| | - Johanna M Vervoorn
- Digital Dentistry, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam, The Netherlands
| |
Collapse
|
6
|
Dastan M, Fiorentino M, Walter ED, Diegritz C, Uva AE, Eck U, Navab N. Co-Designing Dynamic Mixed Reality Drill Positioning Widgets: A Collaborative Approach with Dentists in a Realistic Setup. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7053-7063. [PMID: 39250405 DOI: 10.1109/tvcg.2024.3456149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/11/2024]
Abstract
Mixed Reality (MR) is proven in the literature to support precise spatial dental drill positioning by superimposing 3D widgets. Despite this, the related knowledge about widget's visual design and interactive user feedback is still limited. Therefore, this study is contributed to by co-designed MR drill tool positioning widgets with two expert dentists and three MR experts. The results of co-design are two static widgets (SWs): a simple entry point, a target axis, and two dynamic widgets (DWs), variants of dynamic error visualization with and without a target axis (DWTA and DWEP). We evaluated the co-designed widgets in a virtual reality simulation supported by a realistic setup with a tracked phantom patient, a virtual magnifying loupe, and a dentist's foot pedal. The user study involved 35 dentists with various backgrounds and years of experience. The findings demonstrated significant results; DWs outperform SWs in positional and rotational precision, especially with younger generations and subjects with gaming experiences. The user preference remains for DWs (19) instead of SWs (16). However, findings indicated that the precision positively correlates with the time trade-off. The post-experience questionnaire (NASA-TLX) showed that DWs increase mental and physical demand, effort, and frustration more than SWs. Comparisons between DWEP and DWTA show that the DW's complexity level influences time, physical and mental demands. The DWs are extensible to diverse medical and industrial scenarios that demand precision.
Collapse
|
7
|
Liu Z, Zhong Y, Lyu X, Zhang J, Huang M, Liu S, Zheng L. Accuracy of the modified tooth-supported 3D printing surgical guides based on CT, CBCT, and intraoral scanning in maxillofacial region: A comparison study. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2024; 125:101853. [PMID: 38555078 DOI: 10.1016/j.jormas.2024.101853] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/21/2024] [Accepted: 03/27/2024] [Indexed: 04/02/2024]
Abstract
BACKGROUND Tooth-supported surgical guides have demonstrated superior accuracy compared with bone-supported guides. This study aimed to modify the fabrication of tooth-supported guides for compatibility with tumor resection procedures and investigate their accuracy. METHODS Patients with tumors who underwent osteotomy with the assistance of modified tooth- or bone-supported surgical guides were included. Virtual surgical planning (VSP) was employed to align three dimensional (3D) models extracted from intraoperative computed tomography (CT) images. The distances and angular deviations between the actual osteotomy plane and preoperative plane were recorded. A comparative analysis of osteotomy discrepancies between tooth-supported and bone-supported guides, as well as among tooth-supported guides based on CT, cone-beam CT (CBCT), or intraoral scanner (IOS) was conducted. The factors influencing the precision of the guides were analyzed. RESULTS Sixty patients with 81 resection planes were included in this study. In the tooth-supported group, the mean deviations in the osteotomy plane and angle were 1.39 mm and 4.30°, respectively, whereas those of the bone-supported group were 2.16 mm and 4.95°. In the tooth-supported isotype guide groups, the mean deviations of the osteotomy plane were 1.39 mm, 1.47 mm, 1.23 mm across CT, CBCT, and IOS, respectively. The accuracy of the modified tooth-supported guides remained consistent regardless of number and position of the teeth supporting the guide and location of the osteotomy lines. CONCLUSIONS The findings indicate that the modified tooth-supported surgical guides demonstrated high accuracy in the maxillofacial region, contributing to a reduction in the amount of surgically detached soft tissue.
Collapse
Affiliation(s)
- Zezhao Liu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Yiwei Zhong
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Xiaoming Lyu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Jie Zhang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Mingwei Huang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Shuming Liu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China
| | - Lei Zheng
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, National Center for Stomatology & National Clinical Research Center for Oral Diseases, Beijing & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing 100081, China.
| |
Collapse
|
8
|
Al Hamad KQ, Said KN, Engelschalk M, Matoug-Elwerfelli M, Gupta N, Eric J, Ali SA, Ali K, Daas H, Abu Alhaija ES. Taxonomic discordance of immersive realities in dentistry: A systematic scoping review. J Dent 2024; 146:105058. [PMID: 38729286 DOI: 10.1016/j.jdent.2024.105058] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Revised: 05/04/2024] [Accepted: 05/07/2024] [Indexed: 05/12/2024] Open
Abstract
OBJECTIVES This review aimed to map taxonomy frameworks, descriptions, and applications of immersive technologies in the dental literature. DATA The Preferred reporting items for systematic reviews and meta-analyses extension for scoping reviews (PRISMA-ScR) guidelines was followed, and the protocol was registered at open science framework platform (https://doi.org/10.17605/OSF.IO/H6N8M). SOURCES Systematic search was conducted in MEDLINE (via PubMed), Scopus, and Cochrane Library databases, and complemented by manual search. STUDY SELECTION A total of 84 articles were included, with 81 % between 2019 and 2023. Most studies were experimental (62 %), including education (25 %), protocol feasibility (20 %), in vitro (11 %), and cadaver (6 %). Other study types included clinical report/technique article (24 %), clinical study (9 %), technical note/tip to reader (4 %), and randomized controlled trial (1 %). Three-quarters of the included studies were published in oral and maxillofacial surgery (38 %), dental education (26 %), and implant (12 %) disciplines. Methods of display included head mounted display device (HMD) (55 %), see through screen (32 %), 2D screen display (11 %), and projector display (2 %). Descriptions of immersive realities were fragmented and inconsistent with lack of clear taxonomy framework for the umbrella and the subset terms including virtual reality (VR), augmented reality (AR), mixed reality (MR), augmented virtuality (AV), extended reality, and X reality. CONCLUSIONS Immersive reality applications in dentistry are gaining popularity with a notable surge in the number of publications in the last 5 years. Ambiguities are apparent in the descriptions of immersive realities. A taxonomy framework based on method of display (full or partial) and reality class (VR, AR, or MR) is proposed. CLINICAL SIGNIFICANCE Understanding different reality classes can be perplexing due to their blurred boundaries and conceptual overlapping. Immersive technologies offer novel educational and clinical applications. This domain is fast developing. With the current fragmented and inconsistent terminologies, a comprehensive taxonomy framework is necessary.
Collapse
Affiliation(s)
- Khaled Q Al Hamad
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar.
| | - Khalid N Said
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Marcus Engelschalk
- Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Germany
| | | | - Nidhi Gupta
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Jelena Eric
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Shaymaa A Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Kamran Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Hanin Daas
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | | |
Collapse
|
9
|
Tang ZN, Hu LH, Yu Y, Zhang WB, Peng X. Mixed Reality Combined with Surgical Navigation in Resection of Micro- and Mini-Tumors of the Parotid Gland: A Pilot Study. Laryngoscope 2024; 134:1670-1678. [PMID: 37819631 DOI: 10.1002/lary.31104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 09/13/2023] [Accepted: 09/27/2023] [Indexed: 10/13/2023]
Abstract
OBJECTIVE This study aimed to evaluate the feasibility and outcomes of mixed reality combined with surgical navigation (MRSN) in the resection of parotid micro- and mini-tumors. METHODS Eighteen patients who underwent parotid tumor resection between December 2020 and November 2022 were included. Six patients were enrolled in MRSN group, and the surgeons performed the surgery with the help of MRSN technology. The surgical procedures include virtual planning, data transfer between mixed reality and surgical navigation, tumor localization and resection assisted by surgical navigation under mixed reality environment. Twelve patients were enrolled in control group, and intraoperative tumor localization and resection were performed according to the experience of the surgeon. Total surgery time and intraoperative bleeding were recorded. Perioperative complications were recorded during follow-up. RESULTS The mean surgery time of MRSN group (76.7 ± 14.0 min) and control group (65.4 ± 21.3 min) showed no significant difference (p = 0.220), so did the intraoperative bleeding of MRSN group (16.0 ± 8.0 mL) and control group (16.7 ± 6.6 mL) (p = 0.825). None of the patient in MRSN group underwent any complication, although one patient in control group suffered temporary facial paralysis. The mean deviation between the virtually marked and the intraoperative actual outermost point of tumor was 3.03 ± 0.83 mm. CONCLUSION MRSN technology can realize real-time three-dimensional visualization of the tumor, and it has the potential of enhancing the safety and accuracy of resection of micro- and mini-tumors of parotid gland. LEVEL OF EVIDENCE 4 Laryngoscope, 134:1670-1678, 2024.
Collapse
Affiliation(s)
- Zu-Nan Tang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Lei-Hao Hu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Yao Yu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Wen-Bo Zhang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Xin Peng
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| |
Collapse
|
10
|
Górecka Ż, Idaszek J, Heljak M, Martinez DC, Choińska E, Kulas Z, Święszkowski W. Indocyanine green and iohexol loaded hydroxyapatite in poly(L-lactide-co-caprolactone)-based composite for bimodal near-infrared fluorescence- and X-ray-based imaging. J Biomed Mater Res B Appl Biomater 2024; 112:e35313. [PMID: 37596854 DOI: 10.1002/jbm.b.35313] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Revised: 07/19/2023] [Accepted: 07/31/2023] [Indexed: 08/20/2023]
Abstract
This study aimed to develop material for multimodal imaging by means of X-ray and near-infrared containing FDA- and EMA-approved iohexol and indocyanine green (ICG). The mentioned contrast agents (CAs) are hydrophilic and amphiphilic, respectively, which creates difficulties in fabrication of functional polymeric composites for fiducial markers (FMs) with usage thereof. Therefore, this study exploited for the first time the possibility of enhancing the radiopacity and introduction of the NIR fluorescence of FMs by adsorption of the CAs on hydroxyapatite (HAp) nanoparticles. The particles were embedded in the poly(L-lactide-co-caprolactone) (P[LAcoCL]) matrix resulting in the composite material for bimodal near-infrared fluorescence- and X-ray-based imaging. The applied method of material preparation provided homogenous distribution of both CAs with high iohexol loading efficiency and improved fluorescence signal due to hindered ICG aggregation. The material possessed profound contrasting properties for both imaging modalities. Its stability was evaluated during in vitro experiments in phosphate-buffered saline (PBS) and foetal bovine serum (FBS) solutions. The addition of HAp nanoparticles had significant effect on the fluorescence signal. The X-ray radiopacity was stable within minimum 11 weeks, even though the addition of ICG contributed to a faster release of iohexol. The stiffness of the material was not affected by iohexol or ICG, but incorporation of HAp nanoparticles elevated the values of bending modulus by approximately 70%. Moreover, the performed cell study revealed that all tested materials were not cytotoxic. Thus, the developed material can be successfully used for fabrication of FMs.
Collapse
Affiliation(s)
- Żaneta Górecka
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
- Centre for Advanced Materials and Technologies CEZAMAT, Warsaw University of Technology, Warsaw, Poland
| | - Joanna Idaszek
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
| | - Marcin Heljak
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
| | - Diana C Martinez
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
| | - Emilia Choińska
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
| | - Zbigniew Kulas
- Faculty of Mechanical Engineering, Wroclaw University of Science and Technology, Wroclaw, Poland
| | - Wojciech Święszkowski
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, Warsaw, Poland
| |
Collapse
|
11
|
Yang B, Yang L, Huang WL, Zhou QZ, He J, Zhao X. Application experience and research progress of different emerging technologies in plastic surgery. World J Clin Cases 2023; 11:4258-4266. [PMID: 37449226 PMCID: PMC10336992 DOI: 10.12998/wjcc.v11.i18.4258] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 04/28/2023] [Accepted: 05/23/2023] [Indexed: 06/26/2023] Open
Abstract
In the diagnosis and treatment of plastic surgery, there are structural processing problems, such as positioning, moving, and reconstructing complex three-dimensional structures. Doctors operate according to their own experience, and the inability to accurately locate these structures is an important problem in plastic surgery. Emerging digital technologies such as virtual reality, augmented reality, and three-dimensional printing are widely used in the medical field, particularly in plastic surgery. This article reviews the development of these three technical concepts, introduces the technical elements and specific applications required in plastic surgery, summarizes the application status of the three technologies in plastic surgery, and summarizes prospects for future development.
Collapse
Affiliation(s)
- Bin Yang
- Plastic and Cosmetic Department, The Affiliated Calmette Hospital of Kunming Medical University, The First People’s Hospital of Kunming, Calmette Hospital Kunming, Kunming 650224, Yunnan Province, China
| | - Ling Yang
- Radiology Department, The Third Affiliated Hospital of Kunming Medical University (Yunnan Cancer Hospital, Yunnan Cancer Center), Kunming 650118, Yunnan Province, China
| | - Wen-Li Huang
- Plastic and Cosmetic Department, The Affiliated Calmette Hospital of Kunming Medical University, The First People’s Hospital of Kunming, Calmette Hospital Kunming, Kunming 650224, Yunnan Province, China
| | - Qing-Zhu Zhou
- Plastic and Cosmetic Department, The Affiliated Calmette Hospital of Kunming Medical University, The First People’s Hospital of Kunming, Calmette Hospital Kunming, Kunming 650224, Yunnan Province, China
| | - Jia He
- Plastic and Cosmetic Department, The Affiliated Calmette Hospital of Kunming Medical University, The First People’s Hospital of Kunming, Calmette Hospital Kunming, Kunming 650224, Yunnan Province, China
| | - Xian Zhao
- Plastic and Cosmetic Department, The Affiliated Calmette Hospital of Kunming Medical University, The First People’s Hospital of Kunming, Calmette Hospital Kunming, Kunming 650224, Yunnan Province, China
| |
Collapse
|
12
|
Liu S, Liao Y, He B, Dai B, Zhu Z, Shi J, Huang Y, Zou G, Du C, Shi B. Mandibular resection and defect reconstruction guided by a contour registration-based augmented reality system: A preclinical trial. J Craniomaxillofac Surg 2023:S1010-5182(23)00077-X. [PMID: 37355367 DOI: 10.1016/j.jcms.2023.05.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2022] [Revised: 02/22/2023] [Accepted: 05/21/2023] [Indexed: 06/26/2023] Open
Abstract
The aim of this study was to verify the feasibility and accuracy of a contour registration-based augmented reality (AR) system in jaw surgery. An AR system was developed to display the interaction between virtual planning and images of the surgical site in real time. Several trials were performed with the guidance of the AR system and the surgical guide. The postoperative cone beam CT (CBCT) data were matched with the preoperatively planned data to evaluate the accuracy of the system by comparing the deviations in distance and angle. All procedures were performed successfully. In nine model trials, distance and angular deviations for the mandible, reconstructed fibula, and fixation screws were 1.62 ± 0.38 mm, 1.86 ± 0.43 mm, 1.67 ± 0.70 mm, and 3.68 ± 0.71°, 5.48 ± 2.06°, 7.50 ± 1.39°, respectively. In twelve animal trials, results of the AR system were compared with the surgical guide. Distance deviations for the bilateral condylar outer poles were 0.93 ± 0.63 mm and 0.81 ± 0.30 mm, respectively (p = 0.68). Distance deviations for the bilateral mandibular posterior angles were 2.01 ± 2.49 mm and 2.89 ± 1.83 mm, respectively (p = 0.50). Distance and angular deviations for the mandible were 1.41 ± 0.61 mm, 1.21 ± 0.18 mm (p = 0.45), and 6.81 ± 2.21°, 6.11 ± 2.93° (p = 0.65), respectively. Distance and angular deviations for the reconstructed tibiofibular bones were 0.88 ± 0.22 mm, 0.84 ± 0.18 mm (p = 0.70), and 6.47 ± 3.03°, 6.90 ± 4.01° (p = 0.84), respectively. This study proposed a contour registration-based AR system to assist surgeons in intuitively observing the surgical plan intraoperatively. The trial results indicated that this system had similar accuracy to the surgical guide.
Collapse
Affiliation(s)
- Shaofeng Liu
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; School and Hospital of Stomatology, Fujian Medical University, Fuzhou, 350004, China
| | - Yunyang Liao
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; Laboratory of Facial Plastic and Reconstruction, Fujian Medical University, Fuzhou, 350004, China
| | - Bingwei He
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, 350108, China; Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, 350108, China
| | - Bowen Dai
- Department of Oral and Maxillofacial Surgery, Second Xiangya Hospital of Central South University, Changsha, 410000, China
| | - Zhaoju Zhu
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, 350108, China; Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, 350108, China
| | - Jiafeng Shi
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, 350108, China; Fujian Engineering Research Center of Joint Intelligent Medical Engineering, Fuzhou, 350108, China
| | - Yue Huang
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; Laboratory of Facial Plastic and Reconstruction, Fujian Medical University, Fuzhou, 350004, China
| | - Gengsen Zou
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; Laboratory of Facial Plastic and Reconstruction, Fujian Medical University, Fuzhou, 350004, China
| | - Chen Du
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; School and Hospital of Stomatology, Fujian Medical University, Fuzhou, 350004, China
| | - Bin Shi
- Department of Oral and Maxillofacial Surgery, First Affiliated Hospital of Fujian Medical University, Fuzhou, 350005, China; Laboratory of Facial Plastic and Reconstruction, Fujian Medical University, Fuzhou, 350004, China.
| |
Collapse
|
13
|
Lee YJ, Park Y, Ha Y, Kim S. Mandible Angle Resection with the Retroauricular Approach. J Clin Med 2023; 12:jcm12072641. [PMID: 37048723 PMCID: PMC10094842 DOI: 10.3390/jcm12072641] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Revised: 03/23/2023] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Square-shaped and large moon-shaped faces are commonly observed in Asians, and the contour of the mandible is associated with the shape of the lower part of the face. Mandible contouring surgery is performed to create a softer impression for East Asians. Currently, most surgeries are performed using an intraoral approach. External approaches have not been cosmetically attempted because of possible damage to the facial nerve and visible scarring and have been limited to mandible bone fracture reduction. This study included 42 patients who underwent mandibular angle reduction via classical intraoral incision and retroauricular incision between April 2019 and October 2021. Clinical outcomes were assessed using the Global Aesthetic Improvement Scale and Visual Analog Scale. Surgery was successful in all cases, with no significant complications. An appropriate mandibular contour was achieved postoperatively. All patients were satisfied with the outcome. Some patients experienced short-term complications, such as hematoma and wound disruption of the skin above the incision line. However, these improved within 3 weeks, and no serious long-term complications were observed. Mandible angle resection with the retroauricular approach is a promising alternative for patients, allowing speedy recovery and the resumption of routine daily life.
Collapse
Affiliation(s)
- Yoon Joo Lee
- Doctorsmi Aesthetic Plastic Surgical Clinic, Daejeon 35230, Republic of Korea
| | - Yunsung Park
- Department of Plastic and Reconstructive Surgery, College of Medicine, Chungnam National University, Daejeon 34134, Republic of Korea
| | - Yooseok Ha
- Department of Plastic and Reconstructive Surgery, College of Medicine, Chungnam National University, Daejeon 34134, Republic of Korea
| | - Sunje Kim
- Department of Plastic and Reconstructive Surgery, College of Medicine, Chungnam National University, Daejeon 34134, Republic of Korea
| |
Collapse
|
14
|
Zhang J, Lu V, Khanduja V. The impact of extended reality on surgery: a scoping review. INTERNATIONAL ORTHOPAEDICS 2023; 47:611-621. [PMID: 36645474 PMCID: PMC9841146 DOI: 10.1007/s00264-022-05663-z] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 12/03/2022] [Indexed: 01/17/2023]
Abstract
PURPOSE Extended reality (XR) is defined as a spectrum of technologies that range from purely virtual environments to enhanced real-world environments. In the past two decades, XR-assisted surgery has seen an increase in its use and also in research and development. This scoping review aims to map out the historical trends in these technologies and their future prospects, with an emphasis on the reported outcomes and ethical considerations on the use of these technologies. METHODS A systematic search of PubMed, Scopus, and Embase for literature related to XR-assisted surgery and telesurgery was performed using Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for scoping reviews (PRISMA-ScR) guidelines. Primary studies, peer-reviewed articles that described procedures performed by surgeons on human subjects and cadavers, as well as studies describing general surgical education, were included. Non-surgical procedures, bedside procedures, veterinary procedures, procedures performed by medical students, and review articles were excluded. Studies were classified into the following categories: impact on surgery (pre-operative planning and intra-operative navigation/guidance), impact on the patient (pain and anxiety), and impact on the surgeon (surgical training and surgeon confidence). RESULTS One hundred and sixty-eight studies were included for analysis. Thirty-one studies investigated the use of XR for pre-operative planning concluded that virtual reality (VR) enhanced the surgeon's spatial awareness of important anatomical landmarks. This leads to shorter operating sessions and decreases surgical insult. Forty-nine studies explored the use of XR for intra-operative planning. They noted that augmented reality (AR) headsets highlight key landmarks, as well as important structures to avoid, which lowers the chance of accidental surgical trauma. Eleven studies investigated patients' pain and noted that VR is able to generate a meditative state. This is beneficial for patients, as it reduces the need for analgesics. Ten studies commented on patient anxiety, suggesting that VR is unsuccessful at altering patients' physiological parameters such as mean arterial blood pressure or cortisol levels. Sixty studies investigated surgical training whilst seven studies suggested that the use of XR-assisted technology increased surgeon confidence. CONCLUSION The growth of XR-assisted surgery is driven by advances in hardware and software. Whilst augmented virtuality and mixed reality are underexplored, the use of VR is growing especially in the fields of surgical training and pre-operative planning. Real-time intra-operative guidance is key for surgical precision, which is being supplemented with AR technology. XR-assisted surgery is likely to undertake a greater role in the near future, given the effect of COVID-19 limiting physical presence and the increasing complexity of surgical procedures.
Collapse
Affiliation(s)
- James Zhang
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Victor Lu
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Vikas Khanduja
- Young Adult Hip Service, Department of Trauma and Orthopaedics, Addenbrooke’s Hospital, Cambridge University Hospital, Hills Road, Cambridge, CB2 0QQ UK
| |
Collapse
|
15
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
16
|
"Image to patient" equal-resolution surface registration supported by a surface scanner: analysis of algorithm efficiency for computer-aided surgery. Int J Comput Assist Radiol Surg 2023; 18:319-328. [PMID: 35831549 PMCID: PMC9889449 DOI: 10.1007/s11548-022-02704-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Accepted: 06/10/2022] [Indexed: 02/04/2023]
Abstract
PURPOSE The "image to patient" registration procedure is crucial for the accuracy of surgical instrument tracking relative to the medical image while computer-aided surgery. The main aim of this work was to create an equal-resolution surface registration algorithm (ERSR) and analyze its efficiency. METHODS The ERSR algorithm provides two datasets with equal, high resolution and approximately corresponding points. The registered sets are obtained by projection of a user-designed rectangle(s)-shaped uniform clouds of points on DICOM and surface scanner datasets. The tests of the algorithm were performed on a phantom with titanium microscrews. We analyzed the influence of DICOM resolution on the effect of the ERSR algorithm and compared the ERSR to standard paired-points landmark transform registration. The methods of analysis were Target Registration Error, distance maps, and their histogram evaluation. RESULTS The mean TRE in case of ERSR equaled 0.8 ± 0.3 mm (resolution A), 0.8 ± 0.5 mm (resolution B), and 1.0 ± 0.7 mm (resolution C). The mean values were at least 0.4 mm lower than in the case of landmark transform registration. The distance maps between the model achieved from the scanner and the CT-based model were analyzed by histogram. The frequency of the first bin in a histogram of the distance map for ERSR was about 0.6 for all three resolutions of DICOM dataset and three times higher than in the case of landmark transform registration. The results were statistically analyzed using the Wilcoxon signed-rank test (alpha = 0.05). CONCLUSION The tests proved a statistically significant higher efficiency of equal resolution surface registration related to the landmark transform algorithm. It was proven that the lower resolution of the CT DICOM dataset did not degrade the efficiency of the ERSR algorithm. We observed a significantly lower response to decreased resolution than in the case of paired-points landmark transform registration.
Collapse
|
17
|
Han B, Li R, Huang T, Ma L, Liang H, Zhang X, Liao H. An accurate 3D augmented reality navigation system with enhanced autostereoscopic display for oral and maxillofacial surgery. Int J Med Robot 2022; 18:e2404. [DOI: 10.1002/rcs.2404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2021] [Revised: 03/03/2022] [Accepted: 04/05/2022] [Indexed: 11/10/2022]
Affiliation(s)
- Boxuan Han
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Ruiyang Li
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Tianqi Huang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Longfei Ma
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Hanying Liang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Xinran Zhang
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| | - Hongen Liao
- Department of Biomedical Engineering School of Medicine Tsinghua University Beijing China
| |
Collapse
|
18
|
Pu JJ, Hakim SG, Melville JC, Su YX. Current Trends in the Reconstruction and Rehabilitation of Jaw following Ablative Surgery. Cancers (Basel) 2022; 14:cancers14143308. [PMID: 35884369 PMCID: PMC9320033 DOI: 10.3390/cancers14143308] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2022] [Revised: 06/18/2022] [Accepted: 06/21/2022] [Indexed: 12/04/2022] Open
Abstract
Simple Summary The Maxilla and mandible provide skeletal support for of the middle and lower third of our faces, allowing for the normal functioning of breathing, chewing, swallowing, and speech. The ablative surgery of jaws in the past often led to serious disfigurement and disruption in form and function. However, with recent strides made in computer-assisted surgery and patient-specific implants, the individual functional reconstruction of the jaw is evolving rapidly and the prompt rehabilitation of both the masticatory function and aesthetics after jaw resection has been made possible. In the present review, the recent advancements in jaw reconstruction technology and future perspectives will be discussed. Abstract The reconstruction and rehabilitation of jaws following ablative surgery have been transformed in recent years by the development of computer-assisted surgery and virtual surgical planning. In this narrative literature review, we aim to discuss the current state-of-the-art jaw reconstruction, and to preview the potential future developments. The application of patient-specific implants and the “jaw-in-a-day technique” have made the fast restoration of jaws’ function and aesthetics possible. The improved efficiency of primary reconstructive surgery allows for the rehabilitation of neurosensory function following ablative surgery. Currently, a great deal of research has been conducted on augmented/mixed reality, artificial intelligence, virtual surgical planning for soft tissue reconstruction, and the rehabilitation of the stomatognathic system. This will lead to an even more exciting future for the functional reconstruction and rehabilitation of the jaw following ablative surgery.
Collapse
Affiliation(s)
- Jane J. Pu
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong;
| | - Samer G. Hakim
- Department Oral and Maxillofacial Surgery, University Hospital of Lübeck, Ratzeburger Allee 160, 23538 Lübeck, Germany;
| | - James C. Melville
- Department of Oral and Maxillofacial Surgery, University of Texas Health Science Center at Houston, Houston, TX 77030, USA;
| | - Yu-Xiong Su
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong;
- Correspondence:
| |
Collapse
|
19
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
20
|
de Geer A, Brouwer de Koning S, van Alphen M, van der Mierden S, Zuur C, van Leeuwen F, Loeve A, van Veen R, Karakullukcu M. Registration methods for surgical navigation of the mandible: a systematic review. Int J Oral Maxillofac Surg 2022; 51:1318-1329. [PMID: 35165005 DOI: 10.1016/j.ijom.2022.01.017] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2021] [Revised: 10/18/2021] [Accepted: 01/26/2022] [Indexed: 12/20/2022]
|
21
|
Tang ZN, Hu LH, Soh HY, Yu Y, Zhang WB, Peng X. Accuracy of Mixed Reality Combined With Surgical Navigation Assisted Oral and Maxillofacial Tumor Resection. Front Oncol 2022; 11:715484. [PMID: 35096559 PMCID: PMC8795771 DOI: 10.3389/fonc.2021.715484] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Accepted: 12/20/2021] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVE To evaluate the feasibility and accuracy of mixed reality combined with surgical navigation in oral and maxillofacial tumor surgery. METHODS Retrospective analysis of data of seven patients with oral and maxillofacial tumors who underwent surgery between January 2019 and January 2021 using a combination of mixed reality and surgical navigation. Virtual surgical planning and navigation plan were based on preoperative CT datasets. Through IGT-Link port, mixed reality workstation was synchronized with surgical navigation, and surgical planning data were transferred to the mixed reality workstation. Osteotomy lines were marked with the aid of both surgical navigation and mixed reality images visualized through HoloLens. Frozen section examination was used to ensure negative surgical margins. Postoperative CT datasets were obtained 1 week after the surgery, and chromatographic analysis of virtual osteotomies and actual osteotomies was carried out. Patients received standard oncological postoperative follow-up. RESULTS Of the seven patients, four had maxillary tumors and three had mandibular tumors. There were total of 13 osteotomy planes. Mean deviation between the planned osteotomy plane and the actual osteotomy plane was 1.68 ± 0.92 mm; the maximum deviation was 3.46 mm. Chromatographic analysis showed error of ≤3 mm for 80.16% of the points. Mean deviations of maxillary and mandibular osteotomy lines were approximate (1.60 ± 0.93 mm vs. 1.86 ± 0.93 mm). While five patients had benign tumors, two had malignant tumors. Mean deviations of osteotomy lines was comparable between patients with benign and malignant tumors (1.48 ± 0.74 mm vs. 2.18 ± 0.77 mm). Intraoperative frozen pathology confirmed negative resection margins in all cases. No tumor recurrence or complications occurred during mean follow-up of 15.7 months (range, 6-26 months). CONCLUSION The combination of mixed reality technology and surgical navigation appears to be feasible, safe, and effective for tumor resection in the oral and maxillofacial region.
Collapse
Affiliation(s)
- Zu-Nan Tang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Lei-Hao Hu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Hui Yuh Soh
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China.,Department of Oral and Maxillofacial Surgery, Faculty of Dentistry, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia
| | - Yao Yu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Wen-Bo Zhang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| | - Xin Peng
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology & National Center of Stomatology & National Clinical Research Center for Oral Diseases & National Engineering Research Center of Oral Biomaterials and Digital Medical Devices, Beijing, China
| |
Collapse
|
22
|
Górecka Ż, Grzelecki D, Paskal W, Choińska E, Gilewicz J, Wrzesień R, Macherzyński W, Tracz M, Budzińska-Wrzesień E, Bedyńska M, Kopka M, Jackowska-Tracz A, Świątek-Najwer E, Włodarski PK, Jaworowski J, Święszkowski W. Biodegradable Fiducial Markers for Bimodal Near-Infrared Fluorescence- and X-ray-Based Imaging. ACS Biomater Sci Eng 2022; 8:859-870. [PMID: 35020357 DOI: 10.1021/acsbiomaterials.1c01259] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
This study aimed to evaluate, for the first time, implantable, biodegradable fiducial markers (FMs), which were designed for bimodal, near-infrared fluorescence-based (NIRF) and X-ray-based imaging. The developed FMs had poly(l-lactide-co-caprolactone)-based core-shell structures made of radiopaque (core) and fluorescent (shell) composites with a poly(l-lactide-co-caprolactone) matrix. The approved for human use contrast agents were utilized as fillers. Indocyanine green was applied to the shell material, whereas in the core materials, iohexol and barium sulfate were compared. Moreover, the possibility of tailoring the stability of the properties of the core materials by the addition of hydroxyapatite (HAp) was examined. The performed in situ (porcine tissue) and in vivo experiment (rat model) confirmed that the developed FMs possessed pronounced contrasting properties in NIRF and X-ray imaging. The presence of HAp improved the radiopacity of FMs at the initial state. It was also proved that, in iohexol-containing FMs, the presence of HAp slightly decreased the stability of contrasting properties, while in BaSO4-containing ones, changes were less pronounced. A comprehensive material analysis explaining the differences in the stability of the contrasting properties was also presented. The tissue response around the FMs with composite cores was comparable to that of the FMs with a pristine polymeric core. The developed composite FMs did not cause serious adverse effects on the surrounding tissues even when irradiated in vivo. The developed FMs ensured good visibility for NIRF image-supported tumor surgery and the following X-ray image-guided radiotherapy. Moreover, this study replenishes a scanty report regarding similar biodegradable composite materials with a high potential for application.
Collapse
Affiliation(s)
- Żaneta Górecka
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, 141 Woloska Str., 02-507 Warsaw, Poland.,Centre for Advanced Materials and Technologies CEZAMAT, Warsaw University of Technology, 02-822 Warsaw, Poland
| | - Dariusz Grzelecki
- Department of Applied Pharmacy, Medical University of Warsaw, 02-097 Warsaw, Poland.,Department of Orthopedics and Rheumoorthopedics, Professor Adam Gruca Teaching Hospital, Centre of Postgraduate Medical Education, 05-400 Otwock, Poland
| | - Wiktor Paskal
- Centre for Preclinical Research, The Department of Methodology, Medical University of Warsaw, 02-091 Warsaw, Poland
| | - Emilia Choińska
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, 141 Woloska Str., 02-507 Warsaw, Poland
| | - Joanna Gilewicz
- Department of Applied Pharmacy, Medical University of Warsaw, 02-097 Warsaw, Poland
| | - Robert Wrzesień
- Central Laboratory of Experimental Animal, Medical University of Warsaw, 02-097 Warsaw, Poland
| | - Wojciech Macherzyński
- Faculty of Microsystem Electronics and Photonics, Wroclaw University of Science and Technology, 50-372 Wroclaw, Poland
| | - Michał Tracz
- Institute of Veterinary Medicine, Department of Food Hygiene and Public Health Protection, Warsaw University of Life Sciences, 02-776 Warsaw, Poland
| | | | - Maria Bedyńska
- Department of Applied Pharmacy, Medical University of Warsaw, 02-097 Warsaw, Poland
| | - Michał Kopka
- Centre for Preclinical Research, The Department of Methodology, Medical University of Warsaw, 02-091 Warsaw, Poland
| | - Agnieszka Jackowska-Tracz
- Institute of Veterinary Medicine, Department of Food Hygiene and Public Health Protection, Warsaw University of Life Sciences, 02-776 Warsaw, Poland
| | - Ewelina Świątek-Najwer
- Faculty of Mechanical Engineering, Wroclaw University of Science and Technology, 50-371 Wroclaw, Poland
| | - Paweł K Włodarski
- Centre for Preclinical Research, The Department of Methodology, Medical University of Warsaw, 02-091 Warsaw, Poland
| | - Janusz Jaworowski
- Department of Applied Pharmacy, Medical University of Warsaw, 02-097 Warsaw, Poland
| | - Wojciech Święszkowski
- Division of Materials Design, Faculty of Materials Science and Engineering, Warsaw University of Technology, 141 Woloska Str., 02-507 Warsaw, Poland
| |
Collapse
|
23
|
Modabber A, Ayoub N, Redick T, Gesenhues J, Kniha K, Möhlhenrich SC, Raith S, Abel D, Hölzle F, Winnand P. Comparison of augmented reality and cutting guide technology in assisted harvesting of iliac crest grafts - A cadaver study. Ann Anat 2021; 239:151834. [PMID: 34547412 DOI: 10.1016/j.aanat.2021.151834] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Revised: 09/08/2021] [Accepted: 09/10/2021] [Indexed: 11/26/2022]
Abstract
BACKGROUND Harvesting vascularized bone grafts with computer-assisted surgery represents the gold standard for mandibular reconstruction. However, current augmented reality (AR) approaches are limited to invasive marker fixation. This trial compared a markerless AR-guided real-time navigation with virtually planned and 3D printed cutting guides for harvesting iliac crest grafts. MATERIAL AND METHODS Two commonly used iliac crest transplant configurations were virtually planned on 10 cadaver hips. Transplant harvest was performed with AR guidance and cutting guide technology. The harvested transplants were digitalized using cone beam CT. Deviations of angulation, distance and volume between the executed and planned osteotomies were measured. RESULTS Both AR and cutting guides accurately rendered the virtually planned transplant volume. However, the cumulative osteotomy plane angulation differed significantly (p = 0.018) between AR (14.99 ± 11.69°) and the cutting guides (8.49 ± 5.42°). The cumulative osteotomy plane distance showed that AR-guided navigation had lower accuracy (2.65 ± 3.32 mm) than the cutting guides (1.47 ± 1.36 mm), although without significant difference. CONCLUSION This study demonstrated the clinical usability of markerless AR-guided navigation for harvesting iliac crest grafts. Further improvement of accuracy rates might bring clinical implementation closer to reality.
Collapse
Affiliation(s)
- Ali Modabber
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany.
| | - Nassim Ayoub
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany
| | - Tim Redick
- Institute of Automatic Control, RWTH Aachen University, Campus Boulevard 30, D-52074 Aachen, Germany
| | - Jonas Gesenhues
- Institute of Automatic Control, RWTH Aachen University, Campus Boulevard 30, D-52074 Aachen, Germany
| | - Kristian Kniha
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany
| | | | - Stefan Raith
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany
| | - Dirk Abel
- Institute of Automatic Control, RWTH Aachen University, Campus Boulevard 30, D-52074 Aachen, Germany
| | - Frank Hölzle
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany
| | - Philipp Winnand
- Department of Oral and Maxillofacial Surgery, RWTH Aachen University Hospital, Pauwelsstraße 30, D-52074 Aachen, Germany
| |
Collapse
|
24
|
Winnand P, Ayoub N, Redick T, Gesenhues J, Heitzer M, Peters F, Raith S, Abel D, Hölzle F, Modabber A. Navigation of iliac crest graft harvest using markerless augmented reality and cutting guide technology: A pilot study. Int J Med Robot 2021; 18:e2318. [PMID: 34328700 DOI: 10.1002/rcs.2318] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 07/27/2021] [Accepted: 03/04/2021] [Indexed: 11/08/2022]
Abstract
BACKGROUND Defects of the facial skeleton often require complex reconstruction with vascularized grafts. This trial elucidated the usability, visual perception and accuracy of a markerless augmented reality (AR)-guided navigation for harvesting iliac crest transplants. METHODS Random CT scans were used to virtually plan two common transplant configurations on 10 iliac crest models, each printed four times. The transplants were harvested using projected AR and cutting guides. The duration and accuracies of the angulation, distance and volume between the planned and executed osteotomies were measured. RESULTS AR was characterized by the efficient use of time and accurate rendition of preoperatively planned geometries. However, vertical osteotomies and complex anatomical settings displayed significant inferiority of AR guidance compared to cutting guides. CONCLUSIONS This study demonstrated the usability of a markerless AR setup for harvesting iliac crest transplants. The visual perception and accuracy of the AR-guided osteotomies constituted remaining weaknesses against cutting guide technology.
Collapse
Affiliation(s)
- Philipp Winnand
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Nassim Ayoub
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Tim Redick
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Jonas Gesenhues
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Marius Heitzer
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Florian Peters
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Stefan Raith
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Dirk Abel
- Institute of Automatic Control, RWTH Aachen University, Aachen, Germany
| | - Frank Hölzle
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Ali Modabber
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| |
Collapse
|
25
|
Benmahdjoub M, van Walsum T, van Twisk P, Wolvius EB. Augmented reality in craniomaxillofacial surgery: added value and proposed recommendations through a systematic review of the literature. Int J Oral Maxillofac Surg 2021; 50:969-978. [PMID: 33339731 DOI: 10.1016/j.ijom.2020.11.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2019] [Revised: 11/11/2020] [Accepted: 11/18/2020] [Indexed: 10/22/2022]
Abstract
This systematic review provides an overview of augmented reality (AR) and its benefits in craniomaxillofacial surgery in an attempt to answer the question: Is AR beneficial for craniomaxillofacial surgery? This review includes a description of the studies conducted, the systems used and their technical characteristics. The search was performed in four databases: PubMed, Cochrane Library, Embase, and Web of Science. All journal articles published during the past 11 years related to AR, mixed reality, craniomaxillofacial, and surgery were considered in this study. From a total of 7067 articles identified using AR- and surgery-related keywords, 39 articles were finally selected. Based on these articles, a classification of study types, surgery types, devices used, metrics reported, and benefits were collected. The findings of this review indicate that AR could provide various benefits, addressing the challenges of conventional navigation systems, such as hand-eye coordination and depth perception. However, three main concerns were raised while performing this study: (1) it is complicated to aggregate the metrics reported in the articles, (2) it is difficult to obtain statistical value from the current studies, and (3) user evaluation studies are lacking. This article concludes with recommendations for future studies by addressing the latter points.
Collapse
Affiliation(s)
- M Benmahdjoub
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands; Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands.
| | - T van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - P van Twisk
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - E B Wolvius
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
26
|
Yang WF, Su YX. Artificial intelligence-enabled automatic segmentation of skull CT facilitates computer-assisted craniomaxillofacial surgery. Oral Oncol 2021; 118:105360. [PMID: 34045151 DOI: 10.1016/j.oraloncology.2021.105360] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Accepted: 05/18/2021] [Indexed: 10/21/2022]
Abstract
BACKGROUND The image segmentation of skull CT is the cornerstone for the computer-assisted craniomaxillofacial surgery in multiple aspects. This study aims to introduce an AI-enabled automatic segmentation and propose its prospect in facilitating the computer-assisted surgery. METHODS Three patients enrolled in a clinical trial of computer-assisted craniomaxillofacial surgery were randomly selected for this study. The preoperative helical CT scans of the head and neck region were subjected to the AI-enabled automatic segmentation in Mimics Viewer. The performance of AI segmentation was evaluated based on the requirements of computer-assisted surgery. RESULTS All three patients were successfully segmented by the AI-enabled automatic segmentation. The performance of AI segmentation was excellent regarding key anatomical structures. The overall quality of bone surface was satisfying. The median DICE coefficient was 92.4% for the maxilla, and 94.9% for the mandible, which fulfilled the requirements of computer-assisted craniomaxillofacial surgery. CONCLUSIONS The AI-enabled automatic segmentation could facilitate the preoperative virtual planning and postoperative outcome verification, which formed a feedback loop to enhance the current workflow of computer-assisted surgery. More studies are warranted to confirm the robustness of AI segmentation with more cases.
Collapse
Affiliation(s)
- Wei-Fa Yang
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong Special Administrative Region
| | - Yu-Xiong Su
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong Special Administrative Region.
| |
Collapse
|
27
|
Glas HH, Kraeima J, van Ooijen PMA, Spijkervet FKL, Yu L, Witjes MJH. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J Oral Maxillofac Surg 2021; 79:1943.e1-1943.e10. [PMID: 34033801 DOI: 10.1016/j.joms.2021.04.001] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/01/2021] [Accepted: 04/01/2021] [Indexed: 01/21/2023]
Abstract
BACKGROUND Oral and maxillofacial surgery currently relies on virtual surgery planning based on image data (CT, MRI). Three-dimensional (3D) visualizations are typically used to plan and predict the outcome of complex surgical procedures. To translate the virtual surgical plan to the operating room, it is either converted into physical 3D-printed guides or directly translated using real-time navigation systems. PURPOSE This study aims to improve the translation of the virtual surgery plan to a surgical procedure, such as oncologic or trauma surgery, in terms of accuracy and speed. Here we report an augmented reality visualization technique for image-guided surgery. It describes how surgeons can visualize and interact with the virtual surgery plan and navigation data while in the operating room. The user friendliness and usability is objectified by a formal user study that compared our augmented reality assisted technique to the gold standard setup of a perioperative navigation system (Brainlab). Moreover, accuracy of typical navigation tasks as reaching landmarks and following trajectories is compared. RESULTS Overall completion time of navigation tasks was 1.71 times faster using augmented reality (P = .034). Accuracy improved significantly using augmented reality (P < .001), for reaching physical landmarks a less strong correlation was found (P = .087). Although the participants were relatively unfamiliar with VR/AR (rated 2.25/5) and gesture-based interaction (rated 2/5), they reported that navigation tasks become easier to perform using augmented reality (difficulty Brainlab rated 3.25/5, HoloLens 2.4/5). CONCLUSION The proposed workflow can be used in a wide range of image-guided surgery procedures as an addition to existing verified image guidance systems. Results of this user study imply that our technique enables typical navigation tasks to be performed faster and more accurately compared to the current gold standard. In addition, qualitative feedback on our augmented reality assisted technique was more positive compared to the standard setup.?>.
Collapse
Affiliation(s)
- H H Glas
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
| | - J Kraeima
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - P M A van Ooijen
- Associate Professor Faculty of Medical Sciences, Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - F K L Spijkervet
- Professor, Oral and Maxillofacial Surgeon, Head of the Department, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - L Yu
- Lecturer in the Department of Computer Science and Software Engineering (CSSE), Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - M J H Witjes
- Oral and Maxillofacial Surgeon, Principal Investigator, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
28
|
Gsaxner C, Pepe A, Li J, Ibrahimpasic U, Wallner J, Schmalstieg D, Egger J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 200:105854. [PMID: 33261944 DOI: 10.1016/j.cmpb.2020.105854] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Accepted: 11/16/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Augmented reality (AR) can help to overcome current limitations in computer assisted head and neck surgery by granting "X-ray vision" to physicians. Still, the acceptance of AR in clinical applications is limited by technical and clinical challenges. We aim to demonstrate the benefit of a marker-free, instant calibration AR system for head and neck cancer imaging, which we hypothesize to be acceptable and practical for clinical use. METHODS We implemented a novel AR system for visualization of medical image data registered with the head or face of the patient prior to intervention. Our system allows the localization of head and neck carcinoma in relation to the outer anatomy. Our system does not require markers or stationary infrastructure, provides instant calibration and allows 2D and 3D multi-modal visualization for head and neck surgery planning via an AR head-mounted display. We evaluated our system in a pre-clinical user study with eleven medical experts. RESULTS Medical experts rated our application with a system usability scale score of 74.8 ± 15.9, which signifies above average, good usability and clinical acceptance. An average of 12.7 ± 6.6 minutes of training time was needed by physicians, before they were able to navigate the application without assistance. CONCLUSIONS Our AR system is characterized by a slim and easy setup, short training time and high usability and acceptance. Therefore, it presents a promising, novel tool for visualizing head and neck cancer imaging and pre-surgical localization of target structures.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jianning Li
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Una Ibrahimpasic
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria
| | - Jürgen Wallner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria; Department of Cranio-Maxillofacial Surgery, AZ Monica Hospital Antwerp and Antwerp University Hospital, Antwerp, Belgium.
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| |
Collapse
|
29
|
The use of 3D virtual surgical planning and computer aided design in reconstruction of maxillary surgical defects. Curr Opin Otolaryngol Head Neck Surg 2020; 28:122-128. [PMID: 32102008 DOI: 10.1097/moo.0000000000000618] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
PURPOSE OF REVIEW The present review describes the latest development of 3D virtual surgical planning (VSP) and computer aided design (CAD) for reconstruction of maxillary defects with an aim of fully prosthetic rehabilitation. The purpose is to give an overview of different methods that use CAD in maxillary reconstruction in patients with head and neck cancer. RECENT FINDINGS 3D VSP enables preoperative planning of resection margins and osteotomies. The current 3D VSP workflow is expanded with multimodal imaging, merging decision supportive information. Development of more personalized implants is possible using CAD, individualized virtual muscle modelling and topology optimization. Meanwhile the translation of the 3D VSP towards surgery is improved by techniques like intraoperative imaging and augmented reality. Recent improvements of preoperative 3D VSP enables surgical reconstruction and/or prosthetic rehabilitation of the surgical defect in one combined procedure. SUMMARY With the use of 3D VSP and CAD, ablation surgery, reconstructive surgery, and prosthetic rehabilitation can be planned preoperatively. Many reconstruction possibilities exist and a choice depends on patient characteristics, tumour location and experience of the surgeon. The overall objective in patients with maxillary defects is to follow a prosthetic-driven reconstruction with the aim to restore facial form, oral function, and do so in accordance with the individual needs of the patient.
Collapse
|
30
|
Merema BBJ, Kraeima J, de Visscher SAHJ, van Minnen B, Spijkervet FKL, Schepman K, Witjes MJH. Novel finite element-based plate design for bridging mandibular defects: Reducing mechanical failure. Oral Dis 2020; 26:1265-1274. [PMID: 32176821 PMCID: PMC7507837 DOI: 10.1111/odi.13331] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2019] [Revised: 02/24/2020] [Accepted: 02/27/2020] [Indexed: 12/18/2022]
Abstract
INTRODUCTION When the application of a free vascularised flap is not possible, a segmental mandibular defect is often reconstructed using a conventional reconstruction plate. Mechanical failure of such reconstructions is mostly caused by plate fracture and screw pull-out. This study aims to develop a reliable, mechanically superior, yet slender patient-specific reconstruction plate that reduces failure due to these causes. PATIENTS AND METHODS Eight patients were included in the study. Indications were as follows: fractured reconstruction plate (2), loosened screws (1) and primary reconstruction of a mandibular continuity defect (5). Failed conventional reconstructions were studied using finite element analysis (FEA). A 3D virtual surgical plan (3D-VSP) with a novel patient-specific (PS) titanium plate was developed for each patient. Postoperative CBCT scanning was performed to validate reconstruction accuracy. RESULTS All PS plates were placed accurately according to the 3D-VSP. Mean 3D screw entry point deviation was 1.54 mm (SD: 0.85, R: 0.10-3.19), and mean screw angular deviation was 5.76° (SD: 3.27, R: 1.26-16.62). FEA indicated decreased stress and screw pull-out inducing forces. No mechanical failures appeared (mean follow-up: 16 months, R: 7-29). CONCLUSION Reconstructing mandibular continuity defects with bookshelf-reconstruction plates with FEA underpinning the design seems to reduce the risk of screw pull-out and plate fractures.
Collapse
Affiliation(s)
- Bram B. J. Merema
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| | - Joep Kraeima
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| | | | - Baucke van Minnen
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| | - Fred K. L. Spijkervet
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| | - Kees‐Pieter Schepman
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| | - Max J. H. Witjes
- Department of Oral and Maxillofacial SurgeryUniversity Medical Center GroningenGroningenThe Netherlands
| |
Collapse
|
31
|
A surgical navigated cutting guide for mandibular osteotomies: accuracy and reproducibility of an image-guided mandibular osteotomy. Int J Comput Assist Radiol Surg 2020; 15:1719-1725. [PMID: 32725399 DOI: 10.1007/s11548-020-02234-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Accepted: 07/14/2020] [Indexed: 12/26/2022]
Abstract
PURPOSE 3D-printed cutting guides are the current standard to translate the virtual surgery plan to the intraoperative setting. The production of these patient-specific cutting guides is time-consuming and costly, and therefore, alternative approaches are currently subject of research. The aim of this study was to assess the accuracy and reproducibility of using a novel electromagnetic (EM) navigated surgical cutting guide to perform virtually planned osteotomies in mandible models. METHODS A novel 3D navigated cutting guide (dubbed Bladerunner) was designed and evaluated with a total of 20 osteotomies, performed on plaster mandibular models according to preoperative planning using EM navigation. The pre- and postoperative scans were registered, and the difference between the preoperatively planned osteotomy and the performed osteotomy was expressed as the distance between the planned and performed cutting planes, and the yaw and roll angles between the planes. RESULTS The mean difference in distance between the planned osteotomy and performed osteotomy was 1.1 mm (STD 0.6 mm), the mean yaw was 1.8° (STD 1.4°), and mean roll was 1.6° (STD 1.3°). CONCLUSION The proposed EM navigated cutting guide for mandibular osteotomies demonstrated accurate positioning of the cutting plane according to the preoperative virtual surgical plan with respect to distance, yaw and roll angles. This novel approach has the potential to make the use of 3D-printed cutting guides obsolete, thereby decreasing the interval between diagnosis and surgery, reduce cost and allow for adaptation of the virtual plan in case of rapid tumor proliferation or unanticipated in situ deviations from the preoperative CT/MR imaging.
Collapse
|
32
|
Evaluating the accuracy of resection planes in mandibular surgery using a preoperative, intraoperative, and postoperative approach. Int J Oral Maxillofac Surg 2020; 50:287-293. [PMID: 32682645 DOI: 10.1016/j.ijom.2020.06.013] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Revised: 04/17/2020] [Accepted: 06/25/2020] [Indexed: 11/20/2022]
Abstract
In mandibular surgery, three-dimensionally printed patient-specific cutting guides are used to translate the preoperative virtually planned resection planes to the operating room. This study was performed to determine whether cutting guides are positioned according to the virtual plan and to compare the intraoperative position of the cutting guide with the resection performed. Nine patients were included. The exact positions of the resection planes were planned virtually and a patient-specific cutting guide was designed and printed. After surgical placement of the cutting guide, intraoperative cone beam computed tomography (CBCT) was performed. Postoperative CT was used to obtain the final resection planes. Distances and yaw and pitch angles between the preoperative, intraoperative, and postoperative resection planes were calculated. Cutting guides were positioned on the mandible with millimetre accuracy. Anterior osteotomies were performed more accurately than posterior osteotomies (intraoperatively positioned and final resection planes differed by 1.2±1.0mm, 4.9±6.6°, and 1.8±1.5°, respectively, and by 2.2±0.9mm, 9.3±9°, and 8.3±6.5° respectively). Differences between intraoperatively planned and final resection planes imply a directional freedom of the saw through the saw slots. Since cutting guides are positioned with millimetre accuracy compared to the virtual plan, the design of the saw slots in the cutting guides needs improvement to allow more accurate resections.
Collapse
|
33
|
Gsaxner C, Wallner J, Chen X, Zemann W, Egger J. Facial model collection for medical augmented reality in oncologic cranio-maxillofacial surgery. Sci Data 2019; 6:310. [PMID: 31819060 PMCID: PMC6901520 DOI: 10.1038/s41597-019-0327-8] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2019] [Accepted: 11/21/2019] [Indexed: 01/25/2023] Open
Abstract
Medical augmented reality (AR) is an increasingly important topic in many medical fields. AR enables x-ray vision to see through real world objects. In medicine, this offers pre-, intra- or post-interventional visualization of "hidden" structures. In contrast to a classical monitor view, AR applications provide visualization not only on but also in relation to the patient. However, research and development of medical AR applications is challenging, because of unique patient-specific anatomies and pathologies. Working with several patients during the development for weeks or even months is not feasible. One alternative are commercial patient phantoms, which are very expensive. Hence, this data set provides a unique collection of head and neck cancer patient PET-CT scans with corresponding 3D models, provided as stereolitography (STL) files. The 3D models are optimized for effective 3D printing at low cost. This data can be used in the development and evaluation of AR applications for head and neck surgery.
Collapse
Affiliation(s)
- Christina Gsaxner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 6/1, 8036, Graz, Austria
- Computer Algorithms for Medicine Laboratory, Graz, Austria
- Institute for Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16c/II, 8010, Graz, Austria
| | - Jürgen Wallner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 6/1, 8036, Graz, Austria.
- Computer Algorithms for Medicine Laboratory, Graz, Austria.
| | - Xiaojun Chen
- Shanghai Jiao Tong University, School of Mechanical Engineering, 800 Dong Chuan Road, Shanghai, 200240, China
| | - Wolfgang Zemann
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 6/1, 8036, Graz, Austria
| | - Jan Egger
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 6/1, 8036, Graz, Austria
- Computer Algorithms for Medicine Laboratory, Graz, Austria
- Institute for Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16c/II, 8010, Graz, Austria
- Shanghai Jiao Tong University, School of Mechanical Engineering, 800 Dong Chuan Road, Shanghai, 200240, China
| |
Collapse
|