1
|
Ceccariglia F, Cercenelli L, Badiali G, Marcelli E, Tarsitano A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J Pers Med 2022; 12:jpm12122047. [PMID: 36556268 PMCID: PMC9785494 DOI: 10.3390/jpm12122047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/03/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
In the relevant global context, although virtual reality, augmented reality, and mixed reality have been emerging methodologies for several years, only now have technological and scientific advances made them suitable for revolutionizing clinical care and medical settings through the provision of advanced features and improved healthcare services. Over the past fifteen years, tools and applications using augmented reality (AR) have been designed and tested in the context of various surgical and medical disciplines, including maxillofacial surgery. The purpose of this paper is to show how a marker-less AR guidance system using the Microsoft® HoloLens 2 can be applied in mandible and maxillary demolition surgery to guide maxillary osteotomies. We describe three mandibular and maxillary oncologic resections performed during 2021 using AR support. In these three patients, we applied a marker-less tracking method based on recognition of the patient's facial profile. The surgeon, using HoloLens 2 smart glasses, could see the virtual surgical planning superimposed on the patient's anatomy. We showed that performing osteotomies under AR guidance is feasible and viable, as demonstrated by comparison with osteotomies performed using CAD-CAM cutting guides. This technology has advantages and disadvantages. However, further research is needed to improve the stability and robustness of the marker-less tracking method applied to patient face recognition.
Collapse
Affiliation(s)
- Francesco Ceccariglia
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Correspondence: ; Tel.: +39-051-2144197
| | - Laura Cercenelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| | - Emanuela Marcelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| |
Collapse
|
2
|
Abstract
Augmented reality (AR) is an innovative system that enhances the real world by superimposing virtual objects on reality. The aim of this study was to analyze the application of AR in medicine and which of its technical solutions are the most used. We carried out a scoping review of the articles published between 2019 and February 2022. The initial search yielded a total of 2649 articles. After applying filters, removing duplicates and screening, we included 34 articles in our analysis. The analysis of the articles highlighted that AR has been traditionally and mainly used in orthopedics in addition to maxillofacial surgery and oncology. Regarding the display application in AR, the Microsoft HoloLens Optical Viewer is the most used method. Moreover, for the tracking and registration phases, the marker-based method with a rigid registration remains the most used system. Overall, the results of this study suggested that AR is an innovative technology with numerous advantages, finding applications in several new surgery domains. Considering the available data, it is not possible to clearly identify all the fields of application and the best technologies regarding AR.
Collapse
|
3
|
Chan HHL, Haerle SK, Daly MJ, Zheng J, Philp L, Ferrari M, Douglas CM, Irish JC. An integrated augmented reality surgical navigation platform using multi-modality imaging for guidance. PLoS One 2021; 16:e0250558. [PMID: 33930063 PMCID: PMC8087077 DOI: 10.1371/journal.pone.0250558] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Accepted: 04/11/2021] [Indexed: 11/23/2022] Open
Abstract
An integrated augmented reality (AR) surgical navigation system that potentially improves intra-operative visualization of concealed anatomical structures. Integration of real-time tracking technology with a laser pico-projector allows the surgical surface to be augmented by projecting virtual images of lesions and critical structures created by multimodality imaging. We aim to quantitatively and qualitatively evaluate the performance of a prototype interactive AR surgical navigation system through a series of pre-clinical studies. Four pre-clinical animal studies using xenograft mouse models were conducted to investigate system performance. A combination of CT, PET, SPECT, and MRI images were used to augment the mouse body during image-guided procedures to assess feasibility. A phantom with machined features was employed to quantitatively estimate the system accuracy. All the image-guided procedures were successfully performed. The tracked pico-projector correctly and reliably depicted virtual images on the animal body, highlighting the location of tumour and anatomical structures. The phantom study demonstrates the system was accurate to 0.55 ± 0.33mm. This paper presents a prototype real-time tracking AR surgical navigation system that improves visualization of underlying critical structures by overlaying virtual images onto the surgical site. This proof-of-concept pre-clinical study demonstrated both the clinical applicability and high precision of the system which was noted to be accurate to <1mm.
Collapse
Affiliation(s)
- Harley H. L. Chan
- TECHNA Institute, University Health Network, Toronto, ON, Canada
- * E-mail:
| | - Stephan K. Haerle
- Center for Head and Neck Surgical Oncology and Reconstructive Surgery, Hirslanden Clinic, Lucerne, Switzerland
| | - Michael J. Daly
- TECHNA Institute, University Health Network, Toronto, ON, Canada
| | - Jinzi Zheng
- TECHNA Institute, University Health Network, Toronto, ON, Canada
| | - Lauren Philp
- Institute of Medical Science, University of Toronto, Toronto, ON, Canada
- Department of Obstetrics and Gynecology, University of Toronto, Toronto, ON, Canada
| | - Marco Ferrari
- TECHNA Institute, University Health Network, Toronto, ON, Canada
- Department of Otolaryngology-Head and Neck Surgery, University of Toronto, Toronto, ON, Canada
- Unit of Otorhinolaryngology–Head and Neck Surgery, University of Brescia, Brescia, Italy
| | - Catriona M. Douglas
- TECHNA Institute, University Health Network, Toronto, ON, Canada
- Department of Otolaryngology-Head and Neck Surgery, University of Toronto, Toronto, ON, Canada
- Department of Surgical Oncology, Princess Margaret Cancer Centre, University Health Network, Toronto, ON, Canada
| | - Jonathan C. Irish
- TECHNA Institute, University Health Network, Toronto, ON, Canada
- Department of Otolaryngology-Head and Neck Surgery, University of Toronto, Toronto, ON, Canada
- Department of Surgical Oncology, Princess Margaret Cancer Centre, University Health Network, Toronto, ON, Canada
| |
Collapse
|
4
|
Manni F, Mamprin M, Holthuizen R, Shan C, Burström G, Elmi-Terander A, Edström E, Zinger S, de With PHN. Multi-view 3D skin feature recognition and localization for patient tracking in spinal surgery applications. Biomed Eng Online 2021; 20:6. [PMID: 33413426 PMCID: PMC7792004 DOI: 10.1186/s12938-020-00843-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Accepted: 12/19/2020] [Indexed: 11/25/2022] Open
Abstract
BACKGROUND Minimally invasive spine surgery is dependent on accurate navigation. Computer-assisted navigation is increasingly used in minimally invasive surgery (MIS), but current solutions require the use of reference markers in the surgical field for both patient and instruments tracking. PURPOSE To improve reliability and facilitate clinical workflow, this study proposes a new marker-free tracking framework based on skin feature recognition. METHODS Maximally Stable Extremal Regions (MSER) and Speeded Up Robust Feature (SURF) algorithms are applied for skin feature detection. The proposed tracking framework is based on a multi-camera setup for obtaining multi-view acquisitions of the surgical area. Features can then be accurately detected using MSER and SURF and afterward localized by triangulation. The triangulation error is used for assessing the localization quality in 3D. RESULTS The framework was tested on a cadaver dataset and in eight clinical cases. The detected features for the entire patient datasets were found to have an overall triangulation error of 0.207 mm for MSER and 0.204 mm for SURF. The localization accuracy was compared to a system with conventional markers, serving as a ground truth. An average accuracy of 0.627 and 0.622 mm was achieved for MSER and SURF, respectively. CONCLUSIONS This study demonstrates that skin feature localization for patient tracking in a surgical setting is feasible. The technology shows promising results in terms of detected features and localization accuracy. In the future, the framework may be further improved by exploiting extended feature processing using modern optical imaging techniques for clinical applications where patient tracking is crucial.
Collapse
Affiliation(s)
- Francesca Manni
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands.
| | - Marco Mamprin
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | | | - Caifeng Shan
- Shandong University of Science and Technology, Qingdao, China
| | - Gustav Burström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Adrian Elmi-Terander
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Erik Edström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Svitlana Zinger
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Peter H N de With
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
5
|
Pérez-Pachón L, Poyade M, Lowe T, Gröning F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1260:175-195. [PMID: 33211313 DOI: 10.1007/978-3-030-47483-6_10] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Augmented Reality (AR) applied to surgical guidance is gaining relevance in clinical practice. AR-based image overlay surgery (i.e. the accurate overlay of patient-specific virtual images onto the body surface) helps surgeons to transfer image data produced during the planning of the surgery (e.g. the correct resection margins of tissue flaps) to the operating room, thus increasing accuracy and reducing surgery times. We systematically reviewed 76 studies published between 2004 and August 2018 to explore which existing tracking and registration methods and technologies allow healthcare professionals and researchers to develop and implement these systems in-house. Most studies used non-invasive markers to automatically track a patient's position, as well as customised algorithms, tracking libraries or software development kits (SDKs) to compute the registration between patient-specific 3D models and the patient's body surface. Few studies combined the use of holographic headsets, SDKs and user-friendly game engines, and described portable and wearable systems that combine tracking, registration, hands-free navigation and direct visibility of the surgical site. Most accuracy tests included a low number of subjects and/or measurements and did not normally explore how these systems affect surgery times and success rates. We highlight the need for more procedure-specific experiments with a sufficient number of subjects and measurements and including data about surgical outcomes and patients' recovery. Validation of systems combining the use of holographic headsets, SDKs and game engines is especially interesting as this approach facilitates an easy development of mobile AR applications and thus the implementation of AR-based image overlay surgery in clinical practice.
Collapse
Affiliation(s)
- Laura Pérez-Pachón
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK.
| | - Matthieu Poyade
- School of Simulation and Visualisation, Glasgow School of Art, Glasgow, UK
| | - Terry Lowe
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
- Head and Neck Oncology Unit, Aberdeen Royal Infirmary (NHS Grampian), Aberdeen, UK
| | - Flora Gröning
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
| |
Collapse
|
6
|
Pellegrino G, Mangano C, Mangano R, Ferri A, Taraschi V, Marchetti C. Augmented reality for dental implantology: a pilot clinical report of two cases. BMC Oral Health 2019; 19:158. [PMID: 31324246 PMCID: PMC6642526 DOI: 10.1186/s12903-019-0853-y] [Citation(s) in RCA: 44] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Accepted: 07/11/2019] [Indexed: 12/28/2022] Open
Abstract
BACKGROUND Despite the limited number of articles dedicated to its use, augmented reality (AR) is an emerging technology that has shown to have increasing applications in multiple different medical sectors. These include, but are not limited to, the Maxillo-facial and Dentistry disciplines of medicine. In these medical specialties, the focus of AR technology is to achieve a more visible surgical field during an operation. Currently, this goal is brought about by an accurate display of either static or dynamic diagnostic images via the use of a visor or specific glasses. The objective of this study is to evaluate the feasibility of using a virtual display for dynamic navigation via AR. The secondary outcome is to evaluate if the use of this technology could affect the accuracy of dynamic navigation. CASE PRESENTATION Two patients, both needing implant rehabilitation in the upper premolar area, were treated with flapless surgery. Prior to the procedure itself, the position of the implant was virtually planned and placed for each of the patients using their previous scans. This placement preparation contributed to a dynamic navigation system that was displayed on AR glasses. This, in turn, allowed for the use of a computer-aided/image-guided procedure to occur. Dedicated software for surface superimposition was then used to match the planned position of the implant and the real one obtained from the postoperative scan. Accuracies, using this procedure were evaluated by way of measuring the deviation between real and planned positions of the implants. For both surgeries it was possible to proceed using the AR technology as planned. The deviations for the first implant were 0.53 mm at the entry point and 0.50 mm at the apical point and for the second implant were 0.46 mm at the entry point and 0.48 mm at the apical point. The angular deviations were respectively 3.05° and 2.19°. CONCLUSIONS From the results of this pilot study, it seems that AR can be useful in dental implantology for displaying dynamic navigation systems. While this technology did not seem to noticeably affect the accuracy of the procedure, specific software applications should further optimize the results.
Collapse
Affiliation(s)
- Gerardo Pellegrino
- Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, 125, Via San Vitale 59, 40125, Bologna, Italy.
| | - Carlo Mangano
- Digital Dentistry Section, University San Raffaele, Milan, Italy
| | | | - Agnese Ferri
- Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, 125, Via San Vitale 59, 40125, Bologna, Italy
| | - Valerio Taraschi
- University of Technology - Sydney, School of Life Sciences, Sydney, Australia
| | - Claudio Marchetti
- Chief of Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, Bologna, Italy
| |
Collapse
|
7
|
A Novel Noninvasive Patient-Specific Navigation Method for Orbital Reconstructive Surgery: A Phantom Study Using Patient Data. Plast Reconstr Surg 2019; 143:602e-612e. [PMID: 30601235 DOI: 10.1097/prs.0000000000005381] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND The correction of orbital deformities is an ongoing challenge in maxillofacial surgery. Computer-assisted navigation can improve surgical outcomes. However, conventional registration methods for navigation are not appropriate for orbital reconstructive surgery. This study proposes an accurate, noninvasive, patient-specific navigation method and demonstrates its feasibility. METHODS A noninvasive, patient-specific registration frame based on the external auditory canals and upper front teeth was designed using software developed in-house. A three-dimensional craniofacial model was segmented from patient computed tomographic data for the registration frame. A customized craniofacial phantom was also made using this three-dimensional model, with 20 embedded target points on the orbital model and 21 landmark points on the reference standard model. The proposed method was compared with two conventional registration methods: the dental splint-based method and the invasive marker frame-based method. Twenty trials were conducted for evaluation. Target registration error and surface registration error were computed to measure accuracy. RESULTS The proposed method showed a target registration error of 1.05 ± 0.52 mm, with greater accuracy than conventional methods (dental splint, 2.10 ± 0.63 mm; invasive marker frame, 1.22 ± 0.46 mm). The proposed method yielded the best results for surface registration error, with 0.38 mm of deviation (dental splint, 0.82 mm; invasive marker frame, 0.60 mm). CONCLUSION The proposed noninvasive patient-specific registration method demonstrated superior results for both target registration error and surface registration error compared with other conventional registration methods for computer-assisted navigation in orbital reconstructive surgery. CLINICAL QUESTION/LEVEL OF EVIDENCE Therapeutic, V.
Collapse
|
8
|
Ahn J, Choi H, Hong J, Hong J. Tracking Accuracy of a Stereo Camera-Based Augmented Reality Navigation System for Orthognathic Surgery. J Oral Maxillofac Surg 2019; 77:1070.e1-1070.e11. [DOI: 10.1016/j.joms.2018.12.032] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2018] [Revised: 12/27/2018] [Accepted: 12/27/2018] [Indexed: 10/27/2022]
|
9
|
Hussain R, Lalande A, Guigou C, Bozorg Grayeli A. Contribution of Augmented Reality to Minimally Invasive Computer-Assisted Cranial Base Surgery. IEEE J Biomed Health Inform 2019; 24:2093-2106. [DOI: 10.1109/jbhi.2019.2954003] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
10
|
Bosc R, Fitoussi A, Hersant B, Dao TH, Meningaud JP. Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies. Int J Oral Maxillofac Surg 2019; 48:132-139. [DOI: 10.1016/j.ijom.2018.09.010] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2018] [Revised: 09/16/2018] [Accepted: 09/24/2018] [Indexed: 12/30/2022]
|
11
|
Fida B, Cutolo F, di Franco G, Ferrari M, Ferrari V. Augmented reality in open surgery. Updates Surg 2018; 70:389-400. [PMID: 30006832 DOI: 10.1007/s13304-018-0567-8] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Accepted: 07/08/2018] [Indexed: 12/17/2022]
Abstract
Augmented reality (AR) has been successfully providing surgeons an extensive visual information of surgical anatomy to assist them throughout the procedure. AR allows surgeons to view surgical field through the superimposed 3D virtual model of anatomical details. However, open surgery presents new challenges. This study provides a comprehensive overview of the available literature regarding the use of AR in open surgery, both in clinical and simulated settings. In this way, we aim to analyze the current trends and solutions to help developers and end/users discuss and understand benefits and shortcomings of these systems in open surgery. We performed a PubMed search of the available literature updated to January 2018 using the terms (1) "augmented reality" AND "open surgery", (2) "augmented reality" AND "surgery" NOT "laparoscopic" NOT "laparoscope" NOT "robotic", (3) "mixed reality" AND "open surgery", (4) "mixed reality" AND "surgery" NOT "laparoscopic" NOT "laparoscope" NOT "robotic". The aspects evaluated were the following: real data source, virtual data source, visualization processing modality, tracking modality, registration technique, and AR display type. The initial search yielded 502 studies. After removing the duplicates and by reading abstracts, a total of 13 relevant studies were chosen. In 1 out of 13 studies, in vitro experiments were performed, while the rest of the studies were carried out in a clinical setting including pancreatic, hepatobiliary, and urogenital surgeries. AR system in open surgery appears as a versatile and reliable tool in the operating room. However, some technological limitations need to be addressed before implementing it into the routine practice.
Collapse
Affiliation(s)
- Benish Fida
- Department of Information Engineering, University of Pisa, Pisa, Italy.,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy
| | - Fabrizio Cutolo
- Department of Information Engineering, University of Pisa, Pisa, Italy. .,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy.
| | - Gregorio di Franco
- General Surgery Unit, Department of Surgery, Translational and New Technologies, University of Pisa, Pisa, Italy
| | - Mauro Ferrari
- Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy.,Vascular Surgery Unit, Cisanello University Hospital AOUP, Pisa, Italy
| | - Vincenzo Ferrari
- Department of Information Engineering, University of Pisa, Pisa, Italy.,Department of Translational Research and New Technologies in Medicine and Surgery, EndoCAS Center, University of Pisa, Pisa, Italy
| |
Collapse
|
12
|
Robust and Accurate Algorithm for Wearable Stereoscopic Augmented Reality with Three Indistinguishable Markers. ELECTRONICS 2016. [DOI: 10.3390/electronics5030059] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
|
13
|
Soft tissue coverage on the segmentation accuracy of the 3D surface-rendered model from cone-beam CT. Clin Oral Investig 2016; 21:921-930. [PMID: 27206862 PMCID: PMC5360826 DOI: 10.1007/s00784-016-1844-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2015] [Accepted: 05/02/2016] [Indexed: 10/25/2022]
Abstract
OBJECTIVES The aim of this study is to investigate the effect of soft tissue presence on the segmentation accuracy of the 3D hard tissue models from cone-beam computed tomography (CBCT). MATERIALS AND METHODS Seven pairs of CBCT Digital Imaging and Communication in Medicine (DICOM) datasets, containing data of human cadaver heads and their respective dry skulls, were used. The effect of the soft tissue presence on the accuracy of the segmented models was evaluated by performing linear and angular measurements and by superimposition and color mapping of the surface discrepancies after splitting the mandible and maxillo-facial complex in the midsagittal plane. RESULTS The linear and angular measurements showed significant differences for the more posterior transversal measurements on the mandible (p < 0.01). By splitting and superimposing the maxillo-facial complex, the mean root-mean-square error (RMSE) as a measurement of inaccuracy decreased insignificantly from 0.936 to 0.922 mm (p > 0.05). The RMSE value for the mandible, however, significantly decreased from 1.240 to 0.981 mm after splitting (p < 0.01). CONCLUSIONS The soft tissue presence seems to affect the accuracy of the 3D hard tissue model obtained from a cone-beam CT, below a generally accepted level of clinical significance of 1 mm. However, this level of accuracy may not meet the requirement for applications where high precision is paramount. CLINICAL RELEVANCE Accuracy of CBCT-based 3D surface-rendered models, especially of the hard tissues, are crucial in several dental and medical applications, such as implant planning and virtual surgical planning on patients undergoing orthognathic and navigational surgeries. When used in applications where high precision is paramount, the effect of soft tissue presence should be taken into consideration during the segmentation process.
Collapse
|
14
|
Mandibular angle split osteotomy based on a novel augmented reality navigation using specialized robot-assisted arms—A feasibility study. J Craniomaxillofac Surg 2016; 44:215-23. [DOI: 10.1016/j.jcms.2015.10.024] [Citation(s) in RCA: 46] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2015] [Revised: 09/21/2015] [Accepted: 10/20/2015] [Indexed: 11/22/2022] Open
|
15
|
Badiali G, Ferrari V, Cutolo F, Freschi C, Caramella D, Bianchi A, Marchetti C. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning. J Craniomaxillofac Surg 2014; 42:1970-6. [DOI: 10.1016/j.jcms.2014.09.001] [Citation(s) in RCA: 87] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2014] [Revised: 07/05/2014] [Accepted: 09/01/2014] [Indexed: 10/24/2022] Open
|
16
|
Deng W, Li F, Wang M, Song Z. Easy-to-use augmented reality neuronavigation using a wireless tablet PC. Stereotact Funct Neurosurg 2013; 92:17-24. [PMID: 24216673 DOI: 10.1159/000354816] [Citation(s) in RCA: 58] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2012] [Accepted: 08/06/2013] [Indexed: 11/19/2022]
Abstract
BACKGROUND/AIMS Augmented reality (AR) technology solves the problem of view switching in traditional image-guided neurosurgery systems by integrating computer-generated objects into the actual scene. However, the state-of-the-art AR solution using head-mounted displays has not been widely accepted in clinical applications because it causes some inconvenience for the surgeon during surgery. METHODS In this paper, we present a Tablet-AR system that transmits navigation information to a movable tablet PC via a wireless local area network and overlays this information on the tablet screen, which simultaneously displays the actual scene captured by its back-facing camera. With this system, the surgeon can directly observe the intracranial anatomical structure of the patient with the overlaid virtual projection images to guide the surgery. RESULTS The alignment errors in the skull specimen study and clinical experiment were 4.6 pixels (approx. 1.6 mm) and 6 pixels (approx. 2.1 mm), respectively. The system was also used for navigation in 2 actual clinical cases of neurosurgery, which demonstrated its feasibility in a clinical application. CONCLUSIONS The easy-to-use Tablet-AR system presented in this study is accurate and feasible in clinical applications and has the potential to become a routine device in AR neuronavigation.
Collapse
Affiliation(s)
- Weiwei Deng
- Digital Medical Research Center, Shanghai Medical School, Fudan University, Shanghai, PR China
| | | | | | | |
Collapse
|
17
|
Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study. Int J Oral Sci 2013; 5:98-102. [PMID: 23703710 PMCID: PMC3707071 DOI: 10.1038/ijos.2013.26] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2012] [Accepted: 04/22/2013] [Indexed: 12/04/2022] Open
Abstract
To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye.
Collapse
|
18
|
Kersten-Oertel M, Jannin P, Collins DL. The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 2013; 37:98-112. [PMID: 23490236 DOI: 10.1016/j.compmedimag.2013.01.009] [Citation(s) in RCA: 106] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Revised: 01/04/2013] [Accepted: 01/23/2013] [Indexed: 11/26/2022]
Abstract
This paper presents a review of the state of the art of visualization in mixed reality image guided surgery (IGS). We used the DVV (data, visualization processing, view) taxonomy to classify a large unbiased selection of publications in the field. The goal of this work was not only to give an overview of current visualization methods and techniques in IGS but more importantly to analyze the current trends and solutions used in the domain. In surveying the current landscape of mixed reality IGS systems, we identified a strong need to assess which of the many possible data sets should be visualized at particular surgical steps, to focus on novel visualization processing techniques and interface solutions, and to evaluate new systems.
Collapse
Affiliation(s)
- Marta Kersten-Oertel
- Department of Biomedical Engineering, McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montréal, Canada.
| | | | | |
Collapse
|
19
|
A Realistic Test and Development Environment for Mixed Reality in Neurosurgery. AUGMENTED ENVIRONMENTS FOR COMPUTER-ASSISTED INTERVENTIONS 2012. [DOI: 10.1007/978-3-642-32630-1_2] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
|
20
|
Kockro RA, Tsai YT, Ng I, Hwang P, Zhu C, Agusanto K, Hong LX, Serra L. Dex-ray: augmented reality neurosurgical navigation with a handheld video probe. Neurosurgery 2010; 65:795-807; discussion 807-8. [PMID: 19834386 DOI: 10.1227/01.neu.0000349918.36700.1c] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
OBJECTIVE We developed an augmented reality system that enables intraoperative image guidance by using 3-dimensional (3D) graphics overlaid on a video stream. We call this system DEX-Ray and report on its development and the initial intraoperative experience in 12 cases. METHODS DEX-Ray consists of a tracked handheld probe that integrates a lipstick-size video camera. The camera looks over the probe's tip into the surgical field. The camera's video stream is augmented with coregistered, multimodality 3D graphics and landmarks obtained during neurosurgical planning with 3D workstations. The handheld probe functions as a navigation device to view and point and as an interaction device to adjust the 3D graphics. We tested the system's accuracy in the laboratory and evaluated it intraoperatively with a series of tumor and vascular cases. RESULTS DEX-Ray provided accurate and real-time video-based augmented reality display. The system could be seamlessly integrated into the surgical workflow. The see-through effect revealing 3D information below the surgically exposed surface proved to be of significant value, especially during the macroscopic phase of an operation, providing easily understandable structural navigational information. Navigation in deep and narrow surgical corridors was limited by the camera resolution and light sensitivity. CONCLUSION The system was perceived as an improved navigational experience because the augmented see-through effect allowed direct understanding of the surgical anatomy beyond the visible surface and direct guidance toward surgical targets.
Collapse
Affiliation(s)
- Ralf A Kockro
- Department of Neurosurgery, University Hospital Zürich, Zürich, Switzerland.
| | | | | | | | | | | | | | | |
Collapse
|
21
|
Ghanai S, Marmulla R, Wiechnik J, Mühling J, Kotrikova B. Computer-assisted three-dimensional surgical planning: 3D virtual articulator: technical note. Int J Oral Maxillofac Surg 2010; 39:75-82. [DOI: 10.1016/j.ijom.2009.10.023] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2007] [Revised: 10/17/2007] [Accepted: 10/27/2009] [Indexed: 11/27/2022]
|
22
|
Vikal S, U-Thainual P, Carrino JA, Iordachita I, Fischer GS, Fichtinger G. Perk Station--Percutaneous surgery training and performance measurement platform. Comput Med Imaging Graph 2009; 34:19-32. [PMID: 19539446 DOI: 10.1016/j.compmedimag.2009.05.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2009] [Revised: 04/27/2009] [Accepted: 05/05/2009] [Indexed: 01/23/2023]
Abstract
MOTIVATION Image-guided percutaneous (through the skin) needle-based surgery has become part of routine clinical practice in performing procedures such as biopsies, injections and therapeutic implants. A novice physician typically performs needle interventions under the supervision of a senior physician; a slow and inherently subjective training process that lacks objective, quantitative assessment of the surgical skill and performance. Shortening the learning curve and increasing procedural consistency are important factors in assuring high-quality medical care. METHODS This paper describes a laboratory validation system, called Perk Station, for standardized training and performance measurement under different assistance techniques for needle-based surgical guidance systems. The initial goal of the Perk Station is to assess and compare different techniques: 2D image overlay, biplane laser guide, laser protractor and conventional freehand. The main focus of this manuscript is the planning and guidance software system developed on the 3D Slicer platform, a free, open source software package designed for visualization and analysis of medical image data. RESULTS The prototype Perk Station has been successfully developed, the associated needle insertion phantoms were built, and the graphical user interface was fully implemented. The system was inaugurated in undergraduate teaching and a wide array of outreach activities. Initial results, experiences, ongoing activities and future plans are reported.
Collapse
|
23
|
Krempien R, Hoppe H, Kahrs L, Daeuber S, Schorr O, Eggers G, Bischof M, Munter MW, Debus J, Harms W. Projector-based augmented reality for intuitive intraoperative guidance in image-guided 3D interstitial brachytherapy. Int J Radiat Oncol Biol Phys 2007; 70:944-52. [PMID: 18164834 DOI: 10.1016/j.ijrobp.2007.10.048] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2006] [Revised: 10/25/2007] [Accepted: 10/25/2007] [Indexed: 11/25/2022]
Abstract
PURPOSE The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. METHODS AND MATERIALS The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. RESULTS In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). CONCLUSIONS The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.
Collapse
Affiliation(s)
- Robert Krempien
- Department of Radiation Oncology, University of Heidelberg, Heidelberg, Germany.
| | | | | | | | | | | | | | | | | | | |
Collapse
|
24
|
Widmann G. Image-guided surgery and medical robotics in the cranial area. Biomed Imaging Interv J 2007; 3:e11. [PMID: 21614255 PMCID: PMC3097655 DOI: 10.2349/biij.3.1.e11] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2006] [Accepted: 02/21/2007] [Indexed: 11/17/2022] Open
Abstract
Surgery in the cranial area includes complex anatomic situations with high-risk structures and high demands for functional and aesthetic results. Conventional surgery requires that the surgeon transfers complex anatomic and surgical planning information, using spatial sense and experience. The surgical procedure depends entirely on the manual skills of the operator. The development of image-guided surgery provides new revolutionary opportunities by integrating presurgical 3D imaging and intraoperative manipulation. Augmented reality, mechatronic surgical tools, and medical robotics may continue to progress in surgical instrumentation, and ultimately, surgical care. The aim of this article is to review and discuss state-of-the-art surgical navigation and medical robotics, image-to-patient registration, aspects of accuracy, and clinical applications for surgery in the cranial area.
Collapse
Affiliation(s)
- G Widmann
- Department of Radiology, Innsbruck Medical University, Anichstr, Austria
| |
Collapse
|
25
|
Abstract
PURPOSE OF REVIEW Patients with advanced head and neck cancer are being treated with chemo-radiotherapy, and life is being prolonged, with or without persistent disease, for longer than was previously. Hypercalcaemia may present in patients with advanced or disseminated head and neck cancer, and, as such, these patients may present to a larger variety of clinicians for advice concerning their symptoms and illness. Modes of presentation of hypercalcaemia and treatment strategies are reviewed. RECENT FINDINGS There were previously few large series of head and neck cancer patients diagnosed with hypercalcaemia, which may or may not have been related to their cancer being treated. Investigations, by way of blood/serum calcium level, may identify such patients. Patients with cancer-related hypercalcaemia have a poor prognosis, but many may respond temporarily to treatment when offered, with an improvement of their quality of life and death. SUMMARY Hypercalcaemia should and must be considered in all patients who have or possibly have a diagnosis of a head and neck cancer and who present unwell with symptoms of fatigue, lethargy and somnolence. Investigation must include serum calcium (corrected for serum albumin binding) and parathyroid hormone level. Patients may be treated by a combination of rehydration and bisulphonate therapy until the serum calcium is reduced to a level below 3 mmol/l. The majority of patients diagnosed with hypercalcaemia due to head and neck malignancy die of their diseases in the short term, but some may enjoy a prolongation of life with reasonable quality if diagnosed and treated aggressively.
Collapse
Affiliation(s)
- Patrick J Bradley
- Department of Oto-Rhino-Laryngology, Head and Neck Surgery, University Hospital, Nottingham, UK.
| |
Collapse
|