1
|
Lebovic J, Galetta MS, Sardar ZM, Goytan M, Daniels AH, Miyanji F, Smith JS, Burton DC, Protopsaltis TS. Enabling technology in adult spinal deformity. Spine Deform 2025:10.1007/s43390-025-01086-z. [PMID: 40234366 DOI: 10.1007/s43390-025-01086-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/31/2024] [Accepted: 03/21/2025] [Indexed: 04/17/2025]
Abstract
This review analyzes enabling technology in Adult Spinal Deformity (ASD), with a focus on optimizing safety and teaching. The prevalence of ASD is rising, and recent technological advancements can empower surgeons to improve outcomes for ASD patients but also each comes with specific challenges. The paper highlights opportunities and potential obstacles in effective technology integration and assesses key enabling technologies, including surgical planning software, machine leaning, three-dimensional printing, augmented and virtual reality, patient-specific instrumentation as well as navigation and robotics.
Collapse
Affiliation(s)
- Jordan Lebovic
- Department of Orthopedic Surgery, NYU Langone Orthopedic Hospital, New York, NY, USA
| | - Matthew S Galetta
- Department of Orthopedic Surgery, NYU Langone Orthopedic Hospital, New York, NY, USA.
| | - Zeeshan M Sardar
- Department of Orthopaedic Surgery, Columbia University Medical Center, The Spine Hospital at New York Presbyterian, New York, USA
| | - Michael Goytan
- Winnipeg Spine Program, Health Sciences Centre, University of Manitoba, Winnipeg, MB, Canada
| | - Alan H Daniels
- Department of Orthopedics, Warren Alpert Medical School of Brown University, East Providence, RI, USA
| | - Firoz Miyanji
- Department of Orthopaedics, British Columbia Children's Hospital, Vancouver, BC, Canada
| | - Justin S Smith
- Department of Neurosurgery, University of Virginia Medical Center, Charlottesville, VA, USA
| | - Douglas C Burton
- Department of Orthopaedic Surgery, Medical Center, University of Kansas, Kansas City, KS, USA
| | | |
Collapse
|
2
|
Jang Y, Lim S, Lee S, Je LG, Kim T, Joo S, Seo J, Lee D, Koh JC. Clinical Application of an Augmented Reality Navigation System for Transforaminal Epidural Injection: A Randomized Controlled Trial. J Clin Med 2024; 13:1992. [PMID: 38610758 PMCID: PMC11012780 DOI: 10.3390/jcm13071992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Revised: 02/10/2024] [Accepted: 03/27/2024] [Indexed: 04/14/2024] Open
Abstract
Objectives: Augmented reality (AR) navigation systems are emerging to simplify and enhance the precision of medical procedures. Lumbosacral transforaminal epidural injection is a commonly performed procedure for the treatment and diagnosis of radiculopathy. Accurate needle placement while avoiding critical structures remains a challenge. For this purpose, we conducted a randomized controlled trial for our augmented reality navigation system. Methods: This randomized controlled study involved 28 patients, split between a traditional C-arm guided group (control) and an AR navigation guided group (AR-NAVI), to compare procedure efficiency and radiation exposure. The AR-NAVI group used a real-time tracking system displaying spinal structure and needle position on an AR head-mounted display. The procedural time and C-arm usage (radiation exposure) were measured. Results: All patients underwent successful procedures without complications. The AR-NAVI group demonstrated significantly reduced times and C-arm usage for needle entry to the target point (58.57 ± 33.31 vs. 124.91 ± 41.14, p < 0.001 and 3.79 ± 1.97 vs. 8.86 ± 3.94, p < 0.001). Conclusions: The use of the AR navigation system significantly improved procedure efficiency and safety by reducing time and radiation exposure, suggesting a promising direction for future enhancements and validation.
Collapse
Affiliation(s)
- Yookyung Jang
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Sunghwan Lim
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; (S.L.); (D.L.)
| | - Sunhee Lee
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Lee Gyeong Je
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Taesan Kim
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Subin Joo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu 42994, Republic of Korea; (S.J.); (J.S.)
| | - Joonho Seo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu 42994, Republic of Korea; (S.J.); (J.S.)
| | - Deukhee Lee
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; (S.L.); (D.L.)
| | - Jae Chul Koh
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| |
Collapse
|
3
|
Youssef S, McDonnell JM, Wilson KV, Turley L, Cunniffe G, Morris S, Darwish S, Butler JS. Accuracy of augmented reality-assisted pedicle screw placement: a systematic review. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2024; 33:974-984. [PMID: 38177834 DOI: 10.1007/s00586-023-08094-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 12/06/2023] [Accepted: 12/08/2023] [Indexed: 01/06/2024]
Abstract
OBJECTIVE Conventional freehand methods of pedicle screw placement are associated with significant complications due to close proximity to neural and vascular structures. Recent advances in augmented reality surgical navigation (ARSN) have led to its adoption into spine surgery. However, little is known regarding its overall accuracy. The purpose of this study is to delineate the overall accuracy of ARSN pedicle screw placement across various models. METHODS A systematic review was conducted of Medline/PubMed, Cochrane and Embase Library databases according to the PRISMA guidelines. Relevant data extracted included reports of pedicle screw placement accuracy and breaches, as defined by the Gertzbein-Robbins classification, in addition to deviation from pre-planned trajectory and entry point. Accuracy was defined as the summation of grade 0 and grade 1 events per the Gertzbein-Robbins classification. RESULTS Twenty studies reported clinically accurate placed screws. The range of clinically accurate placed screws was 26.3-100%, with 2095 screws (93.1%) being deemed clinically accurate. Furthermore, 5.4% (112/2088) of screws were reported as grade two breaches, 1.6% (33/2088) grade 3 breaches, 3.1% (29/926) medial breaches and 2.3% (21/926) lateral breaches. Mean linear deviation ranged from 1.3 to 5.99 mm, while mean angular/trajectory deviation ranged 1.6°-5.88°. CONCLUSION The results of this study highlight the overall accuracy of ARSN pedicle screw placement. However, further robust prospective studies are needed to accurately compare to conventional methods of pedicle screw placement.
Collapse
Affiliation(s)
- Salma Youssef
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
| | - Jake M McDonnell
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Trinity Biomedical Sciences Institute, Trinity College Dublin, Dublin, Ireland
| | - Kielan V Wilson
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland.
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland.
| | - Luke Turley
- Department of Orthopaedics, Tallaght University Hospital, Tallaght, Dublin, Ireland
| | - Gráinne Cunniffe
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Seamus Morris
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| | - Stacey Darwish
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
- Department of Orthopaedics, St. Vincent's University Hospital, Dublin, Ireland
| | - Joseph S Butler
- School of Medicine, University College Dublin, Belfield, Dublin, Ireland
- National Spinal Injuries Unit, Mater Misericordiae University Hospital, Dublin, Ireland
| |
Collapse
|
4
|
Adida S, Legarreta AD, Hudson JS, McCarthy D, Andrews E, Shanahan R, Taori S, Lavadi RS, Buell TJ, Hamilton DK, Agarwal N, Gerszten PC. Machine Learning in Spine Surgery: A Narrative Review. Neurosurgery 2024; 94:53-64. [PMID: 37930259 DOI: 10.1227/neu.0000000000002660] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 07/06/2023] [Indexed: 11/07/2023] Open
Abstract
Artificial intelligence and machine learning (ML) can offer revolutionary advances in their application to the field of spine surgery. Within the past 5 years, novel applications of ML have assisted in surgical decision-making, intraoperative imaging and navigation, and optimization of clinical outcomes. ML has the capacity to address many different clinical needs and improve diagnostic and surgical techniques. This review will discuss current applications of ML in the context of spine surgery by breaking down its implementation preoperatively, intraoperatively, and postoperatively. Ethical considerations to ML and challenges in ML implementation must be addressed to maximally benefit patients, spine surgeons, and the healthcare system. Areas for future research in augmented reality and mixed reality, along with limitations in generalizability and bias, will also be highlighted.
Collapse
Affiliation(s)
- Samuel Adida
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Andrew D Legarreta
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Joseph S Hudson
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - David McCarthy
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Edward Andrews
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Regan Shanahan
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Suchet Taori
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Raj Swaroop Lavadi
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Thomas J Buell
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - D Kojo Hamilton
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Nitin Agarwal
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh , Pennsylvania , USA
| | - Peter C Gerszten
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| |
Collapse
|
5
|
Lin Z, Lei C, Yang L. Modern Image-Guided Surgery: A Narrative Review of Medical Image Processing and Visualization. SENSORS (BASEL, SWITZERLAND) 2023; 23:9872. [PMID: 38139718 PMCID: PMC10748263 DOI: 10.3390/s23249872] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Revised: 11/15/2023] [Accepted: 12/13/2023] [Indexed: 12/24/2023]
Abstract
Medical image analysis forms the basis of image-guided surgery (IGS) and many of its fundamental tasks. Driven by the growing number of medical imaging modalities, the research community of medical imaging has developed methods and achieved functionality breakthroughs. However, with the overwhelming pool of information in the literature, it has become increasingly challenging for researchers to extract context-relevant information for specific applications, especially when many widely used methods exist in a variety of versions optimized for their respective application domains. By being further equipped with sophisticated three-dimensional (3D) medical image visualization and digital reality technology, medical experts could enhance their performance capabilities in IGS by multiple folds. The goal of this narrative review is to organize the key components of IGS in the aspects of medical image processing and visualization with a new perspective and insights. The literature search was conducted using mainstream academic search engines with a combination of keywords relevant to the field up until mid-2022. This survey systemically summarizes the basic, mainstream, and state-of-the-art medical image processing methods as well as how visualization technology like augmented/mixed/virtual reality (AR/MR/VR) are enhancing performance in IGS. Further, we hope that this survey will shed some light on the future of IGS in the face of challenges and opportunities for the research directions of medical image processing and visualization.
Collapse
Affiliation(s)
- Zhefan Lin
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310030, China;
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China;
| | - Chen Lei
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China;
| | - Liangjing Yang
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310030, China;
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China;
| |
Collapse
|
6
|
Poullay Silven M, Nicoletti GF, Iacopino DG. Letter To The Editor Regarding "Augmented Reality in Minimally Invasive Spinal Surgery: A Narrative Review of Available Technology". World Neurosurg 2023; 180:259-260. [PMID: 38115389 DOI: 10.1016/j.wneu.2023.08.095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2023] [Accepted: 08/21/2023] [Indexed: 12/21/2023]
Affiliation(s)
- Manikon Poullay Silven
- Department of Biomedicine Neurosciences and Advanced Diagnostics, Neurosurgical Clinic, AOUP "Paolo Giaccone", Post Graduate Residency Program in Neurologic Surgery, School of Medicine, University of Palermo, Palermo, Italy.
| | | | - Domenico Gerardo Iacopino
- Department of Biomedicine Neurosciences and Advanced Diagnostics, Neurosurgical Clinic, AOUP "Paolo Giaccone", Post Graduate Residency Program in Neurologic Surgery, School of Medicine, University of Palermo, Palermo, Italy
| |
Collapse
|
7
|
Suter D, Hodel S, Liebmann F, Fürnstahl P, Farshad M. Factors affecting augmented reality head-mounted device performance in real OR. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2023; 32:3425-3433. [PMID: 37552327 DOI: 10.1007/s00586-023-07826-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 05/01/2023] [Accepted: 06/12/2023] [Indexed: 08/09/2023]
Abstract
PURPOSE Over the last years, interest and efforts to implement augmented reality (AR) in orthopedic surgery through head-mounted devices (HMD) have increased. However, the majority of experiments were preclinical and within a controlled laboratory environment. The operating room (OR) is a more challenging environment with various confounding factors potentially affecting the performance of an AR-HMD. The aim of this study was to assess the performance of an AR-HMD in a real-life OR setting. METHODS An established AR application using the HoloLens 2 HMD was tested in an OR and in a laboratory by two users. The accuracy of the hologram overlay, the time to complete the trial, the number of rejected registration attempts, the delay in live overlay of the hologram, and the number of completely failed runs were recorded. Further, different OR setting parameters (light condition, setting up partitions, movement of personnel, and anchor placement) were modified and compared. RESULTS Time for full registration was higher with 48 s (IQR 24 s) in the OR versus 33 s (IQR 10 s) in the laboratory setting (p < 0.001). The other investigated parameters didn't differ significantly if an optimal OR setting was used. Within the OR, the strongest influence on performance of the AR-HMD was different light conditions with direct light illumination on the situs being the least favorable. CONCLUSION AR-HMDs are affected by different OR setups. Standardization measures for better AR-HMD performance include avoiding direct light illumination on the situs, setting up partitions, and minimizing the movement of personnel.
Collapse
Affiliation(s)
- Daniel Suter
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland.
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Sandro Hodel
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Florentin Liebmann
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
8
|
Bhatt FR, Orosz LD, Tewari A, Boyd D, Roy R, Good CR, Schuler TC, Haines CM, Jazini E. Augmented Reality-Assisted Spine Surgery: An Early Experience Demonstrating Safety and Accuracy with 218 Screws. Global Spine J 2023; 13:2047-2052. [PMID: 35000409 PMCID: PMC10556900 DOI: 10.1177/21925682211069321] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
STUDY DESIGN Prospective cohort study. OBJECTIVES In spine surgery, accurate screw guidance is critical to achieving satisfactory fixation. Augmented reality (AR) is a novel technology to assist in screw placement and has shown promising results in early studies. This study aims to provide our early experience evaluating safety and efficacy with an Food and Drug Administration-approved head-mounted (head-mounted device augmented reality (HMD-AR)) device. METHODS Consecutive adult patients undergoing AR-assisted thoracolumbar fusion between October 2020 and August 2021 with 2 -week follow-up were included. Preoperative, intraoperative, and postoperative data were collected to include demographics, complications, revision surgeries, and AR performance. Intraoperative 3D imaging was used to assess screw accuracy using the Gertzbein-Robbins (G-R) grading scale. RESULTS Thirty-two patients (40.6% male) were included with a total of 222 screws executed using HMD-AR. Intraoperatively, 4 (1.8%) were deemed misplaced and revised using AR or freehand. The remaining 218 (98.2%) screws were placed accurately. There were no intraoperative adverse events or complications, and AR was not abandoned in any case. Of the 208 AR-placed screws with 3D imaging confirmation, 97.1% were considered clinically accurate (91.8% Grade A, 5.3% Grade B). There were no early postoperative surgical complications or revision surgeries during the 2 -week follow-up. CONCLUSIONS This early experience study reports an overall G-R accuracy of 97.1% across 218 AR-guided screws with no intra or early postoperative complications. This shows that HMD-AR-assisted spine surgery is a safe and accurate tool for pedicle, cortical, and pelvic fixation. Larger studies are needed to continue to support this compelling evolution in spine surgery.
Collapse
Affiliation(s)
| | | | - Anant Tewari
- National Spine Health Foundation, Reston, VA, USA
| | - David Boyd
- Reston Radiology Consultants, Reston, VA, USA
| | - Rita Roy
- National Spine Health Foundation, Reston, VA, USA
| | | | | | | | | |
Collapse
|
9
|
Cao B, Yuan B, Xu G, Zhao Y, Sun Y, Wang Z, Zhou S, Xu Z, Wang Y, Chen X. A Pilot Human Cadaveric Study on Accuracy of the Augmented Reality Surgical Navigation System for Thoracolumbar Pedicle Screw Insertion Using a New Intraoperative Rapid Registration Method. J Digit Imaging 2023; 36:1919-1929. [PMID: 37131064 PMCID: PMC10406793 DOI: 10.1007/s10278-023-00840-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 04/20/2023] [Accepted: 04/21/2023] [Indexed: 05/04/2023] Open
Abstract
To evaluate the feasibility and accuracy of AR-assisted pedicle screw placement using a new intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy in cadavers. Five cadavers with intact thoracolumbar spines were employed in this study. Intraoperative registration was performed using anteroposterior and lateral views of preoperative CT scanning and intraoperative 2D fluoroscopic images. Patient-specific targeting guides were used for pedicle screw placement from Th1-L5, totaling 166 screws. Instrumentation for each side was randomized (augmented reality surgical navigation (ARSN) vs. C-arm) with an equal distribution of 83 screws in each group. CT was performed to evaluate the accuracy of both techniques by assessing the screw positions and the deviations between the inserted screws and planned trajectories. Postoperative CT showed that 98.80% (82/83) screws in ARSN group and 72.29% (60/83) screws in C-arm group were within the 2-mm safe zone (p < 0.001). The mean time for instrumentation per level in ARSN group was significantly shorter than that in C-arm group (56.17 ± 3.33 s vs. 99.22 ± 9.03 s, p < 0.001). The overall intraoperative registration time was 17.2 ± 3.5 s per segment. AR-based navigation technology can provide surgeons with accurate guidance of pedicle screw insertion and save the operation time by using the intraoperative rapid registration method of combining preoperative CT scanning and intraoperative C-arm 2D fluoroscopy.
Collapse
Affiliation(s)
- Bing Cao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Bo Yuan
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Guofeng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yin Zhao
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yanqing Sun
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zhiwei Wang
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Shengyuan Zhou
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Zheng Xu
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China
| | - Yao Wang
- Linyan Medical Technology Company Limited, 528 Ruiqing Road, Pudong New District, Shanghai, China
| | - Xiongsheng Chen
- Spine Center, Department of Orthopaedics, Shanghai Changzheng Hospital, Second Military Medical University, 415 Fengyang Road, Huangpu District, Shanghai, China.
| |
Collapse
|
10
|
Taghian A, Abo-Zahhad M, Sayed MS, Abd El-Malek AH. Virtual and augmented reality in biomedical engineering. Biomed Eng Online 2023; 22:76. [PMID: 37525193 PMCID: PMC10391968 DOI: 10.1186/s12938-023-01138-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 07/12/2023] [Indexed: 08/02/2023] Open
Abstract
BACKGROUND In the future, extended reality technology will be widely used. People will be led to utilize virtual reality (VR) and augmented reality (AR) technologies in their daily lives, hobbies, numerous types of entertainment, and employment. Medical augmented reality has evolved with applications ranging from medical education to picture-guided surgery. Moreover, a bulk of research is focused on clinical applications, with the majority of research devoted to surgery or intervention, followed by rehabilitation and treatment applications. Numerous studies have also looked into the use of augmented reality in medical education and training. METHODS Using the databases Semantic Scholar, Web of Science, Scopus, IEEE Xplore, and ScienceDirect, a scoping review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria. To find other articles, a manual search was also carried out in Google Scholar. This study presents studies carried out over the previous 14 years (from 2009 to 2023) in detail. We classify this area of study into the following categories: (1) AR and VR in surgery, which is presented in the following subsections: subsection A: MR in neurosurgery; subsection B: spine surgery; subsection C: oral and maxillofacial surgery; and subsection D: AR-enhanced human-robot interaction; (2) AR and VR in medical education presented in the following subsections; subsection A: medical training; subsection B: schools and curriculum; subsection C: XR in Biomedicine; (3) AR and VR for rehabilitation presented in the following subsections; subsection A: stroke rehabilitation during COVID-19; subsection B: cancer and VR, and (4) Millimeter-wave and MIMO systems for AR and VR. RESULTS In total, 77 publications were selected based on the inclusion criteria. Four distinct AR and/or VR applications groups could be differentiated: AR and VR in surgery (N = 21), VR and AR in Medical Education (N = 30), AR and VR for Rehabilitation (N = 15), and Millimeter-Wave and MIMO Systems for AR and VR (N = 7), where N is number of cited studies. We found that the majority of research is devoted to medical training and education, with surgical or interventional applications coming in second. The research is mostly focused on rehabilitation, therapy, and clinical applications. Moreover, the application of XR in MIMO has been the subject of numerous research. CONCLUSION Examples of these diverse fields of applications are displayed in this review as follows: (1) augmented reality and virtual reality in surgery; (2) augmented reality and virtual reality in medical education; (3) augmented reality and virtual reality for rehabilitation; and (4) millimeter-wave and MIMO systems for augmented reality and virtual reality.
Collapse
Affiliation(s)
- Aya Taghian
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt.
| | - Mohammed Abo-Zahhad
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electrical Engineering, Assiut University, Assiut, Egypt
| | - Mohammed S Sayed
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electronics and Communications Engineering, Zagazig University, Zagazig, Ash Sharqia, Egypt
| | - Ahmed H Abd El-Malek
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
| |
Collapse
|
11
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
12
|
Medress ZA, Bobrow A, Tigchelaar SS, Henderson T, Parker JJ, Desai A. Augmented Reality-Assisted Resection of a Large Presacral Ganglioneuroma: 2-Dimensional Operative Video. Oper Neurosurg (Hagerstown) 2023; 24:e284-e285. [PMID: 36701554 DOI: 10.1227/ons.0000000000000542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 09/22/2022] [Indexed: 01/27/2023] Open
Affiliation(s)
- Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Jonathon J Parker
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
13
|
Tigchelaar SS, Medress ZA, Quon J, Dang P, Barbery D, Bobrow A, Kin C, Louis R, Desai A. Augmented Reality Neuronavigation for En Bloc Resection of Spinal Column Lesions. World Neurosurg 2022; 167:102-110. [PMID: 36096393 DOI: 10.1016/j.wneu.2022.08.143] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/28/2022] [Accepted: 08/30/2022] [Indexed: 11/22/2022]
Abstract
BACKGROUND Primary tumors involving the spine are relatively rare but represent surgically challenging procedures with high patient morbidity. En bloc resection of these tumors necessitates large exposures, wide tumor margins, and poses risks to functionally relevant anatomical structures. Augmented reality neuronavigation (ARNV) represents a paradigm shift in neuronavigation, allowing on-demand visualization of 3D navigation data in real-time directly in line with the operative field. METHODS Here, we describe the first application of ARNV to perform distal sacrococcygectomies for the en bloc removal of sacral and retrorectal lesions involving the coccyx in 2 patients, as well as a thoracic 9-11 laminectomy with costotransversectomy for en bloc removal of a schwannoma in a third patient. RESULTS In our experience, ARNV allowed our teams to minimize the length of the incision, reduce the extent of bony resection, and enhanced visualization of critical adjacent anatomy. All tumors were resected en bloc, and the patients recovered well postoperatively, with no known complications. Pathologic analysis confirmed the en bloc removal of these lesions with negative margins. CONCLUSIONS We conclude that ARNV is an effective strategy for the precise, en bloc removal of spinal lesions including both sacrococcygeal tumors involving the retrorectal space and thoracic schwannomas.
Collapse
Affiliation(s)
- Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA.
| | - Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Jennifer Quon
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Phuong Dang
- Surgical Theater, Inc., Cleveland, Ohio, USA
| | | | | | - Cindy Kin
- Department of Surgery, Stanford University Medical Center, Stanford, California, USA
| | - Robert Louis
- The Brain and Spine Center, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA; Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
14
|
Hagan MJ, Remacle T, Leary OP, Feler J, Shaaya E, Ali R, Zheng B, Bajaj A, Traupe E, Kraus M, Zhou Y, Fridley JS, Lewandrowski KU, Telfeian AE. Navigation Techniques in Endoscopic Spine Surgery. BIOMED RESEARCH INTERNATIONAL 2022; 2022:8419739. [PMID: 36072476 PMCID: PMC9444441 DOI: 10.1155/2022/8419739] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Revised: 07/31/2022] [Accepted: 08/08/2022] [Indexed: 12/04/2022]
Abstract
Endoscopic spine surgery (ESS) advances the principles of minimally invasive surgery, including minor collateral tissue damage, reduced blood loss, and faster recovery times. ESS allows for direct access to the spine through small incisions and direct visualization of spinal pathology via an endoscope. While this technique has many applications, there is a steep learning curve when adopting ESS into a surgeon's practice. Two types of navigation, optical and electromagnetic, may allow for widespread utilization of ESS by engendering improved orientation to surgical anatomy and reduced complication rates. The present review discusses these two available navigation technologies and their application in endoscopic procedures by providing case examples. Furthermore, we report on the future directions of navigation within the discipline of ESS.
Collapse
Affiliation(s)
- Matthew J. Hagan
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
| | - Thibault Remacle
- Department of Neurosurgery, CHR Citadelle, Bd du 12eme de Ligne, 1, 4000 Liege, Belgium
| | - Owen P. Leary
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
- Department of Neurosurgery, Warren Alpert School of Medicine of Brown University, 593 Eddy Street, APC 6, Providence, RI 02903, USA
| | - Joshua Feler
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
- Department of Neurosurgery, Warren Alpert School of Medicine of Brown University, 593 Eddy Street, APC 6, Providence, RI 02903, USA
| | - Elias Shaaya
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
- Department of Neurosurgery, Warren Alpert School of Medicine of Brown University, 593 Eddy Street, APC 6, Providence, RI 02903, USA
| | - Rohaid Ali
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
- Department of Neurosurgery, Warren Alpert School of Medicine of Brown University, 593 Eddy Street, APC 6, Providence, RI 02903, USA
| | - Bryan Zheng
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
| | - Ankush Bajaj
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
| | - Erik Traupe
- Helios Weißeritztal Clinics, Bürgerstraße 7, 01705 Freital, Germany
| | - Michael Kraus
- ORTHix Zentrum für Orthopädie, Stadtberger Str. 21, 86157 Augsburg, Germany
| | - Yue Zhou
- Department of Orthopaedics, Xinqiao Hospital, Third Military Medical University, Chongqing 400037, China
| | - Jared S. Fridley
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
- Department of Neurosurgery, Warren Alpert School of Medicine of Brown University, 593 Eddy Street, APC 6, Providence, RI 02903, USA
| | - Kai-Uwe Lewandrowski
- Center for Advanced Spine Care of Southern Arizona, The Surgical Institute of Tucson, 4787 E Camp Lowell Dr, Tucson, AZ 85712, USA
| | - Albert E. Telfeian
- Warren Alpert School of Medicine of Brown University, 222 Richmond Street, Providence, RI 02903, USA
| |
Collapse
|
15
|
Boaro A, Moscolo F, Feletti A, Polizzi G, Nunes S, Siddi F, Broekman M, Sala F. Visualization, navigation, augmentation. The ever-changing perspective of the neurosurgeon. BRAIN & SPINE 2022; 2:100926. [PMID: 36248169 PMCID: PMC9560703 DOI: 10.1016/j.bas.2022.100926] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 07/23/2022] [Accepted: 08/10/2022] [Indexed: 11/22/2022]
Abstract
Introduction The evolution of neurosurgery coincides with the evolution of visualization and navigation. Augmented reality technologies, with their ability to bring digital information into the real environment, have the potential to provide a new, revolutionary perspective to the neurosurgeon. Research question To provide an overview on the historical and technical aspects of visualization and navigation in neurosurgery, and to provide a systematic review on augmented reality (AR) applications in neurosurgery. Material and methods We provided an overview on the main historical milestones and technical features of visualization and navigation tools in neurosurgery. We systematically searched PubMed and Scopus databases for AR applications in neurosurgery and specifically discussed their relationship with current visualization and navigation systems, as well as main limitations. Results The evolution of visualization in neurosurgery is embodied by four magnification systems: surgical loupes, endoscope, surgical microscope and more recently the exoscope, each presenting independent features in terms of magnification capabilities, eye-hand coordination and the possibility to implement additional functions. In regard to navigation, two independent systems have been developed: the frame-based and the frame-less systems. The most frequent application setting for AR is brain surgery (71.6%), specifically neuro-oncology (36.2%) and microscope-based (29.2%), even though in the majority of cases AR applications presented their own visualization supports (66%). Discussion and conclusions The evolution of visualization and navigation in neurosurgery allowed for the development of more precise instruments; the development and clinical validation of AR applications, have the potential to be the next breakthrough, making surgeries safer, as well as improving surgical experience and reducing costs.
Collapse
Affiliation(s)
- A. Boaro
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - F. Moscolo
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - A. Feletti
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - G.M.V. Polizzi
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - S. Nunes
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - F. Siddi
- Department of Neurosurgery, Haaglanden Medical Center, The Hague, Zuid-Holland, the Netherlands
| | - M.L.D. Broekman
- Department of Neurosurgery, Haaglanden Medical Center, The Hague, Zuid-Holland, the Netherlands
- Department of Neurosurgery, Leiden University Medical Center, Leiden, Zuid-Holland, the Netherlands
| | - F. Sala
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| |
Collapse
|
16
|
Mandelka E, Gierse J, Gruetzner PA, Franke J, Vetter SY. First Clinical Experience with a Novel 3D C-Arm-Based System for Navigated Percutaneous Thoracolumbar Pedicle Screw Placement. Medicina (B Aires) 2022; 58:medicina58081111. [PMID: 36013578 PMCID: PMC9414596 DOI: 10.3390/medicina58081111] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Revised: 08/11/2022] [Accepted: 08/15/2022] [Indexed: 11/25/2022] Open
Abstract
Background and Objectives: Navigated pedicle screw placement is becoming increasingly popular, as it has been shown to reduce the rate of screw misplacement. We present our intraoperative workflow and initial experience in terms of safety, efficiency, and clinical feasibility with a novel system for a 3D C-arm cone beam computed-tomography-based navigation of thoracolumbar pedicle screws. Materials and Methods: The first 20 consecutive cases of C-arm cone beam computed-tomography-based percutaneous pedicle screw placement using a novel navigation system were included in this study. Procedural data including screw placement time and patient radiation dose were prospectively collected. Final pedicle screw accuracy was assessed using the Gertzbein–Robbins grading system. Results: In total, 156 screws were placed. The screw accuracy was 94.9%. All the pedicle breaches occurred on the lateral pedicle wall, and none caused clinical complications. On average, a time of 2:42 min was required to place a screw. The mean intraoperative patient radiation exposure was 7.46 mSv. Conclusions: In summary, the investigated combination of C-arm CBCT-based navigation proved to be easy to implement and highly reliable. It facilitates the accurate and efficient percutaneous placement of pedicle screws in the thoracolumbar spine. The careful use of intraoperative imaging maintains the intraoperative radiation exposure to the patient at a moderate level.
Collapse
|
17
|
Kitaguchi D, Lee Y, Hayashi K, Nakajima K, Kojima S, Hasegawa H, Takeshita N, Mori K, Ito M. Development and Validation of a Model for Laparoscopic Colorectal Surgical Instrument Recognition Using Convolutional Neural Network-Based Instance Segmentation and Videos of Laparoscopic Procedures. JAMA Netw Open 2022; 5:e2226265. [PMID: 35984660 PMCID: PMC9391983 DOI: 10.1001/jamanetworkopen.2022.26265] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
IMPORTANCE Deep learning-based automatic surgical instrument recognition is an indispensable technology for surgical research and development. However, pixel-level recognition with high accuracy is required to make it suitable for surgical automation. OBJECTIVE To develop a deep learning model that can simultaneously recognize 8 types of surgical instruments frequently used in laparoscopic colorectal operations and evaluate its recognition performance. DESIGN, SETTING, AND PARTICIPANTS This quality improvement study was conducted at a single institution with a multi-institutional data set. Laparoscopic colorectal surgical videos recorded between April 1, 2009, and December 31, 2021, were included in the video data set. Deep learning-based instance segmentation, an image recognition approach that recognizes each object individually and pixel by pixel instead of roughly enclosing with a bounding box, was performed for 8 types of surgical instruments. MAIN OUTCOMES AND MEASURES Average precision, calculated from the area under the precision-recall curve, was used as an evaluation metric. The average precision represents the number of instances of true-positive, false-positive, and false-negative results, and the mean average precision value for 8 types of surgical instruments was calculated. Five-fold cross-validation was used as the validation method. The annotation data set was split into 5 segments, of which 4 were used for training and the remainder for validation. The data set was split at the per-case level instead of the per-frame level; thus, the images extracted from an intraoperative video in the training set never appeared in the validation set. Validation was performed for all 5 validation sets, and the average mean average precision was calculated. RESULTS In total, 337 laparoscopic colorectal surgical videos were used. Pixel-by-pixel annotation was manually performed for 81 760 labels on 38 628 static images, constituting the annotation data set. The mean average precisions of the instance segmentation for surgical instruments were 90.9% for 3 instruments, 90.3% for 4 instruments, 91.6% for 6 instruments, and 91.8% for 8 instruments. CONCLUSIONS AND RELEVANCE A deep learning-based instance segmentation model that simultaneously recognizes 8 types of surgical instruments with high accuracy was successfully developed. The accuracy was maintained even when the number of types of surgical instruments increased. This model can be applied to surgical innovations, such as intraoperative navigation and surgical automation.
Collapse
Affiliation(s)
- Daichi Kitaguchi
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Younae Lee
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Kazuyuki Hayashi
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Kei Nakajima
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Shigehiro Kojima
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Hiro Hasegawa
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Nobuyoshi Takeshita
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| | - Kensaku Mori
- Graduate School of Informatics, Nagoya University, Nagoya, Aichi, Japan
| | - Masaaki Ito
- Surgical Device Innovation Office, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
- Department of Colorectal Surgery, National Cancer Center Hospital East, Kashiwanoha, Kashiwa, Chiba, Japan
| |
Collapse
|
18
|
Beisemann N, Gierse J, Mandelka E, Hassel F, Grützner PA, Franke J, Vetter SY. Comparison of three imaging and navigation systems regarding accuracy of pedicle screw placement in a sawbone model. Sci Rep 2022; 12:12344. [PMID: 35853991 PMCID: PMC9296669 DOI: 10.1038/s41598-022-16709-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Accepted: 07/14/2022] [Indexed: 02/06/2023] Open
Abstract
3D-navigated pedicle screw placement is increasingly performed as the accuracy has been shown to be considerably higher compared to fluoroscopy-guidance. While different imaging and navigation devices can be used, there are few studies comparing these under similar conditions. Thus, the objective of this study was to compare the accuracy of two combinations most used in the literature for spinal navigation and a recently approved combination of imaging device and navigation system. With each combination of imaging system and navigation interface, 160 navigated screws were placed percutaneously in spine levels T11-S1 in ten artificial spine models. 470 screws were included in the final evaluation. Two blinded observers classified screw placement according to the Gertzbein Robbins grading system. Grades A and B were considered acceptable and Grades C-E unacceptable. Weighted kappa was used to calculate reliability between the observers. Mean accuracy was 94.9% (149/157) for iCT/Curve, 97.5% (154/158) for C-arm CBCT/Pulse and 89.0% for CBCT/StealthStation (138/155). The differences between the different combinations were not statistically significant except for the comparison of C-arm CBCT/Pulse and CBCT/StealthStation (p = 0.003). Relevant perforations of the medial pedicle wall were only seen in the CBCT group. Weighted interrater reliability was found to be 0.896 for iCT, 0.424 for C-arm CBCT and 0.709 for CBCT. Under quasi-identical conditions, higher screw accuracy was achieved with the combinations iCT/Curve and C-arm CBCT/Pulse compared with CBCT/StealthStation. However, the exact reasons for the difference in accuracy remain unclear. Weighted interrater reliability for Gertzbein Robbins grading was moderate for C-arm CBCT, substantial for CBCT and almost perfect for iCT.
Collapse
Affiliation(s)
- Nils Beisemann
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany
| | - Jula Gierse
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany
| | - Eric Mandelka
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany
| | - Frank Hassel
- Department of Spine Surgery, Loretto Hospital, Mercystrasse 6, 79100, Freiburg im Breisgau, Germany
| | - Paul A Grützner
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany
| | - Jochen Franke
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany
| | - Sven Y Vetter
- Research Group Medical Imaging and Navigation in Trauma and Orthopedic Surgery (MINTOS), Berufsgenossenschaftliche Unfallklinik (BG Trauma Center) Ludwigshafen, Ludwig-Guttmann-Strasse 13, 67071, Ludwigshafen, Germany.
| |
Collapse
|
19
|
Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094295] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.
Collapse
|
20
|
Liu Y, Lee MG, Kim JS. Spine Surgery Assisted by Augmented Reality: Where Have We Been? Yonsei Med J 2022; 63:305-316. [PMID: 35352881 PMCID: PMC8965436 DOI: 10.3349/ymj.2022.63.4.305] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/02/2022] [Accepted: 02/09/2022] [Indexed: 11/27/2022] Open
Abstract
This present systematic review examines spine surgery literature supporting augmented reality (AR) technology and summarizes its current status in spinal surgery technology. Database search strategies were retrieved from PubMed, Web of Science, Cochrane Library, Embase, from the earliest records to April 1, 2021. Our review briefly examines the history of AR, and enumerates different device application workflows in a variety of spinal surgeries. We also sort out the pros and cons of current mainstream AR devices and the latest updates. A total of 45 articles are included in our review. The most prevalent surgical applications included are the augmented reality surgical navigation system and head-mounted display. The most popular application of AR is pedicle screw instrumentation in spine surgery, and the primary responsible surgical levels are thoracic and lumbar. AR guidance systems show high potential value in practical clinical applications for the spine. The overall number of cases in AR-related studies is still rare compared to traditional surgical-assisted techniques. These lack long-term clinical efficacy and robust surgical-related statistical data. Changing healthcare laws as well as the increasing prevalence of spinal surgery are generating critical data that determines the value of AR technology.
Collapse
Affiliation(s)
- Yanting Liu
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Min-Gi Lee
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Jin-Sung Kim
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| |
Collapse
|
21
|
Intraoperative Navigation in Plastic Surgery with Augmented Reality: A Preclinical Validation Study. Plast Reconstr Surg 2022; 149:573e-580e. [PMID: 35196700 DOI: 10.1097/prs.0000000000008875] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
BACKGROUND Augmented reality allows users to visualize and interact with digital images including three-dimensional holograms in the real world. This technology may have value intraoperatively by improving surgical decision-making and precision but relies on the ability to accurately align a hologram to a patient. This study aims to quantify the accuracy with which a hologram of soft tissue can be aligned to a patient and used to guide intervention. METHODS A mannequin's face was marked in a standardized fashion with 14 incision patterns in red and nine reference points in blue. A three-dimensional photograph was then taken, converted into a hologram, and uploaded to HoloLens (Verto Studio LLC, San Diego, Calif.), a wearable augmented reality device. The red markings were then erased, leaving only the blue points. The hologram was then viewed through the HoloLens in augmented reality and aligned onto the mannequin. The user then traced the overlaid red markings present on the hologram. Three-dimensional photographs of the newly marked mannequin were then taken and compared with the baseline three-dimensional photographs of the mannequin for accuracy of the red markings. This process was repeated for 15 trials (n = 15). RESULTS The accuracy of the augmented reality-guided intervention, when considering all trials, was 1.35 ± 0.24 mm. Markings that were positioned laterally on the face were significantly more difficult to reproduce than those centered around the facial midline. CONCLUSIONS Holographic markings can be accurately translated onto a mannequin with an average error of less than 1.4 mm. These data support the notion that augmented reality navigation may be practical and reliable for clinical integration in plastic surgery.
Collapse
|
22
|
Farshad M, Spirig JM, Suter D, Hoch A, Burkhard MD, Liebmann F, Farshad-Amacker NA, Fürnstahl P. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. NORTH AMERICAN SPINE SOCIETY JOURNAL 2022; 8:100084. [PMID: 35141649 PMCID: PMC8819958 DOI: 10.1016/j.xnsj.2021.100084] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2021] [Revised: 09/21/2021] [Accepted: 10/02/2021] [Indexed: 12/17/2022]
Abstract
Background AR based navigation of spine surgeries may not only provide accurate surgical execution but also operator independency by compensating for potential skill deficits. “Direct” AR-navigation, namely superposing trajectories on anatomy directly, have not been investigated regarding their accuracy and operator's dependence. Purpose of this study was to prove operator independent reliability and accuracy of both AR assisted pedicle screw navigation and AR assisted rod bending in a cadaver setting. Methods Two experienced spine surgeons and two biomedical engineers (laymen) performed independently from each other pedicle screw instrumentations from L1-L5 in a total of eight lumbar cadaver specimens (20 screws/operator) using a fluoroscopy-free AR based navigation method. Screw fitting rods from L1 to S2-Ala-Ileum were bent bilaterally using an AR based rod bending navigation method (4 rods/operator). Outcome measures were pedicle perforations, accuracy compared to preoperative plan, registration time, navigation time, total rod bending time and operator's satisfaction for these procedures. Results 97.5% of all screws were safely placed (<2 mm perforation), overall mean deviation from planned trajectory was 6.8±3.9°, deviation from planned entry point was 4±2.7 mm, registration time per vertebra was 2:25 min (00:56 to 10:00 min), navigation time per screw was 1:07 min (00:15 to 12:43 min) rod bending time per rod was 4:22 min (02:07 to 10:39 min), operator's satisfaction with AR based screw and rod navigation was 5.38±0.67 (1 to 6, 6 being the best rate). Comparison of surgeons and laymen revealed significant difference in navigation time (1:01 min; 00:15 to 3:00 min vs. 01:37 min; 00:23 to 12:43 min; p = 0.004, respectively) but not in pedicle perforation rate. Conclusions Direct AR based screw and rod navigation using a surface digitization registration technique is reliable and independent of surgical experience. The accuracy of pedicle screw insertion in the lumbar spine is comparable with the current standard techniques.
Collapse
Affiliation(s)
- Mazda Farshad
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - José Miguel Spirig
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Daniel Suter
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland.,ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Armando Hoch
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Marco D Burkhard
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Florentin Liebmann
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zürich, Forchstrasse 340, 8008 Zürich
| | - Philipp Fürnstahl
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
23
|
XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J Clin Med 2022; 11:jcm11020470. [PMID: 35054164 PMCID: PMC8779726 DOI: 10.3390/jcm11020470] [Citation(s) in RCA: 40] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/01/2022] [Accepted: 01/11/2022] [Indexed: 02/06/2023] Open
Abstract
In recent years, with the rapid advancement and consumerization of virtual reality, augmented reality, mixed reality, and extended reality (XR) technology, the use of XR technology in spine medicine has also become increasingly popular. The rising use of XR technology in spine medicine has also been accelerated by the recent wave of digital transformation (i.e., case-specific three-dimensional medical images and holograms, wearable sensors, video cameras, fifth generation, artificial intelligence, and head-mounted displays), and further accelerated by the COVID-19 pandemic and the increase in minimally invasive spine surgery. The COVID-19 pandemic has a negative impact on society, but positive impacts can also be expected, including the continued spread and adoption of telemedicine services (i.e., tele-education, tele-surgery, tele-rehabilitation) that promote digital transformation. The purpose of this narrative review is to describe the accelerators of XR (VR, AR, MR) technology in spine medicine and then to provide a comprehensive review of the use of XR technology in spine medicine, including surgery, consultation, education, and rehabilitation, as well as to identify its limitations and future perspectives (status quo and quo vadis).
Collapse
|
24
|
Feasibility and Accuracy of Thoracolumbar Pedicle Screw Placement Using an Augmented Reality Head Mounted Device. SENSORS 2022; 22:s22020522. [PMID: 35062483 PMCID: PMC8779462 DOI: 10.3390/s22020522] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 12/30/2021] [Accepted: 01/06/2022] [Indexed: 02/06/2023]
Abstract
Background: To investigate the accuracy of augmented reality (AR) navigation using the Magic Leap head mounted device (HMD), pedicle screws were minimally invasively placed in four spine phantoms. Methods: AR navigation provided by a combination of a conventional navigation system integrated with the Magic Leap head mounted device (AR-HMD) was used. Forty-eight screws were planned and inserted into Th11-L4 of the phantoms using the AR-HMD and navigated instruments. Postprocedural CT scans were used to grade the technical (deviation from the plan) and clinical (Gertzbein grade) accuracy of the screws. The time for each screw placement was recorded. Results: The mean deviation between navigation plan and screw position was 1.9 ± 0.7 mm (1.9 [0.3–4.1] mm) at the entry point and 1.4 ± 0.8 mm (1.2 [0.1–3.9] mm) at the screw tip. The angular deviation was 3.0 ± 1.4° (2.7 [0.4–6.2]°) and the mean time for screw placement was 130 ± 55 s (108 [58–437] s). The clinical accuracy was 94% according to the Gertzbein grading scale. Conclusion: The combination of an AR-HMD with a conventional navigation system for accurate minimally invasive screw placement is feasible and can exploit the benefits of AR in the perspective of the surgeon with the reliability of a conventional navigation system.
Collapse
|
25
|
Marker-free Surgical Navigation of Rod Bending using a Stereo Neural Network and Augmented Reality in Spinal Fusion. Med Image Anal 2022; 77:102365. [DOI: 10.1016/j.media.2022.102365] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Revised: 11/16/2021] [Accepted: 01/10/2022] [Indexed: 11/20/2022]
|
26
|
Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr Rev Musculoskelet Med 2021; 14:397-405. [PMID: 34751894 DOI: 10.1007/s12178-021-09728-1] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 01/05/2023]
Abstract
PURPOSE OF REVIEW Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes. RECENT FINDINGS Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches. It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics' high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Collapse
|
27
|
Augmented and virtual reality in spine surgery, current applications and future potentials. Spine J 2021; 21:1617-1625. [PMID: 33774210 DOI: 10.1016/j.spinee.2021.03.018] [Citation(s) in RCA: 91] [Impact Index Per Article: 22.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 03/17/2021] [Indexed: 02/03/2023]
Abstract
BACKGROUND CONTEXT The field of artificial intelligence (AI) is rapidly advancing, especially with recent improvements in deep learning (DL) techniques. Augmented (AR) and virtual reality (VR) are finding their place in healthcare, and spine surgery is no exception. The unique capabilities and advantages of AR and VR devices include their low cost, flexible integration with other technologies, user-friendly features and their application in navigation systems, which makes them beneficial across different aspects of spine surgery. Despite the use of AR for pedicle screw placement, targeted cervical foraminotomy, bone biopsy, osteotomy planning, and percutaneous intervention, the current applications of AR and VR in spine surgery remain limited. PURPOSE The primary goal of this study was to provide the spine surgeons and clinical researchers with the general information about the current applications, future potentials, and accessibility of AR and VR systems in spine surgery. STUDY DESIGN/SETTING We reviewed titles of more than 250 journal papers from google scholar and PubMed with search words: augmented reality, virtual reality, spine surgery, and orthopaedic, out of which 89 related papers were selected for abstract review. Finally, full text of 67 papers were analyzed and reviewed. METHODS The papers were divided into four groups: technological papers, applications in surgery, applications in spine education and training, and general application in orthopaedic. A team of two reviewers performed paper reviews and a thorough web search to ensure the most updated state of the art in each of four group is captured in the review. RESULTS In this review we discuss the current state of the art in AR and VR hardware, their preoperative applications and surgical applications in spine surgery. Finally, we discuss the future potentials of AR and VR and their integration with AI, robotic surgery, gaming, and wearables. CONCLUSIONS AR and VR are promising technologies that will soon become part of standard of care in spine surgery.
Collapse
|
28
|
Wagner CR, Phillips T, Roux S, Corrigan JP. Future Directions in Robotic Neurosurgery. Oper Neurosurg (Hagerstown) 2021; 21:173-180. [PMID: 34051701 DOI: 10.1093/ons/opab135] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Accepted: 12/18/2020] [Indexed: 12/20/2022] Open
Abstract
In this paper, we highlight promising technologies in each phase of a robotic neurosurgery operation, and identify key factors affecting how quickly these technologies will mature into products in the operating room. We focus on specific technology trends in image-guided cranial and spinal procedures, including advances in imaging, machine learning, robotics, and novel interfaces. For each technology, we discuss the required effort to overcome safety or implementation challenges, as well as identifying example regulatory approved products in related fields for comparison. The goal is to provide a roadmap for clinicians as to which robotic and automation technologies are in the developmental pipeline, and which ones are likely to impact their practice sooner, rather than later.
Collapse
Affiliation(s)
| | | | - Serge Roux
- Cambridge Consultants Ltd, Cambridge, UK
| | | |
Collapse
|
29
|
Wessels L, Komm B, Bohner G, Vajkoczy P, Hecht N. Spinal alignment shift between supine and prone CT imaging occurs frequently and regardless of the anatomic region, risk factors, or pathology. Neurosurg Rev 2021; 45:855-863. [PMID: 34379226 PMCID: PMC8827393 DOI: 10.1007/s10143-021-01618-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2021] [Revised: 07/13/2021] [Accepted: 07/25/2021] [Indexed: 11/05/2022]
Abstract
Computer-assisted spine surgery based on preoperative CT imaging may be hampered by sagittal alignment shifts due to an intraoperative switch from supine to prone. In the present study, we systematically analyzed the occurrence and pattern of sagittal spinal alignment shift between corresponding preoperative (supine) and intraoperative (prone) CT imaging in patients that underwent navigated posterior instrumentation between 2014 and 2017. Sagittal alignment across the levels of instrumentation was determined according to the C2 fracture gap (C2-F) and C2 translation (C2-T) in odontoid type 2 fractures, next to the modified Cobb angle (CA), plumbline (PL), and translation (T) in subaxial pathologies. One-hundred and twenty-one patients (C1/C2: n = 17; C3-S1: n = 104) with degenerative (39/121; 32%), oncologic (35/121; 29%), traumatic (34/121; 28%), or infectious (13/121; 11%) pathologies were identified. In the subaxial spine, significant shift occurred in 104/104 (100%) cases (CA: *p = .044; T: *p = .021) compared to only 10/17 (59%) cases that exhibited shift at the C1/C2 level (C2-F: **p = .002; C2-T: *p < .016). The degree of shift was not affected by the anatomic region or pathology but significantly greater in cases with an instrumentation length > 5 segments (“∆PL > 5 segments”: 4.5 ± 1.8 mm; “∆PL ≤ 5 segments”: 2 ± 0.6 mm; *p = .013) or in revision surgery with pre-existing instrumentation (“∆PL presence”: 5 ± 2.6 mm; “∆PL absence”: 2.4 ± 0.7 mm; **p = .007). Interestingly, typical morphological instability risk factors did not influence the degree of shift. In conclusion, intraoperative spinal alignment shift due to a change in patient position should be considered as a cause for inaccuracy during computer-assisted spine surgery and when correcting spinal alignment according to parameters that were planned in other patient positions.
Collapse
Affiliation(s)
- Lars Wessels
- Department of Neurosurgery, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany
| | - Bettina Komm
- Department of Neurosurgery, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany
| | - Georg Bohner
- Department of Neuroradiology, Charité-Universitätsmedizin Berlin, Berlin, Germany
| | - Peter Vajkoczy
- Department of Neurosurgery, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany
| | - Nils Hecht
- Department of Neurosurgery, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany.
| |
Collapse
|
30
|
Augmented reality-navigated pedicle screw placement: a cadaveric pilot study. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2021; 30:3731-3737. [PMID: 34350487 DOI: 10.1007/s00586-021-06950-w] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Revised: 11/04/2020] [Accepted: 07/25/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE Augmented reality (AR) is an emerging technology with great potential for surgical navigation through its ability to provide 3D holographic projection of otherwise hidden anatomical information. This pilot cadaver study investigated the feasibility and accuracy of one of the first holographic navigation techniques for lumbar pedicle screw placement. METHODS Lumbar computer tomography scans (CT) of two cadaver specimens and their reconstructed 3D models were used for pedicle screw trajectory planning. Planned trajectories and 3D models were subsequently uploaded to an AR head-mounted device. Randomly, k-wires were placed either into the left or the right pedicle of a vertebra (L1-5) with or without AR-navigation (by holographic projection of the planned trajectory). CT-scans were subsequently performed to assess accuracy of both techniques. RESULTS A total of 18 k-wires could be placed (8 navigated, 10 free hand) by two experienced spine surgeons. In two vertebrae, the AR-navigation was aborted because the registration of the preoperative plan with the intraoperative anatomy was imprecise due to a technical failure. The average differences of the screw entry points between planning and execution were 4.74 ± 2.37 mm in the freehand technique and 5.99 ± 3.60 mm in the AR-navigated technique (p = 0.39). The average deviation from the planned trajectories was 11.21° ± 7.64° in the freehand technique and 5.88° ± 3.69° in the AR-navigated technique (p = 0.09). CONCLUSION This pilot study demonstrates improved angular precision in one of the first AR-navigated pedicle screw placement studies worldwide. Technical shortcomings need to be eliminated before potential clinical applications.
Collapse
|
31
|
Chan J, Pangal DJ, Cardinal T, Kugener G, Zhu Y, Roshannai A, Markarian N, Sinha A, Anandkumar A, Hung A, Zada G, Donoho DA. A systematic review of virtual reality for the assessment of technical skills in neurosurgery. Neurosurg Focus 2021; 51:E15. [PMID: 34333472 DOI: 10.3171/2021.5.focus21210] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Accepted: 05/19/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Virtual reality (VR) and augmented reality (AR) systems are increasingly available to neurosurgeons. These systems may provide opportunities for technical rehearsal and assessments of surgeon performance. The assessment of neurosurgeon skill in VR and AR environments and the validity of VR and AR feedback has not been systematically reviewed. METHODS A systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines was conducted through MEDLINE and PubMed. Studies published in English between January 1990 and February 2021 describing the use of VR or AR to quantify surgical technical performance of neurosurgeons without the use of human raters were included. The types and categories of automated performance metrics (APMs) from each of these studies were recorded. RESULTS Thirty-three VR studies were included in the review; no AR studies met inclusion criteria. VR APMs were categorized as either distance to target, force, kinematics, time, blood loss, or volume of resection. Distance and time were the most well-studied APM domains, although all domains were effective at differentiating surgeon experience levels. Distance was successfully used to track improvements with practice. Examining volume of resection demonstrated that attending surgeons removed less simulated tumor but preserved more normal tissue than trainees. More recently, APMs have been used in machine learning algorithms to predict level of training with a high degree of accuracy. Key limitations to enhanced-reality systems include limited AR usage for automated surgical assessment and lack of external and longitudinal validation of VR systems. CONCLUSIONS VR has been used to assess surgeon performance across a wide spectrum of domains. The VR environment can be used to quantify surgeon performance, assess surgeon proficiency, and track training progression. AR systems have not yet been used to provide metrics for surgeon performance assessment despite potential for intraoperative integration. VR-based APMs may be especially useful for metrics that are difficult to assess intraoperatively, including blood loss and extent of resection.
Collapse
Affiliation(s)
- Justin Chan
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Dhiraj J Pangal
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Tyler Cardinal
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Guillaume Kugener
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Yichao Zhu
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Arman Roshannai
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Nicholas Markarian
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Aditya Sinha
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Anima Anandkumar
- 2Computing + Mathematical Sciences, California Institute of Technology, Pasadena, California
| | - Andrew Hung
- 3USC Department of Urology, Keck School of Medicine of the University of Southern California, Los Angeles, California; and
| | - Gabriel Zada
- 1USC Department of Neurosurgery, Keck School of Medicine of the University of Southern California, Los Angeles, California
| | - Daniel A Donoho
- 4Texas Children's Hospital, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
32
|
Skyrman S, Lai M, Edström E, Burström G, Förander P, Homan R, Kor F, Holthuizen R, Hendriks BHW, Persson O, Elmi-Terander A. Augmented reality navigation for cranial biopsy and external ventricular drain insertion. Neurosurg Focus 2021; 51:E7. [PMID: 34333469 DOI: 10.3171/2021.5.focus20813] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2020] [Accepted: 05/17/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE The aim of this study was to evaluate the accuracy (deviation from the target or intended path) and efficacy (insertion time) of an augmented reality surgical navigation (ARSN) system for insertion of biopsy needles and external ventricular drains (EVDs), two common neurosurgical procedures that require high precision. METHODS The hybrid operating room-based ARSN system, comprising a robotic C-arm with intraoperative cone-beam CT (CBCT) and integrated video tracking of the patient and instruments using nonobtrusive adhesive optical markers, was used. A 3D-printed skull phantom with a realistic gelatinous brain model containing air-filled ventricles and 2-mm spherical biopsy targets was obtained. After initial CBCT acquisition for target registration and planning, ARSN was used for 30 cranial biopsies and 10 EVD insertions. Needle positions were verified by CBCT. RESULTS The mean accuracy of the biopsy needle insertions (n = 30) was 0.8 mm ± 0.43 mm. The median path length was 39 mm (range 16-104 mm) and did not correlate to accuracy (p = 0.15). The median device insertion time was 149 seconds (range 87-233 seconds). The mean accuracy for the EVD insertions (n = 10) was 2.9 mm ± 0.8 mm at the tip with a 0.7° ± 0.5° angular deviation compared with the planned path, and the median insertion time was 188 seconds (range 135-400 seconds). CONCLUSIONS This study demonstrated that ARSN can be used for navigation of percutaneous cranial biopsies and EVDs with high accuracy and efficacy.
Collapse
Affiliation(s)
- Simon Skyrman
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Marco Lai
- 2Philips Research, High Tech Campus 34, Eindhoven.,3Eindhoven University of Technology (TU/e), Eindhoven
| | - Erik Edström
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Gustav Burström
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Petter Förander
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | | | - Flip Kor
- 5Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | | | - Benno H W Hendriks
- 2Philips Research, High Tech Campus 34, Eindhoven.,5Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - Oscar Persson
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Adrian Elmi-Terander
- 1Department of Neurosurgery, Karolinska University Hospital, and Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
33
|
Godzik J, Farber SH, Urakov T, Steinberger J, Knipscher LJ, Ehredt RB, Tumialán LM, Uribe JS. "Disruptive Technology" in Spine Surgery and Education: Virtual and Augmented Reality. Oper Neurosurg (Hagerstown) 2021; 21:S85-S93. [PMID: 34128065 DOI: 10.1093/ons/opab114] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2021] [Accepted: 03/04/2021] [Indexed: 01/09/2023] Open
Abstract
BACKGROUND Technological advancements are the drivers of modern-day spine care. With the growing pressure to deliver faster and better care, surgical-assist technology is needed to harness computing power and enable the surgeon to improve outcomes. Virtual reality (VR) and augmented reality (AR) represent the pinnacle of emerging technology, not only to deliver higher quality education through simulated care, but also to provide valuable intraoperative information to assist in more efficient and more precise surgeries. OBJECTIVE To describe how the disruptive technologies of VR and AR interface in spine surgery and education. METHODS We review the relevance of VR and AR technologies in spine care, and describe the feasibility and limitations of the technologies. RESULTS We discuss potential future applications, and provide a case study demonstrating the feasibility of a VR program for neurosurgical spine education. CONCLUSION Initial experiences with VR and AR technologies demonstrate their applicability and ease of implementation. However, further prospective studies through multi-institutional and industry-academic partnerships are necessary to solidify the future of VR and AR in spine surgery education and clinical practice.
Collapse
Affiliation(s)
- Jakub Godzik
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - S Harrison Farber
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Timur Urakov
- Department of Neurosurgery, University of Miami, Miami, Florida, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Health System, New York, New York, USA
| | - Liza J Knipscher
- Neuroscience Publications, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Ryan B Ehredt
- Neuroscience Publications, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Luis M Tumialán
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| | - Juan S Uribe
- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph's Hospital and Medical Center, Phoenix, Arizona, USA
| |
Collapse
|
34
|
Lak AM, Zaidi HA. Commentary: Minimally Invasive Posterior Cervical Foraminotomy Using 3-Dimensional Total Navigation: 2-Dimensional Operative Video. Oper Neurosurg (Hagerstown) 2021; 20:E139-E140. [PMID: 33294921 DOI: 10.1093/ons/opaa358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2020] [Accepted: 09/01/2020] [Indexed: 11/14/2022] Open
Affiliation(s)
- Asad M Lak
- Department of Neurosurgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Hasan A Zaidi
- Department of Neurosurgery, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
35
|
Rousseau J, Dreuil S, Bassinet C, Cao S, Elleaume H. Surgivisio® and O-arm®O2 cone beam CT mobile systems for guidance of lumbar spine surgery: Comparison of patient radiation dose. Phys Med 2021; 85:192-199. [PMID: 34111631 DOI: 10.1016/j.ejmp.2021.04.018] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 04/14/2021] [Accepted: 04/19/2021] [Indexed: 10/21/2022] Open
Abstract
PURPOSE To compare patient radiation doses in cone beam computed tomography (CBCT) of two mobile systems used for navigation-assisted mini-invasive orthopedic surgery: O-arm®O2 and Surgivisio®. METHODS The study focused on imaging of the spine. Thermoluminescent dosimeters were used to measure organs and effective doses (ED) during CBCT. An ionization-chamber and a solid-state sensor were used to measure the incident air-kerma (Ki) at the center of the CBCT field-of-view and Ki during 2D-imaging, respectively. The PCXMC software was used to calculate patient ED in 2D and CBCT configurations. The image quality in CBCT was evaluated with the CATPHAN phantom. RESULTS The experimental ED estimate for the low-dose 3D-modes was 2.41 and 0.35 mSv with O-arm®O2 (Low Dose 3D-small-abdomen) and Surgivisio® (3DSU-91 images), respectively. PCXMC results were consistent: 1.54 and 0.30 mSv. Organ doses were 5 to 12 times lower with Surgivisio®. Ki at patient skin were comparable on lateral 2D-imaging (0.5 mGy), but lower with O-arm®O2 on anteroposterior (0.3 versus 0.9 mGy). Both systems show poor low contrast resolution and similar high contrast spatial resolution (7 line-pairs/cm). CONCLUSIONS This study is the first to evaluate patient ED and organ doses with Surgivisio®. A significant difference in organs doses was observed between the CBCT systems. The study demonstrates that Surgivisio® used on spine delivers approximately five to six times less patient ED, compared to O-arm®O2, in low dose 3D-modes. Doses in 2D-mode preceding CBCT were higher with Surgivisio®, but negligible compared to CBCT doses under the experimental conditions tested.
Collapse
Affiliation(s)
- Julia Rousseau
- Pôle Imagerie, CHU Grenoble Alpes, Avenue Maquis du Grésivaudan, 38700 La Tronche, France.
| | - Serge Dreuil
- Institut de Radioprotection et de Sûreté Nucléaire (IRSN), 31 Avenue de la Division Leclerc, 92260 Fontenay-aux-Roses, France.
| | - Céline Bassinet
- Institut de Radioprotection et de Sûreté Nucléaire (IRSN), 31 Avenue de la Division Leclerc, 92260 Fontenay-aux-Roses, France.
| | - Sophie Cao
- Pôle Coordination des Gestes Chirurgicaux et Interventionnels, CHU Grenoble Alpes, Avenue Maquis du Grésivaudan, 38700 La Tronche, France.
| | - Hélène Elleaume
- INSERM UA07 Team STROBE, ESRF 71 Avenue des Martyrs, 38000 Grenoble, France.
| |
Collapse
|
36
|
Farshad M, Fürnstahl P, Spirig JM. First in man in-situ augmented reality pedicle screw navigation. NORTH AMERICAN SPINE SOCIETY JOURNAL 2021; 6:100065. [PMID: 35141630 PMCID: PMC8819976 DOI: 10.1016/j.xnsj.2021.100065] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Revised: 04/17/2021] [Accepted: 04/20/2021] [Indexed: 12/29/2022]
Abstract
Background Augmented reality (AR) is a rising technology gaining increasing utility in medicine. By superimposing the surgical site and the operator's visual field with computer-generated information, it has the potential to enhance the cognitive skills of surgeons. This is the report of the first in man case with "direct holographic navigation" as part of a randomized controlled trial. Case description A pointing instrument was equipped with a sterile fiducial marker, which was used to obtain a digital representation of the intraoperative bony anatomy of the lumbar spine. Subsequently, a previously validated registration method was applied to superimpose the surgery plan with the intraoperative anatomy. The registration result is shown in situ as a 3D AR hologram of the preoperative 3D vertebra model with the planned screw trajectory and entry point for validation and approval by the surgeon. After achieving alignment with the surgery plan, a borehole is drilled and the pedicle screw placed. Postoperativ computer tomography was used to measure accuracy of this novel method for surgical navigation. Outcome Correct screw positions entirely within bone were documented with a postoperative CT, with an accuracy similar to current standard of care methods for surgical navigation. The patient was mobilized uneventfully on the first postoperative day with little pain medication and dismissed on the fourth postoperative day. Conclusion This first in man report of direct AR navigation demonstrates feasibility in vivo. The continuation of this randomized controlled study will evaluate the value of this novel technology.
Collapse
Affiliation(s)
- Mazda Farshad
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
- Corresponding author.
| | - Philipp Fürnstahl
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - José Miguel Spirig
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| |
Collapse
|
37
|
Pojskić M, Bopp M, Saß B, Kirschbaum A, Nimsky C, Carl B. Intraoperative Computed Tomography-Based Navigation with Augmented Reality for Lateral Approaches to the Spine. Brain Sci 2021; 11:brainsci11050646. [PMID: 34063546 PMCID: PMC8156391 DOI: 10.3390/brainsci11050646] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2021] [Revised: 05/10/2021] [Accepted: 05/12/2021] [Indexed: 11/23/2022] Open
Abstract
Background. Lateral approaches to the spine have gained increased popularity due to enabling minimally invasive access to the spine, less blood loss, decreased operative time, and less postoperative pain. The objective of the study was to analyze the use of intraoperative computed tomography with navigation and the implementation of augmented reality in facilitating a lateral approach to the spine. Methods. We prospectively analyzed all patients who underwent surgery with a lateral approach to the spine from September 2016 to January 2021 using intraoperative CT applying a 32-slice movable CT scanner, which was used for automatic navigation registration. Sixteen patients, with a median age of 64.3 years, were operated on using a lateral approach to the thoracic and lumbar spine and using intraoperative CT with navigation. Indications included a herniated disc (six patients), tumors (seven), instability following the fracture of the thoracic or lumbar vertebra (two), and spondylodiscitis (one). Results. Automatic registration, applying intraoperative CT, resulted in high accuracy (target registration error: 0.84 ± 0.10 mm). The effective radiation dose of the registration CT scans was 6.16 ± 3.91 mSv. In seven patients, a control iCT scan was performed for resection and implant control, with an ED of 4.51 ± 2.48 mSv. Augmented reality (AR) was used to support surgery in 11 cases, by visualizing the tumor outline, pedicle screws, herniated discs, and surrounding structures. Of the 16 patients, corpectomy was performed in six patients with the implantation of an expandable cage, and one patient underwent discectomy using the XLIF technique. One patient experienced perioperative complications. One patient died in the early postoperative course due to severe cardiorespiratory failure. Ten patients had improved and five had unchanged neurological status at the 3-month follow up. Conclusions. Intraoperative computed tomography with navigation facilitates the application of lateral approaches to the spine for a variety of indications, including fusion procedures, tumor resection, and herniated disc surgery.
Collapse
Affiliation(s)
- Mirza Pojskić
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Correspondence: ; Tel.: +49-64215869848
| | - Miriam Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35043 Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
| | - Andreas Kirschbaum
- Department of Visceral, Thoracic and Vascular Surgery, University of Marburg, 35043 Marburg, Germany;
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35043 Marburg, Germany
| | - Barbara Carl
- Department of Neurosurgery, University of Marburg, Baldingerstraße, 35043 Marburg, Germany; (M.B.); (B.S.); (C.N.); (B.C.)
- Department of Neurosurgery, Helios Dr. Horst Schmidt Kliniken, 65199 Wiesbaden, Germany
| |
Collapse
|
38
|
Maharjan N, Alsadoon A, Prasad PWC, Abdullah S, Rashid TA. A novel visualization system of using augmented reality in knee replacement surgery: Enhanced bidirectional maximum correntropy algorithm. Int J Med Robot 2021; 17:e2223. [PMID: 33421286 DOI: 10.1002/rcs.2223] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2019] [Revised: 08/18/2020] [Accepted: 08/19/2020] [Indexed: 02/06/2023]
Abstract
BACKGROUND AND AIM Image registration and alignment are the main limitations of augmented reality (AR)-based knee replacement surgery. This research aims to decrease the registration error, eliminate outcomes that are trapped in local minima to improve the alignment problems, handle the occlusion and maximize the overlapping parts. METHODOLOGY Markerless image registration method was used for AR-based knee replacement surgery to guide and visualize the surgical operation. While weight least square algorithm was used to enhance stereo camera-based tracking by filling border occlusion in right-to-left direction and non-border occlusion from left-to-right direction. RESULTS This study has improved video precision to 0.57-0.61 mm alignment error. Furthermore, with the use of bidirectional points, that is, forward and backward directional cloud point, the iteration on image registration was decreased. This has led to improve the processing time as well. The processing time of video frames was improved to 7.4-11.74 frames per second. CONCLUSIONS It seems clear that this proposed system has focused on overcoming the misalignment difficulty caused by the movement of patient and enhancing the AR visualization during knee replacement surgery. The proposed system was reliable and favourable which helps in eliminating alignment error by ascertaining the optimal rigid transformation between two cloud points and removing the outliers and non-Gaussian noise. The proposed AR system helps in accurate visualization and navigation of anatomy of knee such as femur, tibia, cartilage, blood vessels and so forth.
Collapse
Affiliation(s)
- Nitish Maharjan
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia
| | - Abeer Alsadoon
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia.,School of Computer Data and Mathematical Sciences, University of Western Sydney (UWS), Sydney, Australia.,School of Information Technology, Southern Cross University (SCU), Sydney, Australia.,Asia Pacific International College (APIC), Information Technology Department, Sydney, Australia.,Kent Institute Australia, Sydney, Australia
| | - P W C Prasad
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia
| | - Salma Abdullah
- Department of Computer Engineering, University of Technology, Baghdad, Iraq
| | - Tarik A Rashid
- Asia Pacific International College (APIC), Information Technology Department, Sydney, Australia
| |
Collapse
|
39
|
Burström G, Persson O, Edström E, Elmi-Terander A. Augmented reality navigation in spine surgery: a systematic review. Acta Neurochir (Wien) 2021; 163:843-852. [PMID: 33506289 PMCID: PMC7886712 DOI: 10.1007/s00701-021-04708-3] [Citation(s) in RCA: 65] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 01/06/2021] [Indexed: 02/07/2023]
Abstract
BACKGROUND Conventional spinal navigation solutions have been criticized for having a negative impact on time in the operating room and workflow. AR navigation could potentially alleviate some of these concerns while retaining the benefits of navigated spine surgery. The objective of this study is to summarize the current evidence for using augmented reality (AR) navigation in spine surgery. METHODS We performed a systematic review to explore the current evidence for using AR navigation in spine surgery. PubMed and Web of Science were searched from database inception to November 27, 2020, for data on the AR navigation solutions; the reported efficacy of the systems; and their impact on workflow, radiation, and cost-benefit relationships. RESULTS In this systematic review, 28 studies were included in the final analysis. The main findings were superior workflow and non-inferior accuracy when comparing AR to free-hand (FH) or conventional surgical navigation techniques. A limited number of studies indicated decreased use of radiation. There were no studies reporting mortality, morbidity, or cost-benefit relationships. CONCLUSIONS AR provides a meaningful addition to FH surgery and traditional navigation methods for spine surgery. However, the current evidence base is limited and prospective studies on clinical outcomes and cost-benefit relationships are needed.
Collapse
|
40
|
Manni F, Mamprin M, Holthuizen R, Shan C, Burström G, Elmi-Terander A, Edström E, Zinger S, de With PHN. Multi-view 3D skin feature recognition and localization for patient tracking in spinal surgery applications. Biomed Eng Online 2021; 20:6. [PMID: 33413426 PMCID: PMC7792004 DOI: 10.1186/s12938-020-00843-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Accepted: 12/19/2020] [Indexed: 11/25/2022] Open
Abstract
BACKGROUND Minimally invasive spine surgery is dependent on accurate navigation. Computer-assisted navigation is increasingly used in minimally invasive surgery (MIS), but current solutions require the use of reference markers in the surgical field for both patient and instruments tracking. PURPOSE To improve reliability and facilitate clinical workflow, this study proposes a new marker-free tracking framework based on skin feature recognition. METHODS Maximally Stable Extremal Regions (MSER) and Speeded Up Robust Feature (SURF) algorithms are applied for skin feature detection. The proposed tracking framework is based on a multi-camera setup for obtaining multi-view acquisitions of the surgical area. Features can then be accurately detected using MSER and SURF and afterward localized by triangulation. The triangulation error is used for assessing the localization quality in 3D. RESULTS The framework was tested on a cadaver dataset and in eight clinical cases. The detected features for the entire patient datasets were found to have an overall triangulation error of 0.207 mm for MSER and 0.204 mm for SURF. The localization accuracy was compared to a system with conventional markers, serving as a ground truth. An average accuracy of 0.627 and 0.622 mm was achieved for MSER and SURF, respectively. CONCLUSIONS This study demonstrates that skin feature localization for patient tracking in a surgical setting is feasible. The technology shows promising results in terms of detected features and localization accuracy. In the future, the framework may be further improved by exploiting extended feature processing using modern optical imaging techniques for clinical applications where patient tracking is crucial.
Collapse
Affiliation(s)
- Francesca Manni
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands.
| | - Marco Mamprin
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | | | - Caifeng Shan
- Shandong University of Science and Technology, Qingdao, China
| | - Gustav Burström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Adrian Elmi-Terander
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Erik Edström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Svitlana Zinger
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Peter H N de With
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
41
|
Yuk FJ, Maragkos GA, Sato K, Steinberger J. Current innovation in virtual and augmented reality in spine surgery. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:94. [PMID: 33553387 PMCID: PMC7859743 DOI: 10.21037/atm-20-1132] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
In spinal surgery, outcomes are directly related both to patient and procedure selection, as well as the accuracy and precision of instrumentation placed. Poorly placed instrumentation can lead to spinal cord, nerve root or vascular injury. Traditionally, spine surgery was performed by open methods and placement of instrumentation under direct visualization. However, minimally invasive surgery (MIS) has seen substantial advances in spine, with an ever-increasing range of indications and procedures. For these reasons, novel methods to visualize anatomy and precisely guide surgery, such as intraoperative navigation, are extremely useful in this field. In this review, we present the recent advances and innovations utilizing simulation methods in spine surgery. The application of these techniques is still relatively new, however quickly being integrated in and outside the operating room. These include virtual reality (VR) (where the entire simulation is virtual), mixed reality (MR) (a combination of virtual and physical components), and augmented reality (AR) (the superimposition of a virtual component onto physical reality). VR and MR have primarily found applications in a teaching and preparatory role, while AR is mainly applied in hands-on surgical settings. The present review attempts to provide an overview of the latest advances and applications of these methods in the neurosurgical spine setting.
Collapse
Affiliation(s)
- Frank J Yuk
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Georgios A Maragkos
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kosuke Sato
- Hospital for Special Surgery, New York, NY, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| |
Collapse
|
42
|
Dibble CF, Molina CA. Device profile of the XVision-spine (XVS) augmented-reality surgical navigation system: overview of its safety and efficacy. Expert Rev Med Devices 2020; 18:1-8. [PMID: 33322948 DOI: 10.1080/17434440.2021.1865795] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Introduction: The field of augmented reality mediated spine surgery is growing rapidly and holds great promise for improving surgical capabilities and patient outcomes. Augmented reality can assist with complex or atypical cases involving challenging anatomy. As neuronavigation evolves, fundamental technical limitations remain in line-of-sight interruption and operator attention shift, which this novel augmented reality technology helps to address.Areas covered: XVision is a recently FDA-approved head mounted display for intraoperative neuronavigation, compatible with all current conventional pedicle screw technology. The device is a wireless, customizable headset with an integrated surgical tracking system and transparent retinal display. This review discusses the available literature on the safety and efficacy of XVision, as well as the current state of augmented reality technology in spine surgery.Expert opinion: Augmented-reality spine surgery is an emerging technology that may increase precision, efficiency, and safety as well as decrease radiation exposure of manual and robotic computer-navigated pedicle screw insertion techniques. The initial clinical experience with XVision has shown good outcomes and it has received positive operator feedback. Now that initial clinical safety and efficacy has been demonstrated, ongoing experience must be studied to empirically validate this technology and generate further innovation in this rapidly evolving field.
Collapse
Affiliation(s)
- Christopher F Dibble
- Department of Neurosurgery, Washington University School of Medicine, Saint Louis, USA
| | - Camilo A Molina
- Department of Neurosurgery, Washington University School of Medicine, Saint Louis, USA
| |
Collapse
|
43
|
Roß T, Reinke A, Full PM, Wagner M, Kenngott H, Apitz M, Hempe H, Mindroc-Filimon D, Scholz P, Tran TN, Bruno P, Arbeláez P, Bian GB, Bodenstedt S, Bolmgren JL, Bravo-Sánchez L, Chen HB, González C, Guo D, Halvorsen P, Heng PA, Hosgor E, Hou ZG, Isensee F, Jha D, Jiang T, Jin Y, Kirtac K, Kletz S, Leger S, Li Z, Maier-Hein KH, Ni ZL, Riegler MA, Schoeffmann K, Shi R, Speidel S, Stenzel M, Twick I, Wang G, Wang J, Wang L, Wang L, Zhang Y, Zhou YJ, Zhu L, Wiesenfarth M, Kopp-Schneider A, Müller-Stich BP, Maier-Hein L. Comparative validation of multi-instance instrument segmentation in endoscopy: Results of the ROBUST-MIS 2019 challenge. Med Image Anal 2020; 70:101920. [PMID: 33676097 DOI: 10.1016/j.media.2020.101920] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Revised: 09/22/2020] [Accepted: 11/24/2020] [Indexed: 12/27/2022]
Abstract
Intraoperative tracking of laparoscopic instruments is often a prerequisite for computer and robotic-assisted interventions. While numerous methods for detecting, segmenting and tracking of medical instruments based on endoscopic video images have been proposed in the literature, key limitations remain to be addressed: Firstly, robustness, that is, the reliable performance of state-of-the-art methods when run on challenging images (e.g. in the presence of blood, smoke or motion artifacts). Secondly, generalization; algorithms trained for a specific intervention in a specific hospital should generalize to other interventions or institutions. In an effort to promote solutions for these limitations, we organized the Robust Medical Instrument Segmentation (ROBUST-MIS) challenge as an international benchmarking competition with a specific focus on the robustness and generalization capabilities of algorithms. For the first time in the field of endoscopic image processing, our challenge included a task on binary segmentation and also addressed multi-instance detection and segmentation. The challenge was based on a surgical data set comprising 10,040 annotated images acquired from a total of 30 surgical procedures from three different types of surgery. The validation of the competing methods for the three tasks (binary segmentation, multi-instance detection and multi-instance segmentation) was performed in three different stages with an increasing domain gap between the training and the test data. The results confirm the initial hypothesis, namely that algorithm performance degrades with an increasing domain gap. While the average detection and segmentation quality of the best-performing algorithms is high, future research should concentrate on detection and segmentation of small, crossing, moving and transparent instrument(s) (parts).
Collapse
Affiliation(s)
- Tobias Roß
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany; University of Heidelberg, Germany, Seminarstraße 2, 69117 Heidelberg, Germany.
| | - Annika Reinke
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany; University of Heidelberg, Germany, Seminarstraße 2, 69117 Heidelberg, Germany
| | - Peter M Full
- University of Heidelberg, Germany, Seminarstraße 2, 69117 Heidelberg, Germany; Division of Medical Image Computing (MIC), Im Neuenheimer Feld 223, 69120 Heidelberg, Germany
| | - Martin Wagner
- Department for General, Visceral and Transplantation Surgery, Heidelberg University Hospital, Im Neuenheimer Feld 110, 69120 Heidelberg, Germany
| | - Hannes Kenngott
- Department for General, Visceral and Transplantation Surgery, Heidelberg University Hospital, Im Neuenheimer Feld 110, 69120 Heidelberg, Germany
| | - Martin Apitz
- Department for General, Visceral and Transplantation Surgery, Heidelberg University Hospital, Im Neuenheimer Feld 110, 69120 Heidelberg, Germany
| | - Hellena Hempe
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany
| | - Diana Mindroc-Filimon
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany
| | - Patrick Scholz
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany; HIDSS4Health - Helmholtz Information and Data Science School for Health, Im Neuenheimer Feld 223, 69120 Heidelberg, Germany
| | - Thuy Nuong Tran
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany
| | - Pierangela Bruno
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany; Department of Mathematics and Computer Science, University of Calabria, 87036 Rende, Italy
| | - Pablo Arbeláez
- Universidad de los Andes, Cra. 1 No 18A - 12, 111711 Bogotá, Colombia
| | - Gui-Bin Bian
- University of Chinese Academy Sciences, 52 Sanlihe Rd., Beijing, China; State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, 100864 Beijing, China
| | - Sebastian Bodenstedt
- National Center for Tumor Diseases (NCT), Partner Site Dresden, Germany: German Cancer Research Center, Im Neuenheimer Feld 460, 69120 Heidelberg, Germany; Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Fetscherstraße 74, 01307 Dresden, Germany; Helmholtz Association/Helmholtz-Zentrum Dresden - Rossendorf (HZDR), Bautzner Landstraße 400, 01328 Dresden, Germany
| | | | | | - Hua-Bin Chen
- University of Chinese Academy Sciences, 52 Sanlihe Rd., Beijing, China; State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, 100864 Beijing, China
| | - Cristina González
- Universidad de los Andes, Cra. 1 No 18A - 12, 111711 Bogotá, Colombia
| | - Dong Guo
- School of Mechanical and Electrical Engineering, University of Electronic Science and Technology of China, Shahe Campus:No.4, Section 2, North Jianshe Road, 610054
- Qingshuihe Campus:No.2006, Xiyuan Ave, West Hi-Tech Zone, 611731, Chengdu, China
| | - Pål Halvorsen
- SimulaMet, Pilestredet 52, 0167 Oslo, Norway; Oslo Metropolitan University (OsloMet), Pilestredet 52, 0167 Oslo, Norway
| | - Pheng-Ann Heng
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Chung Chi Rd, Ma Liu Shui, Hong Kong, China
| | - Enes Hosgor
- caresyntax, Komturstraße 18A, 12099 Berlin, Germany
| | - Zeng-Guang Hou
- University of Chinese Academy Sciences, 52 Sanlihe Rd., Beijing, China; State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, 100864 Beijing, China
| | - Fabian Isensee
- University of Heidelberg, Germany, Seminarstraße 2, 69117 Heidelberg, Germany; Division of Medical Image Computing (MIC), Im Neuenheimer Feld 223, 69120 Heidelberg, Germany
| | - Debesh Jha
- SimulaMet, Pilestredet 52, 0167 Oslo, Norway; Department of Informatics, UIT The Arctic University of Norway, Hansine Hansens vei 54, 9037 Tromsø, Norway
| | - Tingting Jiang
- Institute of Digital Media (NELVT), Peking University, 5 Yiheyuan Rd, Haidian District, 100871 Peking, China
| | - Yueming Jin
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Chung Chi Rd, Ma Liu Shui, Hong Kong, China
| | - Kadir Kirtac
- caresyntax, Komturstraße 18A, 12099 Berlin, Germany
| | - Sabrina Kletz
- Institute of Information Technology, Klagenfurt University, Universitätsstraße 65-67, 9020 Klagenfurt, Austria
| | - Stefan Leger
- National Center for Tumor Diseases (NCT), Partner Site Dresden, Germany: German Cancer Research Center, Im Neuenheimer Feld 460, 69120 Heidelberg, Germany; Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Fetscherstraße 74, 01307 Dresden, Germany; Helmholtz Association/Helmholtz-Zentrum Dresden - Rossendorf (HZDR), Bautzner Landstraße 400, 01328 Dresden, Germany
| | - Zhixuan Li
- Institute of Digital Media (NELVT), Peking University, 5 Yiheyuan Rd, Haidian District, 100871 Peking, China
| | - Klaus H Maier-Hein
- Division of Medical Image Computing (MIC), Im Neuenheimer Feld 223, 69120 Heidelberg, Germany
| | - Zhen-Liang Ni
- University of Chinese Academy Sciences, 52 Sanlihe Rd., Beijing, China; State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, 100864 Beijing, China
| | | | - Klaus Schoeffmann
- Institute of Information Technology, Klagenfurt University, Universitätsstraße 65-67, 9020 Klagenfurt, Austria
| | - Ruohua Shi
- Institute of Digital Media (NELVT), Peking University, 5 Yiheyuan Rd, Haidian District, 100871 Peking, China
| | - Stefanie Speidel
- National Center for Tumor Diseases (NCT), Partner Site Dresden, Germany: German Cancer Research Center, Im Neuenheimer Feld 460, 69120 Heidelberg, Germany; Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Fetscherstraße 74, 01307 Dresden, Germany; Helmholtz Association/Helmholtz-Zentrum Dresden - Rossendorf (HZDR), Bautzner Landstraße 400, 01328 Dresden, Germany
| | | | | | - Gutai Wang
- School of Mechanical and Electrical Engineering, University of Electronic Science and Technology of China, Shahe Campus:No.4, Section 2, North Jianshe Road, 610054
- Qingshuihe Campus:No.2006, Xiyuan Ave, West Hi-Tech Zone, 611731, Chengdu, China
| | - Jiacheng Wang
- Department of Computer Science, School of Informatics, Xiamen University, 422 Siming South Road, 361005 Xiamen, China
| | - Liansheng Wang
- Department of Computer Science, School of Informatics, Xiamen University, 422 Siming South Road, 361005 Xiamen, China
| | - Lu Wang
- School of Mechanical and Electrical Engineering, University of Electronic Science and Technology of China, Shahe Campus:No.4, Section 2, North Jianshe Road, 610054
- Qingshuihe Campus:No.2006, Xiyuan Ave, West Hi-Tech Zone, 611731, Chengdu, China
| | - Yujie Zhang
- Department of Computer Science, School of Informatics, Xiamen University, 422 Siming South Road, 361005 Xiamen, China
| | - Yan-Jie Zhou
- University of Chinese Academy Sciences, 52 Sanlihe Rd., Beijing, China; State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, 100864 Beijing, China
| | - Lei Zhu
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Chung Chi Rd, Ma Liu Shui, Hong Kong, China
| | - Manuel Wiesenfarth
- Division of Biostatistics, German Cancer Research Center, Im Neuenheimer Feld 581, Heidelberg, Germany
| | - Annette Kopp-Schneider
- Division of Biostatistics, German Cancer Research Center, Im Neuenheimer Feld 581, Heidelberg, Germany
| | - Beat P Müller-Stich
- Department for General, Visceral and Transplantation Surgery, Heidelberg University Hospital, Im Neuenheimer Feld 110, 69120 Heidelberg, Germany
| | - Lena Maier-Hein
- Computer Assisted Medical Interventions (CAMI), German Cancer Research Center, Im Neuenheimer Feld 223, 69120, Heidelberg, Germany
| |
Collapse
|
44
|
Frameless Patient Tracking With Adhesive Optical Skin Markers for Augmented Reality Surgical Navigation in Spine Surgery. Spine (Phila Pa 1976) 2020; 45:1598-1604. [PMID: 32756274 DOI: 10.1097/brs.0000000000003628] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN Observational study. OBJECTIVE The aim of this study was to evaluate the accuracy of a new frameless reference marker system for patient tracking by analyzing the effect of vertebral position within the surgical field. SUMMARY OF BACKGROUND DATA Most modern navigation systems for spine surgery rely on a dynamic reference frame attached to a vertebra for tracking the patient. This solution has the drawback of being bulky and obstructing the surgical field, while requiring that the dynamic reference frame is moved between vertebras to maintain accuracy. METHODS An augmented reality surgical navigation (ARSN) system with intraoperative cone beam computed tomography (CBCT) capability was installed in a hybrid operating room. The ARSN system used input from four video cameras for tracking adhesive skin markers placed around the surgical field. The frameless reference marker system was evaluated first in four human cadavers, and then in 20 patients undergoing navigated spine surgery. In each CBCT, the impact of vertebral position in the surgical field on technical accuracy was analyzed. The technical accuracy of the inserted pedicle devices was determined by measuring the distance between the planned position and the placed pedicle device, at the bone entry point. RESULTS The overall mean technical accuracy was 1.65 ± 1.24 mm at the bone entry point (n = 366). There was no statistically significant difference in technical accuracy between levels within CBCTs (P ≥ 0.12 for all comparisons). Linear regressions showed that null- to negligible parts of the effect on technical accuracy could be explained by the number of absolute levels away from the index vertebrae (r ≤ 0.007 for all, β ≤ 0.071 for all). CONCLUSION The frameless reference marker system based on adhesive skin markers is unobtrusive and affords the ARSN system a high accuracy throughout the navigated surgical field, independent of vertebral position. LEVEL OF EVIDENCE 3.
Collapse
|
45
|
Fan N, Yuan S, Du P, Zhu W, Li L, Hai Y, Ding H, Wang G, Zang L. Design of a robot-assisted system for transforaminal percutaneous endoscopic lumbar surgeries: study protocol. J Orthop Surg Res 2020; 15:479. [PMID: 33076965 PMCID: PMC7569762 DOI: 10.1186/s13018-020-02003-y] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 10/06/2020] [Indexed: 12/14/2022] Open
Abstract
Background Transforaminal percutaneous endoscopic lumbar surgeries (PELS) for lumbar disc herniation and spinal stenosis are growing in popularity. However, there are some problems in the establishment of the working channel and foraminoplasty such as nerve and blood vessel injuries, more radiation exposure, and steeper learning curve. Rapid technological advancements have allowed robotic technology to assist surgeons in improving the accuracy and safety of surgeries. Therefore, the purpose of this study is to develop a robot-assisted system for transforaminal PELS, which can provide navigation and foraminoplasty. Methods The robot-assisted system consists of three systems: preoperative planning system, navigation system, and foraminoplasty system. In the preoperative planning system, 3D visualization of the surgical segment and surrounding tissues are realized using the multimodal image fusion technique of computed tomography and magnetic resonance imaging, and the working channel planning is carried out to reduce the risk for injury to vital blood vessels and nerves. In the navigation system, the robot can obtain visual perception ability from a visual receptor and automatically adjust the robotic platform and robot arm to the appropriate positions according to the patient’s position and preoperative plan. In addition, the robot can automatically register the surgical levels through intraoperative fluoroscopy. After that, the robot will provide navigation using the 6 degree-of-freedom (DOF) robot arm according to the preoperative planning system and guide the surgeon to complete the establishment of the working channel. In the foraminoplasty system, according to the foraminoplasty planning in the preoperative planning system, the robot performs foraminoplasty automatically using the high speed burr at the end of the robot arm. The system can provide real-time feedback on the working status of the bur through multi-mode sensors such as multidimensional force, position, and acceleration. Finally, a prototype of the system is constructed and performance tests are conducted. Discussion Our study will develop a robot-assisted system to perform transforaminal PELS, and this robot-assisted system can also be used for other percutaneous endoscopic spinal surgeries such as interlaminar PELS and percutaneous endoscopic cervical and thoracic surgeries through further research. The development of this robot-assisted system can be of great significance. First, the robot can improve the accuracy and efficiency of endoscopic spinal surgeries. In addition, it can avoid multiple intraoperative fluoroscopies, minimize exposure to both patients and the surgical staff, shorten the operative time, and improve the learning curve of beginners, which is beneficial to the popularization of percutaneous endoscopic spinal surgeries.
Collapse
Affiliation(s)
- Ning Fan
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China.,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China
| | - Shuo Yuan
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China.,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China
| | - Peng Du
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China.,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China
| | - Wenyi Zhu
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China.,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China
| | - Liang Li
- Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China.,Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Yong Hai
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China.,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China
| | - Hui Ding
- Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China.,Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Guangzhi Wang
- Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China. .,Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China.
| | - Lei Zang
- Department of Orthopedics, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China. .,Chaoyang-Tsinghua Digitization & Artificial Intelligence Orthopedic Laboratory, Beijing, China.
| |
Collapse
|
46
|
Intraoperative cone beam computed tomography is as reliable as conventional computed tomography for identification of pedicle screw breach in thoracolumbar spine surgery. Eur Radiol 2020; 31:2349-2356. [PMID: 33006659 PMCID: PMC7979653 DOI: 10.1007/s00330-020-07315-5] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2020] [Revised: 08/04/2020] [Accepted: 09/17/2020] [Indexed: 12/01/2022]
Abstract
Objectives To test the hypothesis that intraoperative cone beam computed tomography (CBCT) using the Allura augmented reality surgical navigation (ARSN) system in a dedicated hybrid operating room (OR) matches computed tomography (CT) for identification of pedicle screw breach during spine surgery. Methods Twenty patients treated with spinal fixation surgery (260 screws) underwent intraoperative CBCT as well as conventional postoperative CT scans (median 12 months after surgery) to identify and grade the degree of pedicle screw breach on both scan types, according to the Gertzbein grading scale. Blinded assessments were performed by three independent spine surgeons and the CT served as the standard of reference. Screws graded as Gertzbein 0 or 1 were considered clinically accurate while grades 2 or 3 were considered inaccurate. Sensitivity, specificity, and negative predictive value were the primary metrics of diagnostic performance. Results For this patient group, the negative predictive value of an intraoperative CBCT to rule out pedicle screw breach was 99.6% (CI 97.75–99.99%). Among 10 screws graded as inaccurate on CT, 9 were graded as such on the CBCT, giving a sensitivity of 90.0% (CI 55.5–99.75%). Among the 250 screws graded as accurate on CT, 244 were graded as such on the CBCT, giving a specificity of 97.6% (CI 94.85–99.11%). Conclusions CBCT, performed intraoperatively with the Allura ARSN system, is comparable and non-inferior to a conventional postoperative CT scan for ruling out misplaced pedicle screws in spinal deformity cases, eliminating the need for a postoperative CT. Key Points • Intraoperative cone beam computed tomography (CT) using the Allura ARSN is comparable with conventional CT for ruling out pedicle screw breaches after spinal fixation surgery. • Intraoperative cone beam computed tomography can be used to assess need for revisions of pedicle screws making routine postoperative CT scans unnecessary. • Using cone beam computed tomography, the specificity was 97.6% and the sensitivity was 90% for detecting pedicle screw breaches and the negative predictive value for ruling out a pedicle screw breach was 99.6%.
Collapse
|
47
|
Does Augmented Reality Navigation Increase Pedicle Screw Density Compared to Free-Hand Technique in Deformity Surgery? Single Surgeon Case Series of 44 Patients. Spine (Phila Pa 1976) 2020; 45:E1085-E1090. [PMID: 32355149 DOI: 10.1097/brs.0000000000003518] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN Retrospective comparison between an interventional and a control cohort. OBJECTIVE The aim of this study was to investigate whether the use of an augmented reality surgical navigation (ARSN) system for pedicle screw (PS) placement in deformity cases could alter the total implant density and PS to hook ratio compared to free-hand (FH) technique. SUMMARY OF BACKGROUND DATA Surgical navigation in deformity surgery provides the possibility to place PS in small and deformed pedicles were hooks would otherwise have been placed, and thereby achieve a higher screw density in the constructs that may result in better long-term patient outcomes. METHODS Fifteen deformity cases treated with ARSN were compared to 29 cases treated by FH. All surgeries were performed by the same orthopedic spine surgeon. PS, hook, and combined implant density were primary outcomes. Procedure time, deformity correction, length of hospital stay, and blood loss were secondary outcomes. The surgeries in the ARSN group were performed in a hybrid operating room (OR) with a ceiling-mounted robotic C-arm with integrated video cameras for AR navigation. The FH group was operated with or without fluoroscopy as deemed necessary by the surgeon. RESULTS Both groups had an overall high-density construct (>80% total implant density). The ARSN group, had a significantly higher PS density, 86.3% ± 14.6% versus 74.7% ± 13.9% in the FH group (P < 0.05), whereas the hook density was 2.2% ± 3.0% versus 9.7% ± 9.6% (P < 0.001). Neither the total procedure time (min) 431 ± 98 versus 417 ± 145 nor the deformity correction 59.3% ± 16.6% versus 60.1% ± 17.8% between the groups were significantly affected. CONCLUSION This study indicates that ARSN enables the surgeon to increase the PS density and thereby minimize the use of hooks in deformity surgery without prolonging the OR time. This may result in better constructs with possible long-term advantage and less need for revision surgery. LEVEL OF EVIDENCE 3.
Collapse
|
48
|
Chang M, Canseco JA, Nicholson KJ, Patel N, Vaccaro AR. The Role of Machine Learning in Spine Surgery: The Future Is Now. Front Surg 2020; 7:54. [PMID: 32974382 PMCID: PMC7472375 DOI: 10.3389/fsurg.2020.00054] [Citation(s) in RCA: 52] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 07/13/2020] [Indexed: 12/12/2022] Open
Abstract
The recent influx of machine learning centered investigations in the spine surgery literature has led to increased enthusiasm as to the prospect of using artificial intelligence to create clinical decision support tools, optimize postoperative outcomes, and improve technologies used in the operating room. However, the methodology underlying machine learning in spine research is often overlooked as the subject matter is quite novel and may be foreign to practicing spine surgeons. Improper application of machine learning is a significant bioethics challenge, given the potential consequences of over- or underestimating the results of such studies for clinical decision-making processes. Proper peer review of these publications requires a baseline familiarity of the language associated with machine learning, and how it differs from classical statistical analyses. This narrative review first introduces the overall field of machine learning and its role in artificial intelligence, and defines basic terminology. In addition, common modalities for applying machine learning, including classification and regression decision trees, support vector machines, and artificial neural networks are examined in the context of examples gathered from the spine literature. Lastly, the ethical challenges associated with adapting machine learning for research related to patient care, as well as future perspectives on the potential use of machine learning in spine surgery, are discussed specifically.
Collapse
Affiliation(s)
- Michael Chang
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | - Jose A. Canseco
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | | | - Neil Patel
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | - Alexander R. Vaccaro
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| |
Collapse
|
49
|
Manni F, Elmi-Terander A, Burström G, Persson O, Edström E, Holthuizen R, Shan C, Zinger S, van der Sommen F, de With PHN. Towards Optical Imaging for Spine Tracking without Markers in Navigated Spine Surgery. SENSORS (BASEL, SWITZERLAND) 2020; 20:E3641. [PMID: 32610555 PMCID: PMC7374436 DOI: 10.3390/s20133641] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/19/2020] [Revised: 06/13/2020] [Accepted: 06/22/2020] [Indexed: 12/18/2022]
Abstract
Surgical navigation systems are increasingly used for complex spine procedures to avoid neurovascular injuries and minimize the risk for reoperations. Accurate patient tracking is one of the prerequisites for optimal motion compensation and navigation. Most current optical tracking systems use dynamic reference frames (DRFs) attached to the spine, for patient movement tracking. However, the spine itself is subject to intrinsic movements which can impact the accuracy of the navigation system. In this study, we aimed to detect the actual patient spine features in different image views captured by optical cameras, in an augmented reality surgical navigation (ARSN) system. Using optical images from open spinal surgery cases, acquired by two gray-scale cameras, spinal landmarks were identified and matched in different camera views. A computer vision framework was created for preprocessing of the spine images, detecting and matching local invariant image regions. We compared four feature detection algorithms, Speeded Up Robust Feature (SURF), Maximal Stable Extremal Region (MSER), Features from Accelerated Segment Test (FAST), and Oriented FAST and Rotated BRIEF (ORB) to elucidate the best approach. The framework was validated in 23 patients and the 3D triangulation error of the matched features was < 0 . 5 mm. Thus, the findings indicate that spine feature detection can be used for accurate tracking in navigated surgery.
Collapse
Affiliation(s)
- Francesca Manni
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Adrian Elmi-Terander
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Gustav Burström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Oscar Persson
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Erik Edström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | | | - Caifeng Shan
- Philips Research, High Tech Campus 36, 5656 AE Eindhoven, The Netherlands;
| | - Svitlana Zinger
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Fons van der Sommen
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Peter H. N. de With
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| |
Collapse
|
50
|
Hyperspectral Imaging for Skin Feature Detection: Advances in Markerless Tracking for Spine Surgery. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10124078] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
In spinal surgery, surgical navigation is an essential tool for safe intervention, including the placement of pedicle screws without injury to nerves and blood vessels. Commercially available systems typically rely on the tracking of a dynamic reference frame attached to the spine of the patient. However, the reference frame can be dislodged or obscured during the surgical procedure, resulting in loss of navigation. Hyperspectral imaging (HSI) captures a large number of spectral information bands across the electromagnetic spectrum, providing image information unseen by the human eye. We aim to exploit HSI to detect skin features in a novel methodology to track patient position in navigated spinal surgery. In our approach, we adopt two local feature detection methods, namely a conventional handcrafted local feature and a deep learning-based feature detection method, which are compared to estimate the feature displacement between different frames due to motion. To demonstrate the ability of the system in tracking skin features, we acquire hyperspectral images of the skin of 17 healthy volunteers. Deep-learned skin features are detected and localized with an average error of only 0.25 mm, outperforming the handcrafted local features with respect to the ground truth based on the use of optical markers.
Collapse
|