1
|
Wang G, Chen C, Jiang Z, Li G, Wu C, Li S. Efficient Use of Biological Data in the Web 3.0 Era by Applying Nonfungible Token Technology. J Med Internet Res 2024; 26:e46160. [PMID: 38805706 PMCID: PMC11167317 DOI: 10.2196/46160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Revised: 09/26/2023] [Accepted: 03/24/2024] [Indexed: 05/30/2024] Open
Abstract
CryptoKitties, a trendy game on Ethereum that is an open-source public blockchain platform with a smart contract function, brought nonfungible tokens (NFTs) into the public eye in 2017. NFTs are popular because of their nonfungible properties and their unique and irreplaceable nature in the real world. The embryonic form of NFTs can be traced back to a P2P network protocol improved based on Bitcoin in 2012 that can realize decentralized digital asset transactions. NFTs have recently gained much attention and have shown an unprecedented explosive growth trend. Herein, the concept of digital asset NFTs is introduced into the medical and health field to conduct a subversive discussion on biobank operations. By converting biomedical data into NFTs, the collection and circulation of samples can be accelerated, and the transformation of resources can be promoted. In conclusion, the biobank can achieve sustainable development through "decentralization."
Collapse
Affiliation(s)
- Guanyi Wang
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Chen Chen
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Ziyu Jiang
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Gang Li
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Can Wu
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Sheng Li
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| |
Collapse
|
2
|
Qi Z, Jin H, Xu X, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. Head model dataset for mixed reality navigation in neurosurgical interventions for intracranial lesions. Sci Data 2024; 11:538. [PMID: 38796526 PMCID: PMC11127921 DOI: 10.1038/s41597-024-03385-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2024] [Accepted: 05/15/2024] [Indexed: 05/28/2024] Open
Abstract
Mixed reality navigation (MRN) technology is emerging as an increasingly significant and interesting topic in neurosurgery. MRN enables neurosurgeons to "see through" the head with an interactive, hybrid visualization environment that merges virtual- and physical-world elements. Offering immersive, intuitive, and reliable guidance for preoperative and intraoperative intervention of intracranial lesions, MRN showcases its potential as an economically efficient and user-friendly alternative to standard neuronavigation systems. However, the clinical research and development of MRN systems present challenges: recruiting a sufficient number of patients within a limited timeframe is difficult, and acquiring low-cost, commercially available, medically significant head phantoms is equally challenging. To accelerate the development of novel MRN systems and surmount these obstacles, the study presents a dataset designed for MRN system development and testing in neurosurgery. It includes CT and MRI data from 19 patients with intracranial lesions and derived 3D models of anatomical structures and validation references. The models are available in Wavefront object (OBJ) and Stereolithography (STL) formats, supporting the creation and assessment of neurosurgical MRN applications.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
- NCO School, Army Medical University, 050081, Shijiazhuang, China
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Ruochu Xiong
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, 920-8641, Kanazawa, Ishikawa, Japan
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
- Medical School of Chinese PLA General Hospital, 100853, Beijing, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, 100853, Beijing, China.
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany
| | - Miriam H A Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043, Marburg, Germany.
- Center for Mind, Brain and Behavior (CMBB), 35043, Marburg, Germany.
| |
Collapse
|
3
|
Canton SP, Austin CN, Steuer F, Dadi S, Sharma N, Kass NM, Fogg D, Clayton E, Cunningham O, Scott D, LaBaze D, Andrews EG, Biehl JT, Hogan MV. Feasibility and Usability of Augmented Reality Technology in the Orthopaedic Operating Room. Curr Rev Musculoskelet Med 2024; 17:117-128. [PMID: 38607522 PMCID: PMC11068703 DOI: 10.1007/s12178-024-09888-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/06/2024] [Indexed: 04/13/2024]
Abstract
PURPOSE OF REVIEW Augmented reality (AR) has gained popularity in various sectors, including gaming, entertainment, and healthcare. The desire for improved surgical navigation within orthopaedic surgery has led to the evaluation of the feasibility and usability of AR in the operating room (OR). However, the safe and effective use of AR technology in the OR necessitates a proper understanding of its capabilities and limitations. This review aims to describe the fundamental elements of AR, highlight limitations for use within the field of orthopaedic surgery, and discuss potential areas for development. RECENT FINDINGS To date, studies have demonstrated evidence that AR technology can be used to enhance navigation and performance in orthopaedic procedures. General hardware and software limitations of the technology include the registration process, ergonomics, and battery life. Other limitations are related to the human response factors such as inattentional blindness, which may lead to the inability to see complications within the surgical field. Furthermore, the prolonged use of AR can cause eye strain and headache due to phenomena such as the vergence-convergence conflict. AR technology may prove to be a better alternative to current orthopaedic surgery navigation systems. However, the current limitations should be mitigated to further improve the feasibility and usability of AR in the OR setting. It is important for both non-clinicians and clinicians to work in conjunction to guide the development of future iterations of AR technology and its implementation into the OR workflow.
Collapse
Affiliation(s)
- Stephen P Canton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA.
| | | | - Fritz Steuer
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Srujan Dadi
- Rowan-Virtua School of Osteopathic Medicine, Stratford, NJ, USA
| | - Nikhil Sharma
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Nicolás M Kass
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - David Fogg
- Texas Tech University Health Sciences Center El Paso, El Paso, TX, USA
| | - Elizabeth Clayton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Onaje Cunningham
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Devon Scott
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Dukens LaBaze
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Edward G Andrews
- Department of Neurological Surgery University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Jacob T Biehl
- School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, USA
| | - MaCalus V Hogan
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| |
Collapse
|
4
|
Sharma N, Mallela AN, Khan T, Canton SP, Kass NM, Steuer F, Jardini J, Biehl J, Andrews EG. Evolution of the meta-neurosurgeon: A systematic review of the current technical capabilities, limitations, and applications of augmented reality in neurosurgery. Surg Neurol Int 2024; 15:146. [PMID: 38742013 PMCID: PMC11090549 DOI: 10.25259/sni_167_2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 04/05/2024] [Indexed: 05/16/2024] Open
Abstract
Background Augmented reality (AR) applications in neurosurgery have expanded over the past decade with the introduction of headset-based platforms. Many studies have focused on either preoperative planning to tailor the approach to the patient's anatomy and pathology or intraoperative surgical navigation, primarily realized as AR navigation through microscope oculars. Additional efforts have been made to validate AR in trainee and patient education and to investigate novel surgical approaches. Our objective was to provide a systematic overview of AR in neurosurgery, provide current limitations of this technology, as well as highlight several applications of AR in neurosurgery. Methods We performed a literature search in PubMed/Medline to identify papers that addressed the use of AR in neurosurgery. The authors screened three hundred and seventy-five papers, and 57 papers were selected, analyzed, and included in this systematic review. Results AR has made significant inroads in neurosurgery, particularly in neuronavigation. In spinal neurosurgery, this primarily has been used for pedicle screw placement. AR-based neuronavigation also has significant applications in cranial neurosurgery, including neurovascular, neurosurgical oncology, and skull base neurosurgery. Other potential applications include operating room streamlining, trainee and patient education, and telecommunications. Conclusion AR has already made a significant impact in neurosurgery in the above domains and has the potential to be a paradigm-altering technology. Future development in AR should focus on both validating these applications and extending the role of AR.
Collapse
Affiliation(s)
- Nikhil Sharma
- School of Medicine, University of Pittsburgh, Pittsburgh, United States
| | - Arka N. Mallela
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| | - Talha Khan
- Department of Computing and Information, University of Pittsburgh, Pittsburgh, United States
| | - Stephen Paul Canton
- Department of Orthopaedic Surgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| | | | - Fritz Steuer
- School of Medicine, University of Pittsburgh, Pittsburgh, United States
| | - Jacquelyn Jardini
- Department of Biology, Haverford College, Haverford, Pennsylvania, United States
| | - Jacob Biehl
- Department of Computing and Information, University of Pittsburgh, Pittsburgh, United States
| | - Edward G. Andrews
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| |
Collapse
|
5
|
von Atzigen M, Liebmann F, Cavalcanti NA, Anh Baran T, Wanivenhaus F, Spirig JM, Rauter G, Snedeker J, Farshad M, Fürnstahl P. Reducing residual forces in spinal fusion using a custom-built rod bending machine. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2024; 247:108096. [PMID: 38447314 DOI: 10.1016/j.cmpb.2024.108096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2023] [Revised: 02/17/2024] [Accepted: 02/19/2024] [Indexed: 03/08/2024]
Abstract
BACKGROUND AND OBJECTIVE As part of spinal fusion surgery, shaping the rod implant to align with the anatomy is a tedious, error-prone, and time-consuming manual process. Inadequately contoured rod implants introduce stress on the screw-bone interface of the pedicle screws, potentially leading to screw loosening or even pull-out. METHODS We propose the first fully automated solution to the rod bending problem by leveraging the advantages of augmented reality and robotics. Augmented reality not only enables the surgeons to intraoperatively digitize the screw positions but also provides a human-computer interface to the wirelessly integrated custom-built rod bending machine. Furthermore, we introduce custom-built test rigs to quantify per screw absolute tensile/compressive residual forces on the screw-bone interface. Besides residual forces, we have evaluated the required bending times and reducer engagements, and compared our method to the freehand gold standard. RESULTS We achieved a significant reduction of the average absolute residual forces from for the freehand gold standard to (p=0.0015) using the bending machine. Moreover, our bending machine reduced the average time to instrumentation per screw from to . Reducer engagements per rod were significantly decreased from an average of 1.00±1.14 to 0.11±0.32 (p=0.0037). CONCLUSION The combination of augmented reality and robotics has the potential to improve surgical outcomes while minimizing the dependency on individual surgeon skill and dexterity.
Collapse
Affiliation(s)
- Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Florentin Liebmann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Nicola A Cavalcanti
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - The Anh Baran
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Computer Aided Medical Procedures (CAMP), Technical University of Munich, Munich, Germany
| | - Florian Wanivenhaus
- Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; University Spine Center Zurich, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - José Miguel Spirig
- Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; University Spine Center Zurich, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Georg Rauter
- Bio-Inspired RObots for MEDicine-Lab, University of Basel, Basel, Switzerland
| | - Jess Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland; Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Mazda Farshad
- Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; University Spine Center Zurich, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
6
|
Li F, Gao Q, Wang N, Greene N, Song T, Dianat O, Azimi E. Mixed reality guided root canal therapy. Healthc Technol Lett 2024; 11:167-178. [PMID: 38638496 PMCID: PMC11022218 DOI: 10.1049/htl2.12077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Accepted: 01/11/2024] [Indexed: 04/20/2024] Open
Abstract
Root canal therapy (RCT) is a widely performed procedure in dentistry, with over 25 million individuals undergoing it annually. This procedure is carried out to address inflammation or infection within the root canal system of affected teeth. However, accurately aligning CT scan information with the patient's tooth has posed challenges, leading to errors in tool positioning and potential negative outcomes. To overcome these challenges, a mixed reality application is developed using an optical see-through head-mounted display (OST-HMD). The application incorporates visual cues, an augmented mirror, and dynamically updated multi-view CT slices to address depth perception issues and achieve accurate tooth localization, comprehensive canal exploration, and prevention of perforation during RCT. Through the preliminary experimental assessment, significant improvements in the accuracy of the procedure are observed. Specifically, with the system the accuracy in position was improved from 1.4 to 0.4 mm (more than a 70% gain) using an Optical Tracker (NDI) and from 2.8 to 2.4 mm using an HMD, thereby achieving submillimeter accuracy with NDI. 6 participants were enrolled in the user study. The result of the study suggests that the average displacement on the crown plane of 1.27 ± 0.83 cm, an average depth error of 0.90 ± 0.72 cm and an average angular deviation of 1.83 ± 0.83°. Our error analysis further highlights the impact of HMD spatial localization and head motion on the registration and calibration process. Through seamless integration of CT image information with the patient's tooth, our mixed reality application assists dentists in achieving precise tool placement. This advancement in technology has the potential to elevate the quality of root canal procedures, ensuring better accuracy and enhancing overall treatment outcomes.
Collapse
Affiliation(s)
- Fangjie Li
- Department of Biomedical EngineeringJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Qingying Gao
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nengyu Wang
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nicholas Greene
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Tianyu Song
- Chair for Computer Aided Medical ProceduresTechnical University of MunichMunichGermany
| | - Omid Dianat
- School of DentistryUniversity of MarylandBaltimoreMarylandUSA
| | - Ehsan Azimi
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| |
Collapse
|
7
|
Frisk H, Burström G, Persson O, El-Hajj VG, Coronado L, Hager S, Edström E, Elmi-Terander A. Automatic image registration on intraoperative CBCT compared to Surface Matching registration on preoperative CT for spinal navigation: accuracy and workflow. Int J Comput Assist Radiol Surg 2024:10.1007/s11548-024-03076-4. [PMID: 38378987 DOI: 10.1007/s11548-024-03076-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Accepted: 02/09/2024] [Indexed: 02/22/2024]
Abstract
INTRODUCTION Spinal navigation solutions have been slower to develop compared to cranial ones. To facilitate greater adoption and use of spinal navigation, the relatively cumbersome registration processes need to be improved upon. This study aims to validate a new solution for automatic image registration and compare it to a traditional Surface Matching method. METHOD Adult patients undergoing spinal surgery requiring navigation were enrolled after providing consent. A registration matrix-Universal AIR (= Automatic Image Registration)-was placed in the surgical field and used for automatic registration based on intraoperative 3D imaging. A standard Surface Matching method was used for comparison. Accuracy measurements were obtained by comparing planned and acquired coordinates on the vertebrae. RESULTS Thirty-nine patients with 42 datasets were included. The mean accuracy of Universal AIR registration was 1.20 ± 0.42 mm, while the mean accuracy of Surface Matching registration was 1.94 ± 0.64 mm. Universal AIR registration was non-inferior to Surface Matching registration. Post hoc analysis showed a significantly greater accuracy for Universal AIR registration. In Surface Matching, but not automatic registration, user-related errors such as incorrect identification of the vertebral level were seen. CONCLUSION Automatic image registration for spinal navigation using Universal AIR and intraoperative 3D imaging provided improved accuracy compared to Surface Matching registration. In addition, it minimizes user errors and offers a standardized workflow, making it a reliable registration method for navigated spinal procedures.
Collapse
Affiliation(s)
- Henrik Frisk
- Department of Clinical Neuroscience, Karolinska Institutet, 171 77, Stockholm, Sweden.
| | - Gustav Burström
- Department of Clinical Neuroscience, Karolinska Institutet, 171 77, Stockholm, Sweden
| | - Oscar Persson
- Department of Clinical Neuroscience, Karolinska Institutet, 171 77, Stockholm, Sweden
| | | | | | | | - Erik Edström
- Department of Clinical Neuroscience, Karolinska Institutet, 171 77, Stockholm, Sweden
- Capio Spine Center Stockholm, Löwenströmska Hospital, Upplands-Väsby, Sweden
| | - Adrian Elmi-Terander
- Department of Clinical Neuroscience, Karolinska Institutet, 171 77, Stockholm, Sweden
- Capio Spine Center Stockholm, Löwenströmska Hospital, Upplands-Väsby, Sweden
- Department of Surgical Sciences, Uppsala University, Uppsala, Sweden
| |
Collapse
|
8
|
Morley CT, Arreola DM, Qian L, Lynn AL, Veigulis ZP, Osborne TF. Mixed Reality Surgical Navigation System; Positional Accuracy Based on Food and Drug Administration Standard. Surg Innov 2024; 31:48-57. [PMID: 38019844 PMCID: PMC10773158 DOI: 10.1177/15533506231217620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2023]
Abstract
BACKGROUND Computer assisted surgical navigation systems are designed to improve outcomes by providing clinicians with procedural guidance information. The use of new technologies, such as mixed reality, offers the potential for more intuitive, efficient, and accurate procedural guidance. The goal of this study is to assess the positional accuracy and consistency of a clinical mixed reality system that utilizes commercially available wireless head-mounted displays (HMDs), custom software, and localization instruments. METHODS Independent teams using the second-generation Microsoft HoloLens© hardware, Medivis SurgicalAR© software, and localization instruments, tested the accuracy of the combined system at different institutions, times, and locations. The ASTM F2554-18 consensus standard for computer-assisted surgical systems, as recognized by the U.S. FDA, was utilized to measure the performance. 288 tests were performed. RESULTS The system demonstrated consistent results, with an average accuracy performance that was better than one millimeter (.75 ± SD .37 mm). CONCLUSION Independently acquired positional tracking accuracies exceed conventional in-market surgical navigation tracking systems and FDA standards. Importantly, the performance was achieved at two different institutions, using an international testing standard, and with a system that included a commercially available off-the-shelf wireless head mounted display and software.
Collapse
Affiliation(s)
| | - David M. Arreola
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
| | | | | | - Zachary P. Veigulis
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Business Analytics, Tippie College of Business, University of Iowa, Iowa, IA, USA
| | - Thomas F. Osborne
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
9
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
10
|
Zhao X, Zhao H, Zheng W, Gohritz A, Shen Y, Xu W. Clinical evaluation of augmented reality-based 3D navigation system for brachial plexus tumor surgery. World J Surg Oncol 2024; 22:20. [PMID: 38233922 PMCID: PMC10792838 DOI: 10.1186/s12957-023-03288-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/26/2023] [Indexed: 01/19/2024] Open
Abstract
BACKGROUND Augmented reality (AR), a form of 3D imaging technology, has been preliminarily applied in tumor surgery of the head and spine, both are rigid bodies. However, there is a lack of research evaluating the clinical value of AR in tumor surgery of the brachial plexus, a non-rigid body, where the anatomical position varies with patient posture. METHODS Prior to surgery in 8 patients diagnosed with brachial plexus tumors, conventional MRI scans were performed to obtain conventional 2D MRI images. The MRI data were then differentiated automatically and converted into AR-based 3D models. After point-to-point relocation and registration, the 3D models were projected onto the patient's body using a head-mounted display for navigation. To evaluate the clinical value of AR-based 3D models compared to the conventional 2D MRI images, 2 senior hand surgeons completed questionnaires on the evaluation of anatomical structures (tumor, arteries, veins, nerves, bones, and muscles), ranging from 1 (strongly disagree) to 5 (strongly agree). RESULTS Surgeons rated AR-based 3D models as superior to conventional MRI images for all anatomical structures, including tumors. Furthermore, AR-based 3D models were preferred for preoperative planning and intraoperative navigation, demonstrating their added value. The mean positional error between the 3D models and intraoperative findings was approximately 1 cm. CONCLUSIONS This study evaluated, for the first time, the clinical value of an AR-based 3D navigation system in preoperative planning and intraoperative navigation for brachial plexus tumor surgery. By providing more direct spatial visualization, compared with conventional 2D MRI images, this 3D navigation system significantly improved the clinical accuracy and safety of tumor surgery in non-rigid bodies.
Collapse
Affiliation(s)
- Xuanyu Zhao
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Huali Zhao
- Department of Radiology, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
| | - Wanling Zheng
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China
| | - Andreas Gohritz
- Department of Plastic, Reconstructive, Aesthetic and Hand Surgery, University Hospital Basel, University of Basel, Basel, Switzerland
| | - Yundong Shen
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
| | - Wendong Xu
- Department of Hand and Upper Extremity Surgery, Jing'an District Central Hospital, Branch of Huashan Hospital, Fudan University, Shanghai, China.
- Department of Hand Surgery, Huashan Hospital, Fudan University, Shanghai, China.
- The National Clinical Research Center for Aging and Medicine, Fudan University, Shanghai, China.
- Institute of Brain Science, State Key Laboratory of Medical Neurobiology and Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, China.
- Research Unit of Synergistic Reconstruction of Upper and Lower Limbs after Brain Injury, Chinese Academy of Medical Sciences, Beijing, China.
| |
Collapse
|
11
|
Liebmann F, von Atzigen M, Stütz D, Wolf J, Zingg L, Suter D, Cavalcanti NA, Leoty L, Esfandiari H, Snedeker JG, Oswald MR, Pollefeys M, Farshad M, Fürnstahl P. Automatic registration with continuous pose updates for marker-less surgical navigation in spine surgery. Med Image Anal 2024; 91:103027. [PMID: 37992494 DOI: 10.1016/j.media.2023.103027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 10/29/2023] [Accepted: 11/09/2023] [Indexed: 11/24/2023]
Abstract
Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.
Collapse
Affiliation(s)
- Florentin Liebmann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Dominik Stütz
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland
| | - Julian Wolf
- Product Development Group, ETH Zurich, Zurich, Switzerland
| | - Lukas Zingg
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Daniel Suter
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Nicola A Cavalcanti
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Laura Leoty
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Jess G Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Martin R Oswald
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Computer Vision Lab, University of Amsterdam, Amsterdam, Netherlands
| | - Marc Pollefeys
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Microsoft Mixed Reality and AI Zurich Lab, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
12
|
von Niederhäusern PA, Seppi C, Sandkühler R, Nicolas G, Haerle SK, Cattin PC. Augmented reality for sentinel lymph node biopsy. Int J Comput Assist Radiol Surg 2024; 19:171-180. [PMID: 37747574 PMCID: PMC10770201 DOI: 10.1007/s11548-023-03014-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2022] [Accepted: 08/29/2023] [Indexed: 09/26/2023]
Abstract
INTRODUCTION Sentinel lymph node biopsy for oral and oropharyngeal squamous cell carcinoma is a well-established staging method. One variation is to inject a radioactive tracer near the primary tumor of the patient. After a few minutes, audio feedback from an external hand-held [Formula: see text]-detection probe can monitor the uptake into the lymphatic system. Such probes place a high cognitive load on the surgeon during the biopsy, as they require the simultaneous use of both hands and the skills necessary to correlate the audio signal with the location of tracer accumulation in the lymph nodes. Therefore, an augmented reality (AR) approach to directly visualize and thus discriminate nearby lymph nodes would greatly reduce the surgeons' cognitive load. MATERIALS AND METHODS We present a proof of concept of an AR approach for sentinel lymph node biopsy by ex vivo experiments. The 3D position of the radioactive [Formula: see text]-sources is reconstructed from a single [Formula: see text]-image, acquired by a stationary table-attached multi-pinhole [Formula: see text]-detector. The position of the sources is then visualized using Microsoft's HoloLens. We further investigate the performance of our SLNF algorithm for a single source, two sources, and two sources with a hot background. RESULTS In our ex vivo experiments, a single [Formula: see text]-source and its AR representation show good correlation with known locations, with a maximum error of 4.47 mm. The SLNF algorithm performs well when only one source is reconstructed, with a maximum error of 7.77 mm. For the more challenging case to reconstruct two sources, the errors vary between 2.23 mm and 75.92 mm. CONCLUSION This proof of concept shows promising results in reconstructing and displaying one [Formula: see text]-source. Two simultaneously recorded sources are more challenging and require further algorithmic optimization.
Collapse
Affiliation(s)
| | - Carlo Seppi
- Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland
| | - Robin Sandkühler
- Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland
| | - Guillaume Nicolas
- Nuclear Medicine Clinic, University Hospital Basel, Basel, Switzerland
| | | | - Philippe C Cattin
- Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland
| |
Collapse
|
13
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
14
|
Shaikh HJF, Hasan SS, Woo JJ, Lavoie-Gagne O, Long WJ, Ramkumar PN. Exposure to Extended Reality and Artificial Intelligence-Based Manifestations: A Primer on the Future of Hip and Knee Arthroplasty. J Arthroplasty 2023; 38:2096-2104. [PMID: 37196732 DOI: 10.1016/j.arth.2023.05.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 05/06/2023] [Accepted: 05/08/2023] [Indexed: 05/19/2023] Open
Abstract
BACKGROUND Software-infused services, from robot-assisted and wearable technologies to artificial intelligence (AI)-laden analytics, continue to augment clinical orthopaedics - namely hip and knee arthroplasty. Extended reality (XR) tools, which encompass augmented reality, virtual reality, and mixed reality technology, represent a new frontier for expanding surgical horizons to maximize technical education, expertise, and execution. The purpose of this review is to critically detail and evaluate the recent developments surrounding XR in the field of hip and knee arthroplasty and to address potential future applications as they relate to AI. METHODS In this narrative review surrounding XR, we discuss (1) definitions, (2) techniques, (3) studies, (4) current applications, and (5) future directions. We highlight XR subsets (augmented reality, virtual reality, and mixed reality) as they relate to AI in the increasingly digitized ecosystem within hip and knee arthroplasty. RESULTS A narrative review of the XR orthopaedic ecosystem with respect to XR developments is summarized with specific emphasis on hip and knee arthroplasty. The XR as a tool for education, preoperative planning, and surgical execution is discussed with future applications dependent upon AI to potentially obviate the need for robotic assistance and preoperative advanced imaging without sacrificing accuracy. CONCLUSION In a field where exposure is critical to clinical success, XR represents a novel stand-alone software-infused service that optimizes technical education, execution, and expertise but necessitates integration with AI and previously validated software solutions to offer opportunities that improve surgical precision with or without the use of robotics and computed tomography-based imaging.
Collapse
Affiliation(s)
| | - Sayyida S Hasan
- Donald and Barbara Zucker School of Medicine at Hofstra, Uniondale, New York
| | | | | | | | - Prem N Ramkumar
- Hospital for Special Surgery, New York, New York; Long Beach Orthopaedic Institute, Long Beach, California
| |
Collapse
|
15
|
Suter D, Hodel S, Liebmann F, Fürnstahl P, Farshad M. Factors affecting augmented reality head-mounted device performance in real OR. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2023; 32:3425-3433. [PMID: 37552327 DOI: 10.1007/s00586-023-07826-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 05/01/2023] [Accepted: 06/12/2023] [Indexed: 08/09/2023]
Abstract
PURPOSE Over the last years, interest and efforts to implement augmented reality (AR) in orthopedic surgery through head-mounted devices (HMD) have increased. However, the majority of experiments were preclinical and within a controlled laboratory environment. The operating room (OR) is a more challenging environment with various confounding factors potentially affecting the performance of an AR-HMD. The aim of this study was to assess the performance of an AR-HMD in a real-life OR setting. METHODS An established AR application using the HoloLens 2 HMD was tested in an OR and in a laboratory by two users. The accuracy of the hologram overlay, the time to complete the trial, the number of rejected registration attempts, the delay in live overlay of the hologram, and the number of completely failed runs were recorded. Further, different OR setting parameters (light condition, setting up partitions, movement of personnel, and anchor placement) were modified and compared. RESULTS Time for full registration was higher with 48 s (IQR 24 s) in the OR versus 33 s (IQR 10 s) in the laboratory setting (p < 0.001). The other investigated parameters didn't differ significantly if an optimal OR setting was used. Within the OR, the strongest influence on performance of the AR-HMD was different light conditions with direct light illumination on the situs being the least favorable. CONCLUSION AR-HMDs are affected by different OR setups. Standardization measures for better AR-HMD performance include avoiding direct light illumination on the situs, setting up partitions, and minimizing the movement of personnel.
Collapse
Affiliation(s)
- Daniel Suter
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland.
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Sandro Hodel
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Florentin Liebmann
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zurich, Balgrist Campus, Lengghalde 5, 8008, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
- Spine Division, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
16
|
Tu P, Wang H, Joskowicz L, Chen X. A multi-view interactive virtual-physical registration method for mixed reality based surgical navigation in pelvic and acetabular fracture fixation. Int J Comput Assist Radiol Surg 2023; 18:1715-1724. [PMID: 37031310 DOI: 10.1007/s11548-023-02884-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2023] [Accepted: 03/21/2023] [Indexed: 04/10/2023]
Abstract
PURPOSE The treatment of pelvic and acetabular fractures remains technically demanding, and traditional surgical navigation systems suffer from the hand-eye mis-coordination. This paper describes a multi-view interactive virtual-physical registration method to enhance the surgeon's depth perception and a mixed reality (MR)-based surgical navigation system for pelvic and acetabular fracture fixation. METHODS First, the pelvic structure is reconstructed by segmentation in a preoperative CT scan, and an insertion path for the percutaneous LC-II screw is computed. A custom hand-held registration cube is used for virtual-physical registration. Three strategies are proposed to improve the surgeon's depth perception: vertices alignment, tremble compensation and multi-view averaging. During navigation, distance and angular deviation visual cues are updated to help the surgeon with the guide wire insertion. The methods have been integrated into an MR module in a surgical navigation system. RESULTS Phantom experiments were conducted. Ablation experimental results demonstrated the effectiveness of each strategy in the virtual-physical registration method. The proposed method achieved the best accuracy in comparison with related works. For percutaneous guide wire placement, our system achieved a mean bony entry point error of 2.76 ± 1.31 mm, a mean bony exit point error of 4.13 ± 1.74 mm, and a mean angular deviation of 3.04 ± 1.22°. CONCLUSIONS The proposed method can improve the virtual-physical fusion accuracy. The developed MR-based surgical navigation system has clinical application potential. Cadaver and clinical experiments will be conducted in future.
Collapse
Affiliation(s)
- Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Huixiang Wang
- Department of Orthopedics, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China.
- Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
17
|
Bhatt FR, Orosz LD, Tewari A, Boyd D, Roy R, Good CR, Schuler TC, Haines CM, Jazini E. Augmented Reality-Assisted Spine Surgery: An Early Experience Demonstrating Safety and Accuracy with 218 Screws. Global Spine J 2023; 13:2047-2052. [PMID: 35000409 PMCID: PMC10556900 DOI: 10.1177/21925682211069321] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
STUDY DESIGN Prospective cohort study. OBJECTIVES In spine surgery, accurate screw guidance is critical to achieving satisfactory fixation. Augmented reality (AR) is a novel technology to assist in screw placement and has shown promising results in early studies. This study aims to provide our early experience evaluating safety and efficacy with an Food and Drug Administration-approved head-mounted (head-mounted device augmented reality (HMD-AR)) device. METHODS Consecutive adult patients undergoing AR-assisted thoracolumbar fusion between October 2020 and August 2021 with 2 -week follow-up were included. Preoperative, intraoperative, and postoperative data were collected to include demographics, complications, revision surgeries, and AR performance. Intraoperative 3D imaging was used to assess screw accuracy using the Gertzbein-Robbins (G-R) grading scale. RESULTS Thirty-two patients (40.6% male) were included with a total of 222 screws executed using HMD-AR. Intraoperatively, 4 (1.8%) were deemed misplaced and revised using AR or freehand. The remaining 218 (98.2%) screws were placed accurately. There were no intraoperative adverse events or complications, and AR was not abandoned in any case. Of the 208 AR-placed screws with 3D imaging confirmation, 97.1% were considered clinically accurate (91.8% Grade A, 5.3% Grade B). There were no early postoperative surgical complications or revision surgeries during the 2 -week follow-up. CONCLUSIONS This early experience study reports an overall G-R accuracy of 97.1% across 218 AR-guided screws with no intra or early postoperative complications. This shows that HMD-AR-assisted spine surgery is a safe and accurate tool for pedicle, cortical, and pelvic fixation. Larger studies are needed to continue to support this compelling evolution in spine surgery.
Collapse
Affiliation(s)
| | | | - Anant Tewari
- National Spine Health Foundation, Reston, VA, USA
| | - David Boyd
- Reston Radiology Consultants, Reston, VA, USA
| | - Rita Roy
- National Spine Health Foundation, Reston, VA, USA
| | | | | | | | | |
Collapse
|
18
|
Ackermann J, Hoch A, Snedeker JG, Zingg PO, Esfandiari H, Fürnstahl P. Automatic 3D Postoperative Evaluation of Complex Orthopaedic Interventions. J Imaging 2023; 9:180. [PMID: 37754944 PMCID: PMC10532700 DOI: 10.3390/jimaging9090180] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Revised: 08/21/2023] [Accepted: 08/27/2023] [Indexed: 09/28/2023] Open
Abstract
In clinical practice, image-based postoperative evaluation is still performed without state-of-the-art computer methods, as these are not sufficiently automated. In this study we propose a fully automatic 3D postoperative outcome quantification method for the relevant steps of orthopaedic interventions on the example of Periacetabular Osteotomy of Ganz (PAO). A typical orthopaedic intervention involves cutting bone, anatomy manipulation and repositioning as well as implant placement. Our method includes a segmentation based deep learning approach for detection and quantification of the cuts. Furthermore, anatomy repositioning was quantified through a multi-step registration method, which entailed a coarse alignment of the pre- and postoperative CT images followed by a fine fragment alignment of the repositioned anatomy. Implant (i.e., screw) position was identified by 3D Hough transform for line detection combined with fast voxel traversal based on ray tracing. The feasibility of our approach was investigated on 27 interventions and compared against manually performed 3D outcome evaluations. The results show that our method can accurately assess the quality and accuracy of the surgery. Our evaluation of the fragment repositioning showed a cumulative error for the coarse and fine alignment of 2.1 mm. Our evaluation of screw placement accuracy resulted in a distance error of 1.32 mm for screw head location and an angular deviation of 1.1° for screw axis. As a next step we will explore generalisation capabilities by applying the method to different interventions.
Collapse
Affiliation(s)
- Joëlle Ackermann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, 8093 Zurich, Switzerland
| | - Armando Hoch
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Jess Gerrit Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, 8093 Zurich, Switzerland
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Patrick Oliver Zingg
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| |
Collapse
|
19
|
Pierzchajlo N, Stevenson TC, Huynh H, Nguyen J, Boatright S, Arya P, Chakravarti S, Mehrki Y, Brown NJ, Gendreau J, Lee SJ, Chen SG. Augmented Reality in Minimally Invasive Spinal Surgery: A Narrative Review of Available Technology. World Neurosurg 2023; 176:35-42. [PMID: 37059357 DOI: 10.1016/j.wneu.2023.04.030] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 04/08/2023] [Indexed: 04/16/2023]
Abstract
INTRODUCTION Spine surgery has undergone significant changes in approach and technique. With the adoption of intraoperative navigation, minimally invasive spinal surgery (MISS) has arguably become the gold standard. Augmented reality (AR) has now emerged as a front-runner in anatomical visualization and narrower operative corridors. In effect, AR is poised to revolutionize surgical training and operative outcomes. Our study examines the current literature on AR-assisted MISS, synthesizes findings, and creates a narrative highlighting the history and future of AR in spine surgery. MATERIAL AND METHODS Relevant literature was gathered using the PubMed (Medline) database from 1975 to 2023. Pedicle screw placement models were the primary intervention in AR. These were compared to the outcomes of traditional MISS RESULTS: We found that AR devices on the market show promising clinical outcomes in preoperative training and intraoperative use. Three prominent systems were as follows: XVision, HoloLens, and ImmersiveTouch. In the studies, surgeons, residents, and medical students had opportunities to operate AR systems, showcasing their educational potential across each phase of learning. Specifically, one facet described training with cadaver models to gauge accuracy in pedicle screw placement. AR-MISS exceeded free-hand methods without unique complications or contraindications. CONCLUSIONS While still in its infancy, AR has already proven beneficial for educational training and intraoperative MISS applications. We believe that with continued research and advancement of this technology, AR is poised to become a dominant player within the fundamentals of surgical education and MISS operative technique.
Collapse
Affiliation(s)
| | | | - Huey Huynh
- Mercer University, School of Medicine, Savannah, GA, USA
| | - Jimmy Nguyen
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Priya Arya
- Mercer University, School of Medicine, Savannah, GA, USA
| | | | - Yusuf Mehrki
- Department of Neurosurgery, University of Florida, Jacksonville, FL, USA
| | - Nolan J Brown
- Department of Neurosurgery, University of California Irvine, Orange, CA, USA
| | - Julian Gendreau
- Department of Biomedical Engineering, Johns Hopkins Whiting School of Engineering, Baltimore, MD, USA
| | - Seung Jin Lee
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| | - Selby G Chen
- Department of Neurosurgery, Mayo Clinic, Jacksonville, FL, USA
| |
Collapse
|
20
|
Li H, Zhang P, Wang G, Liu H, Yang X, Wang G, Sun Z. Real-Time Navigation with Guide Template for Pedicle Screw Placement Using an Augmented Reality Head-Mounted Device: A Proof-of-Concept Study. Indian J Orthop 2023; 57:776-781. [PMID: 37128571 PMCID: PMC10147887 DOI: 10.1007/s43465-023-00859-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/11/2022] [Accepted: 02/26/2023] [Indexed: 05/03/2023]
Abstract
Objective This study aims to explore the real-time navigation with guide template using an augmented reality head-mounted device (ARHMD) for pedicle screw placement. Methods The spatial coordinate relationships between augmented reality images and real objects were established through the custom-made guide template, and the registration and tracking were completed using an ARHMD. The feasibility and accuracy of this method were verified by pedicle screw placement in 2 lumbar models. According to the Gertzbein-Robbins grading scale, the accuracy of pedicle screw placement was assessed. The navigation errors were estimated by measuring the deviation values of entry point and trajectory angle. Results A total of 20 pedicle K-wires were placed into L1-L5 in 2 lumbar models, which were successfully completed, with an average time of 11.5 min per model and 69 s per screw. The overall K-wires placement accuracy was 100% (20 screws). The navigation error was 2.77 ± 0.82 mm for the deviation value of entry point, and 3.03° ± 0.94° for the deviation value of trajectory angle. Conclusions The application of an ARHMD combined with guide template for pedicle screw placement is a promising navigation approach.
Collapse
Affiliation(s)
- Haowei Li
- Tsinghua University School of Medicine, Beijing, 100091 China
| | - Peihai Zhang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Guangzhi Wang
- Tsinghua University School of Medicine, Beijing, 100091 China
| | - Huiting Liu
- Peking Union Medical College Hospital, Beijing, 100730 China
| | - Xuejun Yang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Guihuai Wang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Zhenxing Sun
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| |
Collapse
|
21
|
Matinfar S, Salehi M, Suter D, Seibold M, Dehghani S, Navab N, Wanivenhaus F, Fürnstahl P, Farshad M, Navab N. Sonification as a reliable alternative to conventional visual surgical navigation. Sci Rep 2023; 13:5930. [PMID: 37045878 PMCID: PMC10097653 DOI: 10.1038/s41598-023-32778-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 04/02/2023] [Indexed: 04/14/2023] Open
Abstract
Despite the undeniable advantages of image-guided surgical assistance systems in terms of accuracy, such systems have not yet fully met surgeons' needs or expectations regarding usability, time efficiency, and their integration into the surgical workflow. On the other hand, perceptual studies have shown that presenting independent but causally correlated information via multimodal feedback involving different sensory modalities can improve task performance. This article investigates an alternative method for computer-assisted surgical navigation, introduces a novel four-DOF sonification methodology for navigated pedicle screw placement, and discusses advanced solutions based on multisensory feedback. The proposed method comprises a novel four-DOF sonification solution for alignment tasks in four degrees of freedom based on frequency modulation synthesis. We compared the resulting accuracy and execution time of the proposed sonification method with visual navigation, which is currently considered the state of the art. We conducted a phantom study in which 17 surgeons executed the pedicle screw placement task in the lumbar spine, guided by either the proposed sonification-based or the traditional visual navigation method. The results demonstrated that the proposed method is as accurate as the state of the art while decreasing the surgeon's need to focus on visual navigation displays instead of the natural focus on surgical tools and targeted anatomy during task execution.
Collapse
Affiliation(s)
- Sasan Matinfar
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany.
- Nuklearmedizin rechts der Isar, Technical University of Munich, 81675, Munich, Germany.
| | - Mehrdad Salehi
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
| | - Daniel Suter
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Matthias Seibold
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
- Research in Orthopedic Computer Science (ROCS), Balgrist University Hospital, University of Zurich, Balgrist Campus, 8008, Zurich, Switzerland
| | - Shervin Dehghani
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
- Nuklearmedizin rechts der Isar, Technical University of Munich, 81675, Munich, Germany
| | - Navid Navab
- Topological Media Lab, Concordia University, Montreal, H3G 2W1, Canada
| | - Florian Wanivenhaus
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science (ROCS), Balgrist University Hospital, University of Zurich, Balgrist Campus, 8008, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopaedics, Balgrist University Hospital, 8008, Zurich, Switzerland
| | - Nassir Navab
- Computer Aided Medical Procedures (CAMP), Technical University of Munich, 85748, Munich, Germany
| |
Collapse
|
22
|
Medress ZA, Bobrow A, Tigchelaar SS, Henderson T, Parker JJ, Desai A. Augmented Reality-Assisted Resection of a Large Presacral Ganglioneuroma: 2-Dimensional Operative Video. Oper Neurosurg (Hagerstown) 2023; 24:e284-e285. [PMID: 36701554 DOI: 10.1227/ons.0000000000000542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 09/22/2022] [Indexed: 01/27/2023] Open
Affiliation(s)
- Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | | | - Jonathon J Parker
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
23
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
24
|
How different augmented reality visualizations for drilling affect trajectory deviation, visual attention, and user experience. Int J Comput Assist Radiol Surg 2023:10.1007/s11548-022-02819-5. [PMID: 36808552 PMCID: PMC10363038 DOI: 10.1007/s11548-022-02819-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Accepted: 12/20/2022] [Indexed: 02/21/2023]
Abstract
PURPOSE Previous work has demonstrated the high accuracy of augmented reality (AR) head-mounted displays for pedicle screw placement in spinal fusion surgery. An important question that remains unanswered is how pedicle screw trajectories should be visualized in AR to best assist the surgeon. METHODOLOGY We compared five AR visualizations displaying the drill trajectory via Microsoft HoloLens 2 with different configurations of abstraction level (abstract or anatomical), position (overlay or small offset), and dimensionality (2D or 3D) against standard navigation on an external screen. We tested these visualizations in a study with 4 expert surgeons and 10 novices (residents in orthopedic surgery) on lumbar spine models covered by Plasticine. We assessed trajectory deviations ([Formula: see text]) from the preoperative plan, dwell times (%) on areas of interest, and the user experience. RESULTS Two AR visualizations resulted in significantly lower trajectory deviations (mixed-effects ANOVA, p<0.0001 and p<0.05) compared to standard navigation, whereas no significant differences were found between participant groups. The best ratings for ease of use and cognitive load were obtained with an abstract visualization displayed peripherally around the entry point and with a 3D anatomical visualization displayed with some offset. For visualizations displayed with some offset, participants spent on average only 20% of their time examining the entry point area. CONCLUSION Our results show that real-time feedback provided by navigation can level task performance between experts and novices, and that the design of a visualization has a significant impact on task performance, visual attention, and user experience. Both abstract and anatomical visualizations can be suitable for navigation when not directly occluding the execution area. Our results shed light on how AR visualizations guide visual attention and the benefits of anchoring information in the peripheral field around the entry point.
Collapse
|
25
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
26
|
Seibold M, Spirig JM, Esfandiari H, Farshad M, Fürnstahl P. Translation of Medical AR Research into Clinical Practice. J Imaging 2023; 9:jimaging9020044. [PMID: 36826963 PMCID: PMC9961816 DOI: 10.3390/jimaging9020044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2022] [Revised: 01/27/2023] [Accepted: 02/03/2023] [Indexed: 02/17/2023] Open
Abstract
Translational research is aimed at turning discoveries from basic science into results that advance patient treatment. The translation of technical solutions into clinical use is a complex, iterative process that involves different stages of design, development, and validation, such as the identification of unmet clinical needs, technical conception, development, verification and validation, regulatory matters, and ethics. For this reason, many promising technical developments at the interface of technology, informatics, and medicine remain research prototypes without finding their way into clinical practice. Augmented reality is a technology that is now making its breakthrough into patient care, even though it has been available for decades. In this work, we explain the translational process for Medical AR devices and present associated challenges and opportunities. To the best knowledge of the authors, this concept paper is the first to present a guideline for the translation of medical AR research into clinical practice.
Collapse
Affiliation(s)
- Matthias Seibold
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
- Computer Aided Medical Procedures and Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Correspondence:
| | - José Miguel Spirig
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| |
Collapse
|
27
|
Eagleson R, Joskowicz L. Verification, Evaluation, and Validation: Which, How & Why, in Medical Augmented Reality System Design. J Imaging 2023; 9:jimaging9020020. [PMID: 36826939 PMCID: PMC9965271 DOI: 10.3390/jimaging9020020] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2022] [Revised: 12/24/2022] [Accepted: 01/11/2023] [Indexed: 01/19/2023] Open
Abstract
This paper presents a discussion about the fundamental principles of Analysis of Augmented and Virtual Reality (AR/VR) Systems for Medical Imaging and Computer-Assisted Interventions. The three key concepts of Analysis (Verification, Evaluation, and Validation) are introduced, illustrated with examples of systems using AR/VR, and defined. The concepts of system specifications, measurement accuracy, uncertainty, and observer variability are defined and related to the analysis principles. The concepts are illustrated with examples of AR/VR working systems.
Collapse
Affiliation(s)
- Roy Eagleson
- AI and Software Engineering Program, The University of Western Ontario, London, ON N6A 5B9, Canada
- Correspondence:
| | - Leo Joskowicz
- School of Computer Science and Engineering, Edmond J. Safra Campus, The Hebrew University of Jerusalem, Givat Ram, Jerusalem 9190401, Israel
| |
Collapse
|
28
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
29
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
30
|
Kriechling P, Loucas R, Loucas M, Casari F, Fürnstahl P, Wieser K. Augmented reality through head-mounted display for navigation of baseplate component placement in reverse total shoulder arthroplasty: a cadaveric study. Arch Orthop Trauma Surg 2023; 143:169-175. [PMID: 34213578 PMCID: PMC9886637 DOI: 10.1007/s00402-021-04025-5] [Citation(s) in RCA: 17] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/04/2021] [Accepted: 06/23/2021] [Indexed: 02/03/2023]
Abstract
BACKGROUND To achieve an optimal clinical outcome in reverse total shoulder arthroplasty (RSA), accurate placement of the components is essential. The recently introduced navigation technology of augmented reality (AR) through head-mounted displays (HMD) offers a promising new approach to visualize the anatomy and navigate component positioning in various orthopedic surgeries. We hypothesized that AR through HMD is feasible, reliable, and accurate for guidewire placement in RSA baseplate positioning. METHODS Twelve human cadaver shoulders were scanned with computed tomography (CT) and RSA baseplate positioning was 3-D planned using dedicated software. The shoulders were prepared through a deltopectoral approach and an augmented reality hologram was superimposed using the HMD Microsoft HoloLense. The central guidewire was then navigated through the HMD to achieve the planned entry point and trajectory. Postoperatively, the shoulders were CT-scanned a second time and the deviation from the planning was calculated. RESULTS The mean deviation of the entry point was 3.5 mm ± 1.7 mm (95% CI 2.4 mm; 4.6 mm). The mean deviation of the planned trajectory was 3.8° ± 1.7° (95% CI 2.6°; 4.9°). CONCLUSION Augmented reality seems feasible and reliable for baseplate guidewire positioning in reverse total shoulder arthroplasty. The achieved values were accurate.
Collapse
Affiliation(s)
- Philipp Kriechling
- Department of Orthopaedics, Balgrist University Hospital, Zurich, Switzerland
| | - Rafael Loucas
- Department of Orthopaedics, Balgrist University Hospital, Zurich, Switzerland
| | - Marios Loucas
- Department of Orthopaedics, Balgrist University Hospital, Zurich, Switzerland
| | - Fabio Casari
- Department of Orthopaedics, Balgrist University Hospital, Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, Balgrist University Hospital, Zurich, Switzerland
| | - Karl Wieser
- Department of Orthopaedics, Balgrist University Hospital, Zurich, Switzerland
| |
Collapse
|
31
|
Schütz L, Weber E, Niu W, Daniel B, McNab J, Navab N, Leuze C. Audiovisual augmentation for coil positioning in transcranial magnetic stimulation. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2022.2154277] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Affiliation(s)
- Laura Schütz
- Wu Tsai Visualization Lab, Stanford University, Stanford, California, USA
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Munich, Germany
| | - Emmanuelle Weber
- Wu Tsai Visualization Lab, Stanford University, Stanford, California, USA
- McNab Lab, Department of Radiology, Stanford University, Stanford, CA, USA
| | - Wally Niu
- Wu Tsai Visualization Lab, Stanford University, Stanford, California, USA
- Incubator for Medical Mixed and Extended Reality at Stanford, Department of Radiology, Stanford University, Stanford, CA, USA
| | - Bruce Daniel
- Incubator for Medical Mixed and Extended Reality at Stanford, Department of Radiology, Stanford University, Stanford, CA, USA
| | - Jennifer McNab
- Wu Tsai Visualization Lab, Stanford University, Stanford, California, USA
- McNab Lab, Department of Radiology, Stanford University, Stanford, CA, USA
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures and Augmented Reality, Department of Informatics, Technical University of Munich, Munich, Germany
| | - Christoph Leuze
- Wu Tsai Visualization Lab, Stanford University, Stanford, California, USA
- Incubator for Medical Mixed and Extended Reality at Stanford, Department of Radiology, Stanford University, Stanford, CA, USA
| |
Collapse
|
32
|
Rong K, Wu X, Xia Q, Chen J, Fei T, Li X, Jiang W. A Systematic Study to Compare the Precise Implantation of Hololens 2 Assisted with Acetabular Prosthesis for Total Hip Replacement. J BIOMATER TISS ENG 2022. [DOI: 10.1166/jbt.2022.3212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
This study aims to evaluate the accuracy of the precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. A total of 80 orthopaedic doctors from our hospital are enrolled in this systematic study and these doctors are divided into following four
groups based on the experience of doctors treatment for orthopaedic patients and the Hololens 2 assisted:Rich experienced group with Hololens 2, rich experienced group without Hololens 2, inexperienced group with Hololens 2, inexperienced group without Hololens 2. The abduction angle, the
anteversion angle, the offset degree in the abduction angle, the offset degree in the anteversion angle in four groups are presented and these result are used to evaluate the accuracy of precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. Finally,
all date in this study is collected and analyzed. The total of 80 physicians are included in this study. The results show that the outcomes between rich experienced group with Hololens 2 and rich experienced group without Hololens 2 are significant difference, and the outcomes between inexperienced
group with Hololens 2 and inexperienced group without Hololens 2 are significant difference. The result between any other two groups is no significant difference. Hololens 2 assisted with acetabular prosthesis for total hip replacement can improve the accuracy.
Collapse
Affiliation(s)
- Ke Rong
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Xuhua Wu
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Qingquan Xia
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Jie Chen
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Teng Fei
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Xujun Li
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Weimin Jiang
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| |
Collapse
|
33
|
Tigchelaar SS, Medress ZA, Quon J, Dang P, Barbery D, Bobrow A, Kin C, Louis R, Desai A. Augmented Reality Neuronavigation for En Bloc Resection of Spinal Column Lesions. World Neurosurg 2022; 167:102-110. [PMID: 36096393 DOI: 10.1016/j.wneu.2022.08.143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/28/2022] [Accepted: 08/30/2022] [Indexed: 11/22/2022]
Abstract
BACKGROUND Primary tumors involving the spine are relatively rare but represent surgically challenging procedures with high patient morbidity. En bloc resection of these tumors necessitates large exposures, wide tumor margins, and poses risks to functionally relevant anatomical structures. Augmented reality neuronavigation (ARNV) represents a paradigm shift in neuronavigation, allowing on-demand visualization of 3D navigation data in real-time directly in line with the operative field. METHODS Here, we describe the first application of ARNV to perform distal sacrococcygectomies for the en bloc removal of sacral and retrorectal lesions involving the coccyx in 2 patients, as well as a thoracic 9-11 laminectomy with costotransversectomy for en bloc removal of a schwannoma in a third patient. RESULTS In our experience, ARNV allowed our teams to minimize the length of the incision, reduce the extent of bony resection, and enhanced visualization of critical adjacent anatomy. All tumors were resected en bloc, and the patients recovered well postoperatively, with no known complications. Pathologic analysis confirmed the en bloc removal of these lesions with negative margins. CONCLUSIONS We conclude that ARNV is an effective strategy for the precise, en bloc removal of spinal lesions including both sacrococcygeal tumors involving the retrorectal space and thoracic schwannomas.
Collapse
Affiliation(s)
- Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA.
| | - Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Jennifer Quon
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Phuong Dang
- Surgical Theater, Inc., Cleveland, Ohio, USA
| | | | | | - Cindy Kin
- Department of Surgery, Stanford University Medical Center, Stanford, California, USA
| | - Robert Louis
- The Brain and Spine Center, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA; Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
34
|
Zhang D, Aoude A, Driscoll M. Development and model form assessment of an automatic subject-specific vertebra reconstruction method. Comput Biol Med 2022; 150:106158. [PMID: 37859278 DOI: 10.1016/j.compbiomed.2022.106158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 09/09/2022] [Accepted: 09/24/2022] [Indexed: 11/21/2022]
Abstract
BACKGROUND Current spine models for analog bench models, surgical navigation and training platforms are conventionally based on 3D models from anatomical human body polygon database or from time-consuming manual-labelled data. This work proposed a workflow of quick and accurate subject-specific vertebra reconstruction method and quantified the reconstructed model accuracy and model form errors. METHODS Four different neural networks were customized for vertebra segmentation. To validate the workflow in clinical applications, an excised human lumbar vertebra was scanned via CT and reconstructed into 3D CAD models using four refined networks. A reverse engineering solution was proposed to obtain the high-precision geometry of the excised vertebra as gold standard. The 3D model evaluation metrics and a finite element analysis (FEA) method were designed to reflect the model accuracy and model form errors. RESULTS The automatic segmentation networks achieved the best Dice score of 94.20% in validation datasets. The accuracy of reconstructed models was quantified with the best 3D Dice index of 92.80%, 3D IoU of 86.56%, Hausdorff distance of 1.60 mm, and the heatmaps and histograms were used for error visualization. The FEA results showed the impact of different geometries and reflected partial surface accuracy of the reconstructed vertebra under biomechanical loads with the closest percentage error of 4.2710% compared to the gold standard model. CONCLUSIONS In this work, a workflow of automatic subject-specific vertebra reconstruction method was proposed while the errors in geometry and FEA were quantified. Such errors should be considered when leveraging subject-specific modelling towards the development and improvement of treatments.
Collapse
Affiliation(s)
- Dingzhong Zhang
- Musculoskeletal Biomechanics Research Lab, Department of Mechanical Engineering, McGill University, 845 Sherbrooke St. W, Montréal, Quebec, H3A 0G4, Canada.
| | - Ahmed Aoude
- Orthopaedic Research Laboratory, Research Institute of McGill University Health Centre, Montreal General Hospital, 1650 Cedar Avenue, Montréal, Québec, H3G 1A4, Canada.
| | - Mark Driscoll
- Musculoskeletal Biomechanics Research Lab, Department of Mechanical Engineering, McGill University, 845 Sherbrooke St. W, Montréal, Quebec, H3A 0G4, Canada; Orthopaedic Research Laboratory, Research Institute of McGill University Health Centre, Montreal General Hospital, 1650 Cedar Avenue, Montréal, Québec, H3G 1A4, Canada.
| |
Collapse
|
35
|
Jecklin S, Jancik C, Farshad M, Fürnstahl P, Esfandiari H. X23D-Intraoperative 3D Lumbar Spine Shape Reconstruction Based on Sparse Multi-View X-ray Data. J Imaging 2022; 8:271. [PMID: 36286365 PMCID: PMC9604813 DOI: 10.3390/jimaging8100271] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Revised: 09/07/2022] [Accepted: 09/27/2022] [Indexed: 11/16/2022] Open
Abstract
Visual assessment based on intraoperative 2D X-rays remains the predominant aid for intraoperative decision-making, surgical guidance, and error prevention. However, correctly assessing the 3D shape of complex anatomies, such as the spine, based on planar fluoroscopic images remains a challenge even for experienced surgeons. This work proposes a novel deep learning-based method to intraoperatively estimate the 3D shape of patients' lumbar vertebrae directly from sparse, multi-view X-ray data. High-quality and accurate 3D reconstructions were achieved with a learned multi-view stereo machine approach capable of incorporating the X-ray calibration parameters in the neural network. This strategy allowed a priori knowledge of the spinal shape to be acquired while preserving patient specificity and achieving a higher accuracy compared to the state of the art. Our method was trained and evaluated on 17,420 fluoroscopy images that were digitally reconstructed from the public CTSpine1K dataset. As evaluated by unseen data, we achieved an 88% average F1 score and a 71% surface score. Furthermore, by utilizing the calibration parameters of the input X-rays, our method outperformed a counterpart method in the state of the art by 22% in terms of surface score. This increase in accuracy opens new possibilities for surgical navigation and intraoperative decision-making solely based on intraoperative data, especially in surgical applications where the acquisition of 3D image data is not part of the standard clinical workflow.
Collapse
Affiliation(s)
- Sascha Jecklin
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Carla Jancik
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
| |
Collapse
|
36
|
Lin W, Xie F, Zhao S, Lin S, He C, Wang Z. Novel Pedicle Navigator Based on Micro Inertial Navigation System (MINS) and Bioelectric Impedance Analysis (BIA) to Facilitate Pedicle Screw Placement in Spine Surgery: Study in a Porcine Model. Spine (Phila Pa 1976) 2022; 47:1172-1178. [PMID: 35238856 PMCID: PMC9348817 DOI: 10.1097/brs.0000000000004348] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Accepted: 02/08/2022] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN A porcine model. OBJECTIVE The study aims to design a novel pedicle navigator based on micro-inertial navigation system (MINS) and bioelectrical impedance analysis (BIA) to assist place pedicle screw placement and validate the utility of the system in enhancing pedicle screw placement. SUMMARY OF BACKGROUND DATA The incidence of pedicle screw malpositioning in complicated spinal surgery is still high.Procedures such as computed tomography image-guided navigation, and robot-assisted surgery have been used to improve the precision of pedicle screw placement, but it remains an unmet clinical need. METHODS The miniaturized integrated framework containing MINS was mounted inside the hollow handle of the pedicle finder. The inner core was complemented by a high-intensity electrode for measuring bioelectric impedance. Twelve healthy male Wuzhishan minipigs of similar age and weight were used in this experiment and randomized to the MINS-BIA or freehand (FH) group. Pedicle screw placement was determined according to the modified Gertzbein-Robbins grading system on computed tomography images. An impedance detected by probe equal to the baseline value for soft tissue was defined as cortical bone perforation. RESULTS A total of 216 screws were placed in 12 minipigs. There were 15 pedicle breaches in the navigator group and 31 in the FH group; the detection rates of these breaches were 14 of 15 (93.3%) and 25 of 31 (80.6%), respectively, with a statistically significant difference between groups. The mean offsets between the planned and postoperatively measured tilt angles of the screw trajectory were 4.5° ± 5.5° in the axial plane and 4.8° ± 3.3° in the sagittal plane with the navigator system and 7.0° ± 5.1° and 7.7° ± 4.7°, respectively, with the FH technique; the differences were statistically significant. CONCLUSION A novel and portable navigator based on MINS and BIA could be beneficial for improving or maintaining accuracy while reducing overall radiation exposure.
Collapse
Affiliation(s)
- Wentao Lin
- Department of Spine Surgery, Shunde Hospital, Southern Medical University (The First People’s Hospital of Shunde Foshan), Foshan, Guangdong, china
| | - Faqin Xie
- Department of Spine Surgery, Shunde Hospital, Southern Medical University (The First People’s Hospital of Shunde Foshan), Foshan, Guangdong, china
| | - Shuofeng Zhao
- School of Ophthalmology and Optometry, School of Biomedical Engineering, Wenzhou Medical University, Wenzhou, Zhejiang China
| | - Songhui Lin
- Department of Spine Surgery, Shunde Hospital, Southern Medical University (The First People’s Hospital of Shunde Foshan), Foshan, Guangdong, china
| | - Chaoqin He
- Department of Spine Surgery, Shunde Hospital, Southern Medical University (The First People’s Hospital of Shunde Foshan), Foshan, Guangdong, china
| | - Zhiyun Wang
- Department of Spine Surgery, Shunde Hospital, Southern Medical University (The First People’s Hospital of Shunde Foshan), Foshan, Guangdong, china
| |
Collapse
|
37
|
Naik RR, Hoblidar A, Bhat SN, Ampar N, Kundangar R. A Hybrid 3D-2D Image Registration Framework for Pedicle Screw Trajectory Registration between Intraoperative X-ray Image and Preoperative CT Image. J Imaging 2022; 8:jimaging8070185. [PMID: 35877629 PMCID: PMC9324544 DOI: 10.3390/jimaging8070185] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 06/11/2022] [Accepted: 06/19/2022] [Indexed: 12/04/2022] Open
Abstract
Pedicle screw insertion is considered a complex surgery among Orthopaedics surgeons. Exclusively to prevent postoperative complications associated with pedicle screw insertion, various types of image intensity registration-based navigation systems have been developed. These systems are computation-intensive, have a small capture range and have local maxima issues. On the other hand, deep learning-based techniques lack registration generalizability and have data dependency. To overcome these limitations, a patient-specific hybrid 3D-2D registration principled framework was designed to map a pedicle screw trajectory between intraoperative X-ray image and preoperative CT image. An anatomical landmark-based 3D-2D Iterative Control Point (ICP) registration was performed to register a pedicular marker pose between the X-ray images and axial preoperative CT images. The registration framework was clinically validated by generating projection images possessing an optimal match with intraoperative X-ray images at the corresponding control point registration. The effectiveness of the registered trajectory was evaluated in terms of displacement and directional errors after reprojecting its position on 2D radiographic planes. The mean Euclidean distances for the Head and Tail end of the reprojected trajectory from the actual trajectory in the AP and lateral planes were shown to be 0.6–0.8 mm and 0.5–1.6 mm, respectively. Similarly, the corresponding mean directional errors were found to be 4.90 and 20. The mean trajectory length difference between the actual and registered trajectory was shown to be 2.67 mm. The approximate time required in the intraoperative environment to axially map the marker position for a single vertebra was found to be 3 min. Utilizing the markerless registration techniques, the designed framework functions like a screw navigation tool, and assures the quality of surgery being performed by limiting the need of postoperative CT.
Collapse
Affiliation(s)
- Roshan Ramakrishna Naik
- Manipal Institute of Technology, Manipal Academy of Higher Education Manipal, Manipal 576104, India;
| | - Anitha Hoblidar
- Manipal Institute of Technology, Manipal Academy of Higher Education Manipal, Manipal 576104, India;
- Correspondence: (A.H.); (S.N.B.)
| | - Shyamasunder N. Bhat
- Kasturba Medical College, Manipal Academy of Higher Education Manipal, Manipal 576104, India; (N.A.); (R.K.)
- Correspondence: (A.H.); (S.N.B.)
| | - Nishanth Ampar
- Kasturba Medical College, Manipal Academy of Higher Education Manipal, Manipal 576104, India; (N.A.); (R.K.)
| | - Raghuraj Kundangar
- Kasturba Medical College, Manipal Academy of Higher Education Manipal, Manipal 576104, India; (N.A.); (R.K.)
| |
Collapse
|
38
|
von Haxthausen F, Moreta-Martinez R, Pose Díez de la Lastra A, Pascau J, Ernst F. UltrARsound: in situ visualization of live ultrasound images using HoloLens 2. Int J Comput Assist Radiol Surg 2022; 17:2081-2091. [PMID: 35776399 PMCID: PMC9515035 DOI: 10.1007/s11548-022-02695-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 05/31/2022] [Indexed: 11/24/2022]
Abstract
Purpose Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. Methods The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. Results Tracking is performed with a median accuracy of 1.98 mm/1.81\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘ for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. Conclusions In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain. .,Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany.
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Alicia Pose Díez de la Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany
| |
Collapse
|
39
|
Harel R, Anekstein Y, Raichel M, Molina CA, Ruiz-Cardozo MA, Orrú E, Khan M, Mirovsky Y, Smorgick Y. The XVS System During Open Spinal Fixation Procedures in Patients Requiring Pedicle Screw Placement in the Lumbosacral Spine. World Neurosurg 2022; 164:e1226-e1232. [PMID: 35671991 DOI: 10.1016/j.wneu.2022.05.134] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Revised: 05/30/2022] [Accepted: 05/31/2022] [Indexed: 11/16/2022]
Abstract
OBJECTIVE This pilot study was undertaken to evaluate the safety, performance, and usability of the Xvision-Spine (XVS) System (Augmedics, Arlington Heights, IL) during open spinal fixation procedures in patients requiring pedicle screw placement in the lumbosacral spine. METHODS The XVS System is an augmented reality head-mounted display (HMD) based on a computer navigation system designed to assist surgeons in accurately placing pedicle screws. It uses an HMD-mounted tracking camera to provide optical tracking technology, and provides the surgeon a translucent direct near-eye display of the navigated surgical instrument's location relative to the computed tomographic image. We report the preliminary results of a prospective series of all consecutive patients who underwent augmented reality-assisted pedicle screw placement in the lumbosacral vertebrae at 3 institutions. Clinical accuracy for each pedicle screw was graded with Gertzbein-Robbins scores by 2 independent and blinded neuroradiologists. RESULTS The 19 study participants included 8 men and 11 women with a mean age of 59.13 ± 12.09 and 59.91 ± 12.89 years, respectively. Seventeen procedures were successfully completed via the XVS System. Two procedures were not completed due to technical issues with the system's intraoperative scanner. A total of 86 screws were inserted. The accuracy of the XVS System was 97.7%. CONCLUSIONS The XVS System's performance in accurate placement of pedicle screws in the lumbosacral vertebrae had an overall accuracy of 97.7%. These preliminary results were comparable to the accuracy of other manual computer-assisted navigation systems reported in the literature.
Collapse
Affiliation(s)
- Ran Harel
- Department of Neurosurgery and the Spine Unit, Sheba Medical Center, Tel Hashomer, Israel, affiliated to the Sackler Faculty of Medicine, Tel-Aviv University, Tel-Aviv, Israel
| | - Yoram Anekstein
- Department of Orthopedic Surgery and the Spine Unit, Shamir (Assaf Harofeh) Medical Center, Zerifin, Israel, affiliated to the Sackler Faculty of Medicine, Tel-Aviv University, Tel-Aviv, Israel
| | - Michael Raichel
- Department of Orthopedic Surgery and the Spine Unit, Haemek Medical Center, Affula, Israel
| | - Camilo A Molina
- Department of Neurosurgery, Washington University School of Medicine in St Louis, St Louis, Missouri, USA
| | - Miguel A Ruiz-Cardozo
- Department of Neurosurgery, Washington University School of Medicine in St Louis, St Louis, Missouri, USA
| | - Emanuele Orrú
- Department of Neuroradiology, The Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Majid Khan
- Department of Neurosurgery, Washington University School of Medicine in St Louis, St Louis, Missouri, USA
| | - Yigal Mirovsky
- Department of Orthopedic Surgery and the Spine Unit, Shamir (Assaf Harofeh) Medical Center, Zerifin, Israel, affiliated to the Sackler Faculty of Medicine, Tel-Aviv University, Tel-Aviv, Israel
| | - Yossi Smorgick
- Department of Orthopedic Surgery and the Spine Unit, Shamir (Assaf Harofeh) Medical Center, Zerifin, Israel, affiliated to the Sackler Faculty of Medicine, Tel-Aviv University, Tel-Aviv, Israel.
| |
Collapse
|
40
|
Augmented Reality in Orthopedic Surgery and Its Application in Total Joint Arthroplasty: A Systematic Review. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12105278] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
The development of augmented reality (AR) and its application in total joint arthroplasty aims at improving the accuracy and precision in implant components’ positioning, hopefully leading to increased outcomes and survivorship. However, this field is far from being thoroughly explored. We therefore performed a systematic review of the literature in order to examine the application, the results, and the different AR systems available in TJA. A systematic review of the literature according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was performed. A comprehensive search of PubMed, MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews was conducted for English articles on the application of augmented reality in total joint arthroplasty using various combinations of keywords since the inception of the database to 31 March 2022. Accuracy was intended as the mean error from the targeted positioning angle and compared as mean values and standard deviations. In all, 14 articles met the inclusion criteria. Among them, four studies reported on the application of AR in total knee arthroplasty, six studies on total hip arthroplasty, three studies reported on reverse shoulder arthroplasty, and one study on total elbow arthroplasty. Nine of the included studies were preclinical (sawbones or cadaveric), while five of them reported results of AR’s clinical application. The main common feature was the high accuracy and precision when implant positioning was compared with preoperative targeted angles with errors ≤2 mm and/or ≤2°. Despite the promising results in terms of increased accuracy and precision, this technology is far from being widely adopted in daily clinical practice. However, the recent exponential growth in machine learning techniques and technologies may eventually lead to the resolution of the ongoing limitations including depth perception and their high complexity, favorably encouraging the widespread usage of AR systems.
Collapse
|
41
|
Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094295] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.
Collapse
|
42
|
Liu Y, Lee MG, Kim JS. Spine Surgery Assisted by Augmented Reality: Where Have We Been? Yonsei Med J 2022; 63:305-316. [PMID: 35352881 PMCID: PMC8965436 DOI: 10.3349/ymj.2022.63.4.305] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/02/2022] [Accepted: 02/09/2022] [Indexed: 11/27/2022] Open
Abstract
This present systematic review examines spine surgery literature supporting augmented reality (AR) technology and summarizes its current status in spinal surgery technology. Database search strategies were retrieved from PubMed, Web of Science, Cochrane Library, Embase, from the earliest records to April 1, 2021. Our review briefly examines the history of AR, and enumerates different device application workflows in a variety of spinal surgeries. We also sort out the pros and cons of current mainstream AR devices and the latest updates. A total of 45 articles are included in our review. The most prevalent surgical applications included are the augmented reality surgical navigation system and head-mounted display. The most popular application of AR is pedicle screw instrumentation in spine surgery, and the primary responsible surgical levels are thoracic and lumbar. AR guidance systems show high potential value in practical clinical applications for the spine. The overall number of cases in AR-related studies is still rare compared to traditional surgical-assisted techniques. These lack long-term clinical efficacy and robust surgical-related statistical data. Changing healthcare laws as well as the increasing prevalence of spinal surgery are generating critical data that determines the value of AR technology.
Collapse
Affiliation(s)
- Yanting Liu
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Min-Gi Lee
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Jin-Sung Kim
- Department of Neurosurgery, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| |
Collapse
|
43
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
44
|
Uhl C, Hatzl J, Meisenbacher K, Zimmer L, Hartmann N, Böckler D. Mixed-Reality-Assisted Puncture of the Common Femoral Artery in a Phantom Model. J Imaging 2022; 8:jimaging8020047. [PMID: 35200749 PMCID: PMC8874567 DOI: 10.3390/jimaging8020047] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 02/12/2022] [Accepted: 02/14/2022] [Indexed: 12/15/2022] Open
Abstract
Percutaneous femoral arterial access is daily practice in a variety of medical specialties and enables physicians worldwide to perform endovascular interventions. The reported incidence of percutaneous femoral arterial access complications is 3–18% and often results from suboptimal puncture location due to insufficient visualization of the target vessel. The purpose of this proof-of-concept study was to evaluate the feasibility and the positional error of a mixed-reality (MR)-assisted puncture of the common femoral artery in a phantom model using a commercially available navigation system. In total, 15 MR-assisted punctures were performed. Cone-beam computed tomography angiography (CTA) was used following each puncture to allow quantification of positional error of needle placements in the axial and sagittal planes. Technical success was achieved in 14/15 cases (93.3%) with a median axial positional error of 1.0 mm (IQR 1.3) and a median sagittal positional error of 1.1 mm (IQR 1.6). The median duration of the registration process and needle insertion was 2 min (IQR 1.0). MR-assisted puncture of the common femoral artery is feasible with acceptable positional errors in a phantom model. Future studies should aim to measure and reduce the positional error resulting from MR registration.
Collapse
|
45
|
Tu P, Qin C, Guo Y, Li D, Lungu AJ, Wang H, Chen X. Ultrasound image guided and mixed reality-based surgical system with real-time soft tissue deformation computing for robotic cervical pedicle screw placement. IEEE Trans Biomed Eng 2022; 69:2593-2603. [PMID: 35157575 DOI: 10.1109/tbme.2022.3150952] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Cervical pedicle screw (CPS) placement surgery remains technically demanding due to the complicated anatomy with neurovascular structures. State-of-the-art surgical navigation or robotic systems still suffer from the problem of hand-eye coordination and soft tissue deformation. In this study, we aim at tracking the intraoperative soft tissue deformation and constructing a virtual physical fusion surgical scene, and integrating them into the robotic system for CPS placement surgery. Firstly, we propose a real-time deformation computation method based on the prior shape model and intraoperative partial information acquired from ultrasound images. According to the generated posterior shape, the structure representation of deformed target tissue gets updated continuously. Secondly, a hand tremble compensation method is proposed to improve the accuracy and robustness of the virtual-physical calibration procedure, and a mixed reality based surgical scene is further constructed for CPS placement surgery. Thirdly, we integrate the soft tissue deformation method and virtual-physical fusion method into our previously proposed surgical robotic system, and the surgical workflow for CPS placement surgery is introduced. We conducted phantom and animal experiments to evaluate the feasibility and accuracy of the proposed system. Our system yielded a mean surface distance error of 1.52 ± 0.43 mm for soft tissue deformation computing, and an average distance deviation of 1.04 ± 0.27 mm for CPS placement. Results demonstrated that our system involves tremendous clinical application potential. Our proposed system promotes the efficiency and safety of the CPS placement surgery.
Collapse
|
46
|
Farshad M, Spirig JM, Suter D, Hoch A, Burkhard MD, Liebmann F, Farshad-Amacker NA, Fürnstahl P. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. NORTH AMERICAN SPINE SOCIETY JOURNAL 2022; 8:100084. [PMID: 35141649 PMCID: PMC8819958 DOI: 10.1016/j.xnsj.2021.100084] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2021] [Revised: 09/21/2021] [Accepted: 10/02/2021] [Indexed: 12/17/2022]
Abstract
Background AR based navigation of spine surgeries may not only provide accurate surgical execution but also operator independency by compensating for potential skill deficits. “Direct” AR-navigation, namely superposing trajectories on anatomy directly, have not been investigated regarding their accuracy and operator's dependence. Purpose of this study was to prove operator independent reliability and accuracy of both AR assisted pedicle screw navigation and AR assisted rod bending in a cadaver setting. Methods Two experienced spine surgeons and two biomedical engineers (laymen) performed independently from each other pedicle screw instrumentations from L1-L5 in a total of eight lumbar cadaver specimens (20 screws/operator) using a fluoroscopy-free AR based navigation method. Screw fitting rods from L1 to S2-Ala-Ileum were bent bilaterally using an AR based rod bending navigation method (4 rods/operator). Outcome measures were pedicle perforations, accuracy compared to preoperative plan, registration time, navigation time, total rod bending time and operator's satisfaction for these procedures. Results 97.5% of all screws were safely placed (<2 mm perforation), overall mean deviation from planned trajectory was 6.8±3.9°, deviation from planned entry point was 4±2.7 mm, registration time per vertebra was 2:25 min (00:56 to 10:00 min), navigation time per screw was 1:07 min (00:15 to 12:43 min) rod bending time per rod was 4:22 min (02:07 to 10:39 min), operator's satisfaction with AR based screw and rod navigation was 5.38±0.67 (1 to 6, 6 being the best rate). Comparison of surgeons and laymen revealed significant difference in navigation time (1:01 min; 00:15 to 3:00 min vs. 01:37 min; 00:23 to 12:43 min; p = 0.004, respectively) but not in pedicle perforation rate. Conclusions Direct AR based screw and rod navigation using a surface digitization registration technique is reliable and independent of surgical experience. The accuracy of pedicle screw insertion in the lumbar spine is comparable with the current standard techniques.
Collapse
Affiliation(s)
- Mazda Farshad
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - José Miguel Spirig
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Daniel Suter
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland.,ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Armando Hoch
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Marco D Burkhard
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Florentin Liebmann
- University Spine Center Zürich, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zürich, Forchstrasse 340, 8008 Zürich
| | - Philipp Fürnstahl
- ROCS: Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
47
|
Carrillo F, Esfandiari H, Müller S, von Atzigen M, Massalimova A, Suter D, Laux CJ, Spirig JM, Farshad M, Fürnstahl P. Surgical Process Modeling for Open Spinal Surgeries. Front Surg 2022; 8:776945. [PMID: 35145990 PMCID: PMC8821818 DOI: 10.3389/fsurg.2021.776945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/30/2021] [Indexed: 11/13/2022] Open
Abstract
Modern operating rooms are becoming increasingly advanced thanks to the emerging medical technologies and cutting-edge surgical techniques. Current surgeries are transitioning into complex processes that involve information and actions from multiple resources. When designing context-aware medical technologies for a given intervention, it is of utmost importance to have a deep understanding of the underlying surgical process. This is essential to develop technologies that can correctly address the clinical needs and can adapt to the existing workflow. Surgical Process Modeling (SPM) is a relatively recent discipline that focuses on achieving a profound understanding of the surgical workflow and providing a model that explains the elements of a given surgery as well as their sequence and hierarchy, both in quantitative and qualitative manner. To date, a significant body of work has been dedicated to the development of comprehensive SPMs for minimally invasive baroscopic and endoscopic surgeries, while such models are missing for open spinal surgeries. In this paper, we provide SPMs common open spinal interventions in orthopedics. Direct video observations of surgeries conducted in our institution were used to derive temporal and transitional information about the surgical activities. This information was later used to develop detailed SPMs that modeled different primary surgical steps and highlighted the frequency of transitions between the surgical activities made within each step. Given the recent emersion of advanced techniques that are tailored to open spinal surgeries (e.g., artificial intelligence methods for intraoperative guidance and navigation), we believe that the SPMs provided in this study can serve as the basis for further advancement of next-generation algorithms dedicated to open spinal interventions that require a profound understanding of the surgical workflow (e.g., automatic surgical activity recognition and surgical skill evaluation). Furthermore, the models provided in this study can potentially benefit the clinical community through standardization of the surgery, which is essential for surgical training.
Collapse
Affiliation(s)
- Fabio Carrillo
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
- *Correspondence: Hooman Esfandiari ;
| | - Sandro Müller
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
- Laboratory for Orthopaedic Biomechanics, Institute for Biomechanics, Swiss Federal Institute of Technology (ETH), Zurich, Switzerland
| | - Aidana Massalimova
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Daniel Suter
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Christoph J. Laux
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - José M. Spirig
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
48
|
XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J Clin Med 2022; 11:jcm11020470. [PMID: 35054164 PMCID: PMC8779726 DOI: 10.3390/jcm11020470] [Citation(s) in RCA: 28] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/01/2022] [Accepted: 01/11/2022] [Indexed: 02/06/2023] Open
Abstract
In recent years, with the rapid advancement and consumerization of virtual reality, augmented reality, mixed reality, and extended reality (XR) technology, the use of XR technology in spine medicine has also become increasingly popular. The rising use of XR technology in spine medicine has also been accelerated by the recent wave of digital transformation (i.e., case-specific three-dimensional medical images and holograms, wearable sensors, video cameras, fifth generation, artificial intelligence, and head-mounted displays), and further accelerated by the COVID-19 pandemic and the increase in minimally invasive spine surgery. The COVID-19 pandemic has a negative impact on society, but positive impacts can also be expected, including the continued spread and adoption of telemedicine services (i.e., tele-education, tele-surgery, tele-rehabilitation) that promote digital transformation. The purpose of this narrative review is to describe the accelerators of XR (VR, AR, MR) technology in spine medicine and then to provide a comprehensive review of the use of XR technology in spine medicine, including surgery, consultation, education, and rehabilitation, as well as to identify its limitations and future perspectives (status quo and quo vadis).
Collapse
|
49
|
Feasibility and Accuracy of Thoracolumbar Pedicle Screw Placement Using an Augmented Reality Head Mounted Device. SENSORS 2022; 22:s22020522. [PMID: 35062483 PMCID: PMC8779462 DOI: 10.3390/s22020522] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 12/30/2021] [Accepted: 01/06/2022] [Indexed: 02/06/2023]
Abstract
Background: To investigate the accuracy of augmented reality (AR) navigation using the Magic Leap head mounted device (HMD), pedicle screws were minimally invasively placed in four spine phantoms. Methods: AR navigation provided by a combination of a conventional navigation system integrated with the Magic Leap head mounted device (AR-HMD) was used. Forty-eight screws were planned and inserted into Th11-L4 of the phantoms using the AR-HMD and navigated instruments. Postprocedural CT scans were used to grade the technical (deviation from the plan) and clinical (Gertzbein grade) accuracy of the screws. The time for each screw placement was recorded. Results: The mean deviation between navigation plan and screw position was 1.9 ± 0.7 mm (1.9 [0.3–4.1] mm) at the entry point and 1.4 ± 0.8 mm (1.2 [0.1–3.9] mm) at the screw tip. The angular deviation was 3.0 ± 1.4° (2.7 [0.4–6.2]°) and the mean time for screw placement was 130 ± 55 s (108 [58–437] s). The clinical accuracy was 94% according to the Gertzbein grading scale. Conclusion: The combination of an AR-HMD with a conventional navigation system for accurate minimally invasive screw placement is feasible and can exploit the benefits of AR in the perspective of the surgeon with the reliability of a conventional navigation system.
Collapse
|
50
|
Visualization, navigation, augmentation. The ever-changing perspective of the neurosurgeon. BRAIN AND SPINE 2022; 2:100926. [PMID: 36248169 PMCID: PMC9560703 DOI: 10.1016/j.bas.2022.100926] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 07/23/2022] [Accepted: 08/10/2022] [Indexed: 11/22/2022]
|