1
|
Ha HG, Gu K, Jeung D, Hong J, Lee H. Simulated augmented reality-based calibration of optical see-through head mound display for surgical navigation. Int J Comput Assist Radiol Surg 2024; 19:1647-1657. [PMID: 38777946 DOI: 10.1007/s11548-024-03164-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Accepted: 04/22/2024] [Indexed: 05/25/2024]
Abstract
PURPOSE Calibration of an optical see-through head-mounted display is critical for augmented reality-based surgical navigation. While conventional methods have advanced, calibration errors remain significant. Moreover, prior research has focused primarily on calibration accuracy and procedure, neglecting the impact on the overall surgical navigation system. Consequently, these enhancements do not necessarily translate to accurate augmented reality in the optical see-through head mount due to systemic errors, including those in calibration. METHOD This study introduces a simulated augmented reality-based calibration to address these issues. By replicating the augmented reality that appeared in the optical see-through head mount, the method achieves calibration that compensates for augmented reality errors, thereby reducing them. The process involves two distinct calibration approaches, followed by adjusting the transformation matrix to minimize displacement in the simulated augmented reality. RESULTS The efficacy of this method was assessed through two accuracy evaluations: registration accuracy and augmented reality accuracy. Experimental results showed an average translational error of 2.14 mm and rotational error of 1.06° across axes in both approaches. Additionally, augmented reality accuracy, measured by the overlay regions' ratio, increased to approximately 95%. These findings confirm the enhancement in both calibration and augmented reality accuracy with the proposed method. CONCLUSION The study presents a calibration method using simulated augmented reality, which minimizes augmented reality errors. This approach, requiring minimal manual intervention, offers a more robust and precise calibration technique for augmented reality applications in surgical navigation.
Collapse
Affiliation(s)
- Ho-Gun Ha
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Kyeongmo Gu
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Deokgi Jeung
- Department of Robotics and Mechatronics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Hyunki Lee
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea.
| |
Collapse
|
2
|
Ding H, Sun W, Zheng G. Robot-Assisted Augmented Reality (AR)-Guided Surgical Navigation for Periacetabular Osteotomy. SENSORS (BASEL, SWITZERLAND) 2024; 24:4754. [PMID: 39066150 PMCID: PMC11280818 DOI: 10.3390/s24144754] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2024] [Revised: 07/11/2024] [Accepted: 07/20/2024] [Indexed: 07/28/2024]
Abstract
Periacetabular osteotomy (PAO) is an effective approach for the surgical treatment of developmental dysplasia of the hip (DDH). However, due to the complex anatomical structure around the hip joint and the limited field of view (FoV) during the surgery, it is challenging for surgeons to perform a PAO surgery. To solve this challenge, we propose a robot-assisted, augmented reality (AR)-guided surgical navigation system for PAO. The system mainly consists of a robot arm, an optical tracker, and a Microsoft HoloLens 2 headset, which is a state-of-the-art (SOTA) optical see-through (OST) head-mounted display (HMD). For AR guidance, we propose an optical marker-based AR registration method to estimate a transformation from the optical tracker coordinate system (COS) to the virtual space COS such that the virtual models can be superimposed on the corresponding physical counterparts. Furthermore, to guide the osteotomy, the developed system automatically aligns a bone saw with osteotomy planes planned in preoperative images. Then, it provides surgeons with not only virtual constraints to restrict movement of the bone saw but also AR guidance for visual feedback without sight diversion, leading to higher surgical accuracy and improved surgical safety. Comprehensive experiments were conducted to evaluate both the AR registration accuracy and osteotomy accuracy of the developed navigation system. The proposed AR registration method achieved an average mean absolute distance error (mADE) of 1.96 ± 0.43 mm. The robotic system achieved an average center translation error of 0.96 ± 0.23 mm, an average maximum distance of 1.31 ± 0.20 mm, and an average angular deviation of 3.77 ± 0.85°. Experimental results demonstrated both the AR registration accuracy and the osteotomy accuracy of the developed system.
Collapse
Affiliation(s)
| | | | - Guoyan Zheng
- Institute of Medical Robotics, School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China; (H.D.); (W.S.)
| |
Collapse
|
3
|
Wang G, Chen C, Jiang Z, Li G, Wu C, Li S. Efficient Use of Biological Data in the Web 3.0 Era by Applying Nonfungible Token Technology. J Med Internet Res 2024; 26:e46160. [PMID: 38805706 PMCID: PMC11167317 DOI: 10.2196/46160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Revised: 09/26/2023] [Accepted: 03/24/2024] [Indexed: 05/30/2024] Open
Abstract
CryptoKitties, a trendy game on Ethereum that is an open-source public blockchain platform with a smart contract function, brought nonfungible tokens (NFTs) into the public eye in 2017. NFTs are popular because of their nonfungible properties and their unique and irreplaceable nature in the real world. The embryonic form of NFTs can be traced back to a P2P network protocol improved based on Bitcoin in 2012 that can realize decentralized digital asset transactions. NFTs have recently gained much attention and have shown an unprecedented explosive growth trend. Herein, the concept of digital asset NFTs is introduced into the medical and health field to conduct a subversive discussion on biobank operations. By converting biomedical data into NFTs, the collection and circulation of samples can be accelerated, and the transformation of resources can be promoted. In conclusion, the biobank can achieve sustainable development through "decentralization."
Collapse
Affiliation(s)
- Guanyi Wang
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Chen Chen
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Ziyu Jiang
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Gang Li
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Can Wu
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| | - Sheng Li
- Department of Urology, Cancer Precision Diagnosis and Treatment and Translational Medicine Hubei Engineering Research Center, Zhongnan Hospital, Wuhan University, Wuhan, China
| |
Collapse
|
4
|
Canton SP, Austin CN, Steuer F, Dadi S, Sharma N, Kass NM, Fogg D, Clayton E, Cunningham O, Scott D, LaBaze D, Andrews EG, Biehl JT, Hogan MV. Feasibility and Usability of Augmented Reality Technology in the Orthopaedic Operating Room. Curr Rev Musculoskelet Med 2024; 17:117-128. [PMID: 38607522 PMCID: PMC11068703 DOI: 10.1007/s12178-024-09888-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/06/2024] [Indexed: 04/13/2024]
Abstract
PURPOSE OF REVIEW Augmented reality (AR) has gained popularity in various sectors, including gaming, entertainment, and healthcare. The desire for improved surgical navigation within orthopaedic surgery has led to the evaluation of the feasibility and usability of AR in the operating room (OR). However, the safe and effective use of AR technology in the OR necessitates a proper understanding of its capabilities and limitations. This review aims to describe the fundamental elements of AR, highlight limitations for use within the field of orthopaedic surgery, and discuss potential areas for development. RECENT FINDINGS To date, studies have demonstrated evidence that AR technology can be used to enhance navigation and performance in orthopaedic procedures. General hardware and software limitations of the technology include the registration process, ergonomics, and battery life. Other limitations are related to the human response factors such as inattentional blindness, which may lead to the inability to see complications within the surgical field. Furthermore, the prolonged use of AR can cause eye strain and headache due to phenomena such as the vergence-convergence conflict. AR technology may prove to be a better alternative to current orthopaedic surgery navigation systems. However, the current limitations should be mitigated to further improve the feasibility and usability of AR in the OR setting. It is important for both non-clinicians and clinicians to work in conjunction to guide the development of future iterations of AR technology and its implementation into the OR workflow.
Collapse
Affiliation(s)
- Stephen P Canton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA.
| | | | - Fritz Steuer
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Srujan Dadi
- Rowan-Virtua School of Osteopathic Medicine, Stratford, NJ, USA
| | - Nikhil Sharma
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Nicolás M Kass
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - David Fogg
- Texas Tech University Health Sciences Center El Paso, El Paso, TX, USA
| | - Elizabeth Clayton
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Onaje Cunningham
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | - Devon Scott
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Dukens LaBaze
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| | - Edward G Andrews
- Department of Neurological Surgery University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Jacob T Biehl
- School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, USA
| | - MaCalus V Hogan
- Department of Orthopaedic Surgery, University of Pittsburgh, 3471 Fifth Ave, Pittsburgh, PA, 15213, USA
| |
Collapse
|
5
|
Morley CT, Arreola DM, Qian L, Lynn AL, Veigulis ZP, Osborne TF. Mixed Reality Surgical Navigation System; Positional Accuracy Based on Food and Drug Administration Standard. Surg Innov 2024; 31:48-57. [PMID: 38019844 PMCID: PMC10773158 DOI: 10.1177/15533506231217620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2023]
Abstract
BACKGROUND Computer assisted surgical navigation systems are designed to improve outcomes by providing clinicians with procedural guidance information. The use of new technologies, such as mixed reality, offers the potential for more intuitive, efficient, and accurate procedural guidance. The goal of this study is to assess the positional accuracy and consistency of a clinical mixed reality system that utilizes commercially available wireless head-mounted displays (HMDs), custom software, and localization instruments. METHODS Independent teams using the second-generation Microsoft HoloLens© hardware, Medivis SurgicalAR© software, and localization instruments, tested the accuracy of the combined system at different institutions, times, and locations. The ASTM F2554-18 consensus standard for computer-assisted surgical systems, as recognized by the U.S. FDA, was utilized to measure the performance. 288 tests were performed. RESULTS The system demonstrated consistent results, with an average accuracy performance that was better than one millimeter (.75 ± SD .37 mm). CONCLUSION Independently acquired positional tracking accuracies exceed conventional in-market surgical navigation tracking systems and FDA standards. Importantly, the performance was achieved at two different institutions, using an international testing standard, and with a system that included a commercially available off-the-shelf wireless head mounted display and software.
Collapse
Affiliation(s)
| | - David M. Arreola
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
| | | | | | - Zachary P. Veigulis
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Business Analytics, Tippie College of Business, University of Iowa, Iowa, IA, USA
| | - Thomas F. Osborne
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
6
|
Tao B, Fan X, Wang F, Chen X, Shen Y, Wu Y. Comparison of the accuracy of dental implant placement using dynamic and augmented reality-based dynamic navigation: An in vitro study. J Dent Sci 2024; 19:196-202. [PMID: 38303816 PMCID: PMC10829549 DOI: 10.1016/j.jds.2023.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Revised: 05/05/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality has been gradually applied in dental implant surgery. However, whether the dynamic navigation system integrated with augmented reality technology will further improve the accuracy is still unknown. The purpose of this study is to investigate the accuracy of dental implant placement using dynamic navigation and augmented reality-based dynamic navigation systems. Materials and methods Thirty-two cone-beam CT (CBCT) scans from clinical patients were collected and used to generate 64 phantoms that were allocated to the augmented reality-based dynamic navigation (ARDN) group or the conventional dynamic navigation (DN) group. The primary outcomes were global coronal, apical and angular deviations, and they were measured after image fusion. A linear mixed model with a random intercept was used. A P value < 0.05 was considered to indicate statistical significance. Results A total of 242 dental implants were placed in two groups. The global coronal, apical and angular deviations of the ARDN and DN groups were 1.31 ± 0.67 mm vs. 1.18 ± 0.59 mm, 1.36 ± 0.67 mm vs. 1.39 ± 0.55 mm, and 3.72 ± 2.13° vs. 3.1 ± 1.56°, respectively. No significant differences were found with regard to coronal and apical deviations (P = 0.16 and 0.6, respectively), but the DN group had a significantly lower angular deviation than the ARDN group (P = 0.02). Conclusion The augmented reality-based dynamic navigation system yielded a similar accuracy to the conventional dynamic navigation system for dental implant placement in coronal and apical points, but the augmented reality-based dynamic navigation system yielded a higher angular deviation.
Collapse
Affiliation(s)
- Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Feng Wang
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| |
Collapse
|
7
|
Zhang J, Yang Z, Jiang S, Zhou Z. A spatial registration method based on 2D-3D registration for an augmented reality spinal surgery navigation system. Int J Med Robot 2023:e2612. [PMID: 38113328 DOI: 10.1002/rcs.2612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Revised: 09/27/2023] [Accepted: 12/06/2023] [Indexed: 12/21/2023]
Abstract
BACKGROUND In order to provide accurate and reliable image guidance for augmented reality (AR) spinal surgery navigation, a spatial registration method has been proposed. METHODS In the AR spinal surgery navigation system, grayscale-based 2D/3D registration technology has been used to register preoperative computed tomography images with intraoperative X-ray images to complete the spatial registration, and then the fusion of virtual image and real spine has been realised. RESULTS In the image registration experiment, the success rate of spine model registration was 90%. In the spinal model verification experiment, the surface registration error of the spinal model ranged from 0.361 to 0.612 mm, and the total average surface registration error was 0.501 mm. CONCLUSION The spatial registration method based on 2D/3D registration technology can be used in AR spinal surgery navigation systems and is highly accurate and minimally invasive.
Collapse
Affiliation(s)
- Jingqi Zhang
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| | - Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, China
| |
Collapse
|
8
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
9
|
Tu P, Wang H, Joskowicz L, Chen X. A multi-view interactive virtual-physical registration method for mixed reality based surgical navigation in pelvic and acetabular fracture fixation. Int J Comput Assist Radiol Surg 2023; 18:1715-1724. [PMID: 37031310 DOI: 10.1007/s11548-023-02884-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2023] [Accepted: 03/21/2023] [Indexed: 04/10/2023]
Abstract
PURPOSE The treatment of pelvic and acetabular fractures remains technically demanding, and traditional surgical navigation systems suffer from the hand-eye mis-coordination. This paper describes a multi-view interactive virtual-physical registration method to enhance the surgeon's depth perception and a mixed reality (MR)-based surgical navigation system for pelvic and acetabular fracture fixation. METHODS First, the pelvic structure is reconstructed by segmentation in a preoperative CT scan, and an insertion path for the percutaneous LC-II screw is computed. A custom hand-held registration cube is used for virtual-physical registration. Three strategies are proposed to improve the surgeon's depth perception: vertices alignment, tremble compensation and multi-view averaging. During navigation, distance and angular deviation visual cues are updated to help the surgeon with the guide wire insertion. The methods have been integrated into an MR module in a surgical navigation system. RESULTS Phantom experiments were conducted. Ablation experimental results demonstrated the effectiveness of each strategy in the virtual-physical registration method. The proposed method achieved the best accuracy in comparison with related works. For percutaneous guide wire placement, our system achieved a mean bony entry point error of 2.76 ± 1.31 mm, a mean bony exit point error of 4.13 ± 1.74 mm, and a mean angular deviation of 3.04 ± 1.22°. CONCLUSIONS The proposed method can improve the virtual-physical fusion accuracy. The developed MR-based surgical navigation system has clinical application potential. Cadaver and clinical experiments will be conducted in future.
Collapse
Affiliation(s)
- Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Huixiang Wang
- Department of Orthopedics, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China.
- Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
10
|
Winter P, Rother S, Orth P, Fritsch E. [Innovative image-based planning in musculoskeletal surgery]. ORTHOPADIE (HEIDELBERG, GERMANY) 2023:10.1007/s00132-023-04393-3. [PMID: 37286621 DOI: 10.1007/s00132-023-04393-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/03/2023] [Indexed: 06/09/2023]
Abstract
BACKGROUND For the preparation of surgical procedures in orthopedics and trauma surgery, precise knowledge of imaging and the three-dimensional imagination of the surgeon are of outstanding importance. Image-based, preoperative two-dimensional planning is the gold standard in arthroplasty today. In complex cases, further imaging such as computed tomography (CT) or magnetic resonance imaging is also performed, generating a three-dimensional model of the body region and helping the surgeon in the planning of the surgical treatment. Four-dimensional, dynamic CT studies have also been reported and are available as a complementary tool. DIGITAL AIDS Furthermore, digital aids should generate an improved representation of the pathology to be treated and optimize the surgeon's imagination. The finite element method allows patient-specific and implant-specific parameters to be taken into account in preoperative surgical planning. Intraoperatively, relevant information can be provided by augmented reality without significantly influencing the surgical workflow.
Collapse
Affiliation(s)
- Philipp Winter
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Kirrberger Str. 100, 66421, Homburg, Deutschland.
| | - Stephan Rother
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Kirrberger Str. 100, 66421, Homburg, Deutschland
| | - Patrick Orth
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Kirrberger Str. 100, 66421, Homburg, Deutschland
| | - Ekkehard Fritsch
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Kirrberger Str. 100, 66421, Homburg, Deutschland
| |
Collapse
|
11
|
Chiou SY, Liu LS, Lee CW, Kim DH, Al-masni MA, Liu HL, Wei KC, Yan JL, Chen PY. Augmented Reality Surgical Navigation System Integrated with Deep Learning. Bioengineering (Basel) 2023; 10:617. [PMID: 37237687 PMCID: PMC10215407 DOI: 10.3390/bioengineering10050617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 05/11/2023] [Accepted: 05/17/2023] [Indexed: 05/28/2023] Open
Abstract
Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, minimizing distractions during surgery is critical and the spatial information displayed in this arrangement is non-intuitive. Previous studies have proposed combining optical navigation systems with augmented reality (AR) to provide surgeons with intuitive imaging during surgery, through the use of planar and three-dimensional imagery. However, these studies have mainly focused on visual aids and have paid relatively little attention to real surgical guidance aids. Moreover, the use of augmented reality reduces system stability and accuracy, and optical navigation systems are costly. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves the desired system advantages with low cost, high stability, and high accuracy. This system also provides intuitive guidance for the surgical target point, entry point, and trajectory. Once the surgeon uses the navigation stick to indicate the position of the surgical entry point, the connection between the surgical target and the surgical entry point is immediately displayed on the AR device (tablet or HoloLens glasses), and a dynamic auxiliary line is shown to assist with incision angle and depth. Clinical trials were conducted for EVD (extra-ventricular drainage) surgery, and surgeons confirmed the system's overall benefit. A "virtual object automatic scanning" method is proposed to achieve a high accuracy of 1 ± 0.1 mm for the AR-based system. Furthermore, a deep learning-based U-Net segmentation network is incorporated to enable automatic identification of the hydrocephalus location by the system. The system achieves improved recognition accuracy, sensitivity, and specificity of 99.93%, 93.85%, and 95.73%, respectively, representing a significant improvement from previous studies.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Li-Sheng Liu
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Chia-Wei Lee
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Dong-Hyun Kim
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Mohammed A. Al-masni
- Department of Artificial Intelligence, College of Software & Convergence Technology, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Kuo-Chen Wei
- New Taipei City Tucheng Hospital, Tao-Yuan, Tucheng, New Taipei City 236, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| |
Collapse
|
12
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
13
|
Usevitch DE, Bronheim RS, Reyes MC, Babilonia C, Margalit A, Jain A, Armand M. Review of Enhanced Handheld Surgical Drills. Crit Rev Biomed Eng 2023; 51:29-50. [PMID: 37824333 PMCID: PMC10874117 DOI: 10.1615/critrevbiomedeng.2023049106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2023]
Abstract
The handheld drill has been used as a conventional surgical tool for centuries. Alongside the recent successes of surgical robots, the development of new and enhanced medical drills has improved surgeon ability without requiring the high cost and consuming setup times that plague medical robot systems. This work provides an overview of enhanced handheld surgical drill research focusing on systems that include some form of image guidance and do not require additional hardware that physically supports or guides drilling. Drilling is reviewed by main contribution divided into audio-, visual-, or hardware-enhanced drills. A vision for future work to enhance handheld drilling systems is also discussed.
Collapse
Affiliation(s)
- David E. Usevitch
- Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD, United States
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Rachel S. Bronheim
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Miguel C. Reyes
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Carlos Babilonia
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Adam Margalit
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Amit Jain
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Mehran Armand
- Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD, United States
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
14
|
Rong K, Wu X, Xia Q, Chen J, Fei T, Li X, Jiang W. A Systematic Study to Compare the Precise Implantation of Hololens 2 Assisted with Acetabular Prosthesis for Total Hip Replacement. J BIOMATER TISS ENG 2022. [DOI: 10.1166/jbt.2022.3212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
This study aims to evaluate the accuracy of the precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. A total of 80 orthopaedic doctors from our hospital are enrolled in this systematic study and these doctors are divided into following four
groups based on the experience of doctors treatment for orthopaedic patients and the Hololens 2 assisted:Rich experienced group with Hololens 2, rich experienced group without Hololens 2, inexperienced group with Hololens 2, inexperienced group without Hololens 2. The abduction angle, the
anteversion angle, the offset degree in the abduction angle, the offset degree in the anteversion angle in four groups are presented and these result are used to evaluate the accuracy of precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. Finally,
all date in this study is collected and analyzed. The total of 80 physicians are included in this study. The results show that the outcomes between rich experienced group with Hololens 2 and rich experienced group without Hololens 2 are significant difference, and the outcomes between inexperienced
group with Hololens 2 and inexperienced group without Hololens 2 are significant difference. The result between any other two groups is no significant difference. Hololens 2 assisted with acetabular prosthesis for total hip replacement can improve the accuracy.
Collapse
Affiliation(s)
- Ke Rong
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Xuhua Wu
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Qingquan Xia
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Jie Chen
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Teng Fei
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Xujun Li
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Weimin Jiang
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| |
Collapse
|
15
|
Merle G, Miclau T, Parent-Harvey A, Harvey EJ. Sensor technology usage in orthopedic trauma. Injury 2022; 53 Suppl 3:S59-S63. [PMID: 36182592 DOI: 10.1016/j.injury.2022.09.036] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/09/2022] [Revised: 08/25/2022] [Accepted: 09/08/2022] [Indexed: 02/02/2023]
Abstract
Medicine in general is quickly transitioning to a digital presence. Orthopaedic surgery is also being impacted by the tenets of digital health but there are also direct efforts with trauma surgery. Sensors are the pen and paper of the next wave of data acquisition. Orthopaedic trauma can and will be part of this new wave of medicine. Early sensor products that are now coming to market, or are in early development, will directly change the way we think about surgical diagnosis and outcomes. Sensor development for biometrics is already here. Wellness devices, pressure, temperature, and other parameters are already being measured. Data acquisition and analysis is going to be a fruitful addition to our research armamentarium with the volume of information now available. A combination of broadband internet, micro electrical machine systems (MEMS), and new wireless communication standards is driving this new wave of medicine. The Internet of Things (IoT) [1] now has a subset which is the Internet of Medical Devices [2-5] permitting a much more in-depth dive into patient procedures and outcomes. IoT devices are now being used to enable remote health monitoring, in hospital treatment, and guide therapies. This article reviews current sensor technology that looks to impact trauma care.
Collapse
Affiliation(s)
- Géraldine Merle
- École Polytechnique de Montréal, Université de Montréal, Montréal, Canada
| | - Theodore Miclau
- Orthopaedic Trauma Institute, University of Calfornia, School of Medicine, Department of Orthopaedics, San Francisco, USA
| | | | | |
Collapse
|
16
|
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207709. [PMID: 36298059 PMCID: PMC9611914 DOI: 10.3390/s22207709] [Citation(s) in RCA: 33] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 09/29/2022] [Accepted: 10/07/2022] [Indexed: 05/08/2023]
Abstract
In the world reference context, although virtual reality, augmented reality and mixed reality have been emerging methodologies for several years, only today technological and scientific advances have made them suitable to revolutionize clinical care and medical contexts through the provision of enhanced functionalities and improved health services. This systematic review provides the state-of-the-art applications of the Microsoft® HoloLens 2 in a medical and healthcare context. Focusing on the potential that this technology has in providing digitally supported clinical care, also but not only in relation to the COVID-19 pandemic, studies that proved the applicability and feasibility of HoloLens 2 in a medical and healthcare scenario were considered. The review presents a thorough examination of the different studies conducted since 2019, focusing on HoloLens 2 medical sub-field applications, device functionalities provided to users, software/platform/framework used, as well as the study validation. The results provided in this paper could highlight the potential and limitations of the HoloLens 2-based innovative solutions and bring focus to emerging research topics, such as telemedicine, remote control and motor rehabilitation.
Collapse
Affiliation(s)
- Arrigo Palumbo
- Department of Medical and Surgical Sciences, Magna Græcia University, 88100 Catanzaro, Italy
| |
Collapse
|
17
|
Chiou SY, Zhang ZY, Liu HL, Yan JL, Wei KC, Chen PY. Augmented Reality Surgical Navigation System for External Ventricular Drain. Healthcare (Basel) 2022; 10:healthcare10101815. [PMID: 36292263 PMCID: PMC9601392 DOI: 10.3390/healthcare10101815] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 09/14/2022] [Accepted: 09/19/2022] [Indexed: 12/02/2022] Open
Abstract
Augmented reality surgery systems are playing an increasing role in the operating room, but applying such systems to neurosurgery presents particular challenges. In addition to using augmented reality technology to display the position of the surgical target position in 3D in real time, the application must also display the scalpel entry point and scalpel orientation, with accurate superposition on the patient. To improve the intuitiveness, efficiency, and accuracy of extra-ventricular drain surgery, this paper proposes an augmented reality surgical navigation system which accurately superimposes the surgical target position, scalpel entry point, and scalpel direction on a patient’s head and displays this data on a tablet. The accuracy of the optical measurement system (NDI Polaris Vicra) was first independently tested, and then complemented by the design of functions to help the surgeon quickly identify the surgical target position and determine the preferred entry point. A tablet PC was used to display the superimposed images of the surgical target, entry point, and scalpel on top of the patient, allowing for correct scalpel orientation. Digital imaging and communications in medicine (DICOM) results for the patient’s computed tomography were used to create a phantom and its associated AR model. This model was then imported into the application, which was then executed on the tablet. In the preoperative phase, the technician first spent 5–7 min to superimpose the virtual image of the head and the scalpel. The surgeon then took 2 min to identify the intended target position and entry point position on the tablet, which then dynamically displayed the superimposed image of the head, target position, entry point position, and scalpel (including the scalpel tip and scalpel orientation). Multiple experiments were successfully conducted on the phantom, along with six practical trials of clinical neurosurgical EVD. In the 2D-plane-superposition model, the optical measurement system (NDI Polaris Vicra) provided highly accurate visualization (2.01 ± 1.12 mm). In hospital-based clinical trials, the average technician preparation time was 6 min, while the surgeon required an average of 3.5 min to set the target and entry-point positions and accurately overlay the orientation with an NDI surgical stick. In the preparation phase, the average time required for the DICOM-formatted image processing and program import was 120 ± 30 min. The accuracy of the designed augmented reality optical surgical navigation system met clinical requirements, and can provide a visual and intuitive guide for neurosurgeons. The surgeon can use the tablet application to obtain real-time DICOM-formatted images of the patient, change the position of the surgical entry point, and instantly obtain an updated surgical path and surgical angle. The proposed design can be used as the basis for various augmented reality brain surgery navigation systems in the future.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Zhi-Yue Zhang
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Kuo-Chen Wei
- Department of Neurosurgery, New Taipei City TuCheng Hospital, New Taipei City 236, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
- School of Medicine, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Correspondence: ; Tel.: +886-2-2431-3131
| |
Collapse
|
18
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
19
|
Abstract
Augmented reality (AR) is an innovative system that enhances the real world by superimposing virtual objects on reality. The aim of this study was to analyze the application of AR in medicine and which of its technical solutions are the most used. We carried out a scoping review of the articles published between 2019 and February 2022. The initial search yielded a total of 2649 articles. After applying filters, removing duplicates and screening, we included 34 articles in our analysis. The analysis of the articles highlighted that AR has been traditionally and mainly used in orthopedics in addition to maxillofacial surgery and oncology. Regarding the display application in AR, the Microsoft HoloLens Optical Viewer is the most used method. Moreover, for the tracking and registration phases, the marker-based method with a rigid registration remains the most used system. Overall, the results of this study suggested that AR is an innovative technology with numerous advantages, finding applications in several new surgery domains. Considering the available data, it is not possible to clearly identify all the fields of application and the best technologies regarding AR.
Collapse
|
20
|
Arpaia P, De Benedetto E, De Paolis L, D’Errico G, Donato N, Duraccio L. Performance and Usability Evaluation of an Extended Reality Platform to Monitor Patient’s Health during Surgical Procedures. SENSORS 2022; 22:s22103908. [PMID: 35632317 PMCID: PMC9143436 DOI: 10.3390/s22103908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 05/14/2022] [Accepted: 05/18/2022] [Indexed: 02/01/2023]
Abstract
An extended-reality (XR) platform for real-time monitoring of patients’ health during surgical procedures is proposed. The proposed system provides real-time access to a comprehensive set of patients’ information, which are made promptly available to the surgical team in the operating room (OR). In particular, the XR platform supports the medical staff by automatically acquiring the patient’s vitals from the operating room instrumentation and displaying them in real-time directly on an XR headset. Furthermore, information regarding the patient clinical record is also shown upon request. Finally, the XR-based monitoring platform also allows displaying in XR the video stream coming directly from the endoscope. The innovative aspect of the proposed XR-based monitoring platform lies in the comprehensiveness of the available information, in its modularity and flexibility (in terms of adaption to different sources of data), ease of use, and most importantly, in a reliable communication, which are critical requirements for the healthcare field. To validate the proposed system, experimental tests were conducted using instrumentation typically available in the operating room (i.e., a respiratory ventilator, a patient monitor for intensive care, and an endoscope). The overall results showed (i) an accuracy of the data communication greater than 99 %, along with (ii) an average time response below ms, and (iii) satisfying feedback from the SUS questionnaires filled out by the physicians after intensive use.
Collapse
Affiliation(s)
- Pasquale Arpaia
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
- Correspondence:
| | - Egidio De Benedetto
- Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS), University of Naples Federico II, 80138 Naples, Italy;
- Augmented Reality for Health Monitoring Laboratory (ARHeMLAB), Department of Information Technology and Electrical Engineering, University of Naples Federico II, 80138 Naples, Italy
| | - Lucio De Paolis
- Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy;
| | - Giovanni D’Errico
- Department of Applied Science and Technology, Polytechnic University of Turin, 10129 Turin, Italy;
| | - Nicola Donato
- Department of Engineering, University of Messina, 98122 Messina, Italy;
| | - Luigi Duraccio
- Department of Electronics and Telecommunications, Polytechnic University of Turin, 10129 Turin, Italy;
| |
Collapse
|
21
|
Wang L, Zhao Z, Wang G, Zhou J, Zhu H, Guo H, Huang H, Yu M, Zhu G, Li N, Na Y. Application of a three-dimensional visualization model in intraoperative guidance of percutaneous nephrolithotomy. Int J Urol 2022; 29:838-844. [PMID: 35545290 DOI: 10.1111/iju.14907] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 04/05/2022] [Indexed: 01/12/2023]
Abstract
OBJECTIVES To establish a three-dimensional visualization model of percutaneous nephrolithotomy, apply it to guiding intraoperative puncture in a mixed reality environment, and evaluate its accuracy and clinical value. METHODS Patients with percutaneous nephrolithotomy indications were prospectively divided into three-dimensional group and control group with a ratio of 1:2. For patients in three-dimensional group, positioning markers were pasted on the skin and enhanced computed tomography scanning was performed in the prone position. Holographic three-dimensional models were made and puncture routes were planned before operation. During the operation, the three-dimensional model was displayed through HoloLens glass and visually registered with the patient's body. Puncture of the target renal calyx was performed under three-dimensional-image guiding and ultrasonic monitoring. Patients in the control group underwent routine percutaneous nephrolithotomy in the prone position under the monitoring of B-ultrasound. Deviation distance of the kidney, puncture time, puncture attempts, channel coincidence rate, stone clearance rate, and postoperative complications were assessed. RESULTS Twenty-one and 40 patients were enrolled in three-dimensional and control group, respectively. For three-dimensional group, the average deviation between virtual and real kidney was 3.1 ± 2.9 mm. All punctures were performed according to preoperative planning. Compared with the control group, the three-dimensional group had shorter puncture time (8.9 ± 3.3 vs 14.5 ± 6.1 min, P < 0.001), fewer puncture attempts (1.4 ± 0.6 vs 2.2 ± 1.5, P = 0.009), and might also have a better performance in stone clearance rate (90.5% vs 72.5%, P = 0.19) and postoperative complications (P = 0.074). CONCLUSIONS The percutaneous nephrolithotomy three-dimensional model manifested acceptable accuracy and good value for guiding puncture in a mixed reality environment.
Collapse
Affiliation(s)
- Lei Wang
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Zichen Zhao
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Gang Wang
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Jianfang Zhou
- Department of Urology, Shougang Shuigang General Hospital, Liupanshui City, Guizhou
| | - He Zhu
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Hongfeng Guo
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Huagang Huang
- Department of Urology, Shougang Shuigang General Hospital, Liupanshui City, Guizhou
| | - Mingchuan Yu
- Department of Medical Imaging, Peking University Shougang Hospital, Beijing, China
| | - Gang Zhu
- Department of Urology, Beijing United Family Hospital, Beijing, China
| | - Ningchen Li
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| | - Yanqun Na
- Department of Urology, Peking University Shougang Hospital, Beijing, China.,Peking University Wujieping Urology Center, Peking University Health Science Center, Beijing, China
| |
Collapse
|
22
|
Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094295] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.
Collapse
|
23
|
Tu P, Qin C, Guo Y, Li D, Lungu AJ, Wang H, Chen X. Ultrasound image guided and mixed reality-based surgical system with real-time soft tissue deformation computing for robotic cervical pedicle screw placement. IEEE Trans Biomed Eng 2022; 69:2593-2603. [PMID: 35157575 DOI: 10.1109/tbme.2022.3150952] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Cervical pedicle screw (CPS) placement surgery remains technically demanding due to the complicated anatomy with neurovascular structures. State-of-the-art surgical navigation or robotic systems still suffer from the problem of hand-eye coordination and soft tissue deformation. In this study, we aim at tracking the intraoperative soft tissue deformation and constructing a virtual physical fusion surgical scene, and integrating them into the robotic system for CPS placement surgery. Firstly, we propose a real-time deformation computation method based on the prior shape model and intraoperative partial information acquired from ultrasound images. According to the generated posterior shape, the structure representation of deformed target tissue gets updated continuously. Secondly, a hand tremble compensation method is proposed to improve the accuracy and robustness of the virtual-physical calibration procedure, and a mixed reality based surgical scene is further constructed for CPS placement surgery. Thirdly, we integrate the soft tissue deformation method and virtual-physical fusion method into our previously proposed surgical robotic system, and the surgical workflow for CPS placement surgery is introduced. We conducted phantom and animal experiments to evaluate the feasibility and accuracy of the proposed system. Our system yielded a mean surface distance error of 1.52 ± 0.43 mm for soft tissue deformation computing, and an average distance deviation of 1.04 ± 0.27 mm for CPS placement. Results demonstrated that our system involves tremendous clinical application potential. Our proposed system promotes the efficiency and safety of the CPS placement surgery.
Collapse
|
24
|
Zhu T, Jiang S, Yang Z, Zhou Z, Li Y, Ma S, Zhuo J. A neuroendoscopic navigation system based on dual-mode augmented reality for minimally invasive surgical treatment of hypertensive intracerebral hemorrhage. Comput Biol Med 2022; 140:105091. [PMID: 34872012 DOI: 10.1016/j.compbiomed.2021.105091] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2021] [Revised: 11/23/2021] [Accepted: 11/26/2021] [Indexed: 01/01/2023]
Abstract
BACKGROUND AND OBJECTIVE Hypertensive intracerebral hemorrhage is characterized by a high rate of morbidity, mortality, disability and recurrence. Neuroendoscopy has been utilized for treatment as an advanced technology. However, traditional neuroendoscopy allows professionals to see only tissue surfaces, and the field of vision is limited, which cannot provide spatial guidance. In this study, an AR-based neuroendoscopic navigation system is proposed to assist surgeons in locating and clearing hematoma. METHODS The neuroendoscope can be registered through the vector closed loop algorithm. The single-shot method is designed to register medical images with patients precisely. Real-time AR is realized based on video stream fusion. Dual-mode AR navigation is proposed to provide comprehensive guidance from catheter implantation to hematoma removal. A series of experiments is designed to validate the accuracy and significance of this system. RESULTS The average root mean square error of the registration between medical images and patients is 0.784 mm, and the variance is 0.1426 mm. The pixel mismatching degrees are less than 1% in different AR modes. In catheter implantation experiments, the average error of distance is 1.28 mm, and the variance is 0.43 mm, while the average error of angles is 1.34°, and the variance is 0.45°. Comparative experiments are also conducted to evaluate the feasibility of this system. CONCLUSION This system can provide stereo images with depth information fused with patients to guide surgeons to locate targets and remove hematoma. It has been validated to have high accuracy and feasibility.
Collapse
Affiliation(s)
- Tao Zhu
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China.
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Yuhua Li
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shixing Ma
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Jie Zhuo
- Department of Neurosurgery, Tianjin Huanhu Hospital, Tianjin, 300200, China
| |
Collapse
|
25
|
Matthews JH, Shields JS. The Clinical Application of Augmented Reality in Orthopaedics: Where Do We Stand? Curr Rev Musculoskelet Med 2021; 14:316-319. [PMID: 34581989 DOI: 10.1007/s12178-021-09713-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/09/2021] [Indexed: 12/13/2022]
Abstract
PURPOSE OF REVIEW The surgical community is constantly working to improve accuracy and reproducibility in patient care, with the goal to improve patient outcomes and efficiency. One area of growing interest with potential to meet these goals is in the use of augmented reality (AR) in surgery. There is still a paucity of published research on the clinical benefits of AR over traditional techniques, but this article aims to present an update on the current state of AR within orthopaedics over the past 5 years. RECENT FINDINGS AR systems are being developed and studied for use in all areas of orthopaedics. Most recently published research has focused on the areas of fracture care, adult reconstruction, orthopaedic oncology, spine, and resident education. These studies have shown some promising results, particularly in surgical accuracy, decreased surgical time, and less radiation exposure. However, the majority of recently published research is still in the pre-clinical setting, with very few studies using living patients. AR supplementation in orthopaedic surgery has shown promising results in pre-clinical settings, with improvements in surgical accuracy and reproducibility, decreased operating times, and less radiation exposure. Most AR systems, however, are still not approved for clinical use. Further research is needed to validate the benefits of AR use in orthopaedic surgery before it is widely adopted into practice.
Collapse
Affiliation(s)
- J Hunter Matthews
- WFBMC Department of Orthopaedic Surgery, Watlington 4th Floor, 1 Medical Center Blvd, Winston-Salem, NC, 27157, USA.
| | - John S Shields
- WFBMC Department of Orthopaedic Surgery, 329 NC-801 N, Bermuda Run, NC, 27006, USA
| |
Collapse
|