1
|
Nagai K, Sugimoto M, Itoi T, Minami H, Tsuchiya T, Araki Y, Tonozuka R, Kojima H, Hirakawa N, Orihara S. The efficacy of 3D hologram support with mixed-reality technique in pancreatobiliary endoscopy. JOURNAL OF HEPATO-BILIARY-PANCREATIC SCIENCES 2025. [PMID: 40269401 DOI: 10.1002/jhbp.12136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/25/2025]
Abstract
BACKGROUND AND AIMS Mixed-reality (MR) technology is an advanced holographic imaging system that uses wearable devices to display three-dimensional (3D) images in clinical environments. In pancreatobiliary endoscopy, MR technology can facilitate 3D visualization of the bile and pancreatic ducts, enhancing spatial awareness and providing a more intuitive understanding of the anatomy. We evaluated the safety and efficacy of 3D hologram support (3D-HS) using MR in pancreatobiliary endoscopy. METHODS This study included 30 patients who underwent pancreatobiliary endoscopy using 3D-HS between June 2023 and July 2024. The procedures included ERCP (n = 13) and interventional EUS (iEUS) (n = 17). The primary outcome was technical success, with secondary outcomes including total procedure-related time and adverse events. The conventional treatment group, matched for age and disease type, was compared to the 3D-HS group. a questionnaire was used to evaluate the MR device, 3D hologram images, and the overall evaluation with 3D-HS. RESULTS The overall technical success rate was 96.7% (ERCP, 100%; iEUS, 93.8%). Adverse events occurred in 16.7% of cases; all were managed conservatively. comparison with conventional treatment showed no significant differences in outcomes, although conventional methods had shorter procedural times (p < 0.05). The questionnaire results indicated positive feedback on operability and image quality of 3D-HS, with enhanced anatomical understanding relative to 2D imaging, although some ERCP cases received lower ratings, particularly for intrahepatic bile duct stones. CONCLUSIONS Pancreatobiliary endoscopy using 3D-HS can be safe and effective in enhancing anatomical understanding. Further studies are needed to optimize its use and reduce the total procedure time.
Collapse
Affiliation(s)
- Kazumasa Nagai
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Maki Sugimoto
- Okinaga Research Institute Teikyo University, Tokyo, Japan
| | - Takao Itoi
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Hirohito Minami
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Takayoshi Tsuchiya
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Yoichi Araki
- Department of Radiology, Tokyo Medical University, Tokyo, Japan
| | - Ryosuke Tonozuka
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Hiroyuki Kojima
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Noriyuki Hirakawa
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Shunichiro Orihara
- Department of Health Data Science, Tokyo Medical University, Tokyo, Japan
| |
Collapse
|
2
|
Rodrigues PG, Leandro DMK, de Azevedo SS, Mimica MJ, Rodrigues RF, Magalhães M, dos Anjos BF, Variane GFT. Transforming neonatal care: a position paper on the potential of augmented and mixed reality. Front Digit Health 2025; 7:1571521. [PMID: 40290870 PMCID: PMC12021927 DOI: 10.3389/fdgth.2025.1571521] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2025] [Accepted: 03/25/2025] [Indexed: 04/30/2025] Open
Abstract
Mixed reality (MR) and augmented reality (AR) technologies bridge elements of the real and virtual worlds, emerging as tools that allow users to engage with digital cues to aid with tasks encountered in the physical environment. Thus, these holographic-based innovations are potential tools to support real-time patient care. The applications of MR and AR in neonatal care remain significantly underexplored. In the present article, we highlight the applications of MR and AR across medical procedures, physical examinations, medical diagnoses, and telemedicine, further underscoring their transformative potential within neonatal care. The use of MR and AR can be relevant across diverse economic and clinical landscapes, and in-depth research is required to evaluate the advantages of these tools in caring for neonates.
Collapse
Affiliation(s)
| | - Danieli Mayumi Kimura Leandro
- Protecting Brains & Saving Futures, Clinical Research Department, São Paulo, Brazil
- Division of Neonatology, Department of Pediatrics, Irmandade da Santa Casa de Misericórdia de São Paulo, São Paulo, Brazil
| | | | - Marcelo Jenné Mimica
- Protecting Brains & Saving Futures, Clinical Research Department, São Paulo, Brazil
- Santa Casa de São Paulo School of Medicine, São Paulo, Brazil
| | - Rafaela Fabri Rodrigues
- Protecting Brains & Saving Futures, Clinical Research Department, São Paulo, Brazil
- Division of Neonatology, Department of Pediatrics, Irmandade da Santa Casa de Misericórdia de São Paulo, São Paulo, Brazil
| | - Mauricio Magalhães
- Protecting Brains & Saving Futures, Clinical Research Department, São Paulo, Brazil
- Division of Neonatology, Department of Pediatrics, Irmandade da Santa Casa de Misericórdia de São Paulo, São Paulo, Brazil
- Santa Casa de São Paulo School of Medicine, São Paulo, Brazil
| | | | - Gabriel Fernando Todeschi Variane
- Protecting Brains & Saving Futures, Clinical Research Department, São Paulo, Brazil
- Division of Neonatology, Department of Pediatrics, Irmandade da Santa Casa de Misericórdia de São Paulo, São Paulo, Brazil
| |
Collapse
|
3
|
Minami H, Nagai K, Sugimoto M, Tsuchiya T, Tanaka R, Tonozuka R, Mukai S, Yamamoto K, Kojima H, Itoi T. Intraprocedural mixed-reality hologram support in endoscopic retrograde cholangiography (ERC) for bile leaks. JOURNAL OF HEPATO-BILIARY-PANCREATIC SCIENCES 2025; 32:e7-e8. [PMID: 39943791 PMCID: PMC12038379 DOI: 10.1002/jhbp.12098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/30/2025]
Affiliation(s)
- Hirohito Minami
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Kazumasa Nagai
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Maki Sugimoto
- Innnovation Laboratory, Okinaga Research InstituteTeikyo UniversityTokyoJapan
| | - Takayoshi Tsuchiya
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Reina Tanaka
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Ryosuke Tonozuka
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Shuntaro Mukai
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Kenjiro Yamamoto
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Hiroyuki Kojima
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| | - Takao Itoi
- Department of Gastroenterology and HepatologyTokyo Medical UniversityTokyoJapan
| |
Collapse
|
4
|
Orione M, Rubegni G, Tartaro R, Alberghina A, Fallico M, Orione C, Russo A, Tosi GM, Avitabile T. Utilization of apple vision pro in ophthalmic surgery: A pilot study. Eur J Ophthalmol 2025; 35:715-721. [PMID: 39140319 DOI: 10.1177/11206721241273574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/15/2024]
Abstract
PURPOSE This paper explores the application of Apple Vision Pro in ophthalmic surgery, assessing its potential benefits in providing real-time imaging overlay, surgical guidance, and collaborative opportunities. MATERIALS AND METHOD The device was worn by 10 ophthalmic surgeons during eyelid malposition surgery. All surgeons performed the entire surgery while wearing the visor. At the end of procedure, all operators had to rate Apple Vision Pro visor according to 10 specific item and system usability scale (SUS) questionnaire. RESULTS The surgeons used the Apple Vision Pro during the entire procedure, and the results were positive, with high ratings for practicality, freedom of movement, integration into workflow, and learning. All surgeons rated the Apple Vision Pro above 85/100 in the SUS. CONCLUSION The incorporation of Apple Vision Pro in oculoplastic surgery offers several advantages, including improved visualization, enhanced precision, and streamlined communication among surgical teams. According to our preliminary results Apple Vision Pro could represents a valuable tool in ophthalmic surgery, with implications for enhancing surgical techniques and advancing XR research in the surgical field.
Collapse
Affiliation(s)
- M Orione
- Department of Ophthalmology, University of Catania, Catania, Italy
| | - G Rubegni
- Ophthalmology Unit, Department of Medicine, Surgery and Neurosciences, University of Siena, Siena, Italy
| | - R Tartaro
- Ophthalmology Unit, Ospedale Santo Spirito, Casale Monferrato, Italy
| | - A Alberghina
- Department of Ophthalmology, University of Catania, Catania, Italy
| | - M Fallico
- Department of Ophthalmology, University of Catania, Catania, Italy
| | - C Orione
- Ophthalmology Unit, Ospedale Santo Spirito, Casale Monferrato, Italy
| | - A Russo
- Department of Ophthalmology, University of Catania, Catania, Italy
| | - G M Tosi
- Ophthalmology Unit, Department of Medicine, Surgery and Neurosciences, University of Siena, Siena, Italy
| | - T Avitabile
- Department of Ophthalmology, University of Catania, Catania, Italy
| |
Collapse
|
5
|
Amano T, Yoneoka Y, Tanaka Y, Takahashi A, Tsuji S. Enhancing Sentinel Lymph Node Biopsy in Endometrial Cancer Using Augmented and Mixed Reality. Cureus 2025; 17:e77599. [PMID: 39963631 PMCID: PMC11830575 DOI: 10.7759/cureus.77599] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/17/2025] [Indexed: 02/20/2025] Open
Abstract
Sentinel lymph node biopsy offers a less invasive alternative to full lymphadenectomy in endometrial cancer, reducing complications while maintaining diagnostic accuracy. This case report highlights the integration of holography and augmented reality (AR) in a 34-year-old woman undergoing robotic-assisted surgery for endometrial cancer. Preoperative imaging combined with single-photon emission computed tomography and computed tomography identified sentinel lymph nodes, which were visualized using mixed reality (MR) technology. This approach enabled the surgical team to accurately understand the three-dimensional spatial relationships between lymph nodes and surrounding critical structures. Holographic projections provided precise guidance during surgery, improving lymph node identification and minimizing invasiveness. No lymph node metastasis was detected, but a diagnosis of the International Federation of Gynecology and Obstetrics (FIGO) stage IIIA was confirmed due to tumor seeding in the peritoneum. The patient underwent successful adjuvant chemotherapy with no recurrence observed. This report demonstrates the significant potential of holography and AR to enhance spatial awareness and surgical precision. These technologies represent a promising advancement in sentinel lymph node biopsy for patients with gynecologic cancers, contributing to reduced surgical invasiveness and alleviating stress for surgeons.
Collapse
Affiliation(s)
- Tsukuru Amano
- Obstetrics and Gynecology, Shiga University of Medical Science, Shiga, JPN
| | - Yutaka Yoneoka
- Obstetrics and Gynecology, Shiga University of Medical Science, Shiga, JPN
| | - Yuji Tanaka
- Obstetrics and Gynecology, Shiga University of Medical Science, Shiga, JPN
| | - Akimasa Takahashi
- Obstetrics and Gynecology, Shiga University of Medical Science, Shiga, JPN
| | - Shunichiro Tsuji
- Obstetrics and Gynecology, Shiga University of Medical Science, Shiga, JPN
| |
Collapse
|
6
|
Parekh P, Oyeleke R, Vishwanath T. The Depth Estimation and Visualization of Dermatological Lesions: Development and Usability Study. JMIR DERMATOLOGY 2024; 7:e59839. [PMID: 39693616 DOI: 10.2196/59839] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Revised: 06/28/2024] [Accepted: 11/21/2024] [Indexed: 12/20/2024] Open
Abstract
BACKGROUND Thus far, considerable research has been focused on classifying a lesion as benign or malignant. However, there is a requirement for quick depth estimation of a lesion for the accurate clinical staging of the lesion. The lesion could be malignant and quickly grow beneath the skin. While biopsy slides provide clear information on lesion depth, it is an emerging domain to find quick and noninvasive methods to estimate depth, particularly based on 2D images. OBJECTIVE This study proposes a novel methodology for the depth estimation and visualization of skin lesions. Current diagnostic methods are approximate in determining how much a lesion may have proliferated within the skin. Using color gradients and depth maps, this method will give us a definite estimate and visualization procedure for lesions and other skin issues. We aim to generate 3D holograms of the lesion depth such that dermatologists can better diagnose melanoma. METHODS We started by performing classification using a convolutional neural network (CNN), followed by using explainable artificial intelligence to localize the image features responsible for the CNN output. We used the gradient class activation map approach to perform localization of the lesion from the rest of the image. We applied computer graphics for depth estimation and developing the 3D structure of the lesion. We used the depth from defocus method for depth estimation from single images and Gabor filters for volumetric representation of the depth map. Our novel method, called red spot analysis, measures the degree of infection based on how a conical hologram is constructed. We collaborated with a dermatologist to analyze the 3D hologram output and received feedback on how this method can be introduced to clinical implementation. RESULTS The neural model plus the explainable artificial intelligence algorithm achieved an accuracy of 86% in classifying the lesions correctly as benign or malignant. For the entire pipeline, we mapped the benign and malignant cases to their conical representations. We received exceedingly positive feedback while pitching this idea at the King Edward Memorial Institute in India. Dermatologists considered this a potentially useful tool in the depth estimation of lesions. We received a number of ideas for evaluating the technique before it can be introduced to the clinical scene. CONCLUSIONS When we map the CNN outputs (benign or malignant) to the corresponding hologram, we observe that a malignant lesion has a higher concentration of red spots (infection) in the upper and deeper portions of the skin, and that the malignant cases have deeper conical sections when compared with the benign cases. This proves that the qualitative results map with the initial classification performed by the neural model. The positive feedback provided by the dermatologist suggests that the qualitative conclusion of the method is sufficient.
Collapse
Affiliation(s)
- Pranav Parekh
- Stevens Institute of Technology, Hoboken, NJ, United States
| | | | | |
Collapse
|
7
|
Nagai K, Sugimoto M, Tsuchiya T, Tonozuka R, Mukai S, Yamamoto K, Itoi T. Intraprocedural hologram support with mixed-reality technique in endoscopic ultrasound-guided biliary drainage. Endoscopy 2024; 56:E550-E551. [PMID: 38917978 PMCID: PMC11199081 DOI: 10.1055/a-2335-6642] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Affiliation(s)
- Kazumasa Nagai
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Maki Sugimoto
- Okinaga Research Institute, Teikyo University, Tokyo, Japan
| | - Takasyoshi Tsuchiya
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Ryosuke Tonozuka
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Shuntaro Mukai
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Kenjiro Yamamoto
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| | - Takao Itoi
- Department of Gastroenterology and Hepatology, Tokyo Medical University, Tokyo, Japan
| |
Collapse
|
8
|
Ito T, Fujikawa T, Takeda T, Mizoguchi Y, Okubo K, Onogi S, Nakajima Y, Tsutsumi T. Integration of Augmented Reality in Temporal Bone and Skull Base Surgeries. SENSORS (BASEL, SWITZERLAND) 2024; 24:7063. [PMID: 39517960 PMCID: PMC11548182 DOI: 10.3390/s24217063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2024] [Revised: 10/29/2024] [Accepted: 10/31/2024] [Indexed: 11/16/2024]
Abstract
Augmented reality technologies provide transformative solutions in various surgical fields. Our research focuses on the use of an advanced augmented reality system that projects 3D holographic images directly into surgical footage, potentially improving the surgeon's orientation to the surgical field and lowering the cognitive load. We created a novel system that combines exoscopic surgical footage from the "ORBEYE" and displays both the surgical field and 3D holograms on a single screen. This setup enables surgeons to use the system without using head-mounted displays, instead viewing the integrated images on a 3D monitor. Thirteen surgeons and surgical assistants completed tasks with 2D and 3D graphical surgical guides. The NASA Task Load Index was used to assess mental, physical, and temporal demands. The use of 3D graphical surgical guides significantly improved performance metrics in cochlear implant surgeries by lowering mental, physical, temporal, and frustration levels. However, for Bonebridge implantation, the 2D graphical surgical guide performed better overall (p = 0.045). Participants found the augmented reality system's video latency to be imperceptible, measuring 0.13 ± 0.01 s. This advanced augmented reality system significantly improves the efficiency and precision of cochlear implant surgeries by lowering cognitive load and improving spatial orientation.
Collapse
Affiliation(s)
- Taku Ito
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| | - Taro Fujikawa
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| | - Takamori Takeda
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| | - Yoshimaru Mizoguchi
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| | - Kouta Okubo
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| | - Shinya Onogi
- Department of Biomedical Information, Institute of Biomaterials and Bioengineering, Institute of Science Tokyo, Tokyo 101-0062, Japan; (S.O.); (Y.N.)
| | - Yoshikazu Nakajima
- Department of Biomedical Information, Institute of Biomaterials and Bioengineering, Institute of Science Tokyo, Tokyo 101-0062, Japan; (S.O.); (Y.N.)
| | - Takeshi Tsutsumi
- Department of Otorhinolaryngology, Institute of Science Tokyo, 1-5-45 Yushima, Bunkyo-ku, Tokyo 113-8510, Japan; (T.F.); (T.T.); (Y.M.); (K.O.); (T.T.)
| |
Collapse
|
9
|
Levschuk A, Whittal J, Trejos AL, Sirek A. Leveraging Space-Flown Technologies to Deliver Healthcare with Holographic Physical Examinations. Aerosp Med Hum Perform 2024; 95:214-218. [PMID: 38486313 DOI: 10.3357/amhp.6397.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/17/2024]
Abstract
INTRODUCTION: Musculoskeletal injuries are one of the more common injuries in spaceflight. Physical assessment of an injury is essential for diagnosis and treatment. Unfortunately, when musculoskeletal injuries occur in space, the flight surgeon is limited to two-dimensional videoconferencing and, potentially, observations made by the crew medical officer. To address these limitations, we investigated the feasibility of performing physical examinations on a three-dimensional augmented reality projection using a mixed-reality headset, specifically evaluating a standard shoulder examination.METHODS: A simulated patient interaction was set up between Western University in London, Ontario, Canada, and Huntsville, AL, United States. The exam was performed by a medical student, and a healthy adult man volunteered to enable the physical exam.RESULTS: All parts of the standard shoulder physical examination according to the Bates Guide to the Physical Exam were performed with holoportation. Adaptation was required for the palpation and some special tests.DISCUSSION: All parts of the physical exam were able to be completed. The true to anatomical size of the holograms permitted improved inspection of the anatomy compared to traditional videoconferencing. Palpation was completed by instructing the patient to palpate themselves and comment on relevant findings asked by the examiner. Range of motion and special tests for specific pathologies were also able to be completed with some modifications due to the examiner not being present to provide resistance. Future work should aim to improve the graphics, physician communication, and haptic feedback during holoportation.Levschuk A, Whittal J, Trejos AL, Sirek A. Leveraging space-flown technologies to deliver healthcare with holographic physical examinations. Aerosp Med Hum Perform. 2024; 95(4):214-218.
Collapse
|
10
|
Hatzl J, Henning D, Böckler D, Hartmann N, Meisenbacher K, Uhl C. Comparing Different Registration and Visualization Methods for Navigated Common Femoral Arterial Access-A Phantom Model Study Using Mixed Reality. J Imaging 2024; 10:76. [PMID: 38667974 PMCID: PMC11051344 DOI: 10.3390/jimaging10040076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2024] [Revised: 03/17/2024] [Accepted: 03/20/2024] [Indexed: 04/28/2024] Open
Abstract
Mixed reality (MxR) enables the projection of virtual three-dimensional objects into the user's field of view via a head-mounted display (HMD). This phantom model study investigated three different workflows for navigated common femoral arterial (CFA) access and compared it to a conventional sonography-guided technique as a control. A total of 160 punctures were performed by 10 operators (5 experts and 5 non-experts). A successful CFA puncture was defined as puncture at the mid-level of the femoral head with the needle tip at the central lumen line in a 0° coronary insertion angle and a 45° sagittal insertion angle. Positional errors were quantified using cone-beam computed tomography following each attempt. Mixed effect modeling revealed that the distance from the needle entry site to the mid-level of the femoral head is significantly shorter for navigated techniques than for the control group. This highlights that three-dimensional visualization could increase the safety of CFA access. However, the navigated workflows are infrastructurally complex with limited usability and are associated with relevant cost. While navigated techniques appear as a potentially beneficial adjunct for safe CFA access, future developments should aim to reduce workflow complexity, avoid optical tracking systems, and offer more pragmatic methods of registration and instrument tracking.
Collapse
Affiliation(s)
- Johannes Hatzl
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Daniel Henning
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Dittmar Böckler
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Niklas Hartmann
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Katrin Meisenbacher
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Christian Uhl
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
- Department of Vascular Surgery, University Hospital RWTH Aachen, 52074 Aachen, Germany
| |
Collapse
|
11
|
Naito S, Kajiwara M, Nakashima R, Sasaki T, Hasegawa S. Application of Extended Reality (Virtual Reality and Mixed Reality) Technology in Laparoscopic Liver Resections. Cureus 2023; 15:e44520. [PMID: 37790042 PMCID: PMC10544840 DOI: 10.7759/cureus.44520] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/31/2023] [Indexed: 10/05/2023] Open
Abstract
Background and purpose Laparoscopic liver resection (LLR) has recently gained popularity owing to advances in surgical techniques. Difficulties in LLR may be influenced by anatomical factors. This study presents a comprehensive overview of LLR performed using extended reality (XR) technology. Methods Six patients underwent LLR performed wearing HoloLens2® XR (Microsoft Corporation, Redmond, Washington, United States) technology. We performed dynamic contrast-enhanced CT scans before surgery and used the data to construct three-dimensional images. Results Of the six patients, two were diagnosed with colorectal liver metastases, two with hepatocellular carcinoma, and one with intrahepatic cholangiocarcinoma. The median maximum tumor diameter was 31 mm (range, 23-80 mm). One patient had liver cirrhosis, with Child-Pugh classification grade B. Anatomical resection was performed in three patients (60%), with a median difficulty score of 7 (intermediate). No conversions to open surgery were necessary. The median operative time and estimated blood loss were 444 minutes (range, 337-597 minutes) and 200 mL (range, 100-1000 mL), respectively. Postoperative complications (Clavien-Dindo classification grade II) were observed in one patient. All six cases achieved negative surgical margins. Conclusions LLR using XR technology enhances surgical visualization and anatomical recognition. The incorporation of XR technology into LLR offers advantages over traditional two-dimensional imaging.
Collapse
Affiliation(s)
- Shigetoshi Naito
- Gastroenterological Surgery, Fukuoka University Hospital, Fukuoka, JPN
| | - Masatoshi Kajiwara
- Gastroenterological Surgery, Fukuoka University Faculty of Medicine, Fukuoka, JPN
| | - Ryo Nakashima
- Gastroenterological Surgery, Fukuoka University Faculty of Medicine, Fukuoka, JPN
| | - Takahide Sasaki
- Gastroenterological Surgery, Fukuoka University Faculty of Medicine, Fukuoka, JPN
| | - Suguru Hasegawa
- Gastroenterological Surgery, Fukuoka University Hospital, Fukuoka, JPN
| |
Collapse
|
12
|
Zary N, Eysenbach G, Van Doormaal TPC, Ruurda JP, Van der Kaaij NP, De Heer LM. Mixed Reality in Modern Surgical and Interventional Practice: Narrative Review of the Literature. JMIR Serious Games 2023; 11:e41297. [PMID: 36607711 PMCID: PMC9947976 DOI: 10.2196/41297] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Revised: 10/17/2022] [Accepted: 10/31/2022] [Indexed: 11/07/2022] Open
Abstract
BACKGROUND Mixed reality (MR) and its potential applications have gained increasing interest within the medical community over the recent years. The ability to integrate virtual objects into a real-world environment within a single video-see-through display is a topic that sparks imagination. Given these characteristics, MR could facilitate preoperative and preinterventional planning, provide intraoperative and intrainterventional guidance, and aid in education and training, thereby improving the skills and merits of surgeons and residents alike. OBJECTIVE In this narrative review, we provide a broad overview of the different applications of MR within the entire spectrum of surgical and interventional practice and elucidate on potential future directions. METHODS A targeted literature search within the PubMed, Embase, and Cochrane databases was performed regarding the application of MR within surgical and interventional practice. Studies were included if they met the criteria for technological readiness level 5, and as such, had to be validated in a relevant environment. RESULTS A total of 57 studies were included and divided into studies regarding preoperative and interventional planning, intraoperative and interventional guidance, as well as training and education. CONCLUSIONS The overall experience with MR is positive. The main benefits of MR seem to be related to improved efficiency. Limitations primarily seem to be related to constraints associated with head-mounted display. Future directions should be aimed at improving head-mounted display technology as well as incorporation of MR within surgical microscopes, robots, and design of trials to prove superiority.
Collapse
Affiliation(s)
| | | | - Tristan P C Van Doormaal
- University Medical Center Utrecht, Utrecht, Netherlands.,University Hospital Zurich, Zurich, Switzerland
| | | | | | | |
Collapse
|
13
|
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207709. [PMID: 36298059 PMCID: PMC9611914 DOI: 10.3390/s22207709] [Citation(s) in RCA: 54] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 09/29/2022] [Accepted: 10/07/2022] [Indexed: 05/08/2023]
Abstract
In the world reference context, although virtual reality, augmented reality and mixed reality have been emerging methodologies for several years, only today technological and scientific advances have made them suitable to revolutionize clinical care and medical contexts through the provision of enhanced functionalities and improved health services. This systematic review provides the state-of-the-art applications of the Microsoft® HoloLens 2 in a medical and healthcare context. Focusing on the potential that this technology has in providing digitally supported clinical care, also but not only in relation to the COVID-19 pandemic, studies that proved the applicability and feasibility of HoloLens 2 in a medical and healthcare scenario were considered. The review presents a thorough examination of the different studies conducted since 2019, focusing on HoloLens 2 medical sub-field applications, device functionalities provided to users, software/platform/framework used, as well as the study validation. The results provided in this paper could highlight the potential and limitations of the HoloLens 2-based innovative solutions and bring focus to emerging research topics, such as telemedicine, remote control and motor rehabilitation.
Collapse
Affiliation(s)
- Arrigo Palumbo
- Department of Medical and Surgical Sciences, Magna Græcia University, 88100 Catanzaro, Italy
| |
Collapse
|
14
|
The intraoperative use of augmented and mixed reality technology to improve surgical outcomes: A systematic review. Int J Med Robot 2022; 18:e2450. [DOI: 10.1002/rcs.2450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 07/23/2022] [Accepted: 07/27/2022] [Indexed: 11/07/2022]
|
15
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
16
|
Digital Transformation Will Change Medical Education and Rehabilitation in Spine Surgery. Medicina (B Aires) 2022; 58:medicina58040508. [PMID: 35454347 PMCID: PMC9030988 DOI: 10.3390/medicina58040508] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2022] [Revised: 03/22/2022] [Accepted: 03/31/2022] [Indexed: 12/25/2022] Open
Abstract
The concept of minimally invasive spine therapy (MIST) has been proposed as a treatment strategy to reduce the need for overall patient care, including not only minimally invasive spine surgery (MISS) but also conservative treatment and rehabilitation. To maximize the effectiveness of patient care in spine surgery, the educational needs of medical students, residents, and patient rehabilitation can be enhanced by digital transformation (DX), including virtual reality (VR), augmented reality (AR), mixed reality (MR), and extended reality (XR), three-dimensional (3D) medical images and holograms; wearable sensors, high-performance video cameras, fifth-generation wireless system (5G) and wireless fidelity (Wi-Fi), artificial intelligence, and head-mounted displays (HMDs). Furthermore, to comply with the guidelines for social distancing due to the unexpected COVID-19 pandemic, the use of DX to maintain healthcare and education is becoming more innovative than ever before. In medical education, with the evolution of science and technology, it has become mandatory to provide a highly interactive educational environment and experience using DX technology for residents and medical students, known as digital natives. This study describes an approach to pre- and intraoperative medical education and postoperative rehabilitation using DX in the field of spine surgery that was implemented during the COVID-19 pandemic and will be utilized thereafter.
Collapse
|
17
|
Uhl C, Hatzl J, Meisenbacher K, Zimmer L, Hartmann N, Böckler D. Mixed-Reality-Assisted Puncture of the Common Femoral Artery in a Phantom Model. J Imaging 2022; 8:jimaging8020047. [PMID: 35200749 PMCID: PMC8874567 DOI: 10.3390/jimaging8020047] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 02/12/2022] [Accepted: 02/14/2022] [Indexed: 12/15/2022] Open
Abstract
Percutaneous femoral arterial access is daily practice in a variety of medical specialties and enables physicians worldwide to perform endovascular interventions. The reported incidence of percutaneous femoral arterial access complications is 3–18% and often results from suboptimal puncture location due to insufficient visualization of the target vessel. The purpose of this proof-of-concept study was to evaluate the feasibility and the positional error of a mixed-reality (MR)-assisted puncture of the common femoral artery in a phantom model using a commercially available navigation system. In total, 15 MR-assisted punctures were performed. Cone-beam computed tomography angiography (CTA) was used following each puncture to allow quantification of positional error of needle placements in the axial and sagittal planes. Technical success was achieved in 14/15 cases (93.3%) with a median axial positional error of 1.0 mm (IQR 1.3) and a median sagittal positional error of 1.1 mm (IQR 1.6). The median duration of the registration process and needle insertion was 2 min (IQR 1.0). MR-assisted puncture of the common femoral artery is feasible with acceptable positional errors in a phantom model. Future studies should aim to measure and reduce the positional error resulting from MR registration.
Collapse
|