1
|
Hernandez Torres SI, Ruiz A, Holland L, Ortiz R, Snider EJ. Evaluation of Deep Learning Model Architectures for Point-of-Care Ultrasound Diagnostics. Bioengineering (Basel) 2024; 11:392. [PMID: 38671813 PMCID: PMC11048259 DOI: 10.3390/bioengineering11040392] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2024] [Revised: 04/05/2024] [Accepted: 04/13/2024] [Indexed: 04/28/2024] Open
Abstract
Point-of-care ultrasound imaging is a critical tool for patient triage during trauma for diagnosing injuries and prioritizing limited medical evacuation resources. Specifically, an eFAST exam evaluates if there are free fluids in the chest or abdomen but this is only possible if ultrasound scans can be accurately interpreted, a challenge in the pre-hospital setting. In this effort, we evaluated the use of artificial intelligent eFAST image interpretation models. Widely used deep learning model architectures were evaluated as well as Bayesian models optimized for six different diagnostic models: pneumothorax (i) B- or (ii) M-mode, hemothorax (iii) B- or (iv) M-mode, (v) pelvic or bladder abdominal hemorrhage and (vi) right upper quadrant abdominal hemorrhage. Models were trained using images captured in 27 swine. Using a leave-one-subject-out training approach, the MobileNetV2 and DarkNet53 models surpassed 85% accuracy for each M-mode scan site. The different B-mode models performed worse with accuracies between 68% and 74% except for the pelvic hemorrhage model, which only reached 62% accuracy for all model architectures. These results highlight which eFAST scan sites can be easily automated with image interpretation models, while other scan sites, such as the bladder hemorrhage model, will require more robust model development or data augmentation to improve performance. With these additional improvements, the skill threshold for ultrasound-based triage can be reduced, thus expanding its utility in the pre-hospital setting.
Collapse
Affiliation(s)
| | | | | | | | - Eric J. Snider
- Organ Support and Automation Technologies Group, U.S. Army Institute of Surgical Research, Joint Base San Antonio, Fort Sam Houston, San Antonio, TX 78234, USA; (S.I.H.T.); (A.R.); (L.H.); (R.O.)
| |
Collapse
|
2
|
Holland L, Hernandez Torres SI, Snider EJ. Using AI Segmentation Models to Improve Foreign Body Detection and Triage from Ultrasound Images. Bioengineering (Basel) 2024; 11:128. [PMID: 38391614 PMCID: PMC10886314 DOI: 10.3390/bioengineering11020128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Revised: 01/20/2024] [Accepted: 01/26/2024] [Indexed: 02/24/2024] Open
Abstract
Medical imaging can be a critical tool for triaging casualties in trauma situations. In remote or military medicine scenarios, triage is essential for identifying how to use limited resources or prioritize evacuation for the most serious cases. Ultrasound imaging, while portable and often available near the point of injury, can only be used for triage if images are properly acquired, interpreted, and objectively triage scored. Here, we detail how AI segmentation models can be used for improving image interpretation and objective triage evaluation for a medical application focused on foreign bodies embedded in tissues at variable distances from critical neurovascular features. Ultrasound images previously collected in a tissue phantom with or without neurovascular features were labeled with ground truth masks. These image sets were used to train two different segmentation AI frameworks: YOLOv7 and U-Net segmentation models. Overall, both approaches were successful in identifying shrapnel in the image set, with U-Net outperforming YOLOv7 for single-class segmentation. Both segmentation models were also evaluated with a more complex image set containing shrapnel, artery, vein, and nerve features. YOLOv7 obtained higher precision scores across multiple classes whereas U-Net achieved higher recall scores. Using each AI model, a triage distance metric was adapted to measure the proximity of shrapnel to the nearest neurovascular feature, with U-Net more closely mirroring the triage distances measured from ground truth labels. Overall, the segmentation AI models were successful in detecting shrapnel in ultrasound images and could allow for improved injury triage in emergency medicine scenarios.
Collapse
Affiliation(s)
- Lawrence Holland
- Organ Support and Automation Technologies Group, U.S. Army Institute of Surgical Research, JBSA Fort Sam Houston, San Antonio, TX 78234, USA
| | - Sofia I Hernandez Torres
- Organ Support and Automation Technologies Group, U.S. Army Institute of Surgical Research, JBSA Fort Sam Houston, San Antonio, TX 78234, USA
| | - Eric J Snider
- Organ Support and Automation Technologies Group, U.S. Army Institute of Surgical Research, JBSA Fort Sam Houston, San Antonio, TX 78234, USA
| |
Collapse
|
3
|
Hernandez-Torres SI, Bedolla C, Berard D, Snider EJ. An extended focused assessment with sonography in trauma ultrasound tissue-mimicking phantom for developing automated diagnostic technologies. Front Bioeng Biotechnol 2023; 11:1244616. [PMID: 38033814 PMCID: PMC10682760 DOI: 10.3389/fbioe.2023.1244616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 10/30/2023] [Indexed: 12/02/2023] Open
Abstract
Introduction: Medical imaging-based triage is critical for ensuring medical treatment is timely and prioritized. However, without proper image collection and interpretation, triage decisions can be hard to make. While automation approaches can enhance these triage applications, tissue phantoms must be developed to train and mature these novel technologies. Here, we have developed a tissue phantom modeling the ultrasound views imaged during the enhanced focused assessment with sonography in trauma exam (eFAST). Methods: The tissue phantom utilized synthetic clear ballistic gel with carveouts in the abdomen and rib cage corresponding to the various eFAST scan points. Various approaches were taken to simulate proper physiology without injuries present or to mimic pneumothorax, hemothorax, or abdominal hemorrhage at multiple locations in the torso. Multiple ultrasound imaging systems were used to acquire ultrasound scans with or without injury present and were used to train deep learning image classification predictive models. Results: Performance of the artificial intelligent (AI) models trained in this study achieved over 97% accuracy for each eFAST scan site. We used a previously trained AI model for pneumothorax which achieved 74% accuracy in blind predictions for images collected with the novel eFAST tissue phantom. Grad-CAM heat map overlays for the predictions identified that the AI models were tracking the area of interest for each scan point in the tissue phantom. Discussion: Overall, the eFAST tissue phantom ultrasound scans resembled human images and were successful in training AI models. Tissue phantoms are critical first steps in troubleshooting and developing medical imaging automation technologies for this application that can accelerate the widespread use of ultrasound imaging for emergency triage.
Collapse
Affiliation(s)
| | | | | | - Eric J. Snider
- Organ Support and Automation Technologies Group, U.S. Army Institute of Surgical Research, JBSA Fort Sam Houston, San Antonio, TX, United States
| |
Collapse
|
4
|
Jiang B, Wang L, Xu K, Hossbach M, Demir A, Rajan P, Taylor RH, Moghekar A, Foroughi P, Kazanzides P, Boctor EM. Wearable Mechatronic Ultrasound-integrated AR Navigation System for Lumbar Puncture Guidance. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2023; 5:966-977. [PMID: 38779126 PMCID: PMC11107797 DOI: 10.1109/tmrb.2023.3319963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2024]
Abstract
As one of the most commonly performed spinal interventions in routine clinical practice, lumbar punctures are usually done with only hand palpation and trial-and-error. Failures can prolong procedure time and introduce complications such as cerebrospinal fluid leaks and headaches. Therefore, an effective needle insertion guidance method is desired. In this work, we present a complete lumbar puncture guidance system with the integration of (1) a wearable mechatronic ultrasound imaging device, (2) volume-reconstruction and bone surface estimation algorithms and (3) two alternative augmented reality user interfaces for needle guidance, including a HoloLens-based and a tablet-based solution. We conducted a quantitative evaluation of the end-to-end navigation accuracy, which shows that our system can achieve an overall needle navigation accuracy of 2.83 mm and 2.76 mm for the Tablet-based and the HoloLens-based solutions, respectively. In addition, we conducted a preliminary user study to qualitatively evaluate the effectiveness and ergonomics of our system on lumbar phantoms. The results show that users were able to successfully reach the target in an average of 1.12 and 1.14 needle insertion attempts for Tablet-based and HoloLens-based systems, respectively, exhibiting the potential to reduce the failure rates of lumbar puncture procedures with the proposed lumbar-puncture guidance.
Collapse
Affiliation(s)
- Baichuan Jiang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Liam Wang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Keshuai Xu
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | | | - Alican Demir
- Clear Guide Medical Inc., Baltimore, MD 21211, USA
| | | | - Russell H. Taylor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Abhay Moghekar
- Department of Neurology, Johns Hopkins Medical Institute, Baltimore, MD 21205, USA
| | | | - Peter Kazanzides
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Emad M. Boctor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| |
Collapse
|
5
|
Saruwatari MS, Nguyen TN, Talari HF, Matisoff AJ, Sharma KV, Donoho KG, Basu S, Dwivedi P, Bost JE, Shekhar R. Assessing the Effect of Augmented Reality on Procedural Outcomes During Ultrasound-Guided Vascular Access. ULTRASOUND IN MEDICINE & BIOLOGY 2023; 49:2346-2353. [PMID: 37573178 PMCID: PMC10658651 DOI: 10.1016/j.ultrasmedbio.2023.07.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Revised: 06/16/2023] [Accepted: 07/11/2023] [Indexed: 08/14/2023]
Abstract
OBJECTIVE Augmented reality devices are increasingly accepted in health care, though most applications involve education and pre-operative planning. A novel augmented reality ultrasound application, HoloUS, was developed for the Microsoft HoloLens 2 to project real-time ultrasound images directly into the user's field of view. In this work, we assessed the effect of using HoloUS on vascular access procedural outcomes. METHODS A single-center user study was completed with participants with (N = 22) and without (N = 12) experience performing ultrasound-guided vascular access. Users completed a venipuncture and aspiration task a total of four times: three times on study day 1, and once on study day 2 between 2 and 4 weeks later. Users were randomized to use conventional ultrasound during either their first or second task and the HoloUS application at all other times. Task completion time, numbers of needle re-directions, head adjustments and needle visualization rates were recorded. RESULTS For expert users, task completion time was significantly faster using HoloUS (11.5 s, interquartile range [IQR] = 6.5-23.5 s vs. 18.5 s, IQR = 11.0-36.5 s; p = 0.04). The number of head adjustments was significantly lower using the HoloUS app (1.0, IQR = 0.0-1.0 vs. 3.0, IQR = 1.0-5.0; p < 0.0001). No significant differences were identified in other measured outcomes. CONCLUSION This is the first investigation of augmented reality-based ultrasound-guided vascular access using the second-generation HoloLens. It demonstrates equivalent procedural efficiency and accuracy, with favorable usability, ergonomics and user independence when compared with traditional ultrasound techniques.
Collapse
Affiliation(s)
- Michele S Saruwatari
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; Department of Surgery, MedStar Georgetown University Hospital and Washington Hospital Center, Washington, DC, USA
| | | | - Hadi Fooladi Talari
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA
| | - Andrew J Matisoff
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Kelsey G Donoho
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Sonali Basu
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Pallavi Dwivedi
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA
| | - James E Bost
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Raj Shekhar
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA; IGI Technologies, Silver Spring, MD, USA; George Washington University School of Medicine and Health Sciences, Washington, DC, USA.
| |
Collapse
|
6
|
Marhofer P, Eichenberger U. Augmented reality in ultrasound-guided regional anaesthesia: useful tool or expensive toy? Br J Anaesth 2023; 131:442-445. [PMID: 37353469 DOI: 10.1016/j.bja.2023.05.022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2023] [Revised: 05/02/2023] [Accepted: 05/24/2023] [Indexed: 06/25/2023] Open
Abstract
Use of augmented reality is increasingly applied in medical education and practice. The main advantage of this technology is the display of relevant information in the visual field of multiple operators. Here we provide a critical analysis of the potential application of augmented reality in regional anaesthesia.
Collapse
Affiliation(s)
- Peter Marhofer
- Department of Anaesthesia, Intensive Care Medicine and Pain Medicine, Medical University of Vienna, Vienna, Austria.
| | - Urs Eichenberger
- Department of Anaesthesiology, Intensive Care and Pain Medicine, Balgrist University Hospital and University of Zurich, Zurich, Switzerland
| |
Collapse
|
7
|
Sohn J, Se Cha M. Evidence and Practicality of Real-Time Ultrasound-Guided Procedures in the Intensive Care Unit: A New Skill Set for the Intensivist. Tex Heart Inst J 2023; 50:e238166. [PMID: 37432768 PMCID: PMC10660895 DOI: 10.14503/thij-23-8166] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/13/2023]
Affiliation(s)
- Jacqueline Sohn
- Department of Anesthesiology, Division of Cardiovascular Anesthesia and Critical Care Medicine, Baylor College of Medicine, Houston, Texas
| | - Min Se Cha
- Department of Cardiovascular Anesthesiology, The Texas Heart Institute, Houston, Texas
| |
Collapse
|
8
|
Farshad-Amacker NA, Kubik-Huch RA, Kolling C, Leo C, Goldhahn J. Learning how to perform ultrasound-guided interventions with and without augmented reality visualization: a randomized study. Eur Radiol 2023; 33:2927-2934. [PMID: 36350392 PMCID: PMC10017581 DOI: 10.1007/s00330-022-09220-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 10/02/2022] [Accepted: 10/09/2022] [Indexed: 11/11/2022]
Abstract
OBJECTIVES Augmented reality (AR), which entails overlay of in situ images onto the anatomy, may be a promising technique for assisting image-guided interventions. The purpose of this study was to investigate and compare the learning experience and performance of untrained operators in puncture of soft tissue lesions, when using AR ultrasound (AR US) compared with standard US (sUS). METHODS Forty-four medical students (28 women, 16 men) who had completed a basic US course, but had no experience with AR US, were asked to perform US-guided biopsies with both sUS and AR US, with a randomized selection of the initial modality. The experimental setup aimed to simulate biopsies of superficial soft tissue lesions, such as for example breast masses in clinical practice, by use of a turkey breast containing olives. Time to puncture(s) and success (yes/no) of the biopsies was documented. All participants completed questionnaires about their coordinative skills and their experience during the training. RESULTS Despite having no experience with the AR technique, time to puncture did not differ significantly between AR US and sUS (median [range]: 17.0 s [6-60] and 14.5 s [5-41], p = 0.16), nor were there any gender-related differences (p = 0.22 and p = 0.50). AR US was considered by 79.5% of the operators to be the more enjoyable means of learning and performing US-guided biopsies. Further, a more favorable learning curve was achieved using AR US. CONCLUSIONS Students considered AR US to be the preferable and more enjoyable modality for learning how to obtain soft tissue biopsies; however, they did not perform the biopsies faster than when using sUS. KEY POINTS • Performance of standard and augmented reality US-guided biopsies was comparable • A more favorable learning curve was achieved using augmented reality US. • Augmented reality US was the preferred technique and was considered more enjoyable.
Collapse
Affiliation(s)
- Nadja A Farshad-Amacker
- Radiology, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Rahel A Kubik-Huch
- Institute of Radiology, Department of Medical Services, Kantonsspital Baden, Baden, Switzerland
| | - Christoph Kolling
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| | - Cornelia Leo
- Department of Gynaecology and Obstetrics, Kantonsspital Baden, Baden, Switzerland
| | - Jörg Goldhahn
- Institute of Translational Medicine, Department of Health Sciences and Technology, Eidgenössische Technische Hochschule (ETH), Zurich, Switzerland
| |
Collapse
|
9
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
10
|
von Haxthausen F, Rüger C, Sieren MM, Kloeckner R, Ernst F. Augmenting Image-Guided Procedures through In Situ Visualization of 3D Ultrasound via a Head-Mounted Display. SENSORS (BASEL, SWITZERLAND) 2023; 23:2168. [PMID: 36850766 PMCID: PMC9961663 DOI: 10.3390/s23042168] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2022] [Revised: 02/09/2023] [Accepted: 02/10/2023] [Indexed: 06/18/2023]
Abstract
Medical ultrasound (US) is a commonly used modality for image-guided procedures. Recent research systems providing an in situ visualization of 2D US images via an augmented reality (AR) head-mounted display (HMD) were shown to be advantageous over conventional imaging through reduced task completion times and improved accuracy. In this work, we continue in the direction of recent developments by describing the first AR HMD application visualizing real-time volumetric (3D) US in situ for guiding vascular punctures. We evaluated the application on a technical level as well as in a mixed-methods user study with a qualitative prestudy and a quantitative main study, simulating a vascular puncture. Participants completed the puncture task significantly faster when using 3D US AR mode compared to 2D US AR, with a decrease of 28.4% in time. However, no significant differences were observed regarding the success rate of vascular puncture (2D US AR-50% vs. 3D US AR-72%). On the technical side, the system offers a low latency of 49.90 ± 12.92 ms and a satisfactory frame rate of 60 Hz. Our work shows the feasibility of a system that visualizes real-time 3D US data via an AR HMD, and our experiments show, furthermore, that this may offer additional benefits in US-guided tasks (i.e., reduced task completion time) over 2D US images viewed in AR by offering a vividly volumetric visualization.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, 23562 Lübeck, Germany
| | - Christoph Rüger
- Department of Surgery, Campus Charité Mitte, Campus Virchow-Klinikum, Experimental Surgery, Charité–Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, Berlin Institute of Health, 10117 Berlin, Germany
| | - Malte Maria Sieren
- Department of Radiology and Nuclear Medicine, University Hospital Schleswig-Holstein Campus Lübeck, 23569 Lübeck, Germany
- Institute of Interventional Radiology, University Hospital Schleswig-Holstein Campus Lübeck, 23569 Lübeck, Germany
| | - Roman Kloeckner
- Institute of Interventional Radiology, University Hospital Schleswig-Holstein Campus Lübeck, 23569 Lübeck, Germany
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Lübeck, 23562 Lübeck, Germany
| |
Collapse
|
11
|
Isaak A, Wolff T, Zdoroveac A, Taher F, Gürke L, Richarz S, Akifi S. Ultrasound-Guided Percutaneous Arteriovenous Fistula Creation Simulation Training in a Lifelike Flow Model. Bioengineering (Basel) 2022; 9:bioengineering9110659. [PMID: 36354570 PMCID: PMC9687548 DOI: 10.3390/bioengineering9110659] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Revised: 10/30/2022] [Accepted: 11/01/2022] [Indexed: 11/09/2022] Open
Abstract
Objectives: To assess the feasibility and training effect of simulation training for ultrasound-guided percutaneous arteriovenous fistula (pAVF) creation in a lifelike flow model. Methods: Twenty vascular trainees and specialists were shown an instructional video on creating a pAVF in a dedicated flow model and then randomized to a study or control group. The procedure was divided into five clearly defined steps. Two observers rated the performance on each step, and the time to perform the exercise was recorded. The study group participants underwent supervised hands-on training on the model before performing a second rated pAVF creation. All participants subsequently completed a feedback questionnaire. Results: After supervised simulation training, the study groups participants increased their mean performance rating from 2.2 ± 0.9 to 3.2 ± 0.7. A mean of 3.8 ± 0.8 procedure steps was accomplished independently (control group 2.1 ± 1.4; p < 0.05). The time taken to perform the procedure was 15.6 ± 3.8 min in the study group (control group 27.2 ± 7.3, p < 0.05). The participants with previous experience in ultrasound-guided vascular procedures (n = 5) achieved higher overall mean scores 3.0 ± 0.8 and accomplished more steps without assistance (2.0 ± 1.0) during the simulation training compared to their inexperienced peers (1.5 ± 0.3 and 0.8 ± 0.4, respectively). The feedback questionnaire revealed that the study group participants strongly agreed (n = 7) or agreed (n = 3) that training on the simulation model improved their skills regarding catheter handling. Conclusions: The study group participants increased their overall performance after training on the simulator. More experienced attendees performed better from the beginning, indicating the model to be lifelike and a potential skill assessment tool. Simulation training for pAVF creation using a lifelike model may be an intermediate step between acquiring ultrasound and theoretical pAVF skills and procedure guidance in theatre. However, this type of training is limited by its reliance on the simulator quality, demonstration devices and costs.
Collapse
Affiliation(s)
- Andrej Isaak
- Department of Vascular and Endovascular Surgery, University Hospital Basel, Spitalstrasse 21, 4031 Basel, Switzerland
- Vascular and Endovascular Surgery, Cantonal Hospital Aarau, 5001 Aarau, Switzerland
- Correspondence: ; Tel.: +41-62-838-45-13
| | - Thomas Wolff
- Department of Vascular and Endovascular Surgery, University Hospital Basel, Spitalstrasse 21, 4031 Basel, Switzerland
| | - Andrei Zdoroveac
- Vascular and Endovascular Surgery, Cantonal Hospital Aarau, 5001 Aarau, Switzerland
| | - Fadi Taher
- Vascular and Endovascular Surgery, Klinik Ottakring, Montlearstrasse 37, 1160 Wien, Austria
| | - Lorenz Gürke
- Department of Vascular and Endovascular Surgery, University Hospital Basel, Spitalstrasse 21, 4031 Basel, Switzerland
- Vascular and Endovascular Surgery, Cantonal Hospital Aarau, 5001 Aarau, Switzerland
| | - Sabine Richarz
- Department of Vascular and Endovascular Surgery, University Hospital Basel, Spitalstrasse 21, 4031 Basel, Switzerland
| | - Shuaib Akifi
- Vascular and Endovascular Surgery, Cantonal Hospital Aarau, 5001 Aarau, Switzerland
| |
Collapse
|
12
|
von Haxthausen F, Moreta-Martinez R, Pose Díez de la Lastra A, Pascau J, Ernst F. UltrARsound: in situ visualization of live ultrasound images using HoloLens 2. Int J Comput Assist Radiol Surg 2022; 17:2081-2091. [PMID: 35776399 PMCID: PMC9515035 DOI: 10.1007/s11548-022-02695-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 05/31/2022] [Indexed: 11/24/2022]
Abstract
Purpose Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. Methods The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. Results Tracking is performed with a median accuracy of 1.98 mm/1.81\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘ for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. Conclusions In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain. .,Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany.
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Alicia Pose Díez de la Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany
| |
Collapse
|
13
|
Bloom IW. The Potential for Augmented Reality to Boost the Utility of Sonography. JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY 2022. [DOI: 10.1177/87564793221096745] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
14
|
HoloUS: Augmented reality visualization of live ultrasound images using HoloLens for ultrasound-guided procedures. Int J Comput Assist Radiol Surg 2021; 17:385-391. [PMID: 34817764 DOI: 10.1007/s11548-021-02526-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 10/20/2021] [Indexed: 10/19/2022]
Abstract
PURPOSE Microsoft HoloLens is a pair of augmented reality (AR) smart glasses that could improve the intraprocedural visualization of ultrasound-guided procedures. With the wearable HoloLens headset, an ultrasound image can be virtually rendered and registered with the ultrasound transducer and placed directly in the practitioner's field of view. METHODS A custom application, called HoloUS, was developed using the HoloLens and a portable ultrasound machine connected through a wireless network. A custom 3D-printed case with an AR-pattern for the ultrasound transducer permitted ultrasound image tracking and registration. Voice controls on the HoloLens supported the scaling and movement of the ultrasound image as desired. The ultrasound images were streamed and displayed in real-time. A user study was performed to assess the effectiveness of the HoloLens as an alternative display platform. Novices and experts were timed on tasks involving targeting simulated vessels using a needle in a custom phantom. RESULTS Technical characterization of the HoloUS app was conducted using frame rate, tracking accuracy, and latency as performance metrics. The app ran at 25 frames/s, had an 80-ms latency, and could track the transducer with an average reprojection error of 0.0435 pixels. With AR visualization, the novices' times improved by 17% but the experts' times decreased slightly by 5%, which may reflect the experts' training and experience bias. CONCLUSION The HoloUS application was found to enhance user experience and simplify hand-eye coordination. By eliminating the need to alternately observe the patient and the ultrasound images presented on a separate monitor, the proposed AR application has the potential to improve efficiency and effectiveness of ultrasound-guided procedures.
Collapse
|
15
|
In Situ Visualization for 3D Ultrasound-Guided Interventions with Augmented Reality Headset. Bioengineering (Basel) 2021; 8:bioengineering8100131. [PMID: 34677204 PMCID: PMC8533537 DOI: 10.3390/bioengineering8100131] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2021] [Revised: 09/16/2021] [Accepted: 09/21/2021] [Indexed: 12/03/2022] Open
Abstract
Augmented Reality (AR) headsets have become the most ergonomic and efficient visualization devices to support complex manual tasks performed under direct vision. Their ability to provide hands-free interaction with the augmented scene makes them perfect for manual procedures such as surgery. This study demonstrates the reliability of an AR head-mounted display (HMD), conceived for surgical guidance, in navigating in-depth high-precision manual tasks guided by a 3D ultrasound imaging system. The integration between the AR visualization system and the ultrasound imaging system provides the surgeon with real-time intra-operative information on unexposed soft tissues that are spatially registered with the surrounding anatomic structures. The efficacy of the AR guiding system was quantitatively assessed with an in vitro study simulating a biopsy intervention aimed at determining the level of accuracy achievable. In the experiments, 10 subjects were asked to perform the biopsy on four spherical lesions of decreasing sizes (10, 7, 5, and 3 mm). The experimental results showed that 80% of the subjects were able to successfully perform the biopsy on the 5 mm lesion, with a 2.5 mm system accuracy. The results confirmed that the proposed integrated system can be used for navigation during in-depth high-precision manual tasks.
Collapse
|
16
|
Baashar Y, Alkawsi G, Ahmad WNW, Alhussian H, Alwadain A, Capretz LF, Babiker A, Alghail A. The Effectiveness of Using Augmented Reality for Training in the Medical Professions: A Meta Analysis (Preprint). JMIR Serious Games 2021; 10:e32715. [PMID: 35787488 PMCID: PMC9297143 DOI: 10.2196/32715] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 03/12/2022] [Accepted: 04/22/2022] [Indexed: 11/19/2022] Open
Abstract
Background Augmented reality (AR) is an interactive technology that uses persuasive digital data and real-world surroundings to expand the user's reality, wherein objects are produced by various computer applications. It constitutes a novel advancement in medical care, education, and training. Objective The aim of this work was to assess how effective AR is in training medical students when compared to other educational methods in terms of skills, knowledge, confidence, performance time, and satisfaction. Methods We performed a meta-analysis on the effectiveness of AR in medical training that was constructed by using the Cochrane methodology. A web-based literature search was performed by using the Cochrane Library, Web of Science, PubMed, and Embase databases to find studies that recorded the effect of AR in medical training up to April 2021. The quality of the selected studies was assessed by following the Cochrane criteria for risk of bias evaluations. Results In total, 13 studies with a total of 654 participants were included in the meta-analysis. The findings showed that using AR in training can improve participants' performance time (I2=99.9%; P<.001), confidence (I2=97.7%; P=.02), and satisfaction (I2=99.8%; P=.006) more than what occurs under control conditions. Further, AR did not have any effect on the participants’ knowledge (I2=99.4%; P=.90) and skills (I2=97.5%; P=.10). The meta-regression plot shows that there has been an increase in the number of articles discussing AR over the years and that there is no publication bias in the studies used for the meta-analysis. Conclusions The findings of this work suggest that AR can effectively improve performance time, satisfaction, and confidence in medical training but is not very effective in areas such as knowledge and skill. Therefore, more AR technologies should be implemented in the field of medical training and education. However, to confirm these findings, more meticulous research with more participants is needed.
Collapse
Affiliation(s)
- Yahia Baashar
- Faculty of Computing and Informatics, Universiti Malaysia Sabah, Labuan, Malaysia
| | - Gamal Alkawsi
- Institute of Sustainable Energy, Universiti Tenaga Nasional, Kajang, Malaysia
| | | | - Hitham Alhussian
- Department of Computer and Information Sciences, Universiti Teknologi Petronas, Seri Iskandar, Malaysia
| | - Ayed Alwadain
- Department of Computer Science, King Saud University, Riyadh, Saudi Arabia
| | - Luiz Fernando Capretz
- Department of Electrical & Computer Engineering, Western University, Ontario, ON, Canada
| | - Areej Babiker
- Department of Computer Engineering, Future University, Khartoum, Sudan
| | - Adnan Alghail
- Department of World Languages, Greece Central School District, New York, NY, United States
| |
Collapse
|
17
|
Velazco-Garcia JD, Navkar NV, Balakrishnan S, Younes G, Abi-Nahed J, Al-Rumaihi K, Darweesh A, Elakkad MSM, Al-Ansari A, Christoforou EG, Karkoub M, Leiss EL, Tsiamyrtzis P, Tsekos NV. Evaluation of how users interface with holographic augmented reality surgical scenes: Interactive planning MR-Guided prostate biopsies. Int J Med Robot 2021; 17:e2290. [PMID: 34060214 DOI: 10.1002/rcs.2290] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Revised: 05/04/2021] [Accepted: 05/27/2021] [Indexed: 12/15/2022]
Abstract
BACKGROUND User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment. METHOD End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system. RESULTS The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system. CONCLUSIONS The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.
Collapse
Affiliation(s)
| | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | - Georges Younes
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | | | - Adham Darweesh
- Department of Clinical Imaging, Hamad Medical Corporation, Doha, Qatar
| | | | | | | | - Mansour Karkoub
- Department of Mechanical Engineering, Texas A&M University-Qatar, Doha, Qatar
| | - Ernst L Leiss
- Department of Computer Science, University of Houston, Houston, Texas, USA
| | | | - Nikolaos V Tsekos
- Department of Computer Science, University of Houston, Houston, Texas, USA
| |
Collapse
|
18
|
Dalili D, Isaac A, Rashidi A, Åström G, Fritz J. Image-guided Sports Medicine and Musculoskeletal Tumor Interventions: A Patient-Centered Model. Semin Musculoskelet Radiol 2020; 24:290-309. [PMID: 32987427 DOI: 10.1055/s-0040-1710065] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
The spectrum of effective musculoskeletal (MSK) interventions is broadening and rapidly evolving. Increasing demands incite a perpetual need to optimize services and interventions by maximizing the diagnostic and therapeutic yield, reducing exposure to ionizing radiation, increasing cost efficiency, as well as identifying and promoting effective procedures to excel in patient satisfaction ratings and outcomes. MSK interventions for the treatment of oncological conditions, and conditions related to sports injury can be performed with different imaging modalities; however, there is usually one optimal image guidance modality for each procedure and individual patient. We describe our patient-centered workflow as a model of care that incorporates state-of-the-art imaging techniques, up-to-date evidence, and value-based practices with the intent of optimizing procedural success and outcomes at a patient-specific level. This model contrasts interventionalist- and imaging modality-centered practices, where procedures are performed based on local preference and selective availability of imaging modality or interventionalists. We discuss rationales, benefits, and limitations of fluoroscopy, ultrasound, computed tomography, and magnetic resonance imaging procedure guidance for a broad range of image-guided MSK interventions to diagnose and treat sports and tumor-related conditions.
Collapse
Affiliation(s)
- Danoob Dalili
- Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins School of Medicine, Baltimore, Maryland.,Nuffield Orthopaedic Centre, Oxford University Hospitals NHS Foundation Trust, Oxford, United Kingdom
| | - Amanda Isaac
- School of Biomedical Engineering & Imaging Sciences, King's College London, London, United Kingdom
| | - Ali Rashidi
- Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Gunnar Åström
- Department of Immunology, Genetics and Pathology (Oncology) and department of Surgical Sciences (Radiology), Uppsala University, Uppsala, Sweden
| | - Jan Fritz
- Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins School of Medicine, Baltimore, Maryland.,Department of Radiology, Division of Musculoskeletal Imaging, New York University Grossman School of Medicine, New York, New York
| |
Collapse
|