1
|
Tsumura R, Gao S, Tang Y, Zhang HK. Concentric-ring arrays for forward-viewing ultrasound imaging. J Med Imaging (Bellingham) 2022; 9:065002. [PMID: 36444284 PMCID: PMC9683378 DOI: 10.1117/1.jmi.9.6.065002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Accepted: 11/03/2022] [Indexed: 11/25/2023] Open
Abstract
Purpose Current ultrasound (US)-image-guided needle insertions often require an expertized technique for clinicians because the performance of tasks in a three-dimensional space using two-dimensional images requires operators to cognitively maintain the spatial relationships between the US probe, the needle, and the lesion. This work presents forward-viewing US imaging with a ring array configuration to enable needle interventions without requiring the registration between tools and targets. Approach The center-open ring array configuration allows the needle to be inserted from the center of the visualized US image, providing simple and intuitive guidance. To establish the feasibility of the ring array configuration, the design parameters causing the image quality, including the radius of the center hole and the number of ring layers and transducer elements, were investigated. Results Experimental results showed successful visualization, even with a hole in the transducer elements, and the target visibility was improved by increasing the number of ring layers and the number of transducer elements in each ring layer. Reducing the hole radius improved the region's image quality at a shallow depth. Conclusions Forward-viewing US imaging with a ring array configuration has the potential to be a viable alternative to conventional US image-guided needle insertion methods.
Collapse
Affiliation(s)
- Ryosuke Tsumura
- Worcester Polytechnic Institute, Department of Biomedical Engineering, Worcester, Massachusetts, United States
- National Institute of Advanced Industrial Science and Technology, Health and Medical Research Institute, Tsukuba, Japan
| | - Shang Gao
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| | - Yichuan Tang
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| | - Haichong K. Zhang
- Worcester Polytechnic Institute, Department of Biomedical Engineering, Worcester, Massachusetts, United States
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| |
Collapse
|
2
|
Augmented Reality in Supporting Healthcare and Nursing Independent Learning: Narrative Review. Comput Inform Nurs 2022; 41:281-291. [PMID: 35470310 DOI: 10.1097/cin.0000000000000910] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
New advances in technology have brought challenges and opportunities for education and instructional methods. Compared with traditional education, the increased use of technology-enhanced blended learning in healthcare and nursing education requires students to take more responsibility for their learning. The use of advanced technology has resulted in independent learning skills becoming increasingly important. Many studies have reported a positive correlation between independent learning and success rates in an e-learning environment. This paper focuses on the potential contribution of augmented reality, which superimposes layers of virtual content on real physical objects. The paper initially presents a narrative literature review to identify augmented reality's strengths and challenges in facilitating independent learning and highlights several potential approaches for utilizing augmented reality in nursing education. However, it also reveals a lack of studies integrating augmented reality and independent learning theories such as self-regulated learning. The paper then addresses this gap by proposing a new learning approach to support independent learning.
Collapse
|
3
|
Headman ZC, Matson MC, Schneider RP, Potter JL, Loguda-Summers DL, Bhatia S, Kondrashova T. Developing Neuraxial and Regional Pain Procedural Skills Through Innovative 3-Dimensional Printing Technology. J Osteopath Med 2020; 120:273-282. [DOI: 10.7556/jaoa.2020.044] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Abstract
Context
Various forms of simulation-based training, including training models, increase training opportunities and help assess performance of a task. However, commercial training models for lumbar puncture and epidural procedures are costly.
Objective
To assess medical students’ and residents’ perception of 3-dimensional (3D)-printed lumbar, cervical, and pelvic models for mastering joint injection techniques and to determine the utility of ultrasonography-guided needle procedure training.
Methods
Osteopathic medical students and residents used in-house 3D-printed gel joint models during an injection ultrasonography laboratory for mastering lumbar epidural, caudal epidural, sacroiliac, and facet joint injection techniques. After the laboratory, they answered a 17-item survey about their perception of the importance of the models in medical education and future practice. The survey also evaluated comfort levels with performing joint injections after using the models, overall satisfaction with the models, and likelihood of using models in the future.
Results
Thirty-six medical students and residents participated. Both students and residents agreed that 3D-printed models were easy to use, aided understanding of corresponding procedures, and increased comfort with performing joint injections (all P<.001). Most participants (35 [97.2%]) believed that the models were reasonable alternatives to commercial models. Over half felt capable of successfully performing cervical or pelvic (22 [61.1%]) and lumbar epidural (23 [63.9%]) injections. The majority of participants (34 [94.4%]) would like to use the models in the future for personal training purposes. Overall, 100% believed that the 3D-printed models were a useful tool for injection training.
Conclusions
Results suggest that 3D-printed models provided realistic training experience for injection procedures and seemed to allow participants to quickly master new injection techniques. These models offer a visual representation of human anatomy and could be a cost-saving alternative to commercial trainers.
Collapse
|
4
|
Tang KS, Cheng DL, Mi E, Greenberg PB. Augmented reality in medical education: a systematic review. CANADIAN MEDICAL EDUCATION JOURNAL 2020; 11:e81-e96. [PMID: 32215146 PMCID: PMC7082471 DOI: 10.36834/cmej.61705] [Citation(s) in RCA: 37] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
INTRODUCTION The field of augmented reality (AR) is rapidly growing with many new potential applications in medical education. This systematic review investigated the current state of augmented reality applications (ARAs) and developed an analytical model to guide future research in assessing ARAs as teaching tools in medical education. METHODS A literature search was conducted using PubMed, Embase, Web of Science, Cochrane Library, and Google Scholar. This review followed PRISMA guidelines and included publications from January 1, 2000 to June 18, 2018. Inclusion criteria were experimental studies evaluating ARAs implemented in healthcare education published in English. Our review evaluated study quality and determined whether studies assessed ARA validity using criteria established by the GRADE Working Group and Gallagher et al., respectively. These findings were used to formulate an analytical model to assess the readiness of ARAs for implementation in medical education. RESULTS We identified 100,807 articles in the initial literature search; 36 met inclusion criteria for final review and were categorized into three categories: Surgery (23), Anatomy (9), and Other (4). The overall quality of the studies was poor and no ARA was tested for all five stages of validity. Our analytical model evaluates the importance of research quality, application content, outcomes, and feasibility of an ARA to gauge its readiness for implementation. CONCLUSION While AR technology is growing at a rapid rate, the current quality and breadth of AR research in medical training is insufficient to recommend the adoption into educational curricula. We hope our analytical model will help standardize AR assessment methods and define the role of AR technology in medical education.
Collapse
Affiliation(s)
- Kevin S. Tang
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
- The Warren Alpert Medical School of Brown University, Rhode Island, USA
- Division of Ophthalmology, Warren Alpert Medical School, Rhode Island, USA
- Section of Ophthalmology, Providence VA Medical Center, Rhode Island, USA
| | - Derrick L. Cheng
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
- The Warren Alpert Medical School of Brown University, Rhode Island, USA
- Lifespan Clinical Research Center, Rhode Island, USA
| | - Eric Mi
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
| | - Paul B. Greenberg
- Division of Ophthalmology, Warren Alpert Medical School, Rhode Island, USA
- Section of Ophthalmology, Providence VA Medical Center, Rhode Island, USA
| |
Collapse
|
5
|
Mashar M, Nanapragasam A, Haslam P. Interventional radiology training: where will technology take us? BJR Open 2019; 1:20190002. [PMID: 33178937 PMCID: PMC7592432 DOI: 10.1259/bjro.20190002] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2018] [Revised: 06/17/2019] [Accepted: 08/01/2019] [Indexed: 12/26/2022] Open
Abstract
Interventional radiology is a relatively young specialty, and it is undergoing a period of considerable growth. The benefits of a minimally invasive approach are clear, with smaller incisions, less pain, and faster recovery times being the principal benefits compared to surgical alternatives. Trainees need to acquire the technical skills and the clinical acumen to accurately deliver targeted treatment and safely follow up patients after the procedure. The need to maintain an efficient interventional radiology service whilst also giving sufficient time for trainee education is a challenge. In order to compensate for this, novel technologies like virtual reality (VR), augmented reality (AR), cadaveric simulation, and three-dimensional (3D) printing have been postulated as a means of supplementing training. In this article, we outline the main features of these innovative strategies and discuss the evidence base behind them. Benefits of these techniques beyond pure clinical training include the standardization of educational cases, access to training at any time, and less risk to patients. The main disadvantage is the large financial outlay required. Therefore, before widespread uptake can be recommended, further research is needed to confirm the educational benefit of these novel techniques, both in and of themselves and in comparison to existing clinical-based education.
Collapse
Affiliation(s)
- Meghavi Mashar
- Oxford University Hospitals NHS Foundation Trust, Oxford, United Kingdom
| | | | - Philip Haslam
- The Newcastle Hospitals NHS Foundation Trust, Newcastle-upon-Tyne, United Kingdom
| |
Collapse
|
6
|
Munzer BW, Khan MM, Shipman B, Mahajan P. Augmented Reality in Emergency Medicine: A Scoping Review. J Med Internet Res 2019; 21:e12368. [PMID: 30994463 PMCID: PMC6492064 DOI: 10.2196/12368] [Citation(s) in RCA: 42] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2018] [Revised: 01/31/2019] [Accepted: 02/28/2019] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Augmented reality is increasingly being investigated for its applications to medical specialties as well as in medical training. Currently, there is little information about its applicability to training and care delivery in the context of emergency medicine. OBJECTIVE The objective of this article is to review current literature related to augmented reality applicable to emergency medicine and its training. METHODS Through a scoping review utilizing Scopus, MEDLINE, and Embase databases for article searches, we identified articles involving augmented reality that directly involved emergency medicine or was in an area of education or clinical care that could be potentially applied to emergency medicine. RESULTS A total of 24 articles were reviewed in detail and were categorized into three groups: user-environment interface, telemedicine and prehospital care, and education and training. CONCLUSIONS Through analysis of the current literature across fields, we were able to demonstrate that augmented reality has utility and feasibility in clinical care delivery in patient care settings, in operating rooms and inpatient settings, and in education and training of emergency care providers. Additionally, we found that the use of augmented reality for care delivery over distances is feasible, suggesting a role in telehealth. Our results from the review of the literature in emergency medicine and other specialties reveal that further research into the uses of augmented reality will have a substantial role in changing how emergency medicine as a specialty will deliver care and provide education and training.
Collapse
Affiliation(s)
| | - Mohammad Mairaj Khan
- Department of Emergency Medicine, University of Michigan, Ann Arbor, MI, United States
| | - Barbara Shipman
- Medical School, University of Michigan, Ann Arbor, MI, United States
| | - Prashant Mahajan
- Department of Emergency Medicine, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
7
|
Computer mediated reality technologies: A conceptual framework and survey of the state of the art in healthcare intervention systems. J Biomed Inform 2019; 90:103102. [DOI: 10.1016/j.jbi.2019.103102] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2018] [Revised: 10/30/2018] [Accepted: 12/29/2018] [Indexed: 11/19/2022]
|
8
|
Suthakorn J, Tanaiutchawoot N, Wiratkapan C, Ongwattanakul S. Breast biopsy navigation system with an assisted needle holder tool and 2D graphical user interface. Eur J Radiol Open 2018; 5:93-101. [PMID: 30109245 PMCID: PMC6090089 DOI: 10.1016/j.ejro.2018.07.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Revised: 06/20/2018] [Accepted: 07/02/2018] [Indexed: 11/28/2022] Open
Abstract
Objective This paper proposes the development of a breast biopsy navigation system with an assisted needle holder tool for a coaxial needle and a graphical user interface, which utilizes an optical tracking device to localize the needle position relative to the ultrasound image with the aim to improve performance especially for a new radiologist or an inexperienced group. Materials and methods The system consists of an assisted needle holder tool, which as an attachment for the 2D ultrasound transducer and the graphical user interface (GUI) that shows the needle pathway, needle line and warning signs. An optical tracking system is used to track the needle motion, ultrasound image and transform all information to with respect to the technique. The system is evaluated using a phantom made from gel candle. There were nine experienced and eight inexperienced participants who performed the breast biopsy intervention, using three methods: the freehand method, only the needle holder tool guidance, and the whole navigation guidance (GUI + assisted needle holder). Results The results demonstrate a success rate of over 90% using only assisted needle holder and the whole system to perform breast biopsy for the experienced and inexperienced groups, whereas for the inexperienced group a success rate of 57.5% was achieved using the freehand method. The use of only assisted needle holder for breast biopsy reduces the time for a procedure in the inexperienced group by 6 s when compared to the freehand method. Conclusion The authors believe that this navigation system can be applied in a clinical setting and give an advantage to inexperienced radiologists who must successfully perform clinical breast biopsy.
Collapse
Affiliation(s)
- Jackrit Suthakorn
- Center for Biomedical and Robotics Technology (BART LAB) Department of Biomedical Engineering, Faculty of Engineering, Mahidol University, Salaya, Thailand
| | - Narucha Tanaiutchawoot
- Center for Biomedical and Robotics Technology (BART LAB) Department of Biomedical Engineering, Faculty of Engineering, Mahidol University, Salaya, Thailand
| | - Cholatip Wiratkapan
- Breast Diagnostic Center, Division of Diagnostic Radiology, Department of Radiology, Faculty of Medicine, Ramathibodi Hospital, Mahidol University, Bangkok, Thailand
| | - Songpol Ongwattanakul
- Center for Biomedical and Robotics Technology (BART LAB) Department of Biomedical Engineering, Faculty of Engineering, Mahidol University, Salaya, Thailand
| |
Collapse
|
9
|
Tuzer M, Yazıcı A, Türkay R, Boyman M, Acar B. Multi-ray medical ultrasound simulation without explicit speckle modelling. Int J Comput Assist Radiol Surg 2018; 13:1009-1017. [PMID: 29728901 DOI: 10.1007/s11548-018-1760-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2017] [Accepted: 03/28/2018] [Indexed: 11/26/2022]
Abstract
PURPOSE To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. METHODS The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. RESULTS Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. CONCLUSIONS The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.
Collapse
Affiliation(s)
- Mert Tuzer
- VAVlab, EE Department, Boğaziçi University, İstanbul, Turkey
| | | | | | | | - Burak Acar
- VAVlab, EE Department, Boğaziçi University, İstanbul, Turkey.
| |
Collapse
|
10
|
Lehmann T, Sloboda R, Usmani N, Tavakoli M. Human–Machine Collaboration Modalities for Semi-Automated Needle Insertion Into Soft Tissue. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2017.2768123] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
11
|
Yovanoff M, Pepley D, Mirkin K, Moore J, Han D, Miller S. IMPROVING MEDICAL EDUCATION: SIMULATING CHANGES IN PATIENT ANATOMY USING DYNAMIC HAPTIC FEEDBACK. PROCEEDINGS OF THE HUMAN FACTORS AND ERGONOMICS SOCIETY ... ANNUAL MEETING. HUMAN FACTORS AND ERGONOMICS SOCIETY. ANNUAL MEETING 2016; 60:603-607. [PMID: 29151778 PMCID: PMC5693425 DOI: 10.1177/1541931213601138] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Virtual simulation is an emerging field in medical education. Research suggests that simulation reduces complication rates and improves learning gains for medical residents. One benefit of simulators is their allowance for more realistic and dynamic patient anatomies. While potentially useful throughout medical education, few studies have explored the impact of dynamic haptic simulators on medical training. In light of this research void, this study was developed to examine how a Dynamic-Haptic Robotic Trainer (DHRT) impacts medical student self-efficacy and skill gains compared to traditional simulators developed to train students in Internal Jugular Central Venous Catheter (IJ CVC) placement. The study was conducted with 18 third year medical students with no prior CVC insertion experience who underwent a pre-test, simulator training (manikin, robotic, or mixed) and post-test. The results revealed the DHRT as a useful method for training CVC skills and supports further research on dynamic haptic trainers in medical education.
Collapse
Affiliation(s)
- Mary Yovanoff
- Industrial Engineering Penn State, University Park, PA
| | - David Pepley
- Mechanical and Nuclear Engineering, Penn State, University Park, PA
| | | | - Jason Moore
- Mechanical and Nuclear Engineering, Penn State, University Park, PA
| | - David Han
- Penn State Hershey Medical Center, Hershey, PA
| | - Scarlett Miller
- Engineering Design and Industrial Engineering, Penn State, University Park, PA
| |
Collapse
|
12
|
Chetlen AL, Mendiratta-Lala M, Probyn L, Auffermann WF, DeBenedectis CM, Marko J, Pua BB, Sato TS, Little BP, Dell CM, Sarkany D, Gettle LM. Conventional Medical Education and the History of Simulation in Radiology. Acad Radiol 2015; 22:1252-67. [PMID: 26276167 DOI: 10.1016/j.acra.2015.07.003] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Revised: 05/29/2015] [Accepted: 07/08/2015] [Indexed: 01/22/2023]
Abstract
Simulation is a promising method for improving clinician performance, enhancing team training, increasing patient safety, and preventing errors. Training scenarios to enrich medical student and resident education, and apply toward competency assessment, recertification, and credentialing are important applications of simulation in radiology. This review will describe simulation training for procedural skills, interpretive and noninterpretive skills, team-based training and crisis management, professionalism and communication skills, as well as hybrid and in situ applications of simulation training. A brief overview of current simulation equipment and software and the barriers and strategies for implementation are described. Finally, methods of measuring competency and assessment are described, so that the interested reader can successfully implement simulation training into their practice.
Collapse
|
13
|
Maas S, Ingler M, Overhoff HM. Using smart glasses for ultrasound diagnostics. CURRENT DIRECTIONS IN BIOMEDICAL ENGINEERING 2015. [DOI: 10.1515/cdbme-2015-0049] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
Ultrasound has been established as a diagnostic tool in a wide range of applications. Especially for beginners, the alignment of sectional images to patient’s spatial anatomy can be cumbersome. A direct view onto the patient’s anatomy while regarding ultrasound images may help to overcome unergonomic examination.
To solve these issues an affordable augmented reality system using smart glasses was created, that displays a (virtual) ultrasound image beneath a (real) ultrasound transducer.
Collapse
Affiliation(s)
- Stefan Maas
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| | - Marvin Ingler
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| | - Heinrich Martin Overhoff
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| |
Collapse
|
14
|
Comparison of the Development of Performance Skills in Ultrasound-Guided Regional Anesthesia Simulations With Different Phantom Models. Simul Healthc 2013; 8:368-75. [DOI: 10.1097/sih.0b013e318299dae2] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
15
|
Abstract
SUMMARY STATEMENT Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.
Collapse
|
16
|
Sutherland C, Hashtrudi-Zaad K, Sellens R, Abolmaesumi P, Mousavi P. An Augmented Reality Haptic Training Simulator for Spinal Needle Procedures. IEEE Trans Biomed Eng 2013; 60:3009-18. [DOI: 10.1109/tbme.2012.2236091] [Citation(s) in RCA: 55] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
17
|
Interventional radiology virtual simulator for liver biopsy. Int J Comput Assist Radiol Surg 2013; 9:255-67. [PMID: 23881251 DOI: 10.1007/s11548-013-0929-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2013] [Accepted: 07/05/2013] [Indexed: 10/26/2022]
Abstract
PURPOSE Training in Interventional Radiology currently uses the apprenticeship model, where clinical and technical skills of invasive procedures are learnt during practice in patients. This apprenticeship training method is increasingly limited by regulatory restrictions on working hours, concerns over patient risk through trainees' inexperience and the variable exposure to case mix and emergencies during training. To address this, we have developed a computer-based simulation of visceral needle puncture procedures. METHODS A real-time framework has been built that includes: segmentation, physically based modelling, haptics rendering, pseudo-ultrasound generation and the concept of a physical mannequin. It is the result of a close collaboration between different universities, involving computer scientists, clinicians, clinical engineers and occupational psychologists. RESULTS The technical implementation of the framework is a robust and real-time simulation environment combining a physical platform and an immersive computerized virtual environment. The face, content and construct validation have been previously assessed, showing the reliability and effectiveness of this framework, as well as its potential for teaching visceral needle puncture. CONCLUSION A simulator for ultrasound-guided liver biopsy has been developed. It includes functionalities and metrics extracted from cognitive task analysis. This framework can be useful during training, particularly given the known difficulties in gaining significant practice of core skills in patients.
Collapse
|
18
|
Open-source surface mesh-based ultrasound-guided spinal intervention simulator. Int J Comput Assist Radiol Surg 2013; 8:1043-51. [PMID: 23729333 DOI: 10.1007/s11548-013-0901-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2013] [Accepted: 05/17/2013] [Indexed: 10/26/2022]
Abstract
PURPOSE Ultrasound is prevalent in image-guided therapy as a safe, inexpensive, and widely available imaging modality. However, extensive training in interpreting ultrasound images is essential for successful procedures. An open-source ultrasound image simulator was developed to facilitate the training of ultrasound-guided spinal intervention procedures, thereby eliminating the need for an ultrasound machine from the phantom-based training environment. METHODS Anatomical structures and surgical tools are converted to surface meshes for data compression. Anatomical data are converted from segmented volumetric images, while the geometry of surgical tools is available as a surface mesh. The pose of the objects are either constants or coming from a pose-tracking device. Intersection points between the surface models and the ultrasound scan lines are determined with a binary space partitioning tree. The scan lines are divided into segments and filled with gray values determined by an intensity calculation accounting for material properties, reflection, and attenuation parameters defined in a configuration file. The scan lines are finally converted to a regular brightness-mode ultrasound image. RESULTS The simulator was tested in a tracked ultrasound imaging system, with a mock transducer tracked with an Ascension trakSTAR electromagnetic tracker, on a spine phantom. A mesh model of the spine was created from CT data. The simulated ultrasound images were generated at a speed of 50 frames per second, and a resolution of [Formula: see text] pixels, with 256 scan lines per frame, on a PC with a 3.4 GHz processor. A human subject trial was conducted to compare the learning performance of novice trainees, with real and simulated ultrasound, in the localization of facet joints of a spine phantom. With 22 participants split into two equal groups, and each participant localizing 6 facet joints, there was no statistical difference in the performance of the two groups, indicating that simulated ultrasound could indeed replace the real ultrasound in phantom-based ultrasonography training for spinal interventions. CONCLUSIONS The ultrasound simulator was implemented and integrated into the open-source Public Library for Ultrasound (PLUS) toolkit.
Collapse
|
19
|
Goksel O, Sapchuk K, Morris WJ, Salcudean SE. Prostate Brachytherapy Training With Simulated Ultrasound and Fluoroscopy Images. IEEE Trans Biomed Eng 2013; 60:1002-12. [DOI: 10.1109/tbme.2012.2222642] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
20
|
Linte CA, Davenport KP, Cleary K, Peters C, Vosburgh KG, Navab N, Edwards PE, Jannin P, Peters TM, Holmes DR, Robb RA. On mixed reality environments for minimally invasive therapy guidance: systems architecture, successes and challenges in their implementation from laboratory to clinic. Comput Med Imaging Graph 2013; 37:83-97. [PMID: 23632059 PMCID: PMC3796657 DOI: 10.1016/j.compmedimag.2012.12.002] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2012] [Revised: 11/16/2012] [Accepted: 12/24/2012] [Indexed: 11/21/2022]
Abstract
Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinician's view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future.
Collapse
|
21
|
Ultrasound-guided facet joint injection training using Perk Tutor. Int J Comput Assist Radiol Surg 2013; 8:831-6. [DOI: 10.1007/s11548-012-0811-5] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2012] [Accepted: 12/29/2012] [Indexed: 10/27/2022]
|
22
|
Wing-Yin Chan, Jing Qin, Yim-Pan Chui, Pheng-Ann Heng. A Serious Game for Learning Ultrasound-Guided Needle Placement Skills. ACTA ACUST UNITED AC 2012; 16:1032-42. [DOI: 10.1109/titb.2012.2204406] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
23
|
Lee S, Lee J, Lee A, Park N, Lee S, Song S, Seo A, Lee H, Kim JI, Eom K. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine. Vet J 2012; 196:197-202. [PMID: 23103217 DOI: 10.1016/j.tvjl.2012.09.015] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2012] [Revised: 07/08/2012] [Accepted: 09/19/2012] [Indexed: 10/27/2022]
Abstract
Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education.
Collapse
Affiliation(s)
- S Lee
- Department of Veterinary Radiology and Diagnostic Imaging, College of Veterinary Medicine, Konkuk University, Seoul 143-701, Republic of Korea
| | | | | | | | | | | | | | | | | | | |
Collapse
|
24
|
Sutherland C, Hashtrudi-Zaad K, Abolmaesumi P, Mousavi P. Towards an augmented ultrasound guided spinal needle insertion system. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2012; 2011:3459-62. [PMID: 22255084 DOI: 10.1109/iembs.2011.6090935] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
We propose a haptic-based simulator for ultrasound-guided percutaneous spinal interventions. The system is composed of a haptic device to provide force feedback, a camera system to display video and augmented computed tomography (CT) overlay, a finite element model for tissue deformation and US simulation from a CT volume. The proposed system is able to run a large finite element model at the required haptic rate for smooth force feedback, and uses haptic device position measurements for a steady response. The simulated US images from CT closely resemble the vertebrae images captured in vivo. This is the first report of a system that provides a training environment to couple haptic feedback with a tracked mannequin, and a CT volume overlaid on a visual feed of the mannequin.
Collapse
|
25
|
Nakata N, Suzuki N, Hattori A, Hirai N, Miyamoto Y, Fukuda K. Informatics in radiology: Intuitive user interface for 3D image manipulation using augmented reality and a smartphone as a remote control. Radiographics 2012; 32:E169-74. [PMID: 22556316 DOI: 10.1148/rg.324115086] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Although widely used as a pointing device on personal computers (PCs), the mouse was originally designed for control of two-dimensional (2D) cursor movement and is not suited to complex three-dimensional (3D) image manipulation. Augmented reality (AR) is a field of computer science that involves combining the physical world and an interactive 3D virtual world; it represents a new 3D user interface (UI) paradigm. A system for 3D and four-dimensional (4D) image manipulation has been developed that uses optical tracking AR integrated with a smartphone remote control. The smartphone is placed in a hard case (jacket) with a 2D printed fiducial marker for AR on the back. It is connected to a conventional PC with an embedded Web camera by means of WiFi. The touch screen UI of the smartphone is then used as a remote control for 3D and 4D image manipulation. Using this system, the radiologist can easily manipulate 3D and 4D images from computed tomography and magnetic resonance imaging in an AR environment with high-quality image resolution. Pilot assessment of this system suggests that radiologists will be able to manipulate 3D and 4D images in the reading room in the near future. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.324115086/-/DC1.
Collapse
Affiliation(s)
- Norio Nakata
- Department of Radiology and Institute for High Dimensional Medical Imaging, Research Center for Medical Sciences, Jikei University School of Medicine, 3-25-8 Nishi-Shinbashi, Minato-ku, Tokyo 1058461, Japan.
| | | | | | | | | | | |
Collapse
|
26
|
Gjerald SU, Brekken R, Hergum T, D'hooge J. Real-time ultrasound simulation using the GPU. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2012; 59:885-892. [PMID: 22622973 DOI: 10.1109/tuffc.2012.2273] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
Ultrasound simulators can be used for training ultrasound image acquisition and interpretation. In such simulators, synthetic ultrasound images must be generated in real time. Anatomy can be modeled by computed tomography (CT). Shadows can be calculated by combining reflection coefficients and depth dependent, exponential attenuation. To include speckle, a pre-calculated texture map is typically added. Dynamic objects must be simulated separately. We propose to increase the speckle realism and allow for dynamic objects by using a physical model of the underlying scattering process. The model is based on convolution of the point spread function (PSF) of the ultrasound scanner with a scatterer distribution. The challenge is that the typical field-of-view contains millions of scatterers which must be selected by a virtual probe from an even larger body of scatterers. The main idea of this paper is to select and sample scatterers in parallel on the graphic processing unit (GPU). The method was used to image a cyst phantom and a movable needle. Speckle images were produced in real time (more than 10 frames per second) on a standard GPU. The ultrasound images were visually similar to images calculated by a reference method.
Collapse
Affiliation(s)
- Sjur Urdson Gjerald
- Medical Imaging Lab and Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway.
| | | | | | | |
Collapse
|
27
|
Gjerald SU, Brekken R, Bø LE, Hergum T, Nagelhus Hernes TA. Interactive development of a CT-based tissue model for ultrasound simulation. Comput Biol Med 2012; 42:607-13. [PMID: 22424668 DOI: 10.1016/j.compbiomed.2012.02.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2011] [Revised: 02/10/2012] [Accepted: 02/23/2012] [Indexed: 10/28/2022]
Abstract
The objective of this study was to make an interactive method for development of a tissue model, based on anatomical information in computed tomography (CT) images, for use in an ultrasound simulator for training or surgical pre-planning. The method consisted of (1) comparison of true ultrasound B-mode images with corresponding ultrasound-like images, and (2) modification of tissue properties to decrease the difference between these images. Ultrasound-like images that reproduced many, but not all the properties of corresponding true ultrasound images were generated. The tissue model could be used for real-time simulation of ultrasound-like B-mode images on a moderately priced computer.
Collapse
Affiliation(s)
- Sjur Urdson Gjerald
- Norwegian University of Science and Technology (NTNU), Department of Circulation and Medical Imaging, Trondheim, Norway.
| | | | | | | | | |
Collapse
|
28
|
Zhu M, Salcudean SE. Real-time image-based B-mode ultrasound image simulation of needles using tensor-product interpolation. IEEE TRANSACTIONS ON MEDICAL IMAGING 2011; 30:1391-1400. [PMID: 21356613 DOI: 10.1109/tmi.2011.2121091] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
In this paper, we propose an interpolation-based method for simulating rigid needles in B-mode ultrasound images in real time. We parameterize the needle B-mode image as a function of needle position and orientation. We collect needle images under various spatial configurations in a water-tank using a needle guidance robot. Then we use multidimensional tensor-product interpolation to simulate images of needles with arbitrary poses and positions using collected images. After further processing, the interpolated needle and seed images are superimposed on top of phantom or tissue image backgrounds. The similarity between the simulated and the real images is measured using a correlation metric. A comparison is also performed with in vivo images obtained during prostate brachytherapy. Our results, carried out for both the convex (transverse plane) and linear (sagittal/para-sagittal plane) arrays of a trans-rectal transducer indicate that our interpolation method produces good results while requiring modest computing resources. The needle simulation method we present can be extended to the simulation of ultrasound images of other wire-like objects. In particular, we have shown that the proposed approach can be used to simulate brachytherapy seeds.
Collapse
Affiliation(s)
- Mengchen Zhu
- Department of Electrical and Computer Engineering,University of British Columbia, Vancouver, BC, Canada.
| | | |
Collapse
|
29
|
Abstract
In this paper, we propose an interpolation-based method for simulating needle images in B-mode ultrasound. We parametrize the needle image as a function of needle position and orientation. We collect needle images under various spatial configurations in a water-tank using a guidance robot. Then we use multi-dimensional tensor-product interpolation to simulate images of needles with arbitrary poses and positions using the collected images. Interpolated needle images are superimposed on top of phantom image backgrounds. The similarity between the simulated and the real images is measured using a correlation metric. A comparison with in-vivo images is also performed. The simulation procedure is demonstrated using transverse needle images and extended to sagittal needle images and brachytherapy seed images. The proposed method could be used in clinical procedure training simulators.
Collapse
|
30
|
Abstract
Debate on the existence of innate skills has all but evaporated in the light of evidence that it is only the hours spent in deliberate practice that correlate with even the most elite levels of expertise. A range of simple to advanced technologies stands to address some of the many challenges to effective training of 21st century, procedural medicine. Simulation could train and assess behaviours remotely from patients, in complete safety, reducing the risks of inexperienced trainees learning critical tasks in patients while contributing to certification and revalidation. Understanding the strengths and limitations of these devices, determining and improving their effectiveness and identifying their roles, as well as those of individuals and teams, represents a cornerstone of successful adoption into the interventional radiology curriculum. This requires a simulation strategy that includes standards for simulator documentation.
Collapse
Affiliation(s)
- D Gould
- Department of Radiology, Royal Liverpool University, Liverpool L7 8XP, UK.
| |
Collapse
|
31
|
Bø LE, Gjerald SU, Brekken R, Tangen GA, Hernes TAN. Efficiency of ultrasound training simulators: method for assessing image realism. MINIM INVASIV THER 2010; 19:69-74. [PMID: 20337541 DOI: 10.3109/13645701003642826] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Although ultrasound has become an important imaging modality within several medical professions, the benefit of ultrasound depends to some degree on the skills of the person operating the probe and interpreting the image. For some applications, the possibility to educate operators in a clinical setting is limited, and the use of training simulators is considered an alternative approach for learning basic skills. To ensure the quality of simulator-based training, it is important to produce simulated ultrasound images that resemble true images to a sufficient degree. This article describes a method that allows corresponding true and simulated ultrasound images to be generated and displayed side by side in real time, thus facilitating an interactive evaluation of ultrasound simulators in terms of image resemblance, real-time characteristics and man-machine interaction. The proposed method could be used to study the realism of ultrasound simulators and how this realism affects the quality of training, as well as being a valuable tool in the development of simulation algorithms.
Collapse
Affiliation(s)
- Lars Eirik Bø
- SINTEF Technology and Society, Department of Medical Technology, Trondheim, Norway.
| | | | | | | | | |
Collapse
|
32
|
Mendiratta-Lala M, Williams T, de Quadros N, Bonnett J, Mendiratta V. The use of a simulation center to improve resident proficiency in performing ultrasound-guided procedures. Acad Radiol 2010; 17:535-40. [PMID: 20097583 DOI: 10.1016/j.acra.2009.11.010] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2009] [Revised: 11/12/2009] [Accepted: 11/16/2009] [Indexed: 01/22/2023]
Abstract
RATIONALE AND OBJECTIVES With advancements in technology and push for health care reform and reduced costs, minimally invasive procedures, such as those that are ultrasound-guided, have become an essential part of radiology, and are used in many divisions of radiology. By incorporating standardized training methodologies in a risk free environment through utilization of a simulation center with phantom training, we hope to improve proficiency and confidence in procedural performance. MATERIALS AND METHODS Twenty-nine radiology residents from four levels of training were enrolled in this prospective study. The residents were given written, video, and live interactive training on the basics of ultrasound-guided procedures in our simulation center on a phantom mannequin. All of the teaching materials were created by residents and staff radiologists at the institution. RESULTS Residents demonstrated statistically significant improvement (P < .05) between their pre- and posttest scores on both the written and practical examinations. They also showed a trend toward improved dexterity in the technical aspects of ultrasound-guided procedures (P = .07) after training. On the survey questionnaire, residents confirm improved knowledge level, technical ability, and confidence levels pertaining to ultrasound-guided procedures. CONCLUSIONS The use of controlled simulation based training can be an invaluable tool to improve the knowledge level, dexterity, and confidence of residents performing ultrasound-guided procedures. Additionally, a simulation model allows standardization of education.
Collapse
|
33
|
Liu Y, Glass NL, Power RW. Technical communication: new teaching model for practicing ultrasound-guided regional anesthesia techniques: no perishable food products! Anesth Analg 2010; 110:1233-5. [PMID: 20142350 DOI: 10.1213/ane.0b013e3181cc558b] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND There is a pronounced learning curve for the technique of ultrasound-guided regional anesthesia. Practicing with a simulator model has been shown to speed the acquisition of these skills for various ultrasound-guided procedures. However, commercial models for ultrasound-guided regional anesthesia may be too costly or not readily available. Models using turkey breasts or tofu blocks have the disadvantage of containing perishable food products that can be a source for infection. We describe an alternative inexpensive model that is made from nonperishable components readily available in the operating room. METHODS The materials required include 1 clean used 500-mL bag of IV fluids, a bottle of Premisorb (TYCO Healthcare Group, Mansfield, MA), and a piece of foam material approximately 0.3 cm in diameter and 5 cm in length trimmed from operating room foam pads. After filling the IV bag with tap water and inserting the foam into the IV bag from the outlet port of the IV bag, one-third of a bottle of Premisorb (approximately 15 g) is poured into the IV bag. The outlet port of the bag is then sealed by taping the rubber stopper that originally came with the bag. RESULTS Premisorb, a solidifying agent frequently used to absorb irrigating fluids or blood in operating room suction canisters, produces a gel-like material in the IV bag. The foam inserted into the bag creates a relatively hyperechoic target. This gel-like substance in the bag will seal the holes created after multiple practice needle insertions, resulting in minimal leakage. The semitransparent nature of the gel allows the trainee to visualize the target directly and on the ultrasound screen. CONCLUSION The model we describe is inexpensive and easy to make from materials readily available in the operating room with the advantages of being nonperishable, easy to carry, and reusable.
Collapse
Affiliation(s)
- Yang Liu
- Department of Pediatric Anesthesiology, Texas Children's Hospital, Houston, TX 77030, USA.
| | | | | |
Collapse
|
34
|
Linte CA, White J, Eagleson R, Guiraudon GM, Peters TM. Virtual and Augmented Medical Imaging Environments: Enabling Technology for Minimally Invasive Cardiac Interventional Guidance. IEEE Rev Biomed Eng 2010; 3:25-47. [DOI: 10.1109/rbme.2010.2082522] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
35
|
|
36
|
Goksel O, Salcudean SE. B-mode ultrasound image simulation in deformable 3-D medium. IEEE TRANSACTIONS ON MEDICAL IMAGING 2009; 28:1657-1669. [PMID: 19278928 DOI: 10.1109/tmi.2009.2016561] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
This paper presents an algorithm for fast image synthesis inside deformed volumes. Given the node displacements of a mesh and a reference 3-D image dataset of a predeformed volume, the method first maps the image pixels that need to be synthesized from the deformed configuration to the nominal predeformed configuration, where the pixel intensities are obtained easily through interpolation in the regular-grid structure of the reference voxel volume. This mapping requires the identification of the mesh element enclosing each pixel for every image frame. To accelerate this point location operation, a fast method of projecting the deformed mesh on image pixels is introduced in this paper. The method presented was implemented for ultrasound B-mode image simulation of a synthetic tissue phantom. The phantom deformation as a result of ultrasound probe motion was modeled using the finite element method. Experimental images of the phantom under deformation were then compared with the corresponding synthesized images using sum of squared differences and mutual information metrics. Both this quantitative comparison and a qualitative assessment show that realistic images can be synthesized using the proposed technique. An ultrasound examination system was also implemented to demonstrate that real-time image synthesis with the proposed technique can be successfully integrated into a haptic simulation.
Collapse
Affiliation(s)
- Orcun Goksel
- Department of Electrical and Computer Engineering, University of British Columbia, V6T1Z4 Vancouver, BC, Canada.
| | | |
Collapse
|
37
|
Ni D, Chan WY, Qin J, Qu Y, Chui YP, Ho SSM, Heng PA. An ultrasound-guided organ biopsy simulation with 6DOF haptic feedback. MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION : MICCAI ... INTERNATIONAL CONFERENCE ON MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION 2008; 11:551-9. [PMID: 18982648 DOI: 10.1007/978-3-540-85990-1_66] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Ultrasound-guided biopsy is one of the most fundamental, but difficult, skills to acquire in interventional radiology. Intensive training, especially in the needle insertion, is required for trainee radiologists to perform safe procedures. In this paper, we propose a virtual reality simulation system to facilitate the training of radiologists and physicians in this procedures. Key issues addressed include a 3D anatomical model reconstruction, data fusion of multiple ultrasound volumes and computed tomography (CT), realistic rendering, interactive navigation, and haptic feedbacks in six degrees of freedom (DOF). Simulated ultrasound imagery based on real ultrasound data is presented to users, in real-time, while performing an examination on the needle placement into a virtual anatomical model. Our system delivers a realistic haptic feeling for trainees throughout the simulated needle insertion procedure, permitting repeated practices with no danger to patients.
Collapse
Affiliation(s)
- Dong Ni
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
| | | | | | | | | | | | | |
Collapse
|