1
|
Tsumura R, Gao S, Tang Y, Zhang HK. Concentric-ring arrays for forward-viewing ultrasound imaging. J Med Imaging (Bellingham) 2022; 9:065002. [PMID: 36444284 PMCID: PMC9683378 DOI: 10.1117/1.jmi.9.6.065002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Accepted: 11/03/2022] [Indexed: 11/25/2023] Open
Abstract
Purpose Current ultrasound (US)-image-guided needle insertions often require an expertized technique for clinicians because the performance of tasks in a three-dimensional space using two-dimensional images requires operators to cognitively maintain the spatial relationships between the US probe, the needle, and the lesion. This work presents forward-viewing US imaging with a ring array configuration to enable needle interventions without requiring the registration between tools and targets. Approach The center-open ring array configuration allows the needle to be inserted from the center of the visualized US image, providing simple and intuitive guidance. To establish the feasibility of the ring array configuration, the design parameters causing the image quality, including the radius of the center hole and the number of ring layers and transducer elements, were investigated. Results Experimental results showed successful visualization, even with a hole in the transducer elements, and the target visibility was improved by increasing the number of ring layers and the number of transducer elements in each ring layer. Reducing the hole radius improved the region's image quality at a shallow depth. Conclusions Forward-viewing US imaging with a ring array configuration has the potential to be a viable alternative to conventional US image-guided needle insertion methods.
Collapse
Affiliation(s)
- Ryosuke Tsumura
- Worcester Polytechnic Institute, Department of Biomedical Engineering, Worcester, Massachusetts, United States
- National Institute of Advanced Industrial Science and Technology, Health and Medical Research Institute, Tsukuba, Japan
| | - Shang Gao
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| | - Yichuan Tang
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| | - Haichong K. Zhang
- Worcester Polytechnic Institute, Department of Biomedical Engineering, Worcester, Massachusetts, United States
- Worcester Polytechnic Institute, Department of Robotics Engineering, Worcester, Massachusetts, United States
| |
Collapse
|
2
|
Riabtsev M, Petuya V, Urízar M, Altuzarra O. Design and Testing of Two Haptic Devices Based on Reconfigurable 2R Joints. Applied Sciences 2022; 12:339. [DOI: 10.3390/app12010339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
This paper presents the design and testing of two haptic devices, based on reconfigurable 2R joints: an active 2R spherical mechanism-based joint and a differential gear-based joint. Based on our previous works, in which the design and kinematic analysis of both reconfigurable joints were developed, the experimental setup and the various tasks intended to test the reconfigurability, precision, force feedback system and general performance, are presented herein. Two control modes for the haptic device operation are proposed and studied. The statistical analysis tools and their selection principles are described. The mechanical design of two experimental setups and the main elements are considered in detail. The Robot Operating System nodes and the topics that are used in the software component of the experimental setup are presented and explained. The experimental testing was carried out with a number of participants and the corresponding results were analyzed with the selected statistical tools. A detailed interpretation and discussion on of the results is provided.
Collapse
|
3
|
Tai Y, Gao B, Li Q, Yu Z, Zhu C, Chang V. Trustworthy and Intelligent COVID-19 Diagnostic IoMT Through XR and Deep-Learning-Based Clinic Data Access. IEEE Internet Things J 2021; 8:15965-15976. [PMID: 35782175 PMCID: PMC8769002 DOI: 10.1109/jiot.2021.3055804] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Revised: 01/10/2021] [Accepted: 01/27/2021] [Indexed: 05/21/2023]
Abstract
This article presents a novel extended reality (XR) and deep-learning-based Internet-of-Medical-Things (IoMT) solution for the COVID-19 telemedicine diagnostic, which systematically combines virtual reality/augmented reality (AR) remote surgical plan/rehearse hardware, customized 5G cloud computing and deep learning algorithms to provide real-time COVID-19 treatment scheme clues. Compared to existing perception therapy techniques, our new technique can significantly improve performance and security. The system collected 25 clinic data from the 347 positive and 2270 negative COVID-19 patients in the Red Zone by 5G transmission. After that, a novel auxiliary classifier generative adversarial network-based intelligent prediction algorithm is conducted to train the new COVID-19 prediction model. Furthermore, The Copycat network is employed for the model stealing and attack for the IoMT to improve the security performance. To simplify the user interface and achieve an excellent user experience, we combined the Red Zone's guiding images with the Green Zone's view through the AR navigate clue by using 5G. The XR surgical plan/rehearse framework is designed, including all COVID-19 surgical requisite details that were developed with a real-time response guaranteed. The accuracy, recall, F1-score, and area under the ROC curve (AUC) area of our new IoMT were 0.92, 0.98, 0.95, and 0.98, respectively, which outperforms the existing perception techniques with significantly higher accuracy performance. The model stealing also has excellent performance, with the AUC area of 0.90 in Copycat slightly lower than the original model. This study suggests a new framework in the COVID-19 diagnostic integration and opens the new research about the integration of XR and deep learning for IoMT implementation.
Collapse
Affiliation(s)
- Yonghang Tai
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | - Bixuan Gao
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | - Qiong Li
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | - Zhengtao Yu
- Faculty of Information Engineering and AutomationKunming University of Science and TechnologyKunming650093China
| | - Chunsheng Zhu
- Southern University of Science and TechnologyShenzhen518055China
| | | |
Collapse
|
4
|
de Melo RHC, Conci A. Modeling the basic behaviors of Anesthesia Training in Relation to Puncture and Penetration Feedback. Annu Int Conf IEEE Eng Med Biol Soc 2021; 2021:4128-4133. [PMID: 34892135 DOI: 10.1109/embc46164.2021.9630874] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Failure rates in spinal anesthesia are generally low in experienced hands. However, studies report a failure rate variation of 1% to 17% in this procedure. The aim of this study is to bring the main characteristics of in vivo procedure to the virtual reality simulated environment. The first step is to model the behavior of tissue layers being punctured by a needle to then make its inclusion in medical training possible. The simulation proposed here is implemented using a Phantom Omni haptic device. Every crucial sensation of the method mentioned here was assessed by a dozen volunteers who participated in two experiments designed to validate the modeled response. Each user answered six questions (three for each experiment). Good results were achieved in certain essential aspects of the process, such as identifying the number of layers, the most rigid layer to puncture, and the most resistant layers to pass through. These results indicated that it is possible to represent many typical behaviors through virtual needle insertion in spinal anesthesia with the correct use of haptic properties.Clinical relevance- The idea is to create a spinal anesthesia simulator that could work as a complementary step in training new anesthetists. The use of a simulator avoids introducing the first puncture haptic sensation directly in patients.
Collapse
|
5
|
Tai Y, Qian K, Huang X, Zhang J, Jan MA, Yu Z. Intelligent Intraoperative Haptic-AR Navigation for COVID-19 Lung Biopsy Using Deep Hybrid Model. IEEE Trans Industr Inform 2021; 17:6519-6527. [PMID: 37981912 PMCID: PMC8545008 DOI: 10.1109/tii.2021.3052788] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Revised: 12/26/2020] [Accepted: 01/03/2021] [Indexed: 11/21/2023]
Abstract
A novel intelligent navigation technique for accurate image-guided COVID-19 lung biopsy is addressed, which systematically combines augmented reality (AR), customized haptic-enabled surgical tools, and deep neural network to achieve customized surgical navigation. Clinic data from 341 COVID-19 positive patients, with 1598 negative control group, have collected for the model synergy and evaluation. Biomechanics force data from the experiment are applied a WPD-CNN-LSTM (WCL) to learn a new patient-specific COVID-19 surgical model, and the ResNet was employed for the intraoperative force classification. To boost the user immersion and promote the user experience, intro-operational guiding images have combined with the haptic-AR navigational view. Furthermore, a 3-D user interface (3DUI), including all requisite surgical details, was developed with a real-time response guaranteed. Twenty-four thoracic surgeons were invited to the objective and subjective experiments for performance evaluation. The root-mean-square error results of our proposed WCL model is 0.0128, and the classification accuracy is 97%, which demonstrated that the innovative AR with deep learning (DL) intelligent model outperforms the existing perception navigation techniques with significantly higher performance. This article shows a novel framework in the interventional surgical integration for COVID-19 and opens the new research about the integration of AR, haptic rendering, and deep learning for surgical navigation.
Collapse
Affiliation(s)
- Yonghang Tai
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | - Kai Qian
- Department of Thoracic SurgeryYunnan First People's HospitalKunming650000China
| | - Xiaoqiao Huang
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | - Jun Zhang
- Yunnan Key Laboratory of Opto-Electronic Information TechnologyYunnan Normal UniversityKunming650500China
| | | | - Zhengtao Yu
- Faculty of Information Engineering and AutomationKunming University of Science and TechnologyKunming650093China
| |
Collapse
|
6
|
Guo Z, Tai Y, Du J, Chen Z, Li Q, Shi J. Automatically Addressing System for Ultrasound-Guided Renal Biopsy Training Based on Augmented Reality. IEEE J Biomed Health Inform 2021; 25:1495-1507. [PMID: 33684049 DOI: 10.1109/jbhi.2021.3064308] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Chronic kidney disease has become one of the diseases with the highest morbidity and mortality in kidney diseases, and there are still some problems in surgery. During the operation, the surgeon can only operate on two-dimensional ultrasound images and cannot determine the spatial position relationship between the lesion and the medical puncture needle in real-time. The average number of punctures per patient will reach 3 to 4, Increasing the incidence of complications after a puncture. This article starts with ultrasound-guided renal biopsy navigation training, optimizes puncture path planning, and puncture training assistance. The augmented reality technology, combined with renal puncture surgery training was studied. This paper develops a prototype ultrasound-guided renal biopsy surgery training system, which improves the accuracy and reliability of the system training. The system is compared with the VR training system. The results show that the augmented reality training platform is more suitable as a surgical training platform. Because it takes a short time and has a good training effect.
Collapse
|
7
|
Guo Z, Tai Y, Qin Z, Huang X, Li Q, Peng J, Shi J. Development and assessment of a haptic-enabled holographic surgical simulator for renal biopsy training. Soft comput 2020; 24:5783-94. [DOI: 10.1007/s00500-019-04341-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
8
|
Ibrahim Z, Money AG. Computer mediated reality technologies: A conceptual framework and survey of the state of the art in healthcare intervention systems. J Biomed Inform 2019; 90:103102. [DOI: 10.1016/j.jbi.2019.103102] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2018] [Revised: 10/30/2018] [Accepted: 12/29/2018] [Indexed: 11/19/2022]
|
9
|
Corrêa CG, Nunes FL, Ranzini E, Nakamura R, Tori R. Haptic interaction for needle insertion training in medical applications: The state-of-the-art. Med Eng Phys 2019; 63:6-25. [DOI: 10.1016/j.medengphy.2018.11.002] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 10/18/2018] [Accepted: 11/05/2018] [Indexed: 11/16/2022]
|
10
|
Corrêa CG, Machado MADAM, Ranzini E, Tori R, Nunes FDLS. Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block. J Appl Oral Sci 2017; 25:357-366. [PMID: 28877273 PMCID: PMC5595107 DOI: 10.1590/1678-7757-2016-0386] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2016] [Accepted: 11/21/2016] [Indexed: 11/22/2022] Open
Abstract
Objectives This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. Material and Methods To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results’ analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). Results The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues’ resistance. The evaluation of visual aspects was influenced by the participants’ experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. Conclusion The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.
Collapse
Affiliation(s)
- Cléber Gimenez Corrêa
- Universidade de São Paulo, Escola de Artes, Ciências e Humanidades, Laboratório de Aplicações de Informática em Saúde (LApIS), São Paulo, Brasil.,Universidade de São Paulo, Escola Politécnica, Laboratório de Tecnologias Interativas (Interlab), São Paulo, Brasil
| | | | - Edith Ranzini
- Pontifícia Universidade Católica de São Paulo, São Paulo, Brasil
| | - Romero Tori
- Universidade de São Paulo, Escola Politécnica, Laboratório de Tecnologias Interativas (Interlab), São Paulo, Brasil
| | - Fátima de Lourdes Santos Nunes
- Universidade de São Paulo, Escola de Artes, Ciências e Humanidades, Laboratório de Aplicações de Informática em Saúde (LApIS), São Paulo, Brasil.,Universidade de São Paulo, Escola Politécnica, Laboratório de Tecnologias Interativas (Interlab), São Paulo, Brasil
| |
Collapse
|
11
|
Mastmeyer A, Fortmeier D, Handels H. Evaluation of Direct Haptic 4D Volume Rendering of Partially Segmented Data for Liver Puncture Simulation. Sci Rep 2017; 7:671. [PMID: 28386067 PMCID: PMC5429645 DOI: 10.1038/s41598-017-00746-z] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2016] [Accepted: 03/14/2017] [Indexed: 11/26/2022] Open
Abstract
This work presents an evaluation study using a force feedback evaluation framework for a novel direct needle force volume rendering concept in the context of liver puncture simulation. PTC/PTCD puncture interventions targeting the bile ducts have been selected to illustrate this concept. The haptic algorithms of the simulator system are based on (1) partially segmented patient image data and (2) a non-linear spring model effective at organ borders. The primary aim is to quantitatively evaluate force errors caused by our patient modeling approach, in comparison to haptic force output obtained from using gold-standard, completely manually-segmented data. The evaluation of the force algorithms compared to a force output from fully manually segmented gold-standard patient models, yields a low mean of 0.12 N root mean squared force error and up to 1.6 N for systematic maximum absolute errors. Force errors were evaluated on 31,222 preplanned test paths from 10 patients. Only twelve percent of the emitted forces along these paths were affected by errors. This is the first study evaluating haptic algorithms with deformable virtual patients in silico. We prove haptic rendering plausibility on a very high number of test paths. Important errors are below just noticeable differences for the hand-arm system.
Collapse
Affiliation(s)
- Andre Mastmeyer
- Institute of Medical Informatics, University of Luebeck, Luebeck, 23552, Germany.
| | - Dirk Fortmeier
- Institute of Medical Informatics, University of Luebeck, Luebeck, 23552, Germany
| | - Heinz Handels
- Institute of Medical Informatics, University of Luebeck, Luebeck, 23552, Germany
| |
Collapse
|
12
|
Escobar-Castillejos D, Noguez J, Neri L, Magana A, Benes B. A Review of Simulators with Haptic Devices for Medical Training. J Med Syst 2016; 40:104. [DOI: 10.1007/s10916-016-0459-8] [Citation(s) in RCA: 112] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2015] [Accepted: 02/01/2016] [Indexed: 01/28/2023]
|
13
|
Fortmeier D, Wilms M, Mastmeyer A, Handels H. Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models. IEEE Trans Haptics 2015; 8:371-383. [PMID: 26087498 DOI: 10.1109/toh.2015.2445768] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.
Collapse
|
14
|
Abstract
The increasing use of point-of-care (POC) ultrasound presents a challenge in providing efficient training to POC ultrasound users for whom formal training is not readily available. In response to this need, we developed an affordable compact laptop-based obstetric ultrasound training simulator. It offers a realistic scanning experience, task-based training, and performance assessment. The position and orientation of the sham transducer are tracked with 5 DoF on an abdomen-sized scan surface with the shape of a cylindrical segment. On the simulator, user interface is rendered a virtual torso whose body surface models the abdomen of the pregnant scan subject. A virtual transducer scans the virtual torso by following the sham transducer movements on the scan surface. A given 3-D training image volume is generated by combining several overlapping 3-D ultrasound sweeps acquired from the pregnant scan subject using a Markov random field-based approach. Obstetric ultrasound training is completed through a series of tasks, guided by the simulator and focused on three aspects: basic medical ultrasound, orientation to obstetric space, and fetal biometry. The scanning performance is automatically evaluated by comparing user-identified anatomical landmarks with reference landmarks preinserted by sonographers. The simulator renders 2-D ultrasound images in real time with 30 frames/s or higher with good image quality; the training procedure follows standard obstetric ultrasound protocol. Thus, for learners without access to formal sonography programs, the simulator is intended to provide structured training in basic obstetrics ultrasound.
Collapse
|
15
|
Fortmeier D, Mastmeyer A, Schröder J, Handels H. A Virtual Reality System for PTCD Simulation Using Direct Visuo-Haptic Rendering of Partially Segmented Image Data. IEEE J Biomed Health Inform 2014; 20:355-66. [PMID: 25532197 DOI: 10.1109/jbhi.2014.2381772] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.
Collapse
|
16
|
Freschi C, Parrini S, Dinelli N, Ferrari M, Ferrari V. Hybrid simulation using mixed reality for interventional ultrasound imaging training. Int J Comput Assist Radiol Surg 2014; 10:1109-15. [PMID: 25213270 DOI: 10.1007/s11548-014-1113-x] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2014] [Accepted: 08/24/2014] [Indexed: 11/29/2022]
Abstract
PURPOSE Ultrasound (US) imaging offers advantages over other imaging modalities and has become the most widespread modality for many diagnostic and interventional procedures. However, traditional 2D US requires a long training period, especially to learn how to manipulate the probe. A hybrid interactive system based on mixed reality was designed, implemented and tested for hand-eye coordination training in diagnostic and interventional US. METHODS A hybrid simulator was developed integrating a physical US phantom and a software application with a 3D virtual scene. In this scene, a 3D model of the probe with its relative scan plane is coherently displayed with a 3D representation of the phantom internal structures. An evaluation study of the diagnostic module was performed by recruiting thirty-six novices and four experts. The performances of the hybrid (HG) versus physical (PG) simulator were compared. After the training session, each novice was required to visualize a particular target structure. The four experts completed a 5-point Likert scale questionnaire. RESULTS Seventy-eight percentage of the HG novices successfully visualized the target structure, whereas only 45% of the PG reached this goal. The mean scores from the questionnaires were 5.00 for usefulness, 4.25 for ease of use, 4.75 for 3D perception, and 3.25 for phantom realism. CONCLUSIONS The hybrid US training simulator provides ease of use and is effective as a hand-eye coordination teaching tool. Mixed reality can improve US probe manipulation training.
Collapse
Affiliation(s)
- C Freschi
- EndoCAS Center, Università di Pisa, Pisa, Italy
| | | | | | | | | |
Collapse
|