1
|
Daher M, Ghanimeh J, Otayek J, Ghoul A, Bizdikian AJ, EL Abiad R. Augmented reality and shoulder replacement: a state-of-the-art review article. JSES REVIEWS, REPORTS, AND TECHNIQUES 2023; 3:274-278. [PMID: 37588507 PMCID: PMC10426657 DOI: 10.1016/j.xrrt.2023.01.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Since its implementation, the rates of failure of total shoulder arthroplasty which may be due to malpositioning pushed to improve this surgery by creating new techniques and tools to help perioperatively. Augmented reality, a newly used tool in orthopedic surgery can help bypass this problem and reduce the rates of failure faced in shoulder replacement surgeries. Although this technology has revolutionized orthopedic surgery and helped improve the accuracy in shoulder prosthesis components positioning, it still has some limitations such as inaccurate over-imposition that should be addressed before it becomes of standard usage.
Collapse
Affiliation(s)
- Mohammad Daher
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| | - Joe Ghanimeh
- Lebanese American University Medical Center Rizk Hospital, Beirut, Lebanon
| | - Joeffroy Otayek
- Lebanese American University Medical Center Rizk Hospital, Beirut, Lebanon
| | - Ali Ghoul
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| | | | - Rami EL Abiad
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| |
Collapse
|
2
|
Suresh D, Aydin A, James S, Ahmed K, Dasgupta P. The Role of Augmented Reality in Surgical Training: A Systematic Review. Surg Innov 2023; 30:366-382. [PMID: 36412148 PMCID: PMC10331622 DOI: 10.1177/15533506221140506] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
This review aims to provide an update on the role of augmented reality (AR) in surgical training and investigate whether the use of AR improves performance measures compared to traditional approaches in surgical trainees. PUBMED, EMBASE, Google Scholar, Cochrane Library, British Library and Science Direct were searched following PRIMSA guidelines. All English language original studies pertaining to AR in surgical training were eligible for inclusion. Qualitative analysis was performed and results were categorised according to simulator models, subsequently being evaluated using Messick's framework for validity and McGaghie's translational outcomes for simulation-based learning. Of the 1132 results retrieved, 45 were included in the study. 29 platforms were identified, with the highest 'level of effectiveness' recorded as 3. In terms of validity parameters, 10 AR models received a strong 'content validity' score of 2.15 models had a 'response processes' score ≥ 1. 'Internal structure' and 'consequences' were largely not discussed. 'Relations to other variables' was the best assessed criterion, with 9 platforms achieving a high score of 2. Overall, the Microsoft HoloLens received the highest level of recommendation for both validity and level of effectiveness. Augmented reality in surgical education is feasible and effective as an adjunct to traditional training. The Microsoft HoloLens has shown the most promising results across all parameters and produced improved performance measures in surgical trainees. In terms of the other simulator models, further research is required with stronger study designs, in order to validate the use of AR in surgical training.
Collapse
Affiliation(s)
- Dhivya Suresh
- Guy’s, King’s and St Thomas’ School of Medical Education, King’s College London, London, UK
| | - Abdullatif Aydin
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Stuart James
- Department of General Surgery, Princess Royal University Hospital, London, UK
| | - Kamran Ahmed
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Prokar Dasgupta
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| |
Collapse
|
3
|
Olexa J, Cohen J, Alexander T, Brown C, Schwartzbauer G, Woodworth GF. Expanding Educational Frontiers in Neurosurgery: Current and Future Uses of Augmented Reality. Neurosurgery 2023; 92:241-250. [PMID: 36637263 DOI: 10.1227/neu.0000000000002199] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Accepted: 08/22/2022] [Indexed: 01/14/2023] Open
Abstract
BACKGROUND Augmented reality (AR) technology is a new and promising option to advance and expand neurosurgical training because of recent advances in computer vision technology, improved AR software and hardware, and growing acceptance of this technology in clinical practice. OBJECTIVE To analyze the current status of AR use cases with the goal of envisioning future uses of AR in neurosurgical education. METHODS Articles applying to AR technology use in neurosurgical education were identified using PubMed, Google Scholar, and Web of Science databases following the Preferred Reporting Items of Systematic Reviews and Meta-Analyses guidelines. Articles were included for review based on applicable content related to neurosurgical or neuroanatomy training. Assessment of literature quality was completed using standardized MERSQI scoring. RESULTS The systematic search identified 2648 unique articles. Of these, 12 studies met inclusion criteria after extensive review. The average MERSQI score was 10.2 (SD: 1.7). The most common AR platform identified in this study was the Microsoft Hololens. The primary goals of the studies were to improve technical skills and approaches to surgical planning or improve understanding of neuroanatomy. CONCLUSION Augmented reality has emerged as a promising training tool in neurosurgery. This is demonstrated in the wide range of cases in technical training and anatomic education. It remains unclear how AR-based training compares directly with traditional training methods; however, AR shows great promise in the ability to further enhance and innovate neurosurgical education and training.
Collapse
Affiliation(s)
- Joshua Olexa
- Department of Neurosurgery, University of Maryland School of Medicine, Baltimore, Maryland, USA
| | | | | | - Cole Brown
- Department of Neurosurgery, University of Maryland School of Medicine, Baltimore, Maryland, USA
| | - Gary Schwartzbauer
- Department of Neurosurgery, University of Maryland School of Medicine, Baltimore, Maryland, USA
| | - Graeme F Woodworth
- Department of Neurosurgery, University of Maryland School of Medicine, Baltimore, Maryland, USA
| |
Collapse
|
4
|
Jiang J, Zhang J, Sun J, Wu D, Xu S. User's image perception improved strategy and application of augmented reality systems in smart medical care: A review. Int J Med Robot 2023; 19:e2497. [PMID: 36629798 DOI: 10.1002/rcs.2497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 12/26/2022] [Accepted: 01/06/2023] [Indexed: 01/12/2023]
Abstract
BACKGROUND Augmented reality (AR) is a new human-computer interaction technology that combines virtual reality, computer vision, and computer networks. With the rapid advancement of the medical field towards intelligence and data visualisation, AR systems are becoming increasingly popular in the medical field because they can provide doctors with clear enough medical images and accurate image navigation in practical applications. However, it has been discovered that different display types of AR systems have different effects on doctors' perception of the image after virtual-real fusion during the actual medical application. If doctors cannot correctly perceive the image, they may be unable to correctly match the virtual information with the real world, which will have a significant impact on their ability to recognise complex structures. METHODS This paper uses Citespace, a literature analysis tool, to visualise and analyse the research hotspots when AR systems are used in the medical field. RESULTS A visual analysis of the 1163 articles retrieved from the Web of Science Core Collection database reveals that display technology and visualisation technology are the key research directions of AR systems at the moment. CONCLUSION This paper categorises AR systems based on their display principles, reviews current image perception optimisation schemes for various types of systems, and analyses and compares different display types of AR systems based on their practical applications in the field of smart medical care so that doctors can select the appropriate display types based on different application scenarios. Finally, the future development direction of AR display technology is anticipated in order for AR technology to be more effectively applied in the field of smart medical care. The advancement of display technology for AR systems is critical for their use in the medical field, and the advantages and disadvantages of various display types should be considered in different application scenarios to select the best AR system.
Collapse
Affiliation(s)
- Jingang Jiang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China.,Robotics & Its Engineering Research Center, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jiawei Zhang
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Jianpeng Sun
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Dianhao Wu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| | - Shuainan Xu
- Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education, Harbin University of Science and Technology, Harbin, Heilongjiang, China
| |
Collapse
|
5
|
Durrani S, Onyedimma C, Jarrah R, Bhatti A, Nathani KR, Bhandarkar AR, Mualem W, Ghaith AK, Zamanian C, Michalopoulos GD, Alexander AY, Jean W, Bydon M. The Virtual Vision of Neurosurgery: How Augmented Reality and Virtual Reality are Transforming the Neurosurgical Operating Room. World Neurosurg 2022; 168:190-201. [DOI: 10.1016/j.wneu.2022.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2022] [Revised: 09/30/2022] [Accepted: 10/01/2022] [Indexed: 11/22/2022]
|
6
|
Paro MR, Hersh DS, Bulsara KR. History of Virtual Reality and Augmented Reality in Neurosurgical Training. World Neurosurg 2022; 167:37-43. [PMID: 35977681 DOI: 10.1016/j.wneu.2022.08.042] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Revised: 08/08/2022] [Accepted: 08/09/2022] [Indexed: 01/11/2023]
Abstract
Virtual reality (VR) and augmented reality (AR) are rapidly growing technologies. Both have been applied within neurosurgery for presurgical planning and intraoperative navigation, but VR and AR technology is particularly promising for the education of neurosurgical trainees. With the increasing demand for high impact yet efficient educational strategies, VR- and AR-based simulators allow neurosurgical residents to practice technical skills in a low-risk setting. Initial studies have confirmed that such simulators increase trainees' confidence, improve their understanding of operative anatomy, and enhance surgical techniques. Knowledge of the history and conceptual underpinnings of these technologies is useful to understand their current and future applications towards neurosurgical training. The technological precursors for VR and AR were introduced as early as the 1800s, and draw from the fields of entertainment, flight simulation, and education. However, computer software and processing speeds are needed to develop widespread VR- and AR-based surgical simulators, which have only been developed within the last 15 years. During that time, several devices had become rapidly adopted by neurosurgeons, and some programs had begun to incorporate them into the residency curriculum. With ever-improving technology, VR and AR are promising additions to a multi-modal training program, enabling neurosurgical residents to maximize their efforts in preparation for the operating room. In this review, we outline the historical development of the VR and AR systems that are used in neurosurgical training and discuss representative examples of the current technology.
Collapse
Affiliation(s)
- Mitch R Paro
- UConn School of Medicine, Farmington, Connecticut, USA
| | - David S Hersh
- Division of Neurosurgery, Connecticut Children's, Hartford, Connecticut, USA; Department of Surgery, UConn School of Medicine, Farmington, Connecticut, USA
| | - Ketan R Bulsara
- Department of Surgery, UConn School of Medicine, Farmington, Connecticut, USA; Division of Neurosurgery, UConn School of Medicine, Farmington, Connecticut, USA.
| |
Collapse
|
7
|
XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J Clin Med 2022; 11:jcm11020470. [PMID: 35054164 PMCID: PMC8779726 DOI: 10.3390/jcm11020470] [Citation(s) in RCA: 40] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/01/2022] [Accepted: 01/11/2022] [Indexed: 02/06/2023] Open
Abstract
In recent years, with the rapid advancement and consumerization of virtual reality, augmented reality, mixed reality, and extended reality (XR) technology, the use of XR technology in spine medicine has also become increasingly popular. The rising use of XR technology in spine medicine has also been accelerated by the recent wave of digital transformation (i.e., case-specific three-dimensional medical images and holograms, wearable sensors, video cameras, fifth generation, artificial intelligence, and head-mounted displays), and further accelerated by the COVID-19 pandemic and the increase in minimally invasive spine surgery. The COVID-19 pandemic has a negative impact on society, but positive impacts can also be expected, including the continued spread and adoption of telemedicine services (i.e., tele-education, tele-surgery, tele-rehabilitation) that promote digital transformation. The purpose of this narrative review is to describe the accelerators of XR (VR, AR, MR) technology in spine medicine and then to provide a comprehensive review of the use of XR technology in spine medicine, including surgery, consultation, education, and rehabilitation, as well as to identify its limitations and future perspectives (status quo and quo vadis).
Collapse
|
8
|
Muhlestein WE, Strong MJ, Yee TJ, Saadeh YS, Park P. Commentary: Augmented Reality Assisted Endoscopic Transforaminal Lumbar Interbody Fusion: 2-Dimensional Operative Video. Oper Neurosurg (Hagerstown) 2022; 22:e66-e67. [PMID: 34982927 DOI: 10.1227/ons.0000000000000034] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 09/13/2021] [Indexed: 01/17/2023] Open
|
9
|
Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr Rev Musculoskelet Med 2021; 14:397-405. [PMID: 34751894 DOI: 10.1007/s12178-021-09728-1] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 01/05/2023]
Abstract
PURPOSE OF REVIEW Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes. RECENT FINDINGS Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches. It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics' high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Collapse
|
10
|
Weeks JK, Pakpoor J, Park BJ, Robinson NJ, Rubinstein NA, Prouty SM, Nachiappan AC. Harnessing Augmented Reality and CT to Teach First-Year Medical Students Head and Neck Anatomy. Acad Radiol 2021; 28:871-876. [PMID: 32828663 DOI: 10.1016/j.acra.2020.07.008] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Revised: 06/30/2020] [Accepted: 07/05/2020] [Indexed: 11/30/2022]
Abstract
RATIONALE AND OBJECTIVES Three-dimensional (3D) visualization has been shown to benefit new generations of medical students and physicians-in-training in a variety of contexts. However, there is limited research directly comparing student performance after using 3D tools to those using two-dimensional (2D) screens. MATERIALS AND METHODS A CT was performed on a donated cadaver and a 3D CT hologram was created. A total of 30 first-year medical students were randomly assigned into two groups to review head and neck anatomy in a teaching session that incorporated CT. The first group used an augmented reality headset, while the second group used a laptop screen. The students were administered a five-question anatomy test before and after the session. Two-tailed t-tests were used for statistical comparison of pretest and posttest performance within and between groups. A feedback survey was distributed for qualitative data. RESULTS Pretest vs. posttest comparison of average percentage of questions answered correctly demonstrated both groups showing significant in-group improvement (p < 0.05), from 59% to 95% in the augmented reality group, and from 57% to 80% in the screen group. Between-group analysis indicated that posttest performance was significantly better in the augmented reality group (p = 0.022, effect size = 0.73). CONCLUSION Immersive 3D visualization has the potential to improve short-term anatomic recall in the head and neck compared to traditional 2D screen-based review, as well as engage millennial learners to learn better in anatomy laboratory. Our findings may reflect additional benefit gained from the stereoscopic depth cues present in augmented reality-based visualization.
Collapse
Affiliation(s)
- Joanna K Weeks
- Department of Radiology, University of Pennsylvania, 3400 Spruce Street, 1 Silverstein, Suite 130, Philadelphia, PA
| | - Jina Pakpoor
- Department of Radiology, University of Pennsylvania, 3400 Spruce Street, 1 Silverstein, Suite 130, Philadelphia, PA
| | - Brian J Park
- Department of Radiology, University of Pennsylvania, 3400 Spruce Street, 1 Silverstein, Suite 130, Philadelphia, PA
| | - Nicole J Robinson
- Department of Cell and Developmental Biology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA
| | - Neal A Rubinstein
- Department of Cell and Developmental Biology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA
| | - Stephen M Prouty
- Department of Cell and Developmental Biology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA
| | - Arun C Nachiappan
- Department of Radiology, University of Pennsylvania, 3400 Spruce Street, 1 Silverstein, Suite 130, Philadelphia, PA.
| |
Collapse
|
11
|
Is Experience in Hemodialysis Cannulation Related to Expertise? A Metrics-based Investigation for Skills Assessment. Ann Biomed Eng 2021; 49:1688-1700. [PMID: 33417054 DOI: 10.1007/s10439-020-02708-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2020] [Accepted: 12/08/2020] [Indexed: 12/19/2022]
Abstract
Cannulation is not only one of the most common medical procedures but also fraught with complications. The skill of the clinician performing cannulation directly impacts cannulation outcomes. However, current methods of teaching this skill are deficient, relying on subjective demonstrations and unrealistic manikins that have limited utility for skills training. Furthermore, of the factors that hinders effective continuing medical education is the assumption that clinical experience results in expertise. In this work, we examine if objective metrics acquired from a novel cannulation simulator are able to distinguish between experienced clinicians and established experts, enabling the measurement of true expertise. Twenty-two healthcare professionals, who practiced cannulation with varying experience, performed a simulated arteriovenous fistula cannulation task on the simulator. Four clinicians were peer-identified as experts while the others were designated to the experienced group. The simulator tracked the motion of the needle (via an electromagnetic sensor), rendered blood flashback function (via an infrared light sensor), and recorded pinch forces exerted on the needle (via force sensing elements). Metrics were computed based on motion, force, and other sensor data. Results indicated that, with near 80% of accuracy using both logistic regression and linear discriminant analysis, the objective metrics differentiated between experts and the experienced, including identifying needle motion and finger force as two prominent features that distinguished between the groups. Furthermore, results indicated that expertise was not correlated with years of experience, validating the central hypothesis of the study. These insights contribute to structured and standardized medical skills training by enabling a meaningful definition of expertise and could potentially lead to more effective skills training methods.
Collapse
|
12
|
Reymus M, Liebermann A, Diegritz C. Virtual reality: an effective tool for teaching root canal anatomy to undergraduate dental students – a preliminary study. Int Endod J 2020; 53:1581-1587. [DOI: 10.1111/iej.13380] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2019] [Revised: 07/29/2020] [Accepted: 07/29/2020] [Indexed: 12/01/2022]
Affiliation(s)
- M. Reymus
- Department of Conservative Dentistry and Periodontology Klinikum der Universität München MunichGermany
| | - A. Liebermann
- Department of Prosthetic Dentistry Klinikum der Universität München Munich Germany
| | - C. Diegritz
- Department of Conservative Dentistry and Periodontology Klinikum der Universität München MunichGermany
| |
Collapse
|
13
|
Fiador F, Poyade M, Bennett L. The Use of Augmented Reality to Raise Awareness of the Differences Between Osteoarthritis and Rheumatoid Arthritis. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1262:115-147. [PMID: 32613582 DOI: 10.1007/978-3-030-43961-3_6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Arthritis is one of the most common disease states worldwide but is still publicly misunderstood and lacks engaging public awareness materials. Within the UK, the most prevalent types of arthritis are osteoarthritis (OA) and rheumatoid arthritis (RA). The two are commonly mistaken as the same disease but, in fact, have very different pathogenesis, symptoms and treatments. This chapter describes a study which aimed to assess whether an augmented reality (AR) application could be used to raise awareness about the difference between OA and RA.An application was created for Android tablets that included labelled 3D models, animations and AR scenes triggered from a poster. In total 11 adult participants tested the application taking part in a pretest and posttest which aim to measure the usability of the application and the acquisition of knowledge on OA and RA. A T-test was performed to assess the effectiveness of the application from the pretest and posttest questionnaire outcomes. Overall results were encouraging reporting a very significant acquisition of knowledge and a highly satisfactory user experience.
Collapse
Affiliation(s)
- Florina Fiador
- School of Life Sciences, College of Medical, Veterinary and Life Sciences, University of Glasgow, Glasgow, UK
| | - Matthieu Poyade
- School of Simulation and Visualisation, The Glasgow School of Art, Glasgow, UK
| | - Louise Bennett
- Institute of Infection, Immunity and Inflammation, College of Medical, Veterinary and Life Sciences, University of Glasgow, Glasgow, UK.
| |
Collapse
|
14
|
Mendes HCM, Costa CIAB, da Silva NA, Leite FP, Esteves A, Lopes DS. PIÑATA: Pinpoint insertion of intravenous needles via augmented reality training assistance. Comput Med Imaging Graph 2020; 82:101731. [PMID: 32361555 DOI: 10.1016/j.compmedimag.2020.101731] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2019] [Revised: 02/29/2020] [Accepted: 04/11/2020] [Indexed: 11/16/2022]
Abstract
Conventional needle insertion training relies on medical dummies that simulate surface anatomy and internal structures such as veins or arteries. These dummies offer an interesting space to augment with useful information to assist training practices, namely, internal anatomical structures (subclavian artery and vein, internal jugular vein and carotid artery) along with target point, desired inclination, position and orientation of the needle. However, limited research has been conducted on Optical See-Through Augmented Reality (OST-AR) interfaces for training needle insertion, especially for central venous catheterization (CVC). In this work we introduce PIÑATA, an interactive tool to explore the benefits of OST-AR in CVC training using a dummy of the upper torso and neck; andexplore if PIÑATA complements conventional training practices.. Our design contribution also describes the observation and co-design sessions used to collect user requirements, usability aspects and user preferences. This was followed by a comparative study with 18 participants - attending specialists and medical residents - that performed needle insertion tasks for CVC with PIÑATAand the conventional training system. The performance was objectively measured by task completion time and number of needle insertion errors. A correlation was found between the task completion time in the two training methods, suggesting the concurrent validity of our OST-AR tool. An inherent difference in the task completion time (p =0.040) and in the number of errors (p = 0.036) between novices and experts proved the construct validity of the new tool. The qualitative answers of the participants also suggest its face and content validity, a high acceptability rate and a medium perceived workload. Finally, the result of semi-structured interviews with these 18 participants revealed that 14 of them considered that PIÑATA can complement the conventional training system, especially due to the visibility of the vessels inside the simulator. 13 agreed that OST-AR adoption in these scenarios is likely, particularly during early stages of training. Integration with ultrasound information was highlighted as necessary future work. In sum, the overall results show that the OST-AR tool proposed can complement the conventional training of CVC.
Collapse
Affiliation(s)
| | | | | | | | - Augusto Esteves
- Instituto Superior Técnico, Universidade de Lisboa, Portugal; ITI / LARSyS, Portugal.
| | - Daniel Simões Lopes
- Instituto Superior Técnico, Universidade de Lisboa, Portugal; INESC-ID Lisboa, Portugal.
| |
Collapse
|
15
|
Williams MA, McVeigh J, Handa AI, Lee R. Augmented reality in surgical training: a systematic review. Postgrad Med J 2020; 96:537-542. [DOI: 10.1136/postgradmedj-2020-137600] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Revised: 02/27/2020] [Accepted: 03/21/2020] [Indexed: 11/03/2022]
Abstract
The aim of this systematic review is to provide an update on the current state of augmented reality (AR) in surgical training and to further report on any described benefits compared with traditional techniques. A PICO (Population, Intervention, Comparison, Outcome) strategy was adopted to formulate an appropriate research question and define strict search terms to be entered into MEDLINE, CENTRAL and Google Scholar. The search was returned on 12/09/2019. All returned results were screened first by title and then abstract. The systematic search returned a total of 236 results, of which 18 were selected for final inclusion. Studies covered the full range of surgical disciplines and reported on outcomes including operative duration, accuracy and postoperative complication rates. Due to the heterogeneity of the collected data, no meta-analysis was possible. Outcome measures of competency, surgical opinion and postoperative complication rate were in favour of AR technology while operative duration appears to increase.
Collapse
|
16
|
Jud L, Fotouhi J, Andronic O, Aichmair A, Osgood G, Navab N, Farshad M. Applicability of augmented reality in orthopedic surgery - A systematic review. BMC Musculoskelet Disord 2020; 21:103. [PMID: 32061248 PMCID: PMC7023780 DOI: 10.1186/s12891-020-3110-2] [Citation(s) in RCA: 86] [Impact Index Per Article: 17.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Accepted: 02/03/2020] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Computer-assisted solutions are changing surgical practice continuously. One of the most disruptive technologies among the computer-integrated surgical techniques is Augmented Reality (AR). While Augmented Reality is increasingly used in several medical specialties, its potential benefit in orthopedic surgery is not yet clear. The purpose of this article is to provide a systematic review of the current state of knowledge and the applicability of AR in orthopedic surgery. METHODS A systematic review of the current literature was performed to find the state of knowledge and applicability of AR in Orthopedic surgery. A systematic search of the following three databases was performed: "PubMed", "Cochrane Library" and "Web of Science". The systematic review followed the Preferred Reporting Items on Systematic Reviews and Meta-analysis (PRISMA) guidelines and it has been published and registered in the international prospective register of systematic reviews (PROSPERO). RESULTS 31 studies and reports are included and classified into the following categories: Instrument / Implant Placement, Osteotomies, Tumor Surgery, Trauma, and Surgical Training and Education. Quality assessment could be performed in 18 studies. Among the clinical studies, there were six case series with an average score of 90% and one case report, which scored 81% according to the Joanna Briggs Institute Critical Appraisal Checklist (JBI CAC). The 11 cadaveric studies scored 81% according to the QUACS scale (Quality Appraisal for Cadaveric Studies). CONCLUSION This manuscript provides 1) a summary of the current state of knowledge and research of Augmented Reality in orthopedic surgery presented in the literature, and 2) a discussion by the authors presenting the key remarks required for seamless integration of Augmented Reality in the future surgical practice. TRIAL REGISTRATION PROSPERO registration number: CRD42019128569.
Collapse
Affiliation(s)
- Lukas Jud
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Javad Fotouhi
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
| | - Octavian Andronic
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Alexander Aichmair
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Greg Osgood
- Johns Hopkins Hospital, Department of Orthopedics Surgery, 1800 Orleans Street, Baltimore, 21287 USA
| | - Nassir Navab
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
- Computer Aided Medical Procedure, Technical University of Munich, Boltzmannstrasse 3, 85748 Munich, Germany
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| |
Collapse
|
17
|
Farshad-Amacker NA, Bay T, Rosskopf AB, Spirig JM, Wanivenhaus F, Pfirrmann CWA, Farshad M. Ultrasound-guided interventions with augmented reality in situ visualisation: a proof-of-mechanism phantom study. Eur Radiol Exp 2020; 4:7. [PMID: 32020366 PMCID: PMC7000569 DOI: 10.1186/s41747-019-0129-y] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Accepted: 10/29/2019] [Indexed: 02/07/2023] Open
Abstract
Background Ultrasound (US) images are currently displayed on monitors, and their understanding needs good orientation skills. Direct overlay of US images onto the according anatomy is possible with augmented reality (AR) technologies. Our purpose was to explore the performance of US-guided needle placement with and without AR in situ US viewing. Methods Three untrained operators and two experienced radiologists performed 200 US-guided punctures: 100 with and 100 without AR in situ US. The punctures were performed in two different phantoms, a leg phantom with soft tissue lesions and a vessel phantom. Time to puncture and number of needle passes were recorded for each puncture. Data are reported as median [range] according to their non-normal distribution. Results AR in situ US resulted in reduced time (median [range], 13 s [3–101] versus 14 s [3–220]) and number of needle passes (median [range], 1 [1–4] versus 1 [1–8]) compared to the conventional technique. The initial gap in performance of untrained versus experienced operators with the conventional US (time, 21.5 s [3–220] versus 10.5 s [3–94] and needle passes 1 [1–8] versus 1 [1, 2]) was reduced to 12.5 s [3–101] versus 13 s [3–100] and 1 [1–4] versus 1 [1–4] when using AR in situ US, respectively. Conclusion AR in situ US could be a potential breakthrough in US applications by simplifying operator’s spatial orientation and reducing experience-based differences in performance of US-guided interventions. Further studies are needed to confirm these preliminary phantom results.
Collapse
Affiliation(s)
| | - Till Bay
- Incremed AG, Lenghalde 5, 8008, Zurich, Switzerland
| | - Andrea B Rosskopf
- Radiology, Balgrist University Hospital, Forchstrasse, 340, 8008, Zurich, Switzerland
| | - José M Spirig
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Florian Wanivenhaus
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| | | | - Mazda Farshad
- Department of Orthopaedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
18
|
Stefan P, Pfandler M, Lazarovici M, Weigl M, Navab N, Euler E, Fürmetz J, Weidert S. Three-dimensional–Printed Computed Tomography–Based Bone Models for Spine Surgery Simulation. ACTA ACUST UNITED AC 2020; 15:61-66. [DOI: 10.1097/sih.0000000000000417] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
19
|
Lohre R, Wang JC, Lewandrowski KU, Goel DP. Virtual reality in spinal endoscopy: a paradigm shift in education to support spine surgeons. JOURNAL OF SPINE SURGERY 2020; 6:S208-S223. [PMID: 32195429 DOI: 10.21037/jss.2019.11.16] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Background Minimally invasive spine surgery (MISS) and endoscopic spine surgery have continually evolving indications in the cervical, thoracic, and lumbar spine. Endoscopic spine surgery entails treatment of disc disease, stenosis, spondylolisthesis, radiculopathy, and deformity. MISS involves complex motor skills in regions of variable anatomy. Simulator use has been proposed to aid in training and skill retention, preoperative planning, and intraoperative use. Methods A systematic review of five databases was performed for publications pertaining to the use of virtual (VR), augmented (AR), and mixed (MR) reality in MISS and spinal endoscopic surgery. Qualitative data analysis was undertaken with focus of study design, quality, and reported outcomes. Study quality was assessed using the Medical Education Research Quality Instrument (MERSQI) score and level of evidence (LoE) by a modified Oxford Centre for Evidence-Based Medicine (OCEBM) level for simulation in medicine. Results Thirty-eight studies were retained for data collection. Studies were of intervention-control, clinical application, and pilot or cross-sectional design. Identified articles illustrated use of VR, AR, and MR in all study designs. Procedures included pedicle cannulation and screw insertion, vertebroplasty, kyphoplasty, percutaneous transforaminal endoscopic discectomy (PTED), lumbar puncture and facet injection, transvertebral anterior cervical foraminotomy (TVACF) and posterior cervical laminoforaminotomy. Overall MERSQI score was low-to-medium [M =9.71 (SD =2.60); range, 4.5-13.5], and LoE was predominantly low given the number of purely descriptive articles, or low-quality randomized studies. Conclusions The current scope of VR, AR, and MR surgical simulators in MISS and spinal endoscopic surgery was described. Studies demonstrate improvement in technical skill and patient outcomes in short term follow-up. Despite this, overall study quality and levels of evidence remain low. Cohesive study design and reporting with focus on transfer validity in training scenarios, and patient derived outcome measures in clinical studies are required to further advance the field.
Collapse
Affiliation(s)
- Ryan Lohre
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, USA
| | - Jeffrey C Wang
- USC Spine Center, Keck School of Medicine at University of Southern California, Los Angeles, USA
| | - Kai-Uwe Lewandrowski
- Center for Advanced Spine Care of Southern Arizona and Surgical Institute of Tucson, Tucson, AZ, USA.,Department of Neurosurgery, UNIRIO, Rio de Janeiro, Brazil
| | - Danny P Goel
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
20
|
Laverdière C, Corban J, Khoury J, Ge SM, Schupbach J, Harvey EJ, Reindl R, Martineau PA. Augmented reality in orthopaedics. Bone Joint J 2019; 101-B:1479-1488. [DOI: 10.1302/0301-620x.101b12.bjj-2019-0315.r1] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Aims Computer-based applications are increasingly being used by orthopaedic surgeons in their clinical practice. With the integration of technology in surgery, augmented reality (AR) may become an important tool for surgeons in the future. By superimposing a digital image on a user’s view of the physical world, this technology shows great promise in orthopaedics. The aim of this review is to investigate the current and potential uses of AR in orthopaedics. Materials and Methods A systematic review of the PubMed, MEDLINE, and Embase databases up to January 2019 using the keywords ‘orthopaedic’ OR ‘orthopedic AND augmented reality’ was performed by two independent reviewers. Results A total of 41 publications were included after screening. Applications were divided by subspecialty: spine (n = 15), trauma (n = 16), arthroplasty (n = 3), oncology (n = 3), and sports (n = 4). Out of these, 12 were clinical in nature. AR-based technologies have a wide variety of applications, including direct visualization of radiological images by overlaying them on the patient and intraoperative guidance using preoperative plans projected onto real anatomy, enabling hands-free real-time access to operating room resources, and promoting telemedicine and education. Conclusion There is an increasing interest in AR among orthopaedic surgeons. Although studies show similar or better outcomes with AR compared with traditional techniques, many challenges need to be addressed before this technology is ready for widespread use. Cite this article: Bone Joint J 2019;101-B:1479–1488
Collapse
Affiliation(s)
- Carl Laverdière
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Jason Corban
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Jason Khoury
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Susan Mengxiao Ge
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Justin Schupbach
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Edward J. Harvey
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Rudy Reindl
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Paul A. Martineau
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| |
Collapse
|
21
|
Zhang Z, Liu Z, Singapogu R. Extracting Subtask-specific Metrics Toward Objective Assessment of Needle Insertion Skill for Hemodialysis Cannulation. JOURNAL OF MEDICAL ROBOTICS RESEARCH 2019; 4:1942006. [PMID: 33681506 PMCID: PMC7932179 DOI: 10.1142/s2424905x19420066] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
About 80% of all in-hospital patients require vascular access cannulation for treatments. However, there is a high rate of failure for vascular access cannulation, with several studies estimating up to a 50% failure rate for these procedures. Hemodialysis cannulation (HDC) is arguably one of the most difficult of these procedures with a steep learning curve and an extremely high failure rate. In light of this, there is a critical need that clinicians performing HDC have requisite skills. In this work, we present a method that combines the strengths of simulator-based objective skill quantification and task segmentation for needle insertion skill assessment at the subtask level. The results from our experimental study with seven novice nursing students on the cannulation simulator demonstrate that the simulator was able to segment needle insertion into subtask phases. In addition, most metrics were significantly different between the two phases, indicating that there may be value in evaluating participants' behavior at the subtask level. Further, the outcome metric (risk of infiltrating the simulated blood vessel) was successfully predicted by the process metrics in both phases. The implications of these results for skill assessment and training are discussed, which could potentially lead to improved patient outcomes if more extensive validation is pursued.
Collapse
Affiliation(s)
- Ziyang Zhang
- Department of Bioengineering, Clemson University, 301 Rhodes Research Center, Clemson, SC 29634, USA
| | - Zhanhe Liu
- Department of Bioengineering, Clemson University, 301 Rhodes Research Center, Clemson, SC 29634, USA
| | - Ravikiran Singapogu
- Department of Bioengineering, Clemson University, 301 Rhodes Research Center, Clemson, SC 29634, USA
| |
Collapse
|
22
|
da Silva MMO, Teixeira JMXN, Cavalcante PS, Teichrieb V. Perspectives on how to evaluate augmented reality technology tools for education: a systematic review. JOURNAL OF THE BRAZILIAN COMPUTER SOCIETY 2019. [DOI: 10.1186/s13173-019-0084-8] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
23
|
Miura S, Kawamura K, Kobayashi Y, Fujie MG. Using Brain Activation to Evaluate Arrangements Aiding Hand-Eye Coordination in Surgical Robot Systems. IEEE Trans Biomed Eng 2018; 66:2352-2361. [PMID: 30582521 DOI: 10.1109/tbme.2018.2889316] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
GOAL To realize intuitive, minimally invasive surgery, surgical robots are often controlled using master-slave systems. However, the surgical robot's structure often differs from that of the human body, so the arrangement between the monitor and master must reflect this physical difference. In this study, we validate the feasibility of an embodiment evaluation method that determines the arrangement between the monitor and master. In our constructed cognitive model, the brain's intraparietal sulcus activates significantly when somatic and visual feedback match. Using this model, we validate a cognitively appropriate arrangement between the monitor and master. METHODS In experiments, we measure participants' brain activation using an imaging device as they control the virtual surgical simulator. Two experiments are carried out that vary the monitor and hand positions. CONCLUSION There are two common arrangements of the monitor and master at the brain activation's peak: One is placing the monitor behind the master, so the user feels that the system is an extension of his arms into the monitor; the other arranges the monitor in front of the master, so the user feels the correspondence between his own arm and the virtual arm in the monitor. SIGNIFICANCE From these results, we conclude that the arrangement between the monitor and master impacts embodiment, enabling the participant to feel apparent posture matches in master-slave surgical robot systems.
Collapse
|
24
|
Automatic Global Level Set Approach for Lumbar Vertebrae CT Image Segmentation. BIOMED RESEARCH INTERNATIONAL 2018; 2018:6319879. [PMID: 30402488 PMCID: PMC6196995 DOI: 10.1155/2018/6319879] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/06/2018] [Revised: 08/31/2018] [Accepted: 09/19/2018] [Indexed: 11/17/2022]
Abstract
Vertebrae computed tomography (CT) image automatic segmentation is an essential step for Image-guided minimally invasive spine surgery. However, most of state-of-the-art methods still require human intervention due to the inherent limitations of vertebrae CT image, such as topological variation, irregular boundaries (double boundary, weak boundary), and image noise. Therefore, this paper intentionally designed an automatic global level set approach (AGLSA), which is capable of dealing with these issues for lumbar vertebrae CT image segmentation. Unlike the traditional level set methods, we firstly propose an automatically initialized level set function (AILSF) that comprises hybrid morphological filter (HMF) and Gaussian mixture model (GMM) to automatically generate a smooth initial contour which is precisely adjacent to the object boundary. Secondly, a regularized level set formulation is introduced to overcome the weak boundary leaking problem, which utilizes the region correlation of histograms inside and outside the level set contour as a global term. Ultimately, a gradient vector flow (GVF) based edge-stopping function is employed to guarantee a fast convergence rate of the level set evolution and to avoid level set function oversegmentation at the same time. Our proposed approach has been tested on 115 vertebrae CT volumes of various patients. Quantitative comparisons validate that our proposed AGLSA is more accurate in segmenting lumbar vertebrae CT images with irregular boundaries and more robust to various levels of salt-and-pepper noise.
Collapse
|
25
|
Pfandler M, Lazarovici M, Stefan P, Wucherer P, Weigl M. Virtual reality-based simulators for spine surgery: a systematic review. Spine J 2017; 17:1352-1363. [PMID: 28571789 DOI: 10.1016/j.spinee.2017.05.016] [Citation(s) in RCA: 87] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/08/2016] [Revised: 03/06/2017] [Accepted: 05/10/2017] [Indexed: 02/06/2023]
Abstract
BACKGROUND CONTEXT Virtual reality (VR)-based simulators offer numerous benefits and are very useful in assessing and training surgical skills. Virtual reality-based simulators are standard in some surgical subspecialties, but their actual use in spinal surgery remains unclear. Currently, only technical reviews of VR-based simulators are available for spinal surgery. PURPOSE Thus, we performed a systematic review that examined the existing research on VR-based simulators in spinal procedures. We also assessed the quality of current studies evaluating VR-based training in spinal surgery. Moreover, we wanted to provide a guide for future studies evaluating VR-based simulators in this field. STUDY DESIGN AND SETTING This is a systematic review of the current scientific literature regarding VR-based simulation in spinal surgery. METHODS Five data sources were systematically searched to identify relevant peer-reviewed articles regarding virtual, mixed, or augmented reality-based simulators in spinal surgery. A qualitative data synthesis was performed with particular attention to evaluation approaches and outcomes. Additionally, all included studies were appraised for their quality using the Medical Education Research Study Quality Instrument (MERSQI) tool. RESULTS The initial review identified 476 abstracts and 63 full texts were then assessed by two reviewers. Finally, 19 studies that examined simulators for the following procedures were selected: pedicle screw placement, vertebroplasty, posterior cervical laminectomy and foraminotomy, lumbar puncture, facet joint injection, and spinal needle insertion and placement. These studies had a low-to-medium methodological quality with a MERSQI mean score of 11.47 out of 18 (standard deviation=1.81). CONCLUSIONS This review described the current state and applications of VR-based simulator training and assessment approaches in spinal procedures. Limitations, strengths, and future advancements of VR-based simulators for training and assessment in spinal surgery were explored. Higher-quality studies with patient-related outcome measures are needed. To establish further adaptation of VR-based simulators in spinal surgery, future evaluations need to improve the study quality, apply long-term study designs, and examine non-technical skills, as well as multidisciplinary team training.
Collapse
Affiliation(s)
- Michael Pfandler
- Institute and Outpatient Clinic for Occupational, Social, and Environmental Medicine, Ludwig-Maximilians-University Munich, Ziemssenstrasse 1, Munich D-80336, Germany.
| | - Marc Lazarovici
- Institute for Emergency Medicine and Management in Medicine (INM), Ludwig-Maximilians-University Munich, Schillerstraße 53, Munich D-80336, Germany
| | - Philipp Stefan
- Computer Aided Medical Procedures, (CAMP), Computer Science Department (I-16), Technical University of Munich, Boltzmannstraße 3, Garching bei München D-85748, Germany
| | - Patrick Wucherer
- Computer Aided Medical Procedures, (CAMP), Computer Science Department (I-16), Technical University of Munich, Boltzmannstraße 3, Garching bei München D-85748, Germany
| | - Matthias Weigl
- Institute and Outpatient Clinic for Occupational, Social, and Environmental Medicine, Ludwig-Maximilians-University Munich, Ziemssenstrasse 1, Munich D-80336, Germany
| |
Collapse
|
26
|
Morgan M, Aydin A, Salih A, Robati S, Ahmed K. Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review. JOURNAL OF SURGICAL EDUCATION 2017; 74:698-716. [PMID: 28188003 DOI: 10.1016/j.jsurg.2017.01.005] [Citation(s) in RCA: 60] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2016] [Revised: 10/15/2016] [Accepted: 01/04/2017] [Indexed: 06/06/2023]
Abstract
OBJECTIVE To conduct a systematic review of orthopedic training and assessment simulators with reference to their level of evidence (LoE) and level of recommendation. DESIGN Medline and EMBASE library databases were searched for English language articles published between 1980 and 2016, describing orthopedic simulators or validation studies of these models. All studies were assessed for LoE, and each model was subsequently awarded a level of recommendation using a modified Oxford Centre for Evidence-Based Medicine classification, adapted for education. RESULTS A total of 76 articles describing orthopedic simulators met the inclusion criteria, 47 of which described at least 1 validation study. The most commonly identified models (n = 34) and validation studies (n = 26) were for knee arthroscopy. Construct validation was the most frequent validation study attempted by authors. In all, 62% (47 of 76) of the simulator studies described arthroscopy simulators, which also contained validation studies with the highest LoE. CONCLUSIONS Orthopedic simulators are increasingly being subjected to validation studies, although the LoE of such studies generally remain low. There remains a lack of focus on nontechnical skills and on cost analyses of orthopedic simulators.
Collapse
Affiliation(s)
- Michael Morgan
- School of Medicine, King's College London, London, United Kingdom
| | - Abdullatif Aydin
- MRC Centre for Transplantation, Guy's Hospital, King's College London, London, United Kingdom.
| | - Alan Salih
- Department of Orthopedic Surgery, East Sussex Healthcare NHS Trust, Eastbourne, United Kingdom
| | - Shibby Robati
- Department of Orthopedic Surgery, East Sussex Healthcare NHS Trust, Eastbourne, United Kingdom
| | - Kamran Ahmed
- MRC Centre for Transplantation, Guy's Hospital, King's College London, London, United Kingdom
| |
Collapse
|
27
|
Scholten HJ, Pourtaherian A, Mihajlovic N, Korsten HHM, A. Bouwman R. Improving needle tip identification during ultrasound-guided procedures in anaesthetic practice. Anaesthesia 2017; 72:889-904. [DOI: 10.1111/anae.13921] [Citation(s) in RCA: 45] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/23/2017] [Indexed: 12/16/2022]
Affiliation(s)
- H. J. Scholten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
| | - A. Pourtaherian
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | | | - H. H. M. Korsten
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| | - R. A. Bouwman
- Department of Anaesthesiology; Intensive Care and Pain Medicine; Catharina Hospital; Eindhoven the Netherlands
- Department of Electrical Engineering; Eindhoven University of Technology; Eindhoven the Netherlands
| |
Collapse
|
28
|
Fischer-Cartlidge E, Romanoff S, Thom B, Burrows Walters C. Comparing Self-Injection Teaching Strategies for Patients With Breast Cancer and Their Caregivers: A Pilot Study. Clin J Oncol Nurs 2016; 20:515-21. [DOI: 10.1188/16.cjon.515-521] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
29
|
Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surg Endosc 2016; 30:4174-83. [PMID: 26905573 PMCID: PMC5009168 DOI: 10.1007/s00464-016-4800-6] [Citation(s) in RCA: 214] [Impact Index Per Article: 23.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2016] [Accepted: 02/03/2016] [Indexed: 12/16/2022]
Abstract
Background Computer-based applications are increasingly used to support the training of medical professionals. Augmented reality applications (ARAs) render an interactive virtual layer on top of reality. The use of ARAs is of real interest to medical education because they blend digital elements with the physical learning environment. This will result in new educational opportunities. The aim of this systematic review is to investigate to which extent augmented reality applications are currently used to validly support medical professionals training. Methods PubMed, Embase, INSPEC and PsychInfo were searched using predefined inclusion criteria for relevant articles up to August 2015. All study types were considered eligible. Articles concerning AR applications used to train or educate medical professionals were evaluated. Results Twenty-seven studies were found relevant, describing a total of seven augmented reality applications. Applications were assigned to three different categories. The first category is directed toward laparoscopic surgical training, the second category toward mixed reality training of neurosurgical procedures and the third category toward training echocardiography. Statistical pooling of data could not be performed due to heterogeneity of study designs. Face-, construct- and concurrent validity was proven for two applications directed at laparoscopic training, face- and construct validity for neurosurgical procedures and face-, content- and construct validity in echocardiography training. In the literature, none of the ARAs completed a full validation process for the purpose of use. Conclusion Augmented reality applications that support blended learning in medical training have gained public and scientific interest. In order to be of value, applications must be able to transfer information to the user. Although promising, the literature to date is lacking to support such evidence.
Collapse
Affiliation(s)
- E Z Barsom
- Department of Surgery, Academic Medical Centre, PO Box 22660, 1100 DD, Amsterdam, The Netherlands
| | - M Graafland
- Department of Surgery, Academic Medical Centre, PO Box 22660, 1100 DD, Amsterdam, The Netherlands.,Department of Surgery, Flevo Hospital, Almere, The Netherlands
| | - M P Schijven
- Department of Surgery, Academic Medical Centre, PO Box 22660, 1100 DD, Amsterdam, The Netherlands.
| |
Collapse
|
30
|
Stockinger H. Consumers’ Perception of Augmented Reality as an Emerging end User Technology: Social Media Monitoring Applied. KUNSTLICHE INTELLIGENZ 2015. [DOI: 10.1007/s13218-015-0389-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
31
|
Computerized training system for ultrasound-guided lumbar puncture on abnormal spine models: a randomized controlled trial. Can J Anaesth 2015; 62:777-84. [DOI: 10.1007/s12630-015-0367-2] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2014] [Accepted: 03/18/2015] [Indexed: 01/22/2023] Open
|
32
|
Clinkard D, Moult E, Holden M, Davison C, Ungi T, Fichtinger G, McGraw R. Assessment of lumbar puncture skill in experts and nonexperts using checklists and quantitative tracking of needle trajectories: implications for competency-based medical education. TEACHING AND LEARNING IN MEDICINE 2015; 27:51-56. [PMID: 25584471 DOI: 10.1080/10401334.2014.979184] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
UNLABELLED CONSTRUCT: With the current shift toward competency-based education, rigorous assessment tools are needed for procedurally based tasks. BACKGROUND Multiple tools exist to evaluate procedural skills, each with specific weaknesses. APPROACH We sought to determine if quantitative needle tracking could be used as a measure of lumbar puncture (LP) performance and added discriminatory value to a dichotomous checklist. Thirty-two medical students were divided into 2 groups. One group was asked to practice an LP once (single practice [SP]) and the other 5 times (multiple practice [MP]). Experts (attending ER physicians, senior ER residents, and a junior anesthesia resident) were used as comparators. Medical students were assessed again at 1 month to assess skill retention. Groups were assessed performing an LP with an electromagnetic tracking device that allows the needle's 3-dimensional movements to be captured and analyzed, and a dichotomous checklist. RESULTS Quantitative needle metrics as assessed by electromagnetic tracking showed a decreasing trend in needle movement distance with practice and with experience. The SP group made significantly more checklist mistakes initially as compared to the MP group (1.2 vs. 0.3, p <.05). At 1 month, there was a significant increase in both groups' mistakes (SP 3.4 vs. MP 1.3, p =.01). No correlation existed between individuals' needle motion and checklist mistakes. CONCLUSIONS These findings suggest that quantitative needle tracking identifies students who struggle with needle insertion but are successful at completing the dichotomous checklist.
Collapse
Affiliation(s)
- David Clinkard
- a Department of Emergency Medicine , Queen's University , Kingston , Ontario , Canada
| | | | | | | | | | | | | |
Collapse
|
33
|
Ungi T, Lasso A, Fichtinger G. Tracked Ultrasound in Navigated Spine Interventions. SPINAL IMAGING AND IMAGE ANALYSIS 2015. [DOI: 10.1007/978-3-319-12508-4_15] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
34
|
Cornelis F, Takaki H, Laskhmanan M, Durack JC, Erinjeri JP, Getrajdman GI, Maybody M, Sofocleous CT, Solomon SB, Srimathveeravalli G. Comparison of CT Fluoroscopy-Guided Manual and CT-Guided Robotic Positioning System for In Vivo Needle Placements in Swine Liver. Cardiovasc Intervent Radiol 2014; 38:1252-60. [PMID: 25376924 DOI: 10.1007/s00270-014-1016-9] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/25/2014] [Accepted: 09/08/2014] [Indexed: 01/08/2023]
Abstract
PURPOSE To compare CT fluoroscopy-guided manual and CT-guided robotic positioning system (RPS)-assisted needle placement by experienced IR physicians to targets in swine liver. MATERIALS AND METHODS Manual and RPS-assisted needle placement was performed by six experienced IR physicians to four 5 mm fiducial seeds placed in swine liver (n = 6). Placement performance was assessed for placement accuracy, procedure time, number of confirmatory scans, needle manipulations, and procedure radiation dose. Intra-modality difference in performance for each physician was assessed using paired t test. Inter-physician performance variation for each modality was analyzed using Kruskal-Wallis test. RESULTS Paired comparison of manual and RPS-assisted placements to a target by the same physician indicated accuracy outcomes was not statistically different (manual: 4.53 mm; RPS: 4.66 mm; p = 0.41), but manual placement resulted in higher total radiation dose (manual: 1075.77 mGy/cm; RPS: 636.4 mGy/cm; p = 0.03), required more confirmation scans (manual: 6.6; RPS: 1.6; p < 0.0001) and needle manipulations (manual: 4.6; RPS: 0.4; p < 0.0001). Procedure time for RPS was longer than manual placement (manual: 6.12 min; RPS: 9.7 min; p = 0.0003). Comparison of inter-physician performance during manual placement indicated significant differences in the time taken to complete placements (p = 0.008) and number of repositions (p = 0.04) but not in other study measures (p > 0.05). Comparison of inter-physician performance during RPS-assisted placement suggested statistically significant differences in procedure time (p = 0.02) and not in other study measures (p > 0.05). CONCLUSIONS CT-guided RPS-assisted needle placement reduced radiation dose, number of confirmatory scans, and needle manipulations when compared to manual needle placement by experienced IR physicians, with equivalent accuracy.
Collapse
Affiliation(s)
- F Cornelis
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA.,Department of Radiology, Pellegrin Hospital, Place Amélie Raba Léon, 33076, Bordeaux, France
| | - H Takaki
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - M Laskhmanan
- Perfint Healthcare Inc, Chennai, Tamil Nadu, India
| | - J C Durack
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - J P Erinjeri
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - G I Getrajdman
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - M Maybody
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - C T Sofocleous
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - S B Solomon
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA
| | - G Srimathveeravalli
- Interventional Radiology Service, Department of Radiology, Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY, 10065, USA.
| |
Collapse
|
35
|
Rasoulian A, Rohling R, Abolmaesumi P. Lumbar spine segmentation using a statistical multi-vertebrae anatomical shape+pose model. IEEE TRANSACTIONS ON MEDICAL IMAGING 2013; 32:1890-1900. [PMID: 23771318 DOI: 10.1109/tmi.2013.2268424] [Citation(s) in RCA: 75] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Segmentation of the spinal column from computed tomography (CT) images is a preprocessing step for a range of image-guided interventions. One intervention that would benefit from accurate segmentation is spinal needle injection. Previous spinal segmentation techniques have primarily focused on identification and separate segmentation of each vertebra. Recently, statistical multi-object shape models have been introduced to extract common statistical characteristics between several anatomies. These models can be used for segmentation purposes because they are robust, accurate, and computationally tractable. In this paper, we develop a statistical multi-vertebrae shape+pose model and propose a novel registration-based technique to segment the CT images of spine. The multi-vertebrae statistical model captures the variations in shape and pose simultaneously, which reduces the number of registration parameters. We validate our technique in terms of accuracy and robustness of multi-vertebrae segmentation of CT images acquired from lumbar vertebrae of 32 subjects. The mean error of the proposed technique is below 2 mm, which is sufficient for many spinal needle injection procedures, such as facet joint injections.
Collapse
|
36
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and (select (case when (4896=2209) then null else ctxsys.drithsx.sn(1,4896) end) from dual) is null-- sxuy] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/30/2022]
|
37
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and (select (case when (1792=1792) then null else ctxsys.drithsx.sn(1,1792) end) from dual) is null] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
38
|
|
39
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and 1553=1080-- bart] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/01/2022]
|
40
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and 3178=convert(int,(select char(113)+char(118)+char(107)+char(106)+char(113)+(select (case when (3178=3178) then char(49) else char(48) end))+char(113)+char(113)+char(122)+char(118)+char(113)))] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
41
|
|
42
|
|
43
|
|
44
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and 2640=(select (case when (2640=6544) then 2640 else (select 6544 union select 6520) end))-- mzfc] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/30/2022]
|
45
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 rlike (select (case when (4371=9904) then 0x31302e313031362f6a2e6d72692e323031322e30352e303031 else 0x28 end))-- qcki] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/30/2022]
|
46
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 and 5488=utl_inaddr.get_host_address(chr(113)||chr(118)||chr(107)||chr(106)||chr(113)||(select (case when (5488=5488) then 1 else 0 end) from dual)||chr(113)||chr(113)||chr(122)||chr(118)||chr(113))] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
47
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 rlike (select (case when (5886=7226) then 0x31302e313031362f6a2e6d72692e323031322e30352e303031 else 0x28 end))] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
48
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 or (select 7448 from(select count(*),concat(0x71766b6a71,(select (elt(7448=7448,1))),0x71717a7671,floor(rand(0)*2))x from information_schema.plugins group by x)a)-- dbin] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
49
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 or extractvalue(4152,concat(0x5c,0x71766b6a71,(select (elt(4152=4152,1))),0x71717a7671))] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
50
|
Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin JC, Pujol S, Bauer C, Jennings D, Fennessy F, Sonka M, Buatti J, Aylward S, Miller JV, Pieper S, Kikinis R. 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magn Reson Imaging 2012. [DOI: 10.1016/j.mri.2012.05.001 order by 1-- xuuy] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|