1
|
Uchida M, Matsumoto Y, Morioka S, Hori R, Mizutari K. Efficacy of Image-Guided Percutaneous Endoscopic Ear Surgery: A Novel Augmented Reality-Assisted Minimally Invasive Surgery. Otol Neurotol 2025:00129492-990000000-00766. [PMID: 40075251 DOI: 10.1097/mao.0000000000004488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2025]
Abstract
OBJECTIVE Although transcanal endoscopic ear surgery (TEES) offers benefits of minimal invasion, it is difficult to access certain regions of the temporal bone, often necessitating a switch to more invasive methods, such as mastoidectomy. To overcome these challenges, we developed "image-guided percutaneous endoscopic ear surgery (IGPEES)," a novel technique designed to enhance the precision and safety of ear operations by integrating augmented reality (AR) and advanced navigation systems, allowing precise, minimally invasive access to the mastoid antrum and other difficult-to-reach areas. This study aimed to evaluate the efficacy and safety of IGPEES through a retrospective analysis of 11 cases. STUDY DESIGN Retrospective analysis. SETTING Tertiary referral center. MAIN OUTCOME MEASURES We analyzed the setup time and accuracy for navigation and complication rates of IGPEES. RESULTS Our results demonstrated that IGPEES reduced setup time and enhanced navigation accuracy compared with conventional surgical navigation systems, with no postoperative complications observed, thereby representing a promising advancement in otologic surgery. CONCLUSION The integration of AR into IGPEES facilitates better surgical visualization and precision, potentially transforming standard practices for treating complex ear conditions.
Collapse
Affiliation(s)
- Masaya Uchida
- Department of Otorhinolaryngology Tracheoesophageal Surgery, Japanese Red Cross, Kyoto Diani Hospital, Kyoto, Japan
| | - Yu Matsumoto
- Department of Otorhinolaryngology, Tokyo Metropolitan Police Hospital, Tokyo, Japan
| | - Shigefumi Morioka
- Department of Otorhinolaryngology Tracheoesophageal Surgery, Japanese Red Cross, Kyoto Diani Hospital, Kyoto, Japan
| | - Ryusuke Hori
- Department of Otolaryngology-Head and Neck Surgery, University of Occupational and Environmental Health, Japan, Fukuoka, Japan
| | - Kunio Mizutari
- Department of Otolaryngology, Tokyo Women's Medical University Adachi Medical Center, Tokyo, Japan
| |
Collapse
|
2
|
Lin H, Huang X, Sheng Y, Tang N, Lian H, Zhang W, Zhao L, Zhu H, Chang P, Guo Y. Intelligent Verification Tool for Surgical Information of Ophthalmic Patients: A Study Based on Artificial Intelligence Technology. J Patient Saf 2025; 21:62-68. [PMID: 39432546 PMCID: PMC11832179 DOI: 10.1097/pts.0000000000001295] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2024] [Accepted: 08/26/2024] [Indexed: 10/23/2024]
Abstract
OBJECTIVE With the development of day surgery, the characteristics of "short, frequent and fast" ophthalmic surgery are becoming more prominent. However, nurses are not efficient in verifying patients' surgical information, and problems such as patient privacy leakage are becoming more prominent. To improve the situation, we developed a new augmented reality (AR)-based tool for visual recognition and artificial intelligent (AI) interpretation of the pattern and location of patient surgical skin markings for the verification of the correct surgical site and procedure. The tool can also display a variety of other verbally requested patient information. The purpose of this proposal is to evaluate its feasibility of use by surgical nurses in a real clinical setting. METHODS We developed a tool with image recognition technologies to interpretation patient surgical skin markings and match the information obtained with the patients surgical records, thus, verify the patient's surgical information. Verification includes the proper surgical site and type of procedure to be performed. Nurses can interact with the device through its speech recognition capabilities and the device provides them with a variety of other requested patient information via a heads-up display. Three hundred patients in an outpatient ophthalmology clinic were divided into an AR intelligent verification experimental group and a manual verification control group. The accuracy of information verification, work time consumption, and economic cost data were compared between the 2 groups to evaluate the effectiveness of the AR Surgical Information Intelligent Verification Tool in clinical patient surgical information verification. RESULTS There was no statistically difference in the correct rates of patient surgical information review between the experimental group (95.33%) and the control group (98.67%) (χ 2 = 2.934, P = 0.087). The median time for information verification was 10.00 (10.00, 11.00) seconds in the experimental group and 21.00 (20.00, 24.00) seconds in the control group, a statistically difference (Z = 0.000, P < 0.001). The experimental group saved 11 seconds per patient per review compared with the control group. Considering 10,531 surgeries in 2023, printing 1 page of surgical information per 9 patients and requiring 4 copies, 4680 pages of printing paper could be saved. CONCLUSIONS The AR Surgical Information Intelligent Verification Tool has advantages in assisting medical staff in patient surgical information verification, improving nursing efficiency, preventing surgical mark errors or nonstandardization, protecting patient privacy, and saving costs. It has certain research and application value in the scenario of patient surgical information verification in ophthalmic day ward.
Collapse
Affiliation(s)
- Hui Lin
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Xiaofang Huang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Yaying Sheng
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Ning Tang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Hengli Lian
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Wenjie Zhang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Lvjun Zhao
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Hanqing Zhu
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Pingjun Chang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Yingxuan Guo
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
- National Clinical Research Center for Ocular Diseases. Eye Hospital, Wenzhou Medical University, Wenzhou, China
| |
Collapse
|
3
|
Shepard L, Im C, Li O, Schuler N, Holler T, Saxton A, Ghazi A. Comparison of Remote Mixed Reality Versus In-person Training of Ultrasound-guided Percutaneous Nephrolithotomy With Urological Residents. Urology 2025:S0090-4295(25)00108-6. [PMID: 39892573 DOI: 10.1016/j.urology.2025.01.059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2024] [Revised: 12/29/2024] [Accepted: 01/24/2025] [Indexed: 02/04/2025]
Abstract
OBJECTIVE To compare remotely to in-person training during image-guided percutaneous nephrolithotomy (PCNL), we used mixed reality (MR), smart glasses, and a validated hydrogel simulation model to test surgical education uptake with urological residents. METHODS Twelve urology residents were randomized into 2 learning groups (MR-first or in-person-first) and completed a pre-test, MR or IP-guided training sessions, mid test, cross-over into MP or IP-guided training, post-test, and a retention test 2 months post-training to evaluate upper and lower pole access and balloon dilatation. MR sessions utilized Vuzix smart glasses and HelpLightning MR software to allow the instructor to remotely instruct the student and share their first-person perspective. Performance was assessed against a checklist of metrics including attempts for access, procedure time, and a model specific checklist. Learner perspectives were assessed after each training session using 5-point Likert scales and open form comments. RESULTS Overall attempt scores improved significantly for lower pole and upper pole procedures from pre- to mid-test and pre- to post-test (LP: 50% vs 74.5%, P=.0019, 50% vs 82.8%, P=.001; UP: 43.6% vs 63.75%, P=.0002; 43.6% vs 70.5%, P=3.6e-05) in both MR and IP cohorts. Learner evaluations suggested that the majority still prefer IP instruction, citing technological difficulties. CONCLUSION MR-based remote learning is equally effective when compared to in-person instruction of US-PCNL.
Collapse
Affiliation(s)
- Lauren Shepard
- Johns Hopkins Brady Institute of Urologic Surgery, Baltimore, MD
| | - Carolyn Im
- Johns Hopkins Brady Institute of Urologic Surgery, Baltimore, MD
| | - Oscar Li
- Johns Hopkins Brady Institute of Urologic Surgery, Baltimore, MD
| | - Nathan Schuler
- Johns Hopkins Brady Institute of Urologic Surgery, Baltimore, MD
| | - Tyler Holler
- Department of Urology, University of Rochester Medical Center, Rochester, NY
| | - Aaron Saxton
- Department of Urology, University of Rochester Medical Center, Rochester, NY
| | - Ahmed Ghazi
- Johns Hopkins Brady Institute of Urologic Surgery, Baltimore, MD.
| |
Collapse
|
4
|
Hamza H, Al-Ansari A, Navkar NV. Technologies Used for Telementoring in Open Surgery: A Scoping Review. Telemed J E Health 2024; 30:1810-1824. [PMID: 38546446 DOI: 10.1089/tmj.2023.0669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2024] Open
Abstract
Background: Telementoring technologies enable a remote mentor to guide a mentee in real-time during surgical procedures. This addresses challenges, such as lack of expertise and limited surgical training/education opportunities in remote locations. This review aims to provide a comprehensive account of these technologies tailored for open surgery. Methods: A comprehensive scoping review of the scientific literature was conducted using PubMed, ScienceDirect, ACM Digital Library, and IEEE Xplore databases. Broad and inclusive searches were done to identify articles reporting telementoring or teleguidance technologies in open surgery. Results: Screening of the search results yielded 43 articles describing surgical telementoring for open approach. The studies were categorized based on the type of open surgery (surgical specialty, surgical procedure, and stage of clinical trial), the telementoring technology used (information transferred between mentor and mentee, devices used for rendering the information), and assessment of the technology (experience level of mentor and mentee, study design, and assessment criteria). Majority of the telementoring technologies focused on trauma-related surgeries and mixed reality headsets were commonly used for rendering information (telestrations, surgical tools, or hand gestures) to the mentee. These technologies were primarily assessed on high-fidelity synthetic phantoms. Conclusions: Despite longer operative time, these telementoring technologies demonstrated clinical viability during open surgeries through improved performance and confidence of the mentee. In general, usage of immersive devices and annotations appears to be promising, although further clinical trials will be required to thoroughly assess its benefits.
Collapse
Affiliation(s)
- Hawa Hamza
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| |
Collapse
|
5
|
Huang C, Sheng Y, Lian H, Zhang W, Lin H, Huang X, Tang N, Zhao L, Guo Y. AR-AI assisted ophthalmic nursing: Preliminary usability study in clinical settings. Digit Health 2024; 10:20552076241269470. [PMID: 39257872 PMCID: PMC11384517 DOI: 10.1177/20552076241269470] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2024] [Accepted: 06/25/2024] [Indexed: 09/12/2024] Open
Abstract
Objective Ophthalmic ward nursing work is onerous and busy, and many researchers have tried to introduce artificial intelligence (AI) technology to assist nurses in performing nursing tasks. This study aims to use augmented reality (AR) and AI technology to develop an intelligent assistant system for ophthalmic ward nurses and evaluate the usability and acceptability of the system in assisting clinical work for nurses. Methods Based on AR technology, under the framework of deep learning, the system management, functions, and interfaces were completed using acoustic recognition, voice interaction, and image recognition technologies. Finally, an intelligent assistance system with functions such as patient face recognition, automatic information matching, and nursing work management was developed. Ophthalmic day ward nurses were invited to participate in filling out the System Usability Scale (SUS). Using the AR-based intelligent assistance system (AR-IAS) as the experimental group and the existing personal digital assistant (PDA) system as the control group. The experimental results of the three subscales of learnability, efficiency, and satisfaction of the usability scale were compared, and the clinical usability score of the AR-IAS system was calculated. Results This study showed that the AR-IAS and the PDA systems had learnability subscale scores of 22.50/30.00 and 21.00/30.00, respectively; efficiency subscale scores of 29.67/40.00 and 28.67/40.00, respectively; and satisfaction subscale scores of 23.67/30.00 and 23.17/30.00, respectively. The overall usability score of the AR-IAS system was 75.83/100.00. Conclusion Based on the analysis results of the System Usability Scale, the AR-IAS system developed using AR and AI technology has good overall usability and can be accepted by clinical nurses. It is suitable for use in ophthalmic nursing tasks and has clinical promotion and further research value.
Collapse
Affiliation(s)
- Changke Huang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Yaying Sheng
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Hengli Lian
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Wenjie Zhang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Hui Lin
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Xiaofang Huang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Ning Tang
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Lvjun Zhao
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| | - Yingxuan Guo
- National Engineering Research Center of Ophthalmology and Optometry, Eye Hospital, Wenzhou Medical University, Wenzhou, China
| |
Collapse
|
6
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
7
|
Khoong YM, Luo S, Huang X, Li M, Gu S, Jiang T, Liang H, Liu Y, Zan T. The application of augmented reality in plastic surgery training and education: A narrative review. J Plast Reconstr Aesthet Surg 2023; 82:255-263. [PMID: 37207439 DOI: 10.1016/j.bjps.2023.04.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 03/29/2023] [Accepted: 04/08/2023] [Indexed: 05/21/2023]
Abstract
Continuing problems with fewer training opportunities and a greater awareness of patient safety have led to a constant search for an alternative technique to bridge the existing theory-practice gap in plastic surgery training and education. The current COVID-19 epidemic has aggravated the situation, making it urgent to implement breakthrough technological initiatives currently underway to improve surgical education. The cutting edge of technological development, augmented reality (AR), has already been applied in numerous facets of plastic surgery training, and it is capable of realizing the aims of education and training in this field. In this article, we will take a look at some of the most important ways that AR is now being used in plastic surgery education and training, as well as offer an exciting glimpse into the potential future of this field thanks to technological advancements.
Collapse
Affiliation(s)
- Yi Min Khoong
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Shenying Luo
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Xin Huang
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Minxiong Li
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Shuchen Gu
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Taoran Jiang
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Hsin Liang
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Yunhan Liu
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China
| | - Tao Zan
- Department of Plastic and Reconstructive Surgery, Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine, Shanghai, PR China.
| |
Collapse
|
8
|
Suresh D, Aydin A, James S, Ahmed K, Dasgupta P. The Role of Augmented Reality in Surgical Training: A Systematic Review. Surg Innov 2023; 30:366-382. [PMID: 36412148 PMCID: PMC10331622 DOI: 10.1177/15533506221140506] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
This review aims to provide an update on the role of augmented reality (AR) in surgical training and investigate whether the use of AR improves performance measures compared to traditional approaches in surgical trainees. PUBMED, EMBASE, Google Scholar, Cochrane Library, British Library and Science Direct were searched following PRIMSA guidelines. All English language original studies pertaining to AR in surgical training were eligible for inclusion. Qualitative analysis was performed and results were categorised according to simulator models, subsequently being evaluated using Messick's framework for validity and McGaghie's translational outcomes for simulation-based learning. Of the 1132 results retrieved, 45 were included in the study. 29 platforms were identified, with the highest 'level of effectiveness' recorded as 3. In terms of validity parameters, 10 AR models received a strong 'content validity' score of 2.15 models had a 'response processes' score ≥ 1. 'Internal structure' and 'consequences' were largely not discussed. 'Relations to other variables' was the best assessed criterion, with 9 platforms achieving a high score of 2. Overall, the Microsoft HoloLens received the highest level of recommendation for both validity and level of effectiveness. Augmented reality in surgical education is feasible and effective as an adjunct to traditional training. The Microsoft HoloLens has shown the most promising results across all parameters and produced improved performance measures in surgical trainees. In terms of the other simulator models, further research is required with stronger study designs, in order to validate the use of AR in surgical training.
Collapse
Affiliation(s)
- Dhivya Suresh
- Guy’s, King’s and St Thomas’ School of Medical Education, King’s College London, London, UK
| | - Abdullatif Aydin
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Stuart James
- Department of General Surgery, Princess Royal University Hospital, London, UK
| | - Kamran Ahmed
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Prokar Dasgupta
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| |
Collapse
|
9
|
Curran VR, Xu X, Aydin MY, Meruvia-Pastor O. Use of Extended Reality in Medical Education: An Integrative Review. MEDICAL SCIENCE EDUCATOR 2023; 33:275-286. [PMID: 36569366 PMCID: PMC9761044 DOI: 10.1007/s40670-022-01698-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 11/28/2022] [Indexed: 06/17/2023]
Abstract
Extended reality (XR) has emerged as an innovative simulation-based learning modality. An integrative review was undertaken to explore the nature of evidence, usage, and effectiveness of XR modalities in medical education. One hundred and thirty-three (N = 133) studies and articles were reviewed. XR technologies are commonly reported in surgical and anatomical education, and the evidence suggests XR may be as effective as traditional medical education teaching methods and, potentially, a more cost-effective means of curriculum delivery. Further research to compare different variations of XR technologies and best applications in medical education and training are required to advance the field. Supplementary Information The online version contains supplementary material available at 10.1007/s40670-022-01698-4.
Collapse
Affiliation(s)
- Vernon R. Curran
- Office of Professional and Educational Development, Faculty of Medicine, Health Sciences Centre, Memorial University of Newfoundland, Room H2982, St. John’s, NL A1B 3V6 Canada
| | - Xiaolin Xu
- Faculty of Health Sciences, Queen’s University, Kingston, ON Canada
| | - Mustafa Yalin Aydin
- Department of Computer Sciences, Memorial University of Newfoundland, St. John’s, NL Canada
| | - Oscar Meruvia-Pastor
- Department of Computer Sciences, Memorial University of Newfoundland, St. John’s, NL Canada
| |
Collapse
|
10
|
Quesada-Olarte J, Carrion RE, Fernandez-Crespo R, Henry GD, Simhan J, Shridharani A, Carrion RE, Hakky TS. Extended Reality-Assisted Surgery as a Surgical Training Tool: Pilot Study Presenting First HoloLens-Assisted Complex Penile Revision Surgery. J Sex Med 2022; 19:1580-1586. [PMID: 36088277 DOI: 10.1016/j.jsxm.2022.07.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 07/10/2022] [Accepted: 07/18/2022] [Indexed: 11/27/2022]
Abstract
BACKGROUND Extended reality-assisted urologic surgery (XRAS) is a novel technology that superimposes a computer-generated image on the physician's field to integrate common elements of the surgical process in more advanced detail. An extended reality (XR) interface is generated using optical head-mounted display (OHMD) devices. AIM To present the first case of HoloLens-assisted complex penile revision surgery. METHODS We describe our pilot study of HoloLens-assisted penile revision surgery and present a thorough review of the literature regarding XRAS technology and innovative OHMD devices. OUTCOMES The ability of XRAS technology to superimpose a computer-generated image of the patient and integrate common elements of the surgical planning process with long-distance experts. RESULTS XRAS is a feasible technology for application in complex penile surgical planning processes. CLINICAL TRANSLATION XRAS and OHMD devices are novel technologies applicable to urological surgical training and planning. STRENGTHS AND LIMITATIONS Evidence suggests that the potential use of OHMD devices is safe and beneficial for surgeons. We intend to pioneer HoloLens technology in the surgical planning process of a malfunctioning penile implant due to herniation of the cylinder. This novel technology has not been used in prosthetic surgery, and current data about XRAS are limited. CONCLUSION OHMD devices are effective in the operative setting. Herein, we successfully demonstrated the integration of Microsoft HoloLens 2 into a penile surgical planning process for the first time. Further development and studies for this technology are necessary to better characterize the XRAS as a training and surgical planning tool. Quesada-Olarte J, Carrion RE, Fernandez-Crespo R, et al. Extended Reality-Assisted Surgery as a Surgical Training Tool: Pilot Study Presenting First HoloLens-Assisted Complex Penile Revision Surgery. J Sex Med 2022;19:1580-1586.
Collapse
|
11
|
Ivanov VM, Krivtsov AM, Strelkov SV, Smirnov AY, Shipov RY, Grebenkov VG, Rumyantsev VN, Gheleznyak IS, Surov DA, Korzhuk MS, Koskin VS. Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients. J Imaging 2022; 8:jimaging8070183. [PMID: 35877627 PMCID: PMC9319177 DOI: 10.3390/jimaging8070183] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Revised: 06/19/2022] [Accepted: 06/22/2022] [Indexed: 02/04/2023] Open
Abstract
The technology of augmented and mixed reality (AR/MR) is useful in various areas of modern surgery. We considered the use of augmented and mixed reality technologies as a method of preoperative planning and intraoperative navigation in abdominal cancer patients. Practical use of AM/MR raises a range questions, which demand suitable solutions. The difficulties and obstacles we encountered in the practical use of AR/MR are presented, along with the ways we chose to overcome them. The most demonstrative case is covered in detail. The three-dimensional anatomical model obtained from the CT scan needed to be rigidly attached to the patient’s body, and therefore an invasive approach was developed, using an orthopedic pin fixed to the pelvic bones. The pin is used both similarly to an X-ray contrast marker and as a marker for augmented reality. This solution made it possible, not only to visualize the anatomical structures of the patient and the border zone of the tumor, but also to change the position of the patient during the operation. In addition, a noninvasive (skin-based) marking method was developed that allows the application of mixed and augmented reality during operation. Both techniques were used (8 clinical cases) for preoperative planning and intraoperative navigation, which allowed surgeons to verify the radicality of the operation, to have visual control of all anatomical structures near the zone of interest, and to reduce the time of surgical intervention, thereby reducing the complication rate and improving the rehabilitation period.
Collapse
Affiliation(s)
- Vladimir M. Ivanov
- Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia or (A.M.K.); (S.V.S.); (A.Y.S.); (R.Y.S.)
- Correspondence:
| | - Anton M. Krivtsov
- Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia or (A.M.K.); (S.V.S.); (A.Y.S.); (R.Y.S.)
| | - Sergey V. Strelkov
- Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia or (A.M.K.); (S.V.S.); (A.Y.S.); (R.Y.S.)
| | - Anton Yu. Smirnov
- Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia or (A.M.K.); (S.V.S.); (A.Y.S.); (R.Y.S.)
| | - Roman Yu. Shipov
- Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia or (A.M.K.); (S.V.S.); (A.Y.S.); (R.Y.S.)
| | - Vladimir G. Grebenkov
- Department & Clinic of Naval Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia; (V.G.G.); (V.N.R.); (D.A.S.); (M.S.K.)
| | - Valery N. Rumyantsev
- Department & Clinic of Naval Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia; (V.G.G.); (V.N.R.); (D.A.S.); (M.S.K.)
| | - Igor S. Gheleznyak
- Department & Clinic of Roentgenology & Radiology, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia;
| | - Dmitry A. Surov
- Department & Clinic of Naval Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia; (V.G.G.); (V.N.R.); (D.A.S.); (M.S.K.)
| | - Michail S. Korzhuk
- Department & Clinic of Naval Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia; (V.G.G.); (V.N.R.); (D.A.S.); (M.S.K.)
- Department of General Surgery, Omsk State Medical University, ul. Lenina, 12, 644099 Omsk, Russia
| | - Valery S. Koskin
- Department & Clinic of Military Field Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia;
| |
Collapse
|
12
|
Aksoy C, Reimold P, Borgmann H, Kölker M, Cebulla A, Struck JP, Zehe V, Nestler T, von Landenberg N, Uhlig A, Boehm K, Leitsmann M. [Impact of the COVID-19 pandemic on urology residency training programs in Germany]. Aktuelle Urol 2022; 53:317-324. [PMID: 35580617 DOI: 10.1055/a-1824-4288] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
BACKGROUND Several international medical societies reported a negative impact on urology residency training programs due to the COVID-19 pandemic. OBJECTIVES The aim of this study was to investigate the impact of the pandemic on urological residency in Germany. MATERIALS AND METHODS From the 20th of May 2020 until the 20th of June 2020, a Germany-wide online survey on the continuing residency training was distributed via the members of the working group, social media (Facebook, Twitter, Instagram) and the German Society of Residents in Urology (GeSRU e.V.) newsletter. The survey covered 3 topics: 1) basic characteristics of the participants, 2) general and 3) subjective influence of the COVID-19 pandemic on clinics and further residency training. RESULTS A total of 50 residents took part in the survey; 54% were women. The median age was 31 years. Most of the participants were in their 2nd (22%) and 5th (26%) year of training and worked in a university hospital (44%) or in a clinic of maximum care (30%). 38% of the respondents stated that they only served urological emergencies during the COVID-19 pandemic. For 28% this meant a very large delay (80-100%) in the specialisation, while 28% stated only a minor impact. 66% documented training impairments caused by fewer operations, low patient numbers in the outpatient department (50%), congress (50%) and workshop (44%) cancellations. 46% of residents reported direct contact with COVID-19 patients while 10% were deployed on interdisciplinary IMC units. Numerous physical distancing and hygiene measures have been implemented by the clinics. CONCLUSION On average, around 50% of the urology residents indicated significant restrictions in training due to the COVID-19 pandemic in Germany. The delay in training cannot currently be measured in units of time, but it can be assumed that training for residents during the pandemic is likely to be of a lower quality compared to previous generations.
Collapse
Affiliation(s)
- Cem Aksoy
- Klinik und Poliklinik für Urologie, Medizinische Fakultät Carl Gustav Carus, TU Dresden, Dresden, Deutschland
| | - Philipp Reimold
- Urologische Universitätsklinik, Universitätsklinikum Heidelberg, Heidelberg, Germany
| | - Hendrik Borgmann
- Klinik und Poliklinik für Urologie, Universitätsmedizin der Johannes Gutenberg-Universität Mainz, Mainz, Germany
| | - Mara Kölker
- Klinik und Poliklinik für Urologie, Universitätsklinikum Hamburg-Eppendorf, Hamburg, Germany
| | - Angelika Cebulla
- Klinik für Urologie und Kinderurologie, Universitätsklinikum Ulm, Ulm, Germany
| | - Julian Peter Struck
- Klinik für Urologie, Universitatsklinikum Schleswig Holstein - Campus Lübeck, Lübeck, Germany
| | - Viktor Zehe
- Klinik für Urologie, Universitätsklinikum Ulm, Ulm, Germany
| | - Tim Nestler
- Klinik für Urologie, Bundeswehrzentralkrankenhaus Koblenz, Koblenz, Germany
| | | | - Annemarie Uhlig
- Klinik für Urologie, Universitätsmedizin Göttingen, Göttingen, Germany
| | - Katharina Boehm
- Klinik und Poliklinik für Urologie und Kinderurologie, Universitätsmedizin der Johannes Gutenberg-Universität Mainz, Mainz, Germany
| | | |
Collapse
|
13
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
14
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
15
|
Grebenkov VG, Rumyantsev VN, Ivanov VM, Strelkov SV, Balyura OV, Dymnikov DA, Markevich VY, Kushnarev SV, Zheleznyak IS, Pugacheva VS, Korzhuk MS, Demko AE, Surov DA. [Perioperative augmented reality technology in surgical treatment of locally advanced recurrent rectal cancer]. Khirurgiia (Mosk) 2022:44-53. [PMID: 36562672 DOI: 10.17116/hirurgia202212244] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Rectal cancer occupies the leading position among cancers, and incidence of locally advanced recurrences is still high despite comprehensive treatment. Combined resections are usually associated with high perioperative risks. These procedures are technically complex interventions requiring further improvement. Virtual reality technology in surgical treatment of locally advanced rectal cancer recurrence has not been widely discussed. The authors present multidisciplinary construction of the matched topographic-anatomical virtual model and virtual planning of the combined surgical intervention. Intraoperative use of augmented reality allowed specifying topographic and anatomical features of surgical area, level of vascular ligation, localization of tumor fixation points and resection borders. These data ensured safety and quality of resection. Further research of augmented reality technology and improvement of its technical aspects will improve the results of surgical treatment of patients with locally advanced pelvic tumors and recurrences.
Collapse
Affiliation(s)
- V G Grebenkov
- Kirov Military Medical Academy, St. Petersburg, Russia
| | | | - V M Ivanov
- Peter the Great St. Petersburg Polytechnic University, St. Petersburg, Russia
| | - S V Strelkov
- Peter the Great St. Petersburg Polytechnic University, St. Petersburg, Russia
| | - O V Balyura
- Kirov Military Medical Academy, St. Petersburg, Russia
| | - D A Dymnikov
- Kirov Military Medical Academy, St. Petersburg, Russia
| | | | - S V Kushnarev
- Kirov Military Medical Academy, St. Petersburg, Russia
| | | | - V S Pugacheva
- Kirov Military Medical Academy, St. Petersburg, Russia
| | - M S Korzhuk
- Kirov Military Medical Academy, St. Petersburg, Russia
| | - A E Demko
- Dzhanelidze St. Petersburg Research Institute for Emergency Care, St. Petersburg, Russia
| | - D A Surov
- Kirov Military Medical Academy, St. Petersburg, Russia
| |
Collapse
|
16
|
Wake N, Rosenkrantz AB, Huang WC, Wysock JS, Taneja SS, Sodickson DK, Chandarana H. A workflow to generate patient-specific three-dimensional augmented reality models from medical imaging data and example applications in urologic oncology. 3D Print Med 2021; 7:34. [PMID: 34709482 PMCID: PMC8554989 DOI: 10.1186/s41205-021-00125-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Accepted: 10/03/2021] [Indexed: 01/12/2023] Open
Abstract
Augmented reality (AR) and virtual reality (VR) are burgeoning technologies that have the potential to greatly enhance patient care. Visualizing patient-specific three-dimensional (3D) imaging data in these enhanced virtual environments may improve surgeons' understanding of anatomy and surgical pathology, thereby allowing for improved surgical planning, superior intra-operative guidance, and ultimately improved patient care. It is important that radiologists are familiar with these technologies, especially since the number of institutions utilizing VR and AR is increasing. This article gives an overview of AR and VR and describes the workflow required to create anatomical 3D models for use in AR using the Microsoft HoloLens device. Case examples in urologic oncology (prostate cancer and renal cancer) are provided which depict how AR has been used to guide surgery at our institution.
Collapse
Affiliation(s)
- Nicole Wake
- Department of Radiology, Montefiore Medical Center, Albert Einstein College of Medicine, 111 East 210th Street, Bronx, NY, 10467, USA. .,Center for Advanced Imaging Innovation and Research (CAI2R) and Bernard and Irene Schwartz Center for Biomedical Imaging, Department of Radiology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA.
| | - Andrew B Rosenkrantz
- Center for Advanced Imaging Innovation and Research (CAI2R) and Bernard and Irene Schwartz Center for Biomedical Imaging, Department of Radiology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| | - William C Huang
- Department of Urology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| | - James S Wysock
- Department of Urology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| | - Samir S Taneja
- Department of Urology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| | - Daniel K Sodickson
- Center for Advanced Imaging Innovation and Research (CAI2R) and Bernard and Irene Schwartz Center for Biomedical Imaging, Department of Radiology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| | - Hersh Chandarana
- Center for Advanced Imaging Innovation and Research (CAI2R) and Bernard and Irene Schwartz Center for Biomedical Imaging, Department of Radiology, NYU Langone Health, NYU Grossman School of Medicine, New York, NY, USA
| |
Collapse
|
17
|
Bassyouni Z, Elhajj IH. Augmented Reality Meets Artificial Intelligence in Robotics: A Systematic Review. Front Robot AI 2021; 8:724798. [PMID: 34631805 PMCID: PMC8493292 DOI: 10.3389/frobt.2021.724798] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 08/30/2021] [Indexed: 11/30/2022] Open
Abstract
Recently, advancements in computational machinery have facilitated the integration of artificial intelligence (AI) to almost every field and industry. This fast-paced development in AI and sensing technologies have stirred an evolution in the realm of robotics. Concurrently, augmented reality (AR) applications are providing solutions to a myriad of robotics applications, such as demystifying robot motion intent and supporting intuitive control and feedback. In this paper, research papers combining the potentials of AI and AR in robotics over the last decade are presented and systematically reviewed. Four sources for data collection were utilized: Google Scholar, Scopus database, the International Conference on Robotics and Automation 2020 proceedings, and the references and citations of all identified papers. A total of 29 papers were analyzed from two perspectives: a theme-based perspective showcasing the relation between AR and AI, and an application-based analysis highlighting how the robotics application was affected. These two sections are further categorized based on the type of robotics platform and the type of robotics application, respectively. We analyze the work done and highlight some of the prevailing limitations hindering the field. Results also explain how AR and AI can be combined to solve the model-mismatch paradigm by creating a closed feedback loop between the user and the robot. This forms a solid base for increasing the efficiency of the robotic application and enhancing the user’s situational awareness, safety, and acceptance of AI robots. Our findings affirm the promising future for robust integration of AR and AI in numerous robotic applications.
Collapse
Affiliation(s)
- Zahraa Bassyouni
- Vision and Robotics Lab, Department of Electrical and Computer Engineering, American University of Beirut, Beirut, Lebanon
| | - Imad H Elhajj
- Vision and Robotics Lab, Department of Electrical and Computer Engineering, American University of Beirut, Beirut, Lebanon
| |
Collapse
|
18
|
Is the use of augmented reality-assisted surgery beneficial in urological education? A systematic review. Curr Urol 2021; 15:148-152. [PMID: 34552454 PMCID: PMC8451320 DOI: 10.1097/cu9.0000000000000036] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2020] [Accepted: 04/24/2020] [Indexed: 11/25/2022] Open
Abstract
Background: Google Glass is an optical head-mounted display that has been used in multiple medical and surgical settings to enhance delivery of education and training. This systematic review focuses solely on the use of this technology in urology operating theaters for the purpose of surgical education. Materials and methods: A systematic search strategy was employed using EMBASE (1996–2019), Medline (1946–2019) and PubMed. Search terms included optical head-mounted displays, Google Glass and urological surgical training. Use of this technology in a nonurological setting, nonteaching sessions, case reports, reviews, editorials, abstracts, and articles not in English were rejected. Three studies were identified following the exclusion criteria. Results: All 3 studies received positive feedback from trainees regarding this technology in relation to enhanced surgical education. In addition, in all studies the trainees felt the technology had a place for educational training in the future. All studies described disadvantages to the technology as well including battery life, comfort, and cost. Conclusions: Studies describe a big potential for Google Glass and similar head-mounted devices for the role of surgical training in urology, however, larger studies looking at more varied operations can help reinforce this viewpoint.
Collapse
|
19
|
Baashar Y, Alkawsi G, Ahmad WNW, Alhussian H, Alwadain A, Capretz LF, Babiker A, Alghail A. The Effectiveness of Using Augmented Reality for Training in the Medical Professions: A Meta Analysis (Preprint). JMIR Serious Games 2021; 10:e32715. [PMID: 35787488 PMCID: PMC9297143 DOI: 10.2196/32715] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 03/12/2022] [Accepted: 04/22/2022] [Indexed: 11/19/2022] Open
Abstract
Background Augmented reality (AR) is an interactive technology that uses persuasive digital data and real-world surroundings to expand the user's reality, wherein objects are produced by various computer applications. It constitutes a novel advancement in medical care, education, and training. Objective The aim of this work was to assess how effective AR is in training medical students when compared to other educational methods in terms of skills, knowledge, confidence, performance time, and satisfaction. Methods We performed a meta-analysis on the effectiveness of AR in medical training that was constructed by using the Cochrane methodology. A web-based literature search was performed by using the Cochrane Library, Web of Science, PubMed, and Embase databases to find studies that recorded the effect of AR in medical training up to April 2021. The quality of the selected studies was assessed by following the Cochrane criteria for risk of bias evaluations. Results In total, 13 studies with a total of 654 participants were included in the meta-analysis. The findings showed that using AR in training can improve participants' performance time (I2=99.9%; P<.001), confidence (I2=97.7%; P=.02), and satisfaction (I2=99.8%; P=.006) more than what occurs under control conditions. Further, AR did not have any effect on the participants’ knowledge (I2=99.4%; P=.90) and skills (I2=97.5%; P=.10). The meta-regression plot shows that there has been an increase in the number of articles discussing AR over the years and that there is no publication bias in the studies used for the meta-analysis. Conclusions The findings of this work suggest that AR can effectively improve performance time, satisfaction, and confidence in medical training but is not very effective in areas such as knowledge and skill. Therefore, more AR technologies should be implemented in the field of medical training and education. However, to confirm these findings, more meticulous research with more participants is needed.
Collapse
Affiliation(s)
- Yahia Baashar
- Faculty of Computing and Informatics, Universiti Malaysia Sabah, Labuan, Malaysia
| | - Gamal Alkawsi
- Institute of Sustainable Energy, Universiti Tenaga Nasional, Kajang, Malaysia
| | | | - Hitham Alhussian
- Department of Computer and Information Sciences, Universiti Teknologi Petronas, Seri Iskandar, Malaysia
| | - Ayed Alwadain
- Department of Computer Science, King Saud University, Riyadh, Saudi Arabia
| | - Luiz Fernando Capretz
- Department of Electrical & Computer Engineering, Western University, Ontario, ON, Canada
| | - Areej Babiker
- Department of Computer Engineering, Future University, Khartoum, Sudan
| | - Adnan Alghail
- Department of World Languages, Greece Central School District, New York, NY, United States
| |
Collapse
|
20
|
Kovoor JG, Gupta AK, Gladman MA. Validity and effectiveness of augmented reality in surgical education: A systematic review. Surgery 2021; 170:88-98. [PMID: 33744003 DOI: 10.1016/j.surg.2021.01.051] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2020] [Revised: 01/27/2021] [Accepted: 01/28/2021] [Indexed: 01/22/2023]
Abstract
BACKGROUND Current challenges in surgical training have led to the investigation of augmented reality as a potential method of supplementary education. However, its value for this purpose remains uncertain. The aim of this study was to perform a systematic review of the published literature to evaluate the validity and effectiveness of augmented reality in surgical education, and to compare it with other simulation modalities. METHODS Electronic literature searches were performed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. Two authors independently extracted pertinent data and assessed study quality. The primary outcome measures of interest were the validity and effectiveness of augmented reality as an educational tool. RESULTS Of 6,500 articles, 24 studies met eligibility criteria for inclusion, of which 2 were randomized. Ten studies investigated validity, with 7 establishing both face and content validity and an additional 1 just content validity. Construct validity was demonstrated in 9 of 11 studies. Of the 11 studies that examined the effectiveness of augmented reality in skills acquisition, 9 demonstrated enhanced learning. Of the 5 studies in which the effectiveness of augmented reality as an educational tool was compared with other modes of simulation, augmented reality was found to be superior in 2 and equivalent in the others. CONCLUSION Overall, the majority, including 2 high-quality randomized controlled trials, demonstrated the validity and effectiveness of augmented reality in surgical education. However, the quality of published studies was poor with marked heterogeneity. Although these results are encouraging, additional high-quality studies, preferably in the real-life environment, are required before the widespread implementation of augmented reality within surgical curricula can be recommended.
Collapse
Affiliation(s)
- Joshua G Kovoor
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia
| | - Aashray K Gupta
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia
| | - Marc A Gladman
- Adelaide Medical School, Faculty of Health & Medical Sciences, The University of Adelaide, South Australia.
| |
Collapse
|
21
|
Lu S, Sanchez Perdomo YP, Jiang X, Zheng B. Integrating Eye-Tracking to Augmented Reality System for Surgical Training. J Med Syst 2020; 44:192. [PMID: 32990801 DOI: 10.1007/s10916-020-01656-w] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2019] [Accepted: 09/16/2020] [Indexed: 11/29/2022]
Abstract
Augmented Reality has been utilized for surgical training. During the implementation, displaying instructional information at the right moment is critical for skill acquisition. We built a new surgical training platform combining augmented reality system (HoloLens, Microsoft) with an eye-tracker (Pupil labs, Germany). Our goal is to detect the moments of performance difficulty using the integrated eye-tracker so that the system could display instructions at the precise moment when the user is seeking instructional information during a surgical skill practice in simulation. In the paper, we describe the system design, system calibration and data transferring between these devices.
Collapse
Affiliation(s)
- Shang Lu
- Multimedia Research Center, Department of Computing Science, University of Alberta, Edmonton, AB, Canada
| | | | - Xianta Jiang
- Surgical Simulation Research Lab, Department of Surgery, University of Alberta, Edmonton, AB, Canada
| | - Bin Zheng
- Surgical Simulation Research Lab, Department of Surgery, University of Alberta, Edmonton, AB, Canada.
| |
Collapse
|
22
|
Abstract
Augmented reality (AR) is used to enhance the perception of the real world by integrating virtual objects to an image sequence acquired from various camera technologies. Numerous AR applications in robotics have been developed in recent years. The aim of this paper is to provide an overview of AR research in robotics during the five year period from 2015 to 2019. We classified these works in terms of application areas into four categories: (1) Medical robotics: Robot-Assisted surgery (RAS), prosthetics, rehabilitation, and training systems; (2) Motion planning and control: trajectory generation, robot programming, simulation, and manipulation; (3) Human-robot interaction (HRI): teleoperation, collaborative interfaces, wearable robots, haptic interfaces, brain-computer interfaces (BCIs), and gaming; (4) Multi-agent systems: use of visual feedback to remotely control drones, robot swarms, and robots with shared workspace. Recent developments in AR technology are discussed followed by the challenges met in AR due to issues of camera localization, environment mapping, and registration. We explore AR applications in terms of how AR was integrated and which improvements it introduced to corresponding fields of robotics. In addition, we summarize the major limitations of the presented applications in each category. Finally, we conclude our review with future directions of AR research in robotics. The survey covers over 100 research works published over the last five years.
Collapse
|
23
|
Williams MA, McVeigh J, Handa AI, Lee R. Augmented reality in surgical training: a systematic review. Postgrad Med J 2020; 96:537-542. [DOI: 10.1136/postgradmedj-2020-137600] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Revised: 02/27/2020] [Accepted: 03/21/2020] [Indexed: 11/03/2022]
Abstract
The aim of this systematic review is to provide an update on the current state of augmented reality (AR) in surgical training and to further report on any described benefits compared with traditional techniques. A PICO (Population, Intervention, Comparison, Outcome) strategy was adopted to formulate an appropriate research question and define strict search terms to be entered into MEDLINE, CENTRAL and Google Scholar. The search was returned on 12/09/2019. All returned results were screened first by title and then abstract. The systematic search returned a total of 236 results, of which 18 were selected for final inclusion. Studies covered the full range of surgical disciplines and reported on outcomes including operative duration, accuracy and postoperative complication rates. Due to the heterogeneity of the collected data, no meta-analysis was possible. Outcome measures of competency, surgical opinion and postoperative complication rate were in favour of AR technology while operative duration appears to increase.
Collapse
|
24
|
Tang KS, Cheng DL, Mi E, Greenberg PB. Augmented reality in medical education: a systematic review. CANADIAN MEDICAL EDUCATION JOURNAL 2020; 11:e81-e96. [PMID: 32215146 PMCID: PMC7082471 DOI: 10.36834/cmej.61705] [Citation(s) in RCA: 51] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
INTRODUCTION The field of augmented reality (AR) is rapidly growing with many new potential applications in medical education. This systematic review investigated the current state of augmented reality applications (ARAs) and developed an analytical model to guide future research in assessing ARAs as teaching tools in medical education. METHODS A literature search was conducted using PubMed, Embase, Web of Science, Cochrane Library, and Google Scholar. This review followed PRISMA guidelines and included publications from January 1, 2000 to June 18, 2018. Inclusion criteria were experimental studies evaluating ARAs implemented in healthcare education published in English. Our review evaluated study quality and determined whether studies assessed ARA validity using criteria established by the GRADE Working Group and Gallagher et al., respectively. These findings were used to formulate an analytical model to assess the readiness of ARAs for implementation in medical education. RESULTS We identified 100,807 articles in the initial literature search; 36 met inclusion criteria for final review and were categorized into three categories: Surgery (23), Anatomy (9), and Other (4). The overall quality of the studies was poor and no ARA was tested for all five stages of validity. Our analytical model evaluates the importance of research quality, application content, outcomes, and feasibility of an ARA to gauge its readiness for implementation. CONCLUSION While AR technology is growing at a rapid rate, the current quality and breadth of AR research in medical training is insufficient to recommend the adoption into educational curricula. We hope our analytical model will help standardize AR assessment methods and define the role of AR technology in medical education.
Collapse
Affiliation(s)
- Kevin S. Tang
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
- The Warren Alpert Medical School of Brown University, Rhode Island, USA
- Division of Ophthalmology, Warren Alpert Medical School, Rhode Island, USA
- Section of Ophthalmology, Providence VA Medical Center, Rhode Island, USA
| | - Derrick L. Cheng
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
- The Warren Alpert Medical School of Brown University, Rhode Island, USA
- Lifespan Clinical Research Center, Rhode Island, USA
| | - Eric Mi
- The Program in Liberal Medical Education of Brown University, Rhode Island, USA
| | - Paul B. Greenberg
- Division of Ophthalmology, Warren Alpert Medical School, Rhode Island, USA
- Section of Ophthalmology, Providence VA Medical Center, Rhode Island, USA
| |
Collapse
|
25
|
Glick Y, Avital B, Oppenheimer J, Nahman D, Wagnert-Avraham L, Eisenkraft A, Dym L, Levi D, Agur A, Gustus B, Furer A. Augmenting prehospital care. BMJ Mil Health 2020; 167:158-162. [PMID: 32086268 DOI: 10.1136/jramc-2019-001320] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 10/02/2019] [Accepted: 10/04/2019] [Indexed: 11/03/2022]
Abstract
INTRODUCTION The challenging environment of prehospital casualty care demands providers to make prompt decisions and to engage in lifesaving interventions, occasionally without them being adequately experienced. Telementoring based on augmented reality (AR) devices has the potential to decrease the decision time and minimise the distance gap between an experienced consultant and the first responder. The purpose of this study was to determine whether telementoring with AR glasses would affect chest thoracotomy performance and self-confidence of inexperienced trainees. METHODS Two groups of inexperienced medical students performed a chest thoracotomy in an ex vivo pig model. While one group was mentored remotely using HoloLens AR glasses, the second performed the procedure independently. An observer assessed the trainees' performance. In addition, trainees and mentors evaluated their own performance. RESULTS Quality of performance was found to be superior with remote guidance, without significant prolongation of the procedure (492 s vs 496 s, p=0.943). Moreover, sense of self-confidence among participant was substantially improved in the telementoring group in which 100% of the participants believed the procedure was successful compared with 40% in the control group (p=0.035). CONCLUSION AR devices may have a role in future prehospital telementoring systems, to provide accessible consultation for first responders, and could thus positively affect the provider's confidence in decision-making, enhance procedure performance and ultimately improve patient prognosis. That being said, future studies are required to estimate full potential of this technology and additional adjustments are necessary for maximal optimisation and implementation in the field of prehospital care.
Collapse
Affiliation(s)
- Yuval Glick
- Medical Corps, Israel Defense Forces, Ramat-Gan, Israel.,Orthopedic Department, Assuta Ashdod Hospital, Ashdod, Israel
| | - B Avital
- Institute for Research in Military Medicine, The Hebrew University of Jerusalem and Israel Defense Forces Medical Corps, Jerusalem, Israel
| | - J Oppenheimer
- Institute for Research in Military Medicine, The Hebrew University of Jerusalem and Israel Defense Forces Medical Corps, Jerusalem, Israel
| | - D Nahman
- Institute for Research in Military Medicine, The Hebrew University of Jerusalem and Israel Defense Forces Medical Corps, Jerusalem, Israel.,Department of Internal Medicine 'A', Hadassah University Hospital, Jerusalem, Israel
| | - L Wagnert-Avraham
- Institute for Research in Military Medicine, The Hebrew University of Jerusalem and Israel Defense Forces Medical Corps, Jerusalem, Israel
| | - A Eisenkraft
- Institute for Research in Military Medicine, The Hebrew University of Jerusalem and Israel Defense Forces Medical Corps, Jerusalem, Israel
| | - L Dym
- Obstetrics and Gynaecology Division, Soroka Medical Centre, Beer Sheva, Israel
| | - D Levi
- Medical Corps, Israel Defense Forces, Ramat-Gan, Israel
| | - A Agur
- Medical Corps, Israel Defense Forces, Ramat-Gan, Israel.,Neurosurgery Department, Tel Aviv Sourasky Medical Center, Tel Aviv, Israel
| | - B Gustus
- Medical Corps, Israel Defense Forces, Ramat-Gan, Israel.,Pediatric Department, Asaf Harofe Hospital, Zerifin, Israel
| | - A Furer
- Medical Corps, Israel Defense Forces, Ramat-Gan, Israel .,Faculty of Medicine, Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
26
|
Surgical Telementoring Without Encumbrance: A Comparative Study of See-through Augmented Reality-based Approaches. Ann Surg 2020; 270:384-389. [PMID: 29672404 DOI: 10.1097/sla.0000000000002764] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
OBJECTIVE This study investigates the benefits of a surgical telementoring system based on an augmented reality head-mounted display (ARHMD) that overlays surgical instructions directly onto the surgeon's view of the operating field, without workspace obstruction. SUMMARY BACKGROUND DATA In conventional telestrator-based telementoring, the surgeon views annotations of the surgical field by shifting focus to a nearby monitor, which substantially increases cognitive load. As an alternative, tablets have been used between the surgeon and the patient to display instructions; however, tablets impose additional obstructions of surgeon's motions. METHODS Twenty medical students performed anatomical marking (Task1) and abdominal incision (Task2) on a patient simulator, in 1 of 2 telementoring conditions: ARHMD and telestrator. The dependent variables were placement error, number of focus shifts, and completion time. Furthermore, workspace efficiency was quantified as the number and duration of potential surgeon-tablet collisions avoided by the ARHMD. RESULTS The ARHMD condition yielded smaller placement errors (Task1: 45%, P < 0.001; Task2: 14%, P = 0.01), fewer focus shifts (Task1: 93%, P < 0.001; Task2: 88%, P = 0.0039), and longer completion times (Task1: 31%, P < 0.001; Task2: 24%, P = 0.013). Furthermore, the ARHMD avoided potential tablet collisions (4.8 for 3.2 seconds in Task1; 3.8 for 1.3 seconds in Task2). CONCLUSION The ARHMD system promises to improve accuracy and to eliminate focus shifts in surgical telementoring. Because ARHMD participants were able to refine their execution of instructions, task completion time increased. Unlike a tablet system, the ARHMD does not require modifying natural motions to avoid collisions.
Collapse
|
27
|
Tully J, Dameff C, Longhurst CA. Wave of Wearables: Clinical Management of Patients and the Future of Connected Medicine. Clin Lab Med 2020; 40:69-82. [PMID: 32008641 DOI: 10.1016/j.cll.2019.11.004] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
The future of connected health care will involve the collection of patient data or enhancement of clinician workflows through various biosensors and displays found on wearable electronic devices, many of which are marketed directly to consumers. The adoption of wearables in health care is being driven by efforts to reduce health care costs, improve care quality, and increase clinician efficiency. Wearables have significant potential to achieve these goals but are currently limited by lack of widespread integrations into electronic health records, biosensor data collection types, and a lack of scientifically rigorous literature showing benefit.
Collapse
Affiliation(s)
- Jeffrey Tully
- Department of Anesthesiology and Pain Medicine, University of California Davis Medical Center, 2315 Stockton Boulevard, Sacramento, CA 95817, USA.
| | - Christian Dameff
- Department of Emergency Medicine, University of California San Diego, 200 West Arbor Drive #8676, San Diego, CA 92103, USA; Department of Biomedical Informatics, UC San Diego Health, University of California San Diego, 9500 Gilman Drive, MC 0728, La Jolla, California 92093-0728, USA; Department of Computer Science and Engineering, University of California San Diego, 9500 Gilman Drive, Mail Code 0404, La Jolla, CA 92093-0404, USA
| | - Christopher A Longhurst
- Department of Medicine, University of California San Diego, 9500 Gilman Drive, La Jolla, CA 92093, USA; Department of Pediatrics, University of California San Diego, 9500 Gilman Drive, La Jolla, CA 92093, USA
| |
Collapse
|
28
|
Carrera JF. A Systematic Review of the Use of Google Glass in Graduate Medical Education. J Grad Med Educ 2019; 11:637-648. [PMID: 31871562 PMCID: PMC6919184 DOI: 10.4300/jgme-d-19-00148.1] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Revised: 06/13/2019] [Accepted: 08/21/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Graduate medical education (GME) has emphasized the assessment of trainee competencies and milestones; however, sufficient in-person assessment is often constrained. Using mobile hands-free devices, such as Google Glass (GG) for telemedicine, allows for remote supervision, education, and assessment of residents. OBJECTIVE We reviewed available literature on the use of GG in GME in the clinical learning environment, its use for resident supervision and education, and its clinical utility and technical limitations. METHODS We conducted a systematic review in accordance with 2009 PRISMA guidelines. Applicable studies were identified through a review of PubMed, MEDLINE, and Web of Science databases for articles published from January 2013 to August 2018. Two reviewers independently screened titles, abstracts, and full-text articles that reported using GG in GME and assessed the quality of the studies. A systematic review of these studies appraised the literature for descriptions of its utility in GME. RESULTS Following our search and review process, 37 studies were included. The majority evaluated GG in surgical specialties (n = 23) for the purpose of surgical/procedural skills training or supervision. GG was predominantly used for video teleconferencing, and photo and video capture. Highlighted positive aspects of GG use included point-of-view broadcasting and capacity for 2-way communication. Most studies cited drawbacks that included suboptimal battery life and HIPAA concerns. CONCLUSIONS GG shows some promise as a device capable of enhancing GME. Studies evaluating GG in GME are limited by small sample sizes and few quantitative data. Overall experience with use of GG in GME is generally positive.
Collapse
|
29
|
Tatar İ, Huri E, Selçuk İ, Moon YL, Paoluzzi A, Skolarikos A. Review of the effect of 3D medical printing and virtual reality on urology training with ‘MedTRain3DModsim’ Erasmus + European Union Project. Turk J Med Sci 2019; 49:1257-1270. [PMID: 31648427 PMCID: PMC7018298 DOI: 10.3906/sag-1905-73] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2019] [Accepted: 08/07/2019] [Indexed: 12/28/2022] Open
Abstract
Background/aim It is necessary to incorporate novel training modalities in medical education, especially in surgical fields, because of the limitations of cadaveric training. Traditional medical education has many drawbacks, such as residency working hour restrictions, patient safety conflicts with the learning needs, and the lack of hands-on workshops. The MedTRain3DModsim Project aimed to produce 3-dimensional (3D) medical printed models, simulations, and innovative applications for every level of medical training using novel worldwide technologies. It was aimed herein to improve the interdisciplinary and transnational approaches, and accumulate existing experience for medical education, postgraduate studies, and specialty training. Materials and methods This project focused on models of solid organs and the urinary system, including the kidney, prostate, ureter, and liver. With 3D medical printing, it is possible to produce a body part from inert materials in just a few hours with the standardization of medical 3D modeling. Results The target groups of this project included medical students and residents, graduate students from engineering departments who needed medical education and surgical training, and medical researchers interested in health technology or clinical and surgical an atomy. Conclusion It was also intended to develop a novel imaging platform for education and training by reevaluating the existing data using new software and 3D modalities. Therefore, it was believed that our methodology could be implemented in all related medical fields.
Collapse
Affiliation(s)
- İlkan Tatar
- Department of Anatomy, Faculty of Medicine, Hacettepe University, Ankara, Turkey
| | - Emre Huri
- Department of Urology, Faculty of Medicine, Hacettepe University, Ankara, Turkey
| | - İlker Selçuk
- Department of Gynecologic-Oncology, Zekai Tahir Burak Research and Educational Hospital, Ankara, Turkey
| | - Young Lee Moon
- Department of Orthopedics, Chosun University, Chosun, South Korea
| | - Alberto Paoluzzi
- Department of Mathematics and Physics, Rome Tre University, Rome, Italy
| | | |
Collapse
|
30
|
Rahman R, Wood ME, Qian L, Price CL, Johnson AA, Osgood GM. Head-Mounted Display Use in Surgery: A Systematic Review. Surg Innov 2019; 27:88-100. [DOI: 10.1177/1553350619871787] [Citation(s) in RCA: 48] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Purpose. We analyzed the literature to determine (1) the surgically relevant applications for which head-mounted display (HMD) use is reported; (2) the types of HMD most commonly reported; and (3) the surgical specialties in which HMD use is reported. Methods. The PubMed, Embase, Cochrane Library, and Web of Science databases were searched through August 27, 2017, for publications describing HMD use during surgically relevant applications. We identified 120 relevant English-language, non-opinion publications for inclusion. HMD types were categorized as “heads-up” (nontransparent HMD display and direct visualization of the real environment), “see-through” (visualization of the HMD display overlaid on the real environment), or “non–see-through” (visualization of only the nontransparent HMD display). Results. HMDs were used for image guidance and augmented reality (70 publications), data display (63 publications), communication (34 publications), and education/training (18 publications). See-through HMDs were described in 55 publications, heads-up HMDs in 41 publications, and non–see-through HMDs in 27 publications. Google Glass, a see-through HMD, was the most frequently used model, reported in 32 publications. The specialties with the highest frequency of published HMD use were urology (20 publications), neurosurgery (17 publications), and unspecified surgical specialty (20 publications). Conclusion. Image guidance and augmented reality were the most commonly reported applications for which HMDs were used. See-through HMDs were the most commonly reported type used in surgically relevant applications. Urology and neurosurgery were the specialties with greatest published HMD use.
Collapse
Affiliation(s)
- Rafa Rahman
- The Johns Hopkins University, Baltimore, MD, USA
| | | | - Long Qian
- The Johns Hopkins University, Baltimore, MD, USA
| | | | | | | |
Collapse
|
31
|
Saun TJ, Zuo KJ, Grantcharov TP. Video Technologies for Recording Open Surgery: A Systematic Review. Surg Innov 2019; 26:599-612. [PMID: 31165687 DOI: 10.1177/1553350619853099] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Video recording of surgical procedures is an important tool for surgical education, performance enhancement, and error analysis. Technology for video recording open surgery, however, is limited. The objective of this article is to provide an overview of the available literature regarding the various technologies used for intraoperative video recording of open surgery. A systematic review was conducted in accordance with the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) guidelines using the MEDLINE, Cochrane Central, and EMBASE databases. Two authors independently screened the titles and abstracts of the retrieved articles, and those that satisfied the defined inclusion criteria were selected for a full-text review. A total of 2275 publications were initially identified, and 110 were included in the final review. The included articles were categorized based on type of article, surgical subspecialty, type and positioning of camera, and limitations identified with their use. The most common article type was primary-technical (29%), and the dominant specialties were general surgery (22%) and plastic surgery (18%). The most commonly cited camera used was the GoPro (30%) positioned in a head-mount configuration (60%). Commonly cited limitations included poor video quality, inadequate battery life, light overexposure, obstruction by surgical team members, and excessive motion. Open surgery remains the mainstay of many surgical specialties today, and technological innovation is absolutely critical to fulfill the unmet need for better video capture of open surgery. The findings of this article will be valuable for guiding future development of novel technology for this purpose.
Collapse
Affiliation(s)
- Tomas J Saun
- 1 St Michael's Hospital, Toronto, ON, Canada.,2 University of Toronto, ON, Canada
| | | | | |
Collapse
|
32
|
Munzer BW, Khan MM, Shipman B, Mahajan P. Augmented Reality in Emergency Medicine: A Scoping Review. J Med Internet Res 2019; 21:e12368. [PMID: 30994463 PMCID: PMC6492064 DOI: 10.2196/12368] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2018] [Revised: 01/31/2019] [Accepted: 02/28/2019] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Augmented reality is increasingly being investigated for its applications to medical specialties as well as in medical training. Currently, there is little information about its applicability to training and care delivery in the context of emergency medicine. OBJECTIVE The objective of this article is to review current literature related to augmented reality applicable to emergency medicine and its training. METHODS Through a scoping review utilizing Scopus, MEDLINE, and Embase databases for article searches, we identified articles involving augmented reality that directly involved emergency medicine or was in an area of education or clinical care that could be potentially applied to emergency medicine. RESULTS A total of 24 articles were reviewed in detail and were categorized into three groups: user-environment interface, telemedicine and prehospital care, and education and training. CONCLUSIONS Through analysis of the current literature across fields, we were able to demonstrate that augmented reality has utility and feasibility in clinical care delivery in patient care settings, in operating rooms and inpatient settings, and in education and training of emergency care providers. Additionally, we found that the use of augmented reality for care delivery over distances is feasible, suggesting a role in telehealth. Our results from the review of the literature in emergency medicine and other specialties reveal that further research into the uses of augmented reality will have a substantial role in changing how emergency medicine as a specialty will deliver care and provide education and training.
Collapse
Affiliation(s)
| | - Mohammad Mairaj Khan
- Department of Emergency Medicine, University of Michigan, Ann Arbor, MI, United States
| | - Barbara Shipman
- Medical School, University of Michigan, Ann Arbor, MI, United States
| | - Prashant Mahajan
- Department of Emergency Medicine, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
33
|
Bhushan S, Anandasabapathy S, Shukla R. Use of Augmented Reality and Virtual Reality Technologies in Endoscopic Training. Clin Gastroenterol Hepatol 2018; 16:1688-1691. [PMID: 30114487 DOI: 10.1016/j.cgh.2018.08.021] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Affiliation(s)
- Sheena Bhushan
- Section of Gastroenterology and Hepatology, Department of Medicine, Houston, Texas; Baylor Global Initiatives, Baylor College of Medicine, Houston, Texas
| | - Sharmila Anandasabapathy
- Section of Gastroenterology and Hepatology, Department of Medicine, Houston, Texas; Baylor Global Initiatives, Baylor College of Medicine, Houston, Texas
| | - Richa Shukla
- Section of Gastroenterology and Hepatology, Department of Medicine, Houston, Texas; Baylor Global Initiatives, Baylor College of Medicine, Houston, Texas.
| |
Collapse
|
34
|
García-Cruz E, Bretonnet A, Alcaraz A. Testing Smart Glasses in urology: Clinical and surgical potential applications. Actas Urol Esp 2018; 42:207-211. [PMID: 29037757 DOI: 10.1016/j.acuro.2017.06.007] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2017] [Revised: 06/14/2017] [Accepted: 06/14/2017] [Indexed: 11/18/2022]
Abstract
OBJECTIVES We aimed to explore the potential benefits of using smart glasses - wearable computer optical devices with touch-less command features - in the surgery room and in outpatient care settings in urology. MATERIALS AND METHODS Between April and November 2015, 80 urologists were invited to use Google Glass in their daily surgical and clinical practice, and to share them with other urologists. Participants rated the usefulness of smart glasses on a 10-point scale, and provided insights on their potential benefits in a telephone interview. RESULTS During the testing period, 240 urologists used smart glasses, and the 80 initially invited rated their usefulness. Mean scores for usefulness in the surgery room and in outpatient clinics were 7.4 and 5.4, respectively. The interview revealed that the applications of smart glasses considered most promising in surgery were live video streaming and static image playback, augmented reality, laparoscopic navigation, and digital checklist for safety verification. In outpatient settings, participants considered the glasses useful as a viewing platform for sharing test results, for browsing digital vademecum, and for checking medical records in emergency situations. CONCLUSIONS Urologists engaged in our experience identified various uses of smart glasses with potential benefits for physician's daily practice, particularly in the urological surgery setting. Further quantitative studies are needed to exploit the actual possibilities of smart glasses and address the technical limitations for their safe use in clinical and surgical practice.
Collapse
Affiliation(s)
- E García-Cruz
- Departamento de Urología, Hospital Plató, Barcelona, España; Departamento de Urología, Hospital Clínic de Barcelona, Barcelona, España; EAU Young Academic Urologists Men's Health Group, Barcelona, España.
| | - A Bretonnet
- Healthcare Innovation, Soft for You, Barcelona, España
| | - A Alcaraz
- Departamento de Urología, Hospital Clínic de Barcelona, Barcelona, España
| |
Collapse
|
35
|
Wei NJ, Dougherty B, Myers A, Badawy SM. Using Google Glass in Surgical Settings: Systematic Review. JMIR Mhealth Uhealth 2018; 6:e54. [PMID: 29510969 PMCID: PMC5861300 DOI: 10.2196/mhealth.9409] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Revised: 01/13/2018] [Accepted: 01/17/2018] [Indexed: 12/13/2022] Open
Abstract
Background In recent years, wearable devices have become increasingly attractive and the health care industry has been especially drawn to Google Glass because of its ability to serve as a head-mounted wearable device. The use of Google Glass in surgical settings is of particular interest due to the hands-free device potential to streamline workflow and maintain sterile conditions in an operating room environment. Objective The aim is to conduct a systematic evaluation of the literature on the feasibility and acceptability of using Google Glass in surgical settings and to assess the potential benefits and limitations of its application. Methods The literature was searched for articles published between January 2013 and May 2017. The search included the following databases: PubMed MEDLINE, Embase, Cumulative Index to Nursing and Allied Health Literature, PsycINFO (EBSCO), and IEEE Xplore. Two reviewers independently screened titles and abstracts and assessed full-text articles. Original research articles that evaluated the feasibility, usability, or acceptability of using Google Glass in surgical settings were included. This review was completed following the Preferred Reporting Results of Systematic Reviews and Meta-Analyses guidelines. Results Of the 520 records obtained, 31 met all predefined criteria and were included in this review. Google Glass was used in various surgical specialties. Most studies were in the United States (23/31, 74%) and all were conducted in hospital settings: 29 in adult hospitals (29/31, 94%) and two in children’s hospitals (2/31, 7%). Sample sizes of participants who wore Google Glass ranged from 1 to 40. Of the 31 studies, 25 (81%) were conducted under real-time conditions or actual clinical care settings, whereas the other six (19%) were conducted under simulated environment. Twenty-six studies were pilot or feasibility studies (84%), three were case studies (10%), and two were randomized controlled trials (6%). The majority of studies examined the potential use of Google Glass as an intraoperative intervention (27/31, 87%), whereas others observed its potential use in preoperative (4/31, 13%) and postoperative settings (5/31, 16%). Google Glass was utilized as a videography and photography device (21/31, 68%), a vital sign monitor (6/31, 19%), a surgical navigation display (5/31, 16%), and as a videoconferencing tool to communicate with remote surgeons intraoperatively (5/31, 16%). Most studies reported moderate or high acceptability of using Google Glass in surgical settings. The main reported limitations of using Google Glass utilization were short battery life (8/31, 26%) and difficulty with hands-free features (5/31, 16%). Conclusions There are promising feasibility and usability data of using Google Glass in surgical settings with particular benefits for surgical education and training. Despite existing technical limitations, Google Glass was generally well received and several studies in surgical settings acknowledged its potential for training, consultation, patient monitoring, and audiovisual recording.
Collapse
Affiliation(s)
- Nancy J Wei
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Bryn Dougherty
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Aundria Myers
- Weinberg College of Arts and Sciences, Northwestern University, Evanston, IL, United States
| | - Sherif M Badawy
- Division of Hematology, Oncology and Stem Cell Transplant, Ann & Robert H Lurie Children's Hospital of Chicago, Chicago, IL, United States.,Department of Pediatrics, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States.,Department of Pediatrics, Division of Hematology and Oncology, Faculty of Medicine, Zagazig University, Zagazig, Egypt
| |
Collapse
|
36
|
Hiranaka T, Fujishiro T, Hida Y, Shibata Y, Tsubosaka M, Nakanishi Y, Okimura K, Uemoto H. Augmented reality: The use of the PicoLinker smart glasses improves wire insertion under fluoroscopy. World J Orthop 2017; 8:891-894. [PMID: 29312847 PMCID: PMC5745431 DOI: 10.5312/wjo.v8.i12.891] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Revised: 03/29/2017] [Accepted: 04/19/2017] [Indexed: 02/06/2023] Open
Abstract
AIM To demonstrate the feasibility of the wearable smart glasses, PicoLinker, in guide wire insertion under fluoroscopic guidance.
METHODS Under a fluoroscope, a surgeon inserted 3 mm guide wires into plastic femurs from the lateral cortex to the femoral head center while the surgeon did or did not wear PicoLinker, which are wearable smart glasses where the fluoroscopic video was displayed (10 guide wires each).
RESULTS The tip apex distance, radiation exposure time and total insertion time were significantly shorter while wearing the PicoLinker smart glasses.
CONCLUSION This study indicated that the PicoLinker smart glasses can improve accuracy, reduce radiation exposure time, and reduce total insertion time. This is due to the fact that the PicoLinker smart glasses enable surgeons to keep their eyes on the operation field.
Collapse
Affiliation(s)
- Takafumi Hiranaka
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Takaaki Fujishiro
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Yuichi Hida
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Yosaku Shibata
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Masanori Tsubosaka
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Yuta Nakanishi
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Kenjiro Okimura
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| | - Harunobu Uemoto
- Department of Orthopaedic Surgery and Joint Surgery Center, Takatsuki General Hospital, Osaka 569-1115, Japan
| |
Collapse
|
37
|
Kobayashi L, Zhang XC, Collins SA, Karim N, Merck DL. Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training. West J Emerg Med 2017; 19:158-164. [PMID: 29383074 PMCID: PMC5785186 DOI: 10.5811/westjem.2017.10.35026] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 10/17/2017] [Accepted: 10/29/2017] [Indexed: 11/11/2022] Open
Abstract
Introduction Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Methods Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. Results The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients’ de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based “blind insertion” invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner’s AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Conclusion Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices.
Collapse
Affiliation(s)
- Leo Kobayashi
- Alpert Medical School of Brown University, Department of Emergency Medicine, Providence, Rhode Island
| | - Xiao Chi Zhang
- Alpert Medical School of Brown University, Department of Emergency Medicine, Providence, Rhode Island
| | - Scott A Collins
- Rhode Island Hospital, CT Scan Department, Providence, Rhode Island
| | - Naz Karim
- Alpert Medical School of Brown University, Department of Emergency Medicine, Providence, Rhode Island
| | - Derek L Merck
- Alpert Medical School of Brown University, Department of Diagnostic Imaging, Providence, Rhode Island
| |
Collapse
|
38
|
Rodriguez KM, Kohn TP, Davis AB, Hakky TS. Penile implants: a look into the future. Transl Androl Urol 2017; 6:S860-S866. [PMID: 29238665 PMCID: PMC5715181 DOI: 10.21037/tau.2017.05.28] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
Inflatable penile prosthesis (IPP) has been around since the 1970’s as a durable and one-time cure for erectile dysfunction (ED). For the past 40 years, many changes have been made to make the device better and currently IPP boasts a high percentage of long-term patient satisfaction. The next paradigm shift in IPP treatment for ED is upon us. Funding for ED related medications and devices has been a hot topic in health policy over the last 10 years. This suggests that the device must improve and patient advocacy and education must increase for IPP to remain as a viable solution for ED. In this paper, we conduct a literature search for innovations in IPP and argue that IPP must constantly improve to compete with oral, injectable, shockwave, and potentially gene therapies.
Collapse
|
39
|
Feasibility and safety of augmented reality-assisted urological surgery using smartglass. World J Urol 2016; 35:967-972. [PMID: 27761715 DOI: 10.1007/s00345-016-1956-6] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2016] [Accepted: 10/13/2016] [Indexed: 12/18/2022] Open
Abstract
PURPOSE To assess the feasibility, safety and usefulness of augmented reality-assisted urological surgery using smartglass (SG). METHODS Seven urological surgeons (3 board urologists and 4 urology residents) performed augmented reality-assisted urological surgery using SG for 10 different types of operations and a total of 31 urological operations. Feasibility was assessed using technical metadata (number of photographs taken/number of videos recorded/video time recorded) and structured interviews with the urologists on their use of SG. Safety was evaluated by recording complications and grading according to the Clavien-Dindo classification. Usefulness of SG for urological surgery was queried in structured interviews and in a survey. RESULTS The implementation of SG use during urological surgery was feasible with no intrinsic (technical defect) or extrinsic (inability to control the SG function) obstacles being observed. SG use was safe as no grade 3-5 complications occurred for the series of 31 urological surgeries of different complexities. Technical applications of SG included taking photographs/recording videos for teaching and documentation, hands-free teleconsultation, reviewing patients' medical records and images and searching the internet for health information. Overall usefulness of SG for urological surgery was rated as very high by 43 % and high by 29 % of surgeons. CONCLUSIONS Augmented reality-assisted urological surgery using SG is both feasible and safe and also provides several useful functions for urological surgeons. Further developments and investigations are required in the near future to harvest the great potential of this exciting technology for urological surgery.
Collapse
|
40
|
Peden RG, Mercer R, Tatham AJ. The use of head-mounted display eyeglasses for teaching surgical skills: A prospective randomised study. Int J Surg 2016; 34:169-173. [DOI: 10.1016/j.ijsu.2016.09.002] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2016] [Revised: 08/31/2016] [Accepted: 09/04/2016] [Indexed: 11/26/2022]
|