1
|
Lewis KO, Popov V, Fatima SS. From static web to metaverse: reinventing medical education in the post-pandemic era. Ann Med 2024; 56:2305694. [PMID: 38261592 PMCID: PMC10810636 DOI: 10.1080/07853890.2024.2305694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Accepted: 01/06/2024] [Indexed: 01/25/2024] Open
Abstract
The World Wide Web and the advancement of computer technology in the 1960s and 1990s respectively set the ground for a substantial and simultaneous change in many facets of our life, including medicine, health care, and medical education. The traditional didactic approach has shifted towards more dynamic and interactive methods, leveraging technologies such as simulation tools, virtual reality, and online platforms. At the forefront is the remarkable evolution that has revolutionized how medical knowledge is accessed, disseminated, and integrated into pedagogical practices. The COVID-19 pandemic also led to rapid and large-scale adoption of e-learning and digital resources in medical education because of widespread lockdowns, social distancing measures, and the closure of medical schools and healthcare training programs. This review paper examines the evolution of medical education from the Flexnerian era to the modern digital age, closely examining the influence of the evolving WWW and its shift from Education 1.0 to Education 4.0. This evolution has been further accentuated by the transition from the static landscapes of Web 2D to the immersive realms of Web 3D, especially considering the growing notion of the metaverse. The application of the metaverse is an interconnected, virtual shared space that includes virtual reality (VR), augmented reality (AR), and mixed reality (MR) to create a fertile ground for simulation-based training, collaborative learning, and experiential skill acquisition for competency development. This review includes the multifaceted applications of the metaverse in medical education, outlining both its benefits and challenges. Through insightful case studies and examples, it highlights the innovative potential of the metaverse as a platform for immersive learning experiences. Moreover, the review addresses the role of emerging technologies in shaping the post-pandemic future of medical education, ultimately culminating in a series of recommendations tailored for medical institutions aiming to successfully capitalize on revolutionary changes.
Collapse
Affiliation(s)
- Kadriye O. Lewis
- Children’s Mercy Kansas City, Department of Pediatrics, UMKC School of Medicine, Kansas City, MO, USA
| | - Vitaliy Popov
- Department of Learning Health Sciences, University of MI Medical School, Ann Arbor, MI, USA
| | - Syeda Sadia Fatima
- Department of Biological and Biomedical Sciences, The Aga Khan University, Karachi, Pakistan
| |
Collapse
|
2
|
Karnatz N, Schwerter M, Liu S, Parviz A, Wilkat M, Rana M. Mixed Reality as a Digital Visualisation Solution for the Head and Neck Tumour Board: Application Creation and Implementation Study. Cancers (Basel) 2024; 16:1392. [PMID: 38611070 PMCID: PMC11011089 DOI: 10.3390/cancers16071392] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2024] [Revised: 03/21/2024] [Accepted: 03/26/2024] [Indexed: 04/14/2024] Open
Abstract
The preparation and implementation of interdisciplinary oncological case reviews are time-consuming and complex. The variety of clinical and radiological information must be presented in a clear and comprehensible manner. Only if all relevant patient-specific information is demonstrated in a short time frame can well-founded treatment decisions be made on this basis. Mixed reality (MR) technology as a multimodal interactive user interface could enhance understanding in multidisciplinary collaboration by visualising radiological or clinical data. The aim of the work was to develop an MR-based software prototype for a head and neck tumour board (HNTB) to support clinical decision-making. The article describes the development phases and workflows in the planning and creation of a MR-based software prototype that were required to meet the multidisciplinary characteristics of a HNTB.
Collapse
Affiliation(s)
- Nadia Karnatz
- Department of Oral and Plastic Maxillofacial Surgery, Heinrich Heine University Hospital Düsseldorf, Moorenstraße 5, 40225 Düsseldorf, Germany (M.R.)
| | | | - Shufang Liu
- Brainlab AG, Olof-Palme-Str. 9, 81829 München, Germany
| | - Aida Parviz
- Department of Oral and Plastic Maxillofacial Surgery, Heinrich Heine University Hospital Düsseldorf, Moorenstraße 5, 40225 Düsseldorf, Germany (M.R.)
| | - Max Wilkat
- Department of Oral and Plastic Maxillofacial Surgery, Heinrich Heine University Hospital Düsseldorf, Moorenstraße 5, 40225 Düsseldorf, Germany (M.R.)
| | - Majeed Rana
- Department of Oral and Plastic Maxillofacial Surgery, Heinrich Heine University Hospital Düsseldorf, Moorenstraße 5, 40225 Düsseldorf, Germany (M.R.)
| |
Collapse
|
3
|
Nakatani R, Patel K, Chowdhury T. Simulation in Anesthesia for Perioperative Neuroscience: Present and Future. J Neurosurg Anesthesiol 2024; 36:4-10. [PMID: 37903630 DOI: 10.1097/ana.0000000000000939] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Accepted: 09/27/2023] [Indexed: 11/01/2023]
Abstract
The brain's sensitivity to fluctuations in physiological parameters demands precise control of anesthesia during neurosurgery, which, combined with the complex nature of neurosurgical procedures and potential for adverse outcomes, makes neuroanesthesia challenging. Neuroanesthesiologists, as perioperative physicians, work closely with neurosurgeons, neurologists, neurointensivists, and neuroradiologists to provide care for patients with complex neurological diseases, often dealing with life-threatening conditions such as traumatic brain injuries, brain tumors, cerebral aneurysms, and spinal cord injuries. The use of simulation to practice emergency scenarios may have potential for enhancing competency and skill acquisition amongst neuroanesthesiologists. Simulation models, including high-fidelity manikins, virtual reality, and computer-based simulations, can replicate physiological responses, anatomical structures, and complications associated with neurosurgical procedures. The use of high-fidelity simulation can act as a valuable complement to real-life clinical exposure and training in neuroanesthesia.
Collapse
Affiliation(s)
| | - Krisha Patel
- Toronto Western Hospital, University of Toronto, Toronto
| | | |
Collapse
|
4
|
Fonseka T, Henry M, Ellerington C, Gowda A, Ellis R. Urology boot camp for medical students: Using virtual technology to enhance undergraduate education. BJUI COMPASS 2023; 4:523-532. [PMID: 37636208 PMCID: PMC10447215 DOI: 10.1002/bco2.213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Revised: 12/04/2022] [Accepted: 12/19/2022] [Indexed: 02/11/2023] Open
Abstract
Objectives The study aims to describe the methodology of converting the urology boot camp for medical students into a virtual course with key take home points for a successful conversion and to present quantitative and qualitative data demonstrating the impact of the boot camp on improving delegates' knowledge and clinical acumen. Materials and methods The face-to-face boot camp was converted to a virtual format employing a variety of techniques including; utilizing an online platform to deliver live screened lectures, using online polling software to foster an interactive learning environment and displaying pre-recorded videos to teach practical skills. Validated Multiple Choice Questionnaires (MCQs) were used prior to and after the course. This enabled the assessment of delegates' knowledge of urology according to the national undergraduate curriculum, and paired t tests were used to quantify the level of improvement. Thematic analysis was carried out on post-course delegate feedback to identify highlights of the course and ways of improving future iterations. Results In total, 131 delegates took part in the pilot virtual course. Of these, 105 delegates completed the pre- and post-course MCQs. There was a statistically significant improvement in the assessment following the course (p = <0.001) with mean score increasing from 47.5% pre-course to 65.8% post-course. All delegates who attended the most recent implementation of the virtual course (n = 31) felt it improved their knowledge and confidence in urology. Twenty delegates (64.5%) felt that it prepared them for both final year medical school examinations and working as a foundation year doctor. Positive themes in feedback were identified, which included the interactive nature of the course, the quality of teaching, the level and content of information provided and the high yield, concise organization of the teaching schedule. Conclusion Using virtual technology and innovative educational frameworks, we have demonstrated the successful conversion of the urology boot camp for medical students to a virtual format. At a national level, with support from the British Association of Urological Surgeons, the face-to-face component of the course will continue to run in parallel with the virtual course with the aim of standardizing and improving UK undergraduate urological education. The virtual course has been implemented on an international scale, and this has already shown promising results.
Collapse
Affiliation(s)
- Thomas Fonseka
- Department of UrologyRoyal Derby Hospital, University Hospitals of Derby and BurtonDerbyUK
| | - Mei‐Ling Henry
- Department of UrologyRoyal Derby Hospital, University Hospitals of Derby and BurtonDerbyUK
| | - Clare Ellerington
- Department of Medical Education, Royal Derby HospitalUniversity Hospitals of Derby and BurtonDerbyUK
| | - Arjun Gowda
- Department of UrologyRoyal Derby Hospital, University Hospitals of Derby and BurtonDerbyUK
| | - Ricky Ellis
- Department of UrologyRoyal Derby Hospital, University Hospitals of Derby and BurtonDerbyUK
| |
Collapse
|
5
|
Avrumova F, Lebl DR. Augmented reality for minimally invasive spinal surgery. Front Surg 2023; 9:1086988. [PMID: 36776471 PMCID: PMC9914175 DOI: 10.3389/fsurg.2022.1086988] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Accepted: 12/28/2022] [Indexed: 01/28/2023] Open
Abstract
Background Augmented reality (AR) is an emerging technology that can overlay computer graphics onto the real world and enhance visual feedback from information systems. Within the past several decades, innovations related to AR have been integrated into our daily lives; however, its application in medicine, specifically in minimally invasive spine surgery (MISS), may be most important to understand. AR navigation provides auditory and haptic feedback, which can further enhance surgeons' capabilities and improve safety. Purpose The purpose of this article is to address previous and current applications of AR, AR in MISS, limitations of today's technology, and future areas of innovation. Methods A literature review related to applications of AR technology in previous and current generations was conducted. Results AR systems have been implemented for treatments related to spinal surgeries in recent years, and AR may be an alternative to current approaches such as traditional navigation, robotically assisted navigation, fluoroscopic guidance, and free hand. As AR is capable of projecting patient anatomy directly on the surgical field, it can eliminate concern for surgeon attention shift from the surgical field to navigated remote screens, line-of-sight interruption, and cumulative radiation exposure as the demand for MISS increases. Conclusion AR is a novel technology that can improve spinal surgery, and limitations will likely have a great impact on future technology.
Collapse
|
6
|
Rong K, Wu X, Xia Q, Chen J, Fei T, Li X, Jiang W. A Systematic Study to Compare the Precise Implantation of Hololens 2 Assisted with Acetabular Prosthesis for Total Hip Replacement. J BIOMATER TISS ENG 2022. [DOI: 10.1166/jbt.2022.3212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
This study aims to evaluate the accuracy of the precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. A total of 80 orthopaedic doctors from our hospital are enrolled in this systematic study and these doctors are divided into following four
groups based on the experience of doctors treatment for orthopaedic patients and the Hololens 2 assisted:Rich experienced group with Hololens 2, rich experienced group without Hololens 2, inexperienced group with Hololens 2, inexperienced group without Hololens 2. The abduction angle, the
anteversion angle, the offset degree in the abduction angle, the offset degree in the anteversion angle in four groups are presented and these result are used to evaluate the accuracy of precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. Finally,
all date in this study is collected and analyzed. The total of 80 physicians are included in this study. The results show that the outcomes between rich experienced group with Hololens 2 and rich experienced group without Hololens 2 are significant difference, and the outcomes between inexperienced
group with Hololens 2 and inexperienced group without Hololens 2 are significant difference. The result between any other two groups is no significant difference. Hololens 2 assisted with acetabular prosthesis for total hip replacement can improve the accuracy.
Collapse
Affiliation(s)
- Ke Rong
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Xuhua Wu
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Qingquan Xia
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Jie Chen
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Teng Fei
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Xujun Li
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Weimin Jiang
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| |
Collapse
|
7
|
Advances and Innovations in Ablative Head and Neck Oncologic Surgery Using Mixed Reality Technologies in Personalized Medicine. J Clin Med 2022; 11:jcm11164767. [PMID: 36013006 PMCID: PMC9410374 DOI: 10.3390/jcm11164767] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 08/10/2022] [Accepted: 08/12/2022] [Indexed: 11/17/2022] Open
Abstract
The benefit of computer-assisted planning in head and neck ablative and reconstructive surgery has been extensively documented over the last decade. This approach has been proven to offer a more secure surgical procedure. In the treatment of cancer of the head and neck, computer-assisted surgery can be used to visualize and estimate the location and extent of the tumor mass. Nowadays, some software tools even allow the visualization of the structures of interest in a mixed reality environment. However, the precise integration of mixed reality systems into a daily clinical routine is still a challenge. To date, this technology is not yet fully integrated into clinical settings such as the tumor board, surgical planning for head and neck tumors, or medical and surgical education. As a consequence, the handling of these systems is still of an experimental nature, and decision-making based on the presented data is not yet widely used. The aim of this paper is to present a novel, user-friendly 3D planning and mixed reality software and its potential application for ablative and reconstructive head and neck surgery.
Collapse
|
8
|
Roberts S, Desai A, Checcucci E, Puliatti S, Taratkin M, Kowalewski KF, Gomez Rivas J, Rivero I, Veneziano D, Autorino R, Porpiglia F, Gill IS, Cacciamani GE. "Augmented reality" applications in urology: a systematic review. Minerva Urol Nephrol 2022; 74:528-537. [PMID: 35383432 DOI: 10.23736/s2724-6051.22.04726-7] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
INTRODUCTION Augmented reality (AR) applied to surgical procedures refers to the superimposition of preoperative or intra-operative images onto the operative field. Augmented reality has been increasingly used in myriad surgical specialties including Urology. The following study reviews advances in the use of AR for improvements in urologic outcomes. EVIDENCE ACQUISITION We identified all descriptive, validity, prospective randomized/nonrandomized trials and retrospective comparative/noncomparative studies about the use of AR in Urology up until March 2021. The MEDLINE, Scopus, and Web of Science databases were used for literature search. We conducted the study selection according to the PRISMA (Preferred Reporting Items for Systematic Reviews and meta-analysis statement) guidelines. We limited included studies to only those using AR, excluding all that used virtual reality technology. EVIDENCE SYNTHESIS A total of 60 studies were identified and included in the present analysis. Overall, 19 studies were descriptive/validity/phantom studies for specific AR methodologies, 4 studies were case reports, and 37 studies included clinical prospective/retrospective comparative studies. CONCLUSIONS Advances in AR have led to increasing registration accuracy as well as increased ability to identify anatomic landmarks and improve outcomes during Urologic procedures such as RARP and robot-assisted partial nephrectomy.
Collapse
Affiliation(s)
- Sidney Roberts
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Aditya Desai
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Enrico Checcucci
- School of Medicine, Division of Urology, Department of Oncology, San Luigi Hospital, University of Turin, Orbassano, Turin, Italy.,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Stefano Puliatti
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, University of Modena and Reggio Emilia, Modena, Italy.,Department of Urology, OLV, Aalst, Belgium.,ORSI Academy, Melle, Belgium
| | - Mark Taratkin
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Institute for Urology and Reproductive Health, Sechenov University, Moscow, Russia
| | - Karl-Friedrich Kowalewski
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Virgen Macarena University Hospital, Seville, Spain.,Department of Urology and Urosurgery, University Hospital of Mannheim, Mannheim, Germany
| | - Juan Gomez Rivas
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Clinico San Carlos University Hospital, Madrid, Spain
| | - Ines Rivero
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology and Nephrology, Virgen del Rocío University Hospital, Seville, Spain
| | - Domenico Veneziano
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Department of Urology, Riuniti Hospital, Reggio Calabria, Reggio Calabria, Italy
| | | | - Francesco Porpiglia
- European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands
| | - Inderbir S Gill
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- Keck School of Medicine, Catherine and Joseph Aresty Department of Urology, USC Institute of Urology, Los Angeles, CA, USA - .,European Association of Urology (EAU) Young Academic Office (YAU) Uro-Technology Working Group, Arnhem, the Netherlands.,Artificial Intelligence (AI) Center at USC Urology, USC Institute of Urology, Los Angeles, CA, USA.,Keck School of Medicine, Department of Radiology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
9
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
10
|
Cercenelli L, De Stefano A, Billi AM, Ruggeri A, Marcelli E, Marchetti C, Manzoli L, Ratti S, Badiali G. AEducaAR, Anatomical Education in Augmented Reality: A Pilot Experience of an Innovative Educational Tool Combining AR Technology and 3D Printing. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19031024. [PMID: 35162049 PMCID: PMC8834017 DOI: 10.3390/ijerph19031024] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Revised: 01/14/2022] [Accepted: 01/15/2022] [Indexed: 01/27/2023]
Abstract
Gross anatomy knowledge is an essential element for medical students in their education, and nowadays, cadaver-based instruction represents the main instructional tool able to provide three-dimensional (3D) and topographical comprehensions. The aim of the study was to develop and test a prototype of an innovative tool for medical education in human anatomy based on the combination of augmented reality (AR) technology and a tangible 3D printed model that can be explored and manipulated by trainees, thus favoring a three-dimensional and topographical learning approach. After development of the tool, called AEducaAR (Anatomical Education with Augmented Reality), it was tested and evaluated by 62 second-year degree medical students attending the human anatomy course at the International School of Medicine and Surgery of the University of Bologna. Students were divided into two groups: AEducaAR-based learning ("AEducaAR group") was compared to standard learning using human anatomy atlas ("Control group"). Both groups performed an objective test and an anonymous questionnaire. In the objective test, the results showed no significant difference between the two learning methods; instead, in the questionnaire, students showed enthusiasm and interest for the new tool and highlighted its training potentiality in open-ended comments. Therefore, the presented AEducaAR tool, once implemented, may contribute to enhancing students' motivation for learning, increasing long-term memory retention and 3D comprehension of anatomical structures. Moreover, this new tool might help medical students to approach to innovative medical devices and technologies useful in their future careers.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Experimental Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy; (L.C.); (E.M.)
| | - Alessia De Stefano
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Anna Maria Billi
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Alessandra Ruggeri
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Emanuela Marcelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Experimental Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy; (L.C.); (E.M.)
| | - Claudio Marchetti
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (C.M.); (G.B.)
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
| | - Lucia Manzoli
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Stefano Ratti
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
- Correspondence:
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (C.M.); (G.B.)
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
| |
Collapse
|
11
|
Sparwasser P, Haack M, Frey L, Haferkamp A, Borgmann H. [Virtual and augmented reality in urology]. Urologe A 2021; 61:133-141. [PMID: 34935997 PMCID: PMC8693158 DOI: 10.1007/s00120-021-01734-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/23/2021] [Indexed: 11/29/2022]
Abstract
Zwar haben jeher technologische Weiterentwicklungen die medizinische Versorgung in deren stetigem Wandel optimiert, so waren diese jedoch immer noch für den Anwender weitestgehend fassbar. Getrieben durch immense finanzielle Anstrengungen sind innovative Produkte und technische Lösungen entstanden, die den medizinischen Alltag transformieren und diesen in Zukunft um eine Dimension erweitern werden: die Virtual und Augmented Reality. Dieser Übersichtsartikel fasst die aktuellen wissenschaftlichen Projekte und den zukünftigen Nutzen von Virtual und Augmented Reality im Fachgebiet der Urologie zusammen.
Collapse
Affiliation(s)
- P Sparwasser
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland.
| | - M Haack
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - L Frey
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - A Haferkamp
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - H Borgmann
- Department of Urology, University Medical Center, Johannes Gutenberg University, Langenbeckstr. 1, 55131, Mainz, Deutschland
| |
Collapse
|
12
|
Dhar P, Rocks T, Samarasinghe RM, Stephenson G, Smith C. Augmented reality in medical education: students' experiences and learning outcomes. MEDICAL EDUCATION ONLINE 2021; 26:1953953. [PMID: 34259122 PMCID: PMC8281102 DOI: 10.1080/10872981.2021.1953953] [Citation(s) in RCA: 35] [Impact Index Per Article: 11.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
Augmented reality (AR) is a relatively new technology that allows for digitally generated three-dimensional representations to be integrated with real environmental stimuli. AR can make use of smart phones, tablets, or other devices to achieve a highly stimulating learning environment and hands-on immersive experience. The use of AR in industry is becoming widespread with applications being developed for use not just for entertainment and gaming but also healthcare, retail and marketing, education, military, travel and tourism, automotive industry, manufacturing, architecture, and engineering. Due to the distinct learning advantages that AR offers, such as remote learning and interactive simulations, AR-based teaching programs are also increasingly being adopted within medical schools across the world. These advantages are further highlighted by the current COVID-19 pandemic, which has caused an even greater shift towards online learning. In this review, we investigate the use of AR in medical training/education and its effect on students' experiences and learning outcomes. This includes the main goals of AR-based learning, such as to simplify the delivery and enhance the comprehension of complex information. We also describe how AR can enhance the experiences of medical students, by improving knowledge and understanding, practical skills and social skills. These concepts are discussed within the context of specific AR medical training programs, such as HoloHuman, OculAR SIM, and HoloPatient. Finally, we discuss the challenges of AR in learning and teaching and propose future directions for the use of this technology in medical education.
Collapse
Affiliation(s)
- Poshmaal Dhar
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Tetyana Rocks
- Institute for Innovation in Mental and Physical Health and Clinical Translation, Food and Mood Centre, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Rasika M Samarasinghe
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Garth Stephenson
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Craig Smith
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
- CONTACT Craig Smith School of Medicine, Institute for Innovation in Mental and Physical Health and Clinical Translation, Deakin University, Australia
| |
Collapse
|
13
|
Forte MP, Gourishetti R, Javot B, Engler T, Gomez ED, Kuchenbecker KJ. Design of interactive augmented reality functions for robotic surgery and evaluation in dry-lab lymphadenectomy. Int J Med Robot 2021; 18:e2351. [PMID: 34781414 DOI: 10.1002/rcs.2351] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Revised: 10/28/2021] [Accepted: 11/11/2021] [Indexed: 12/18/2022]
Abstract
BACKGROUND Augmented reality (AR) has been widely researched for use in healthcare. Prior AR for robot-assisted minimally invasive surgery has mainly focussed on superimposing preoperative three-dimensional (3D) images onto patient anatomy. This article presents alternative interactive AR tools for robotic surgery. METHODS We designed, built and evaluated four voice-controlled functions: viewing a live video of the operating room, viewing two-dimensional preoperative images, measuring 3D distances and warning about out-of-view instruments. This low-cost system was developed on a da Vinci Si, and it can be integrated into surgical robots equipped with a stereo camera and a stereo viewer. RESULTS Eight experienced surgeons performed dry-lab lymphadenectomies and reported that the functions improved the procedure. They particularly appreciated the possibility of accessing the patient's medical records on demand, measuring distances intraoperatively and interacting with the functions using voice commands. CONCLUSIONS The positive evaluations garnered by these alternative AR functions and interaction methods provide support for further exploration.
Collapse
Affiliation(s)
- Maria-Paola Forte
- Haptic Intelligence Department, Max Planck Institute for Intelligent Systems, Stuttgart, Germany
| | - Ravali Gourishetti
- Haptic Intelligence Department, Max Planck Institute for Intelligent Systems, Stuttgart, Germany
| | - Bernard Javot
- Haptic Intelligence Department, Max Planck Institute for Intelligent Systems, Stuttgart, Germany
| | | | - Ernest D Gomez
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts, USA
| | - Katherine J Kuchenbecker
- Haptic Intelligence Department, Max Planck Institute for Intelligent Systems, Stuttgart, Germany
| |
Collapse
|