1
|
Lewis KO, Popov V, Fatima SS. From static web to metaverse: reinventing medical education in the post-pandemic era. Ann Med 2024; 56:2305694. [PMID: 38261592 PMCID: PMC10810636 DOI: 10.1080/07853890.2024.2305694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Accepted: 01/06/2024] [Indexed: 01/25/2024] Open
Abstract
The World Wide Web and the advancement of computer technology in the 1960s and 1990s respectively set the ground for a substantial and simultaneous change in many facets of our life, including medicine, health care, and medical education. The traditional didactic approach has shifted towards more dynamic and interactive methods, leveraging technologies such as simulation tools, virtual reality, and online platforms. At the forefront is the remarkable evolution that has revolutionized how medical knowledge is accessed, disseminated, and integrated into pedagogical practices. The COVID-19 pandemic also led to rapid and large-scale adoption of e-learning and digital resources in medical education because of widespread lockdowns, social distancing measures, and the closure of medical schools and healthcare training programs. This review paper examines the evolution of medical education from the Flexnerian era to the modern digital age, closely examining the influence of the evolving WWW and its shift from Education 1.0 to Education 4.0. This evolution has been further accentuated by the transition from the static landscapes of Web 2D to the immersive realms of Web 3D, especially considering the growing notion of the metaverse. The application of the metaverse is an interconnected, virtual shared space that includes virtual reality (VR), augmented reality (AR), and mixed reality (MR) to create a fertile ground for simulation-based training, collaborative learning, and experiential skill acquisition for competency development. This review includes the multifaceted applications of the metaverse in medical education, outlining both its benefits and challenges. Through insightful case studies and examples, it highlights the innovative potential of the metaverse as a platform for immersive learning experiences. Moreover, the review addresses the role of emerging technologies in shaping the post-pandemic future of medical education, ultimately culminating in a series of recommendations tailored for medical institutions aiming to successfully capitalize on revolutionary changes.
Collapse
Affiliation(s)
- Kadriye O. Lewis
- Children’s Mercy Kansas City, Department of Pediatrics, UMKC School of Medicine, Kansas City, MO, USA
| | - Vitaliy Popov
- Department of Learning Health Sciences, University of MI Medical School, Ann Arbor, MI, USA
| | - Syeda Sadia Fatima
- Department of Biological and Biomedical Sciences, The Aga Khan University, Karachi, Pakistan
| |
Collapse
|
2
|
Chang Z, Chen D, Peng J, Liu R, Li B, Kang J, Guo L, Hou R, Xu X, Lee M, Zhang X. Bone-Targeted Supramolecular Nanoagonist Assembled by Accurate Ratiometric Herbal-Derived Therapeutics for Osteoporosis Reversal. Nano Lett 2024; 24:5154-5164. [PMID: 38602357 DOI: 10.1021/acs.nanolett.4c00029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/12/2024]
Abstract
Developing novel strategies for defeating osteoporosis has become a world-wide challenge with the aging of the population. In this work, novel supramolecular nanoagonists (NAs), constructed from alkaloids and phenolic acids, emerge as a carrier-free nanotherapy for efficacious osteoporosis treatment. These precision nanoagonists are formed through the self-assembly of berberine (BER) and chlorogenic acid (CGA), utilizing noncovalent electrostatic, π-π, and hydrophobic interactions. This assembly results in a 100% drug loading capacity and stable nanostructure. Furthermore, the resulting weights and proportions of CGA and BER within the NAs are meticulously controlled with strong consistency when the CGA/BER assembly feed ratio is altered from 1:1 to 1:4. As anticipated, our NAs themselves could passively target osteoporotic bone tissues following prolonged blood circulation, modulate Wnt signaling, regulate osteogenic differentiation, and ameliorate bone loss in ovariectomy-induced osteoporotic mice. We hope this work will open a new strategy to design efficient herbal-derived Wnt NAs for dealing with intractable osteoporosis.
Collapse
Affiliation(s)
- Zhuangpeng Chang
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Dengke Chen
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Jiao Peng
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Rongyan Liu
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Beibei Li
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Jianbang Kang
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Li Guo
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Ruigang Hou
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| | - Xianghui Xu
- Department of Pharmacy, College of Biology, Hunan University, Changsha, Hunan 410082, P.R. China
| | - Min Lee
- Division of Advanced Prosthodontics, University of California at Los Angeles, Los Angeles, California 90095, United States
| | - Xiao Zhang
- School of Pharmacy and Shanxi Provincial Key Laboratory of Drug Synthesis and Novel Pharmaceutical Preparation Technology, Shanxi Medical University, Taiyuan 030001, P.R. China
| |
Collapse
|
3
|
Wang J, Li W, Dun A, Zhong N, Ye Z. 3D visualization technology for Learning human anatomy among medical students and residents: a meta- and regression analysis. BMC Med Educ 2024; 24:461. [PMID: 38671399 PMCID: PMC11055294 DOI: 10.1186/s12909-024-05403-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Accepted: 04/08/2024] [Indexed: 04/28/2024]
Abstract
BACKGROUND 3D visualization technology applies computers and other devices to create a realistic virtual world for individuals with various sensory experiences such as 3D vision, touch, and smell to gain a more effective understanding of the relationships between real spatial structures and organizations. The purpose of this study was to comprehensively evaluate the effectiveness of 3D visualization technology in human anatomy teaching/training and explore the potential factors that affect the training effects to better guide the teaching of classroom/laboratory anatomy. METHODS We conducted a meta-analysis of randomized controlled studies on teaching human anatomy using 3D visualization technology. We extensively searched three authoritative databases, PubMed, Web of Science, and Embase; the main outcomes were the participants' test scores and satisfaction, while the secondary outcomes were time consumption and enjoyment. Heterogeneity by I² was statistically determined because I²> 50%; therefore, a random-effects model was employed, using data processing software such as RevMan, Stata, and VOSviewer to process data, apply standardized mean difference and 95% confidence interval, and subgroup analysis to evaluate test results, and then conduct research through sensitivity analysis and meta-regression analysis. RESULTS Thirty-nine randomized controlled trials (2,959 participants) were screened and included in this study. The system analysis of the main results showed that compared with other methods, including data from all regions 3D visualization technology moderately improved test scores as well as satisfaction and enjoyment; however, the time that students took to complete the test was not significantly reduced. Meta-regression analysis also showed that regional factorsaffected test scores, whereas other factors had no significant impact. When the literature from China was excluded, the satisfaction and happiness of the 3D virtual-reality group were statistically significant compared to those of the traditional group; however, the test results and time consumption were not statistically significant. CONCLUSION 3D visualization technology is an effective way to improve learners' satisfaction with and enjoyment of human anatomical learning, but it cannot reduce the time required for testers to complete the test. 3D visualization technology may struggle to improve the testers' scores. The literature test results from China are more prone to positive results and affected by regional bias.
Collapse
Affiliation(s)
- Junming Wang
- Department of Health Management, The First Affiliated Hospital, Shandong Provincial Qianfoshan Hospital, Shandong First Medical University, 250013, Jinan, Shandong, China
- School of clinical and basic medicine, Shandong First Medical University, Jinan, China
| | - Wenjun Li
- Department of Health Management, The First Affiliated Hospital, Shandong Provincial Qianfoshan Hospital, Shandong First Medical University, 250013, Jinan, Shandong, China
- School of clinical and basic medicine, Shandong First Medical University, Jinan, China
| | - Aishe Dun
- School of Stomatology, Shandong First Medical University, Jinan, China
| | - Ning Zhong
- Department of Health Management, The First Affiliated Hospital, Shandong Provincial Qianfoshan Hospital, Shandong First Medical University, 250013, Jinan, Shandong, China.
| | - Zhen Ye
- Department of Health Management, The First Affiliated Hospital, Shandong Provincial Qianfoshan Hospital, Shandong First Medical University, 250013, Jinan, Shandong, China.
| |
Collapse
|
4
|
Weiss H, Tang J, Williams C, Stirling L. Performance on a target acquisition task differs between augmented reality and touch screen displays. Appl Ergon 2024; 116:104185. [PMID: 38043456 DOI: 10.1016/j.apergo.2023.104185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 11/20/2023] [Accepted: 11/22/2023] [Indexed: 12/05/2023]
Abstract
Target acquisition tasks quantify human motor and perceptual abilities while performing discrete tasks to support interface design and sensorimotor assessments. This study investigated the effects of display, Touchscreen and Augmented Reality (AR), on a standardized 2D multidirectional target acquisition task. Thirty-two participants performed the target acquisition task with both modality types and at two indexes of difficulty. The touchscreen modality yielded improved performance over AR as measured by accuracy, precision, error rates, throughput, and movement time. Throughput using the nominal index of difficulty was 10.12 bits/s for touchscreen and 3.11 bits/s for AR. AR designers can use the results to improve performance when designing AR interfaces by selecting larger buttons when accuracy and efficiency are required and by embedding perception cues to button target surfaces such as depth and proximity cues.
Collapse
Affiliation(s)
- Hannah Weiss
- University of Michigan, Department of Industrial and Operations Engineering, USA.
| | - Jianyang Tang
- University of Michigan, Department of Robotics, Ann Arbor, MI, USA
| | - Connor Williams
- University of Michigan, Department of Robotics, Ann Arbor, MI, USA
| | - Leia Stirling
- University of Michigan, Department of Industrial and Operations Engineering, USA; University of Michigan, Department of Robotics, Ann Arbor, MI, USA
| |
Collapse
|
5
|
Peng K, Moussavi Z, Karunakaran KD, Borsook D, Lesage F, Nguyen DK. iVR-fNIRS: studying brain functions in a fully immersive virtual environment. Neurophotonics 2024; 11:020601. [PMID: 38577629 PMCID: PMC10993907 DOI: 10.1117/1.nph.11.2.020601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 03/05/2024] [Accepted: 03/06/2024] [Indexed: 04/06/2024]
Abstract
Immersive virtual reality (iVR) employs head-mounted displays or cave-like environments to create a sensory-rich virtual experience that simulates the physical presence of a user in a digital space. The technology holds immense promise in neuroscience research and therapy. In particular, virtual reality (VR) technologies facilitate the development of diverse tasks and scenarios closely mirroring real-life situations to stimulate the brain within a controlled and secure setting. It also offers a cost-effective solution in providing a similar sense of interaction to users when conventional stimulation methods are limited or unfeasible. Although combining iVR with traditional brain imaging techniques may be difficult due to signal interference or instrumental issues, recent work has proposed the use of functional near infrared spectroscopy (fNIRS) in conjunction with iVR for versatile brain stimulation paradigms and flexible examination of brain responses. We present a comprehensive review of current research studies employing an iVR-fNIRS setup, covering device types, stimulation approaches, data analysis methods, and major scientific findings. The literature demonstrates a high potential for iVR-fNIRS to explore various types of cognitive, behavioral, and motor functions in a fully immersive VR (iVR) environment. Such studies should set a foundation for adaptive iVR programs for both training (e.g., in novel environments) and clinical therapeutics (e.g., pain, motor and sensory disorders and other psychiatric conditions).
Collapse
Affiliation(s)
- Ke Peng
- University of Manitoba, Department of Electrical and Computer Engineering, Price Faculty of Engineering, Winnipeg, Manitoba, Canada
| | - Zahra Moussavi
- University of Manitoba, Department of Electrical and Computer Engineering, Price Faculty of Engineering, Winnipeg, Manitoba, Canada
| | - Keerthana Deepti Karunakaran
- Massachusetts General Hospital, Harvard Medical School, Department of Psychiatry, Boston, Massachusetts, United States
| | - David Borsook
- Massachusetts General Hospital, Harvard Medical School, Department of Psychiatry, Boston, Massachusetts, United States
- Massachusetts General Hospital, Harvard Medical School, Department of Radiology, Boston, Massachusetts, United States
| | - Frédéric Lesage
- University of Montreal, Institute of Biomedical Engineering, Department of Electrical Engineering, Ecole Polytechnique, Montreal, Quebec, Canada
- Montreal Heart Institute, Montreal, Quebec, Canada
| | - Dang Khoa Nguyen
- University of Montreal, Department of Neurosciences, Montreal, Quebec, Canada
- Research Center of the Hospital Center of the University of Montreal, Department of Neurology, Montreal, Quebec, Canada
| |
Collapse
|
6
|
Neri I, Cercenelli L, Marcuccio M, Lodi S, Koufi FD, Fazio A, Marvi MV, Marcelli E, Billi AM, Ruggeri A, Tarsitano A, Manzoli L, Badiali G, Ratti S. Dissecting human anatomy learning process through anatomical education with augmented reality: AEducAR 2.0, an updated interdisciplinary study. Anat Sci Educ 2024. [PMID: 38520153 DOI: 10.1002/ase.2389] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Revised: 01/22/2024] [Accepted: 01/22/2024] [Indexed: 03/25/2024]
Abstract
Anatomical education is pivotal for medical students, and innovative technologies like augmented reality (AR) are transforming the field. This study aimed to enhance the interactive features of the AEducAR prototype, an AR tool developed by the University of Bologna, and explore its impact on human anatomy learning process in 130 second-year medical students at the International School of Medicine and Surgery of the University of Bologna. An interdisciplinary team of anatomists, maxillofacial surgeons, biomedical engineers, and educational scientists collaborated to ensure a comprehensive understanding of the study's objectives. Students used the updated version of AEducAR, named AEducAR 2.0, to study three anatomical topics, specifically the orbit zone, facial bones, and mimic muscles. AEducAR 2.0 offered two learning activities: one explorative and one interactive. Following each activity, students took a test to assess learning outcomes. Students also completed an anonymous questionnaire to provide background information and offer their perceptions of the activity. Additionally, 10 students participated in interviews for further insights. The results demonstrated that AEducAR 2.0 effectively facilitated learning and students' engagement. Students totalized high scores in both quizzes and declared to have appreciated the interactive features that were implemented. Moreover, interviews shed light on the interesting topic of blended learning. In particular, the present study suggests that incorporating AR into medical education alongside traditional methods might prove advantageous for students' academic and future professional endeavors. In this light, this study contributes to the growing research emphasizing the potential role of AR in shaping the future of medical education.
Collapse
Affiliation(s)
- Irene Neri
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Laura Cercenelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, Bologna, Italy
| | - Massimo Marcuccio
- Department of Educational Science "Giovanni Maria Bertin", University of Bologna, Bologna, Italy
| | - Simone Lodi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Foteini-Dionysia Koufi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Antonietta Fazio
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Maria Vittoria Marvi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, Bologna, Italy
| | - Anna Maria Billi
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Alessandra Ruggeri
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Bologna, Italy
| | - Lucia Manzoli
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Bologna, Italy
| | - Stefano Ratti
- Cellular Signalling Laboratory, Anatomy Center, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| |
Collapse
|
7
|
Tandon I, Maldonado V, Wilkerson M, Walls A, Rao RR, Elsaadany M. Immersive virtual reality-based learning as a supplement for biomedical engineering labs: challenges faced and lessons learned. Front Med Technol 2024; 6:1301004. [PMID: 38566843 PMCID: PMC10985327 DOI: 10.3389/fmedt.2024.1301004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2023] [Accepted: 03/05/2024] [Indexed: 04/04/2024] Open
Abstract
Introduction Immersive virtual reality (VR) based laboratory demonstrations have been gaining traction in STEM education as they can provide virtual hands-on experience. VR can also facilitate experiential and visual learning and enhanced retention. However, several optimizations of the implementation, in-depth analyses of advantages and trade-offs of the technology, and assessment of receptivity of modern techniques in STEM education are required to ensure better utilization of VR-based labs. Methods In this study, we developed VR-based demonstrations for a biomolecular engineering laboratory and assessed their effectiveness using surveys containing free responses and 5-point Likert scale-based questions. Insta360 Pro2 camera and Meta Quest 2 headsets were used in combination with an in-person lab. A cohort of 53 students watched the experimental demonstration on VR headsets in the lab after a brief lab overview in person and then performed the experiments in the lab. Results Only 28.29% of students reported experiencing some form of discomfort after using the advanced VR equipment as opposed to 63.63% of students from the previous cohort. About 40% of the students reported that VR eliminated or reduced auditory and visual distractions from the environment, the length of the videos was appropriate, and they received enough information to understand the tasks. Discussion The traditional lab method was found to be more suitable for explaining background information and lab concepts while the VR was found to be suitable for demonstrating lab procedures and tasks. Analyzing open-ended questions revealed several factors and recommendations to overcome the potential challenges and pitfalls of integrating VR with traditional modes of learning. This study provides key insights to help optimize the implementation of immersive VR to effectively supplement in-person learning experiences.
Collapse
Affiliation(s)
| | | | | | | | | | - Mostafa Elsaadany
- Department of Biomedical Engineering, University of Arkansas, Fayetteville, AR, United States
| |
Collapse
|
8
|
Ogata Y, Kolchiba M. Virtual reality images created on the back and front of a display. Opt Lett 2024; 49:1632-1635. [PMID: 38489469 DOI: 10.1364/ol.515883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Accepted: 02/19/2024] [Indexed: 03/17/2024]
Abstract
To better investigate the biological mechanism of microorganisms, we developed a novel, to the best of our knowledge, virtual reality (VR) microscope that incorporates a head-mounted display (HMD) that creates VR images with a digital microscope. This type of VR microscope can be used with any type of optical microscope. The fabricated microscope is quite different from a common bifocal device because it can create VR images on the back and front of a display. If the VR images are displayed with object (OBJ) images, they are observable in [2 × 2] (back and front VR images and OBJ images; 2 × 2 = 4 images). This feature can provide important information on microscopic OBJs, which can be employed in 3D biological analysis. Furthermore, if a laser light source is added to this microscope, the images can be observed in [3 × 2] (back and front laser VR images, VR images, and OBJ images; 3 × 2 = 6 images). The lasers would also enable optical trapping and tracking, leading to improved biological analysis.
Collapse
|
9
|
Philippe AG, Goncalves A, Korchi K, Deshayes M. Exergaming in augmented reality is tailor-made for aerobic training and enjoyment among healthy young adults. Front Public Health 2024; 12:1307382. [PMID: 38469269 PMCID: PMC10925726 DOI: 10.3389/fpubh.2024.1307382] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Accepted: 02/14/2024] [Indexed: 03/13/2024] Open
Abstract
In recent years, the use of exergaming for physical activity practice has gain in popularity but few is known about the use of augmented reality for physical activity, particularly at moderate to vigorous intensities. The present study examined the use of an exergame in augmented reality for aerobic training in healthy young adults. In a within-subject design, 18 participants (19.8 ± 1.4 years of age) have performed two physical activity sessions playing dodgeball. Indeed, they realized a classical dodgeball session and an exergaming session with an augmented reality version of dodgeball game. Physical loads and intensities were measured with accelerometers, RPE and heart sensors. Enjoyment experienced during the sessions was measured with the short version of the physical activity enjoyment scale questionnaire. Results revealed that both physical load and intensity were appropriate for aerobic training in the two conditions (i.e., classical and exergame in augmented reality) although values were significantly higher in the classical condition. Enjoyment was high in the two conditions with a higher significant value in the classical condition compared to the exergame in augmented reality condition. Put together, these results indicate that an aerobic state can be attained through both physical gameplay and its augmented reality equivalent and was associated to a high level of enjoyment among healthy young adults.
Collapse
|
10
|
Yoo S, Son MH. Virtual, augmented, and mixed reality: potential clinical and training applications in pediatrics. Clin Exp Pediatr 2024; 67:92-103. [PMID: 37232076 PMCID: PMC10839193 DOI: 10.3345/cep.2022.00731] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 05/09/2023] [Accepted: 05/18/2023] [Indexed: 05/27/2023] Open
Abstract
BACKGROUND COVID-19 pandemic has significantly impacted the field of medical training, necessitating innovative approaches to education and practice. During this period, the use of novel technologies like virtual reality (VR), augmented reality (AR), and mixed reality (MR) has become increasingly vital. These technologies offer the advantage of transcending the limitations of time and space, thus enabling medical professionals to access various personalized programs for both education and service delivery. This shift is particularly relevant in the realm of pediatric medicine, where traditional training and clinical methods face unique challenges. PURPOSE The primary aim of this study is to explore the application of VR, AR, and MR technologies in pediatric medical settings, with a focus on both clinical applications and the training of pediatric medical professionals. We aim to comprehensively search and review studies that have utilized these technologies in the treatment of pediatric patients and the education of healthcare providers in this field. METHODS Peer-reviewed articles published in PubMed, the Cochrane Library, ScienceDirect, Google Scholar, and Scopus from January 1, 2018, to March 1, 2023, were comprehensively searched. The review was conducted according to the PRISMA (Preferred Reporting Items for Systematic review and Meta-Analyses) guidelines. Among the 89 studies, 63 investigated the clinical applications of VR (n=60) or AR (n=3) in pediatric patients, and 25 investigated the applications of VR (n=19), AR (n=5), or MR (n=1) for training medical professionals. RESULTS A total of 36 randomized controlled trials (RCTs) for clinical application (n=31) and medical training (n=5) were retrieved. Among the RCTs, 21 reported significant improvements in clinical applications (n=17) and medical training (n=4). CONCLUSION Despite a few limitations in conducting research on innovative technology, such research has rapidly expanded, indicating that an increasing number of researchers are involved in pediatric research using these technologies.
Collapse
Affiliation(s)
- Suyoung Yoo
- Department of Digital Health, Samsung Advanced Institute for Health Sciences and Technology (SAIHST), Sungkyunkwan University, Seoul, Korea
| | - Meong Hi Son
- Department of Digital Health, Samsung Advanced Institute for Health Sciences and Technology (SAIHST), Sungkyunkwan University, Seoul, Korea
- Department of Emergency Medicine, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea
| |
Collapse
|
11
|
Jacopin E, Sakamoto Y, Nishida K, Kaizu K, Takahashi K. An architecture for collaboration in systems biology at the age of the Metaverse. NPJ Syst Biol Appl 2024; 10:12. [PMID: 38280851 PMCID: PMC10821884 DOI: 10.1038/s41540-024-00334-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Accepted: 01/10/2024] [Indexed: 01/29/2024] Open
Abstract
As the current state of the Metaverse is largely driven by corporate interests, which may not align with scientific goals and values, academia should play a more active role in its development. Here, we present the challenges and solutions for building a Metaverse that supports systems biology research and collaboration. Our solution consists of two components: Kosmogora, a server ensuring biological data access, traceability, and integrity in the context of a highly collaborative environment such as a metaverse; and ECellDive, a virtual reality application to explore, interact, and build upon the data managed by Kosmogora. We illustrate the synergy between the two components by visualizing a metabolic network and its flux balance analysis. We also argue that the Metaverse of systems biology will foster closer communication and cooperation between experimentalists and modelers in the field.
Collapse
Affiliation(s)
- Eliott Jacopin
- RIKEN, Center for Biosystems Dynamics Research, 6-2-3 Furuedai, Suita, Osaka, 565-0874, Japan.
| | - Yuki Sakamoto
- RIKEN, Center for Biosystems Dynamics Research, 6-2-3 Furuedai, Suita, Osaka, 565-0874, Japan
| | - Kozo Nishida
- RIKEN, Center for Integrative Medical Sciences, 1-7-22 Suehiro-cho, Tsurumi-ku, Yokohama, Kanagawa, 230-0045, Japan
- Tokyo University of Agriculture and Technology, Department of Biotechnology and Life Science, 2-24-16 Nakamachi, Koganei, Tokyo, 184-8588, Japan
| | - Kazunari Kaizu
- RIKEN, Center for Biosystems Dynamics Research, 6-2-3 Furuedai, Suita, Osaka, 565-0874, Japan
| | - Koichi Takahashi
- RIKEN, Center for Biosystems Dynamics Research, 6-2-3 Furuedai, Suita, Osaka, 565-0874, Japan
| |
Collapse
|
12
|
Vanoli S, Grobet-Jeandin E, Windisch O, Valerio M, Benamran D. Evolution of anxiety management in prostate biopsy under local anesthesia: a narrative review. World J Urol 2024; 42:43. [PMID: 38244150 PMCID: PMC10799769 DOI: 10.1007/s00345-023-04723-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Accepted: 11/24/2023] [Indexed: 01/22/2024] Open
Abstract
INTRODUCTION AND METHODS Prostate biopsy (PB) is an essential step in the diagnosis and active surveillance of prostate cancer (PCa). Transperineal PB (TP-PB) is now the recommended approach and is mostly conducted under local anesthesia. However, this procedure can potentially cause anxiety for patients, given the oncological context and the fear of peri-procedural pain and complications. The objective of this narrative review is to summarize the currently available tools for the management of peri-interventional anxiety during TP-PB, with a particular emphasis on the potential role of virtual reality (VR) in this setting. RESULTS In TP-PB, preoperative anxiety can lead to increased pain perception, longer procedure time, and decreased patient satisfaction. Pharmacological and non-pharmacological approaches have been explored to reduce anxiety, such as premedication, deep sedation, education, relaxation techniques, hypnosis, and music therapy, albeit with mixed results. VR has recently emerged in the technological armamentarium for managing pain and anxiety, and the efficiency of this technology has been evaluated in various medical fields, including pediatrics, gastroenterology, urology, gynecology, and psychiatry. CONCLUSION Despite the paucity of available data, VR appears to be a safe and effective technique in reducing anxiety in many procedures, even in frail patients. No studies have evaluated the role of VR in TP-PB. Future research should thus explore the optimal way to implement VR technology and any potential benefits for TP-PB patients.
Collapse
Affiliation(s)
- Sylvain Vanoli
- Urology Department, Hôpitaux Universitaires de Genève, Rue Gabrielle-Perret-Gentil 4, 1205, Geneva, Switzerland
| | - Elisabeth Grobet-Jeandin
- Urology Department, Hôpitaux Universitaires de Genève, Rue Gabrielle-Perret-Gentil 4, 1205, Geneva, Switzerland
| | - Olivier Windisch
- Urology Department, Hôpitaux Universitaires de Genève, Rue Gabrielle-Perret-Gentil 4, 1205, Geneva, Switzerland
| | - Massimo Valerio
- Urology Department, Hôpitaux Universitaires de Genève, Rue Gabrielle-Perret-Gentil 4, 1205, Geneva, Switzerland
| | - Daniel Benamran
- Urology Department, Hôpitaux Universitaires de Genève, Rue Gabrielle-Perret-Gentil 4, 1205, Geneva, Switzerland.
| |
Collapse
|
13
|
Pulumati A, Algarin YA, Jaalouk D, Hirsch M, Nouri K. Exploring the potential role for extended reality in Mohs micrographic surgery. Arch Dermatol Res 2024; 316:67. [PMID: 38194123 DOI: 10.1007/s00403-023-02804-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Revised: 11/03/2023] [Accepted: 12/14/2023] [Indexed: 01/10/2024]
Abstract
Mohs micrographic surgery (MMS) is a cornerstone of dermatological practice. Virtual reality (VR) and augmented reality (AR) technology, initially used for entertainment, have entered healthcare, offering real-time data overlaying a surgeon's view. This paper explores potential applications of VR and AR in MMS, emphasizing their advantages and limitations. We aim to identify research gaps to facilitate innovation in dermatological surgery. We conducted a PubMed search using the following: "augmented reality" OR "virtual reality" AND "Mohs" or "augmented reality" OR "virtual reality" AND "surgery." Inclusion criteria were peer-reviewed articles in English discussing these technologies in medical settings. We excluded non-peer-reviewed sources, non-English articles, and those not addressing these technologies in a medical context. VR alleviates patient anxiety and enhances patient satisfaction while serving as an educational tool. It also aids physicians by providing realistic surgical simulations. On the other hand, AR assists in real-time lesion analysis, optimizing incision planning, and refining margin control during surgery. Both of these technologies offer remote guidance for trainee residents, enabling real-time learning and oversight and facilitating synchronous teleconsultations. These technologies may transform dermatologic surgery, making it more accessible and efficient. However, further research is needed to validate their effectiveness, address potential challenges, and optimize seamless integration. All in all, AR and VR enhance real-world environments with digital data, offering real-time surgical guidance and medical insights. By exploring the potential integration of these technologies in MMS, our study identifies avenues for further research to thoroughly understand the role of these technologies to redefine dermatologic surgery, elevating precision, surgical outcomes, and patient experiences.
Collapse
Affiliation(s)
- Anika Pulumati
- University of Missouri-Kansas City School of Medicine, Kansas City, MO, USA.
- Department of Dermatology and Cutaneous Surgery, University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA.
| | | | - Dana Jaalouk
- Florida State University College of Medicine, Tallahassee, FL, USA
| | - Melanie Hirsch
- University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA
| | - Keyvan Nouri
- Department of Dermatology and Cutaneous Surgery, University of Miami Leonard M. Miller School of Medicine, Miami, FL, USA
| |
Collapse
|
14
|
Nayak K, Shinde RK, Gattani RG, Thakor T. Surgical Perspectives of Open vs. Laparoscopic Approaches to Lateral Pancreaticojejunostomy: A Comprehensive Review. Cureus 2024; 16:e51769. [PMID: 38322062 PMCID: PMC10844796 DOI: 10.7759/cureus.51769] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2023] [Accepted: 01/06/2024] [Indexed: 02/08/2024] Open
Abstract
Pancreaticojejunostomy, a critical step in pancreatic surgery, has significantly evolved surgical approaches, including open, laparoscopic, and robotic techniques. This comprehensive review explores open surgery's historical success, advantages, and disadvantages, emphasizing surgeons' accrued experience and familiarity with this approach. However, heightened morbidity and prolonged recovery associated with open pancreaticojejunostomy underscore the need for a nuanced evaluation of alternatives. The advent of robotic-assisted surgery introduces a paradigm shift in pancreatic procedures. Enhanced dexterity, facilitated by wristed instruments, allows intricate suturing and precise tissue manipulation crucial in pancreatic surgery. Three-dimensional visualization augments surgeon perception, improving spatial orientation and anastomotic alignment. Moreover, the potential for a reduced learning curve may enhance accessibility, especially for surgeons transitioning from open techniques. Emerging technologies, including advanced imaging modalities and artificial intelligence, present promising avenues for refining both open and minimally invasive approaches. The ongoing pursuit of optimal outcomes mandates a judicious consideration of surgical techniques, incorporating technological advancements to navigate challenges and enhance patient care in pancreaticojejunostomy.
Collapse
Affiliation(s)
- Krushank Nayak
- General Surgery, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Raju K Shinde
- General Surgery, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Rajesh G Gattani
- General Surgery, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Tosha Thakor
- Pathology, American International Institute of Medical Sciences, Udaipur, IND
| |
Collapse
|
15
|
Dho YS, Lee BC, Moon HC, Kim KM, Kang H, Lee EJ, Kim MS, Kim JW, Kim YH, Park SJ, Park CK. Validation of real-time inside-out tracking and depth realization technologies for augmented reality-based neuronavigation. Int J Comput Assist Radiol Surg 2024; 19:15-25. [PMID: 37442869 DOI: 10.1007/s11548-023-02993-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
PURPOSE Concomitant with the significant advances in computing technology, the utilization of augmented reality-based navigation in clinical applications is being actively researched. In this light, we developed novel object tracking and depth realization technologies to apply augmented reality-based neuronavigation to brain surgery. METHODS We developed real-time inside-out tracking based on visual inertial odometry and a visual inertial simultaneous localization and mapping algorithm. The cube quick response marker and depth data obtained from light detection and ranging sensors are used for continuous tracking. For depth realization, order-independent transparency, clipping, and annotation and measurement functions were developed. In this study, the augmented reality model of a brain tumor patient was applied to its life-size three-dimensional (3D) printed model. RESULTS Using real-time inside-out tracking, we confirmed that the augmented reality model remained consistent with the 3D printed patient model without flutter, regardless of the movement of the visualization device. The coordination accuracy during real-time inside-out tracking was also validated. The average movement error of the X and Y axes was 0.34 ± 0.21 and 0.04 ± 0.08 mm, respectively. Further, the application of order-independent transparency with multilayer alpha blending and filtered alpha compositing improved the perception of overlapping internal brain structures. Clipping, and annotation and measurement functions were also developed to aid depth perception and worked perfectly during real-time coordination. We named this system METAMEDIP navigation. CONCLUSIONS The results validate the efficacy of the real-time inside-out tracking and depth realization technology. With these novel technologies developed for continuous tracking and depth perception in augmented reality environments, we are able to overcome the critical obstacles in the development of clinically applicable augmented reality neuronavigation.
Collapse
Affiliation(s)
- Yun-Sik Dho
- Neuro-Oncology Clinic, National Cancer Center, Goyang, Republic of Korea
| | - Byeong Cheol Lee
- Research and Science Division, Research and Development Center, MEDICALIP Co. Ltd., Seoul, Republic of Korea
| | - Hyeong Cheol Moon
- Department of Neurosurgery, Chungbuk National University Hospital, Cheongju, Republic of Korea
| | - Kyung Min Kim
- Department of Neurosurgery, Inha University Hospital, Inha University College of Medicine, Incheon, Korea
| | - Ho Kang
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Eun Jung Lee
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Min-Sung Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Jin Wook Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Yong Hwy Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Sang Joon Park
- Research and Science Division, Research and Development Center, MEDICALIP Co. Ltd., Seoul, Republic of Korea.
- Department of Radiology, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea.
| | - Chul-Kee Park
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea.
| |
Collapse
|
16
|
Zhu H, Xu J, Wang P, Liu H, Chen T, Zhao Z, Ji L. The status of virtual simulation experiments in medical education in China: based on the national virtual simulation experiment teaching Center (iLAB-X). Med Educ Online 2023; 28:2272387. [PMID: 37883485 PMCID: PMC10984652 DOI: 10.1080/10872981.2023.2272387] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Accepted: 10/13/2023] [Indexed: 10/28/2023]
Abstract
BACKGROUND Virtual simulation experiments have been rapidly applied to medical education curricula in recent years. China constructed a national virtual simulation experimental teaching center (iLAB-X), and this platform covered almost all of the virtual simulation experiment curricula of domestic colleges or universities. We aimed to comprehensively assess the characteristics and usages of virtual simulation experiments in medical education based on iLAB-X. METHODS A total of 480 virtual simulation experiment courses had been constructed on iLAB-X (https://www.ilab-x.com/) by December 20, 2022, and the curriculum level, type and design were all searched in this platform. We also conducted an evaluation of curriculum usage and online tests, including the page view, frequency of participation, number of participants, duration of experimental learning and passing rate of the experimental test. RESULTS The national and provincial high-quality virtual simulation experiment curricula accounted for 33.5% (161/480) and 35.8% (172/480), respectively. The curricula were mainly set as basic practice experiments (46.5%) and synthetic designing experiments (48.8%). Significantly, forensic medicine (100%), public health and preventive medicine (83%) and basic medical sciences (66%) focused on synthetic design experiments. In terms of usage experiments, the average duration of experimental learning was 25 minutes per course, and the average number of participants was just 1257. The average passing (score ≥60) rate of online tests was 80.6%, but the average rate of score ≥ 85 was only 58.5%. In particular, the average page views, the number of participants, the duration of learning and the test passing rate of clinical medicine were relatively low. CONCLUSIONS The curriculum design features, construction level and utilization rate varied in different medical majors. Virtual simulation experiments are particularly underutilized in clinical medicine. There is a long way for virtual simulation experiments to go to become a supplement or alternative for traditional medical education in the future.
Collapse
Affiliation(s)
- Hui Zhu
- Department of Internal Medicine, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Jin Xu
- School of Public Health, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
- Zhejiang Key Laboratory of Pathophysiology, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Penghao Wang
- School of Public Health, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Hongyi Liu
- School of Public Health, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Tao Chen
- School of Public Health, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Zhijia Zhao
- School of Public Health, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| | - Lindan Ji
- Zhejiang Key Laboratory of Pathophysiology, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
- Department of Biochemistry and Molecular Biology, School of Basic Medical Sciences, Health Science Center, Ningbo University, Ningbo, Zhejiang, P. R. China
| |
Collapse
|
17
|
Nazzal EM, Zsidai B, Hiemstra LA, Lustig S, Samuelsson K, Musahl V. Applications of Extended Reality in Orthopaedic Surgery. J Bone Joint Surg Am 2023; 105:1721-1729. [PMID: 37713502 DOI: 10.2106/jbjs.22.00805] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 09/17/2023]
Abstract
➤ Extended reality is a term that encompasses different modalities, including virtual reality, augmented reality, and mixed reality.➤ Although fully immersive virtual reality has benefits for developing procedural memory and technical skills, augmented and mixed reality are more appropriate modalities for preoperative planning and intraoperative utilization.➤ Current investigations on the role of extended reality in preoperative planning and intraoperative utilization are still in the early stages, but preliminarily show that extended reality technologies can help surgeons to be more accurate and efficient.
Collapse
Affiliation(s)
- Ehab M Nazzal
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
| | - Bálint Zsidai
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
- Department of Orthopaedics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Laurie A Hiemstra
- Banff Sport Medicine, Banff, Alberta, Canada
- Department of Surgery, University of Calgary, Calgary, Alberta, Canada
| | - Sébastien Lustig
- Department of Orthopaedic Surgery and Sports Medicine, FIFA Medical Center of Excellence, Croix-Rousse Hospital, Lyon University Hospital, Lyon, France
| | - Kristian Samuelsson
- Department of Orthopaedics, Institute of Clinical Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Department of Orthopaedics, Sahlgrenska University Hospital, Mölndal, Sweden
| | - Volker Musahl
- Department of Orthopaedic Surgery, UPMC Freddie Fu Sports Medicine Center, University of Pittsburgh, Pittsburgh, Pennsylvania
| |
Collapse
|
18
|
Kim K, Yang H, Lee J, Lee WG. Metaverse Wearables for Immersive Digital Healthcare: A Review. Adv Sci (Weinh) 2023; 10:e2303234. [PMID: 37740417 PMCID: PMC10625124 DOI: 10.1002/advs.202303234] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 07/15/2023] [Indexed: 09/24/2023]
Abstract
The recent exponential growth of metaverse technology has been instrumental in reshaping a myriad of sectors, not least digital healthcare. This comprehensive review critically examines the landscape and future applications of metaverse wearables toward immersive digital healthcare. The key technologies and advancements that have spearheaded the metamorphosis of metaverse wearables are categorized, encapsulating all-encompassed extended reality, such as virtual reality, augmented reality, mixed reality, and other haptic feedback systems. Moreover, the fundamentals of their deployment in assistive healthcare (especially for rehabilitation), medical and nursing education, and remote patient management and treatment are investigated. The potential benefits of integrating metaverse wearables into healthcare paradigms are multifold, encompassing improved patient prognosis, enhanced accessibility to high-quality care, and high standards of practitioner instruction. Nevertheless, these technologies are not without their inherent challenges and untapped opportunities, which span privacy protection, data safeguarding, and innovation in artificial intelligence. In summary, future research trajectories and potential advancements to circumvent these hurdles are also discussed, further augmenting the incorporation of metaverse wearables within healthcare infrastructures in the post-pandemic era.
Collapse
Affiliation(s)
- Kisoo Kim
- Intelligent Optical Module Research CenterKorea Photonics Technology Institute (KOPTI)Gwangju61007Republic of Korea
| | - Hyosill Yang
- Department of NursingCollege of Nursing ScienceKyung Hee UniversitySeoul02447Republic of Korea
| | - Jihun Lee
- Department of Mechanical EngineeringCollege of EngineeringKyung Hee UniversityYongin17104Republic of Korea
| | - Won Gu Lee
- Department of Mechanical EngineeringCollege of EngineeringKyung Hee UniversityYongin17104Republic of Korea
| |
Collapse
|
19
|
Sun L, Liu D, Lian J, Yang M. Application of flipped classroom combined with virtual simulation platform in clinical biochemistry practical course. BMC Med Educ 2023; 23:771. [PMID: 37845661 PMCID: PMC10577961 DOI: 10.1186/s12909-023-04735-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Accepted: 09/28/2023] [Indexed: 10/18/2023]
Abstract
BACKGROUND The study explores an innovative teaching mode that integrates Icourse, DingTalk, and online experimental simulation platforms to provide online theoretical and experimental resources for clinical biochemistry practical courses. These platforms, combined with flipped classroom teaching, aim to increase student engagement and benefit in practical courses, ultimately improving the effectiveness of clinical biochemistry practical teaching. METHODS In a prospective cohort study, we examined the impact of integrating the Icourse and DingTalk platforms to provide theoretical knowledge resources and clinical cases to 48 medical laboratory science students from the 2019 and 2020 grades. Students were assigned to the experimental group using an overall sampling method, and had access to relevant videos through Icourse before and during class. Using a flipped classroom approach, students actively participated in the design, analysis, and discussion of the experimental technique. For the experimental operation part, students participated in virtual simulation experiments and actual experiments. Overall, the study aimed to evaluate students' theoretical and operational performance after completing the practical course. To collect feedback, we distributed a questionnaire to students in the experimental group. For comparison, we included 42 students from the grades of 2017 and 2018 who received traditional instruction and were evaluated using standard textbooks as the control group. RESULTS The experimental group scored significantly higher than the control group on both the theoretical and experimental operational tests (82.45 ± 3.76 vs. 76.36 ± 3.96, P = 0.0126; 92.03 ± 1.62 vs. 81.67 ± 4.19, P < 0.001). The survey revealed that the experimental group preferred the teaching mode that combined the flipped classroom with the virtual simulation platform. This mixed method effectively promoted understanding of basic knowledge (93.8%, 45/48), operative skills (89.6%, 43/48), learning interest (87.5%, 42/48), clinical thinking (85.4%, 41/48), self-learning ability (91.7%, 44/48), and overall satisfaction compared with traditional methods (P < 0.05). This study demonstrates that an innovative teaching approach significantly improves the quality of clinical biochemistry practical courses and promotes students' professional development and self-directed learning habits. CONCLUSION Incorporating virtual simulation with flipped classrooms into clinical biochemistry practical teaching is an efficient and well-received alternative to traditional methods.
Collapse
Affiliation(s)
- Liangbo Sun
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China
| | - Dong Liu
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China
| | - Jiqin Lian
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China.
| | - Mingzhen Yang
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China.
| |
Collapse
|
20
|
Pergolizzi J, LeQuang JAK, Vasiliu-Feltes I, Breve F, Varrassi G. Brave New Healthcare: A Narrative Review of Digital Healthcare in American Medicine. Cureus 2023; 15:e46489. [PMID: 37927734 PMCID: PMC10623488 DOI: 10.7759/cureus.46489] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Accepted: 09/30/2023] [Indexed: 11/07/2023] Open
Abstract
The digital revolution has had a profound effect on American and global healthcare, which was accelerated by the pandemic and telehealth applications. Digital health also includes popular and more esoteric forms of wearable monitoring systems and interscatter and other wireless technologies that facilitate their telemetry. The rise in artificial intelligence (AI) and machine learning (ML) may serve to improve interpretation from imaging technologies to electrocardiography or electroencephalographic tracings, and new ML techniques may allow these systems to scan data to discern and contextualize patterns that may have evaded human physicians. The necessity of virtual care during the pandemic has morphed into new treatment paradigms, which have gained patient acceptance but still raise issues with respect to privacy laws and credentialing. Augmented and virtual reality tools can facilitate surgical planning and "hands-on" clinical training activities. Patients are working with new frontiers in digital health in the form of "Dr. Google" and patient support websites to learn or share medical information. Patient-facing digital health information is both a blessing and curse, in that it can be a boon to health-literate patients who seek to be more active in their own care. On the other hand, digital health information can lead to false conclusions, catastrophizing, misunderstandings, and "cyberchondria." The role of blockchain, familiar from cryptocurrency, may play a role in future healthcare information and would serve as a disruptive, decentralizing, and potentially beneficial change. These important changes are both exciting and perplexing as clinicians and their patients learn to navigate this new system and how we address the questions it raises, such as medical privacy in a digital age. The goal of this review is to explore the vast range of digital health and how it may impact the healthcare system.
Collapse
Affiliation(s)
| | | | | | - Frank Breve
- Department of Pharmacy, Temple University, Philadelphia, USA
| | | |
Collapse
|
21
|
Aguayo C, Videla R, López-Cortés F, Rossel S, Ibacache C. Ethical enactivism for smart and inclusive STEAM learning design. Heliyon 2023; 9:e19205. [PMID: 37662760 PMCID: PMC10472010 DOI: 10.1016/j.heliyon.2023.e19205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 07/14/2023] [Accepted: 08/14/2023] [Indexed: 09/05/2023] Open
Abstract
Current global challenges of the 21st century promote STEAM (science, technology, engineering, arts and mathematics) education and digitalization as a means for humans to be the central actors in the construction of a sustainable society that favors a sense of worth and global wellbeing. In this scenario, new educational technology tools and immersive learning affordances (possibilities), offer unprecedented potential for the design of smart and dynamic learning systems and contexts that can enhance learning processes across varied audiences and educational settings. However, current STEAM education practice lacks attention to equipping all citizens with the necessary skills to use digital technologies in an ethical, critical and creative way. This gap calls for attention in design processes, principles and practices that are attentive to ethical considerations and values-based approaches. On the other hand, in its formulation STEAM as an educational approach is framed in four fundamental pillars: creativity, inclusion, citizenship and emerging technologies, which also put attention on the inclusion of disadvantaged and underrepresented social groups during STEAM education design. Following an apparent need to explore ethical and inclusive design in STEAM education, and inspired in the 4E cognition framework, ethical enactivism and embodied and ecosomaesthetics experience design, here we propose a theoretical framework grounded on systems thinking for the design of smart and dynamic STEAM learning systems and settings. The framework is aimed at STEAM educational psychologists, educational technologists, learning designers and educational practitioners who wish to address the global challenges of 21st century education by means of creative, innovative and inclusive education design.
Collapse
Affiliation(s)
- Claudio Aguayo
- AppLab, Te Ara Poutama, Faculty of Māori and Indigenous Development, Auckland University of Technology, Auckland, 1010, New Zealand
- INNOVA STEAM Lab, La Serena, 17800000, Chile
- Buckminster College, Vrije Universiteit Brussel, Brussels, Belgium
| | - Ronnie Videla
- Escuela de Educación Diferencial, Facultad de Educación, Universidad Santo Tomás, La Serena, 17800000, Chile
- INNOVA STEAM Lab, La Serena, 17800000, Chile
- Buckminster College, Vrije Universiteit Brussel, Brussels, Belgium
| | - Francisco López-Cortés
- Departamento de Biología, Facultad de Ciencias, Universidad de La Serena, La Serena, 1720256, Chile
| | - Sebastián Rossel
- Facultad de Humanidades, Universidad de La Serena, La Serena, 17800000, Chile
| | | |
Collapse
|
22
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. Sensors (Basel) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
23
|
Dos Santos TT, Piuvezam G, Medeiros GCBS, Mata ÁNDS, Silva Júnior DDN, Martínez DG, Pardo Ríos M. Extended reality as a health education strategy of adolescents at school: protocol for systematic review and meta-analysis. BMJ Open 2023; 13:e072438. [PMID: 37407033 DOI: 10.1136/bmjopen-2023-072438] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 07/07/2023] Open
Abstract
INTRODUCTION Extended reality (XR) is the ensemble of interactive experiences based on a computer-simulated environment that encompasses virtual reality and augmented reality and has been proven to be potentially innovative in the field of health education with adolescents. The objective of this study is to present a systematic review and meta-analysis protocol that seeks to evaluate the main effects of interventions that use XR on health parameters (food intake, sound quality and physical activity) of adolescent students. METHODS AND ANALYSIS The literature search will be performed in the following databases: MEDLINE, Embase, Scopus, ERIC, ScienceDirect, Web of Science, Cochrane, LILACS, APA and ADOLEC. Intervention studies (clinical trials-randomised or non-randomised) and quasi-experimental studies will be included. The risk of bias will be assessed using the Risk of Bias in Non-randomized Studies of Interventions tool for randomised controlled trials (RCTs), non-RCTs and quasi-experimental trials. Two independent researchers will conduct all the assessments, and any disagreements will be consulted with a third reviewer. Data analysis and synthesis will be performed using RevMan V.5.4.1 software. The study will be conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols guideline. ETHICS AND DISSEMINATION Ethical approval and human consent were not required, as this is a protocol for a systematic review and only secondary data will be used. The findings will be published in a journal and presented at conferences. In case of any changes to this protocol, it will be updated in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses website, and the modifications will be explained in the final report of this review. PROSPERO REGISTRATION NUMBER CRD42022373876.
Collapse
Affiliation(s)
- Thais Teixeira Dos Santos
- Postgraduate Program in Public Health (PPGSCoL) and Systematic Review and Meta-analysis Laboratory (Lab-Sys/CNPq), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
| | - Grasiela Piuvezam
- Postgraduate Program in Public Health (PPGSCoL) and Systematic Review and Meta-analysis Laboratory (Lab-Sys/CNPq), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
- Department of Public Health (DSC), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
| | - Gidyenne Christine Bandeira Silva Medeiros
- Postgraduate Program in Public Health (PPGSCoL) and Systematic Review and Meta-analysis Laboratory (Lab-Sys/CNPq), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
- Department of Nutrition (DNUT), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
| | - Ádala Nayana de Sousa Mata
- Multicampi School of Medical Sciences of Rio Grande do Norte (EMCM), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
| | - Danyllo do Nascimento Silva Júnior
- Postgraduate Program in Public Health (PPGSCoL) and Systematic Review and Meta-analysis Laboratory (Lab-Sys/CNPq), Federal University of Rio Grande do Norte (UFRN), Natal, Brazil
| | | | - Manuel Pardo Ríos
- Faculty of Nursing, San Antonio Catholic University of Murcia (UCAM), Murcia, Spain
| |
Collapse
|
24
|
Hemme CL, Carley R, Norton A, Ghumman M, Nguyen H, Ivone R, Menon JU, Shen J, Bertin M, King R, Leibovitz E, Bergstrom R, Cho B. Developing virtual and augmented reality applications for science, technology, engineering and math education. Biotechniques 2023; 75:343-352. [PMID: 37291856 PMCID: PMC10505987 DOI: 10.2144/btn-2023-0029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Accepted: 05/30/2023] [Indexed: 06/10/2023] Open
Abstract
The Rhode Island IDeA Network of Biomedical Research Excellence Molecular Informatics Core at the University of Rhode Island Information Technology Services Innovative Learning Technologies developed virtual and augmented reality applications to teach concepts in biomedical science, including pharmacology, medicinal chemistry, cell culture and nanotechnology. The apps were developed as full virtual reality/augmented reality and 3D gaming versions, which do not require virtual reality headsets. Development challenges included creating intuitive user interfaces, text-to-voice functionality, visualization of molecules and implementing complex science concepts. In-app quizzes are used to assess the user's understanding of topics, and user feedback was collected for several apps to improve the experience. The apps were positively reviewed by users and are being implemented into the curriculum at the University of Rhode Island.
Collapse
Affiliation(s)
- Christopher L Hemme
- Rhode Island IDeA Network of Biomedical Research Excellence (RI-INBRE)
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Rachel Carley
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Arielle Norton
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Moez Ghumman
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Hannah Nguyen
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Ryan Ivone
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Jyothi U Menon
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
- College of Engineering, University of Rhode Island, Kingston, RI 02881, USA
| | - Jie Shen
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
- College of Engineering, University of Rhode Island, Kingston, RI 02881, USA
| | - Matthew Bertin
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | - Roberta King
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| | | | - Roy Bergstrom
- Information Technology Services, Innovative Learning Technologies Program, University of Rhode Island, Kingston, RI 02881, USA
| | - Bongsup Cho
- Rhode Island IDeA Network of Biomedical Research Excellence (RI-INBRE)
- College of Pharmacy, University of Rhode Island, Kingston, RI 02881, USA
| |
Collapse
|
25
|
Morsch R, Landgraeber S, Strauss DJ. [New perspectives in orthopedics : Developments through neurotechnology and metaverse]. Orthopadie (Heidelb) 2023:10.1007/s00132-023-04400-7. [PMID: 37289216 DOI: 10.1007/s00132-023-04400-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Accepted: 05/11/2023] [Indexed: 06/09/2023]
Abstract
The combination of neurotechnology and metaverse holds high potentials for orthopedics, as it offers a broad spectrum of possibilities to overcome the limits of traditional medical care. The vision of a medical metaverse providing the infrastructure as a link for innovative technologies opens up new opportunities for therapy, medical collaborations and practical, personalized training for aspiring physicians. However, risks and challenges, such as security and privacy, health-related issues, acceptance by patients and doctors, as well as technical hurdles and access to the technologies, remain. Hence, future research and development is paramount. Nonetheless, due to technological progress, the exploration of new research areas, and the improved availability of the technologies paired with cost reduction, the prospects for neurotechnology and metaverse in orthopedics are promising.
Collapse
Affiliation(s)
- Richard Morsch
- Systems Neuroscience & Neurotechnology Unit, Medizinische Fakultät, Universität des Saarlandes, Homburg, Deutschland.
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Homburg, Deutschland.
- Universität des Saarlandes, Saarbrücken, Deutschland.
| | - Stefan Landgraeber
- Klinik für Orthopädie und Orthopädische Chirurgie, Universität des Saarlandes, Homburg, Deutschland
- Universität des Saarlandes, Saarbrücken, Deutschland
| | - Daniel J Strauss
- Systems Neuroscience & Neurotechnology Unit, Medizinische Fakultät, Universität des Saarlandes, Homburg, Deutschland
- Hochschule für Technik und Wirtschaft des Saarlandes, Saarbrücken, Deutschland
- Universität des Saarlandes, Saarbrücken, Deutschland
| |
Collapse
|
26
|
Yuan J, Hassan SS, Wu J, Koger CR, Packard RRS, Shi F, Fei B, Ding Y. Extended reality for biomedicine. Nat Rev Methods Primers 2023; 3:15. [PMID: 37051227 PMCID: PMC10088349 DOI: 10.1038/s43586-023-00208-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Extended reality (XR) refers to an umbrella of methods that allows users to be immersed in a three-dimensional (3D) or a 4D (spatial + temporal) virtual environment to different extents, including virtual reality (VR), augmented reality (AR), and mixed reality (MR). While VR allows a user to be fully immersed in a virtual environment, AR and MR overlay virtual objects over the real physical world. The immersion and interaction of XR provide unparalleled opportunities to extend our world beyond conventional lifestyles. While XR has extensive applications in fields such as entertainment and education, its numerous applications in biomedicine create transformative opportunities in both fundamental research and healthcare. This Primer outlines XR technology from instrumentation to software computation methods, delineating the biomedical applications that have been advanced by state-of-the-art techniques. We further describe the technical advances overcoming current limitations in XR and its applications, providing an entry point for professionals and trainees to thrive in this emerging field.
Collapse
Affiliation(s)
- Jie Yuan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Sohail S. Hassan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Jiaojiao Wu
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Casey R. Koger
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - René R. Sevag Packard
- Division of Cardiology, Department of Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
- Ronald Reagan UCLA Medical Center, Los Angeles, CA United States
- Veterans Affairs West Los Angeles Medical Center, Los Angeles, CA, United States
| | - Feng Shi
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Baowei Fei
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Department of Radiology, UT Southwestern Medical Center, Dallas, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
| | - Yichen Ding
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
- Hamon Center for Regenerative Science and Medicine, UT Southwestern Medical Center, Dallas, TX, United States
| |
Collapse
|
27
|
Sokołowska B. Impact of Virtual Reality Cognitive and Motor Exercises on Brain Health. Int J Environ Res Public Health 2023; 20:4150. [PMID: 36901160 PMCID: PMC10002333 DOI: 10.3390/ijerph20054150] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Revised: 02/22/2023] [Accepted: 02/23/2023] [Indexed: 06/18/2023]
Abstract
Innovative technologies of the 21st century have an extremely significant impact on all activities of modern humans. Among them, virtual reality (VR) offers great opportunities for scientific research and public health. The results of research to date both demonstrate the beneficial effects of using virtual worlds, and indicate undesirable effects on bodily functions. This review presents interesting recent findings related to training/exercise in virtual environments and its impact on cognitive and motor functions. It also highlights the importance of VR as an effective tool for assessing and diagnosing these functions both in research and modern medical practice. The findings point to the enormous future potential of these rapidly developing innovative technologies. Of particular importance are applications of virtual reality in basic and clinical neuroscience.
Collapse
Affiliation(s)
- Beata Sokołowska
- Bioinformatics Laboratory, Mossakowski Medical Research Institute, Polish Academy of Sciences, 02-106 Warsaw, Poland
| |
Collapse
|
28
|
Lee KN, Kim HJ, Choe K, Cho A, Kim B, Seo J, Myung W, Park JY, Oh KJ. Effects of Fetal Images Produced in Virtual Reality on Maternal-Fetal Attachment: Randomized Controlled Trial. J Med Internet Res 2023; 25:e43634. [PMID: 36826976 PMCID: PMC10007014 DOI: 10.2196/43634] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 12/07/2022] [Accepted: 01/19/2023] [Indexed: 01/20/2023] Open
Abstract
BACKGROUND Maternal-fetal attachment (MFA) has been reported to be associated with the postpartum mother-infant relationship. Seeing the fetus through ultrasound might influence MFA, and the effect could be increased by more realistic images, such as those generated in virtual reality (VR). OBJECTIVE The aim was to determine the effect of fetal images generated in VR on MFA and depressive symptoms through a prenatal-coaching mobile app. METHODS This 2-arm parallel randomized controlled trial involved a total of 80 pregnant women. Eligible women were randomly assigned to either a mobile app-only group (n=40) or an app plus VR group (n=40). The VR group experienced their own baby's images generated in VR based on images obtained from fetal ultrasonography. The prenatal-coaching mobile app recommended health behavior for the pregnant women according to gestational age, provided feedback on entered data for maternal weight, blood pressure, and glucose levels, and included a private diary service for fetal ultrasound images. Both groups received the same app, but the VR group also viewed fetal images produced in VR; these images were stored in the app. All participants filled out questionnaires to assess MFA, depressive symptoms, and other basic medical information. The questionnaires were filled out again after the interventions. RESULTS Basic demographic data were comparable between the 2 groups. Most of the assessments showed comparable results for the 2 groups, but the mean score to assess interaction with the fetus was significantly higher for the VR group than the control group (0.4 vs 0.1, P=.004). The proportion of participants with an increased score for this category after the intervention was significantly higher in the VR group than the control group (43% vs 13%, P=.005). The feedback questionnaire revealed that scores for the degree of perception of fetal appearance all increased after the intervention in the VR group. CONCLUSIONS The use of a mobile app with fetal images in VR significantly increased maternal interaction with the fetus. TRIAL REGISTRATION ClinicalTrials.gov NCT04942197; https://clinicaltrials.gov/ct2/show/NCT04942197.
Collapse
Affiliation(s)
- Kyong-No Lee
- Department of Obstetrics and Gynecology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam-si, Gyeonggi-do, Republic of Korea
| | - Hyeon Ji Kim
- Department of Obstetrics and Gynecology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam-si, Gyeonggi-do, Republic of Korea
| | - Kiroong Choe
- Department of Computer Science and Engineering, Seoul National University, Seoul, Republic of Korea
| | - Aeri Cho
- Department of Computer Science and Engineering, Seoul National University, Seoul, Republic of Korea
| | - Bohyoung Kim
- Division of Biomedical Engineering, Hankuk University of Foreign Studies, Gyeonggi-do, Republic of Korea
| | - Jinwook Seo
- Department of Computer Science and Engineering, Seoul National University, Seoul, Republic of Korea
| | - Woojae Myung
- Department of Neuropsychiatry, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam-si, Gyeonggi-do, Republic of Korea
| | - Jee Yoon Park
- Department of Obstetrics and Gynecology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam-si, Gyeonggi-do, Republic of Korea
| | - Kyung Joon Oh
- Department of Obstetrics and Gynecology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam-si, Gyeonggi-do, Republic of Korea
| |
Collapse
|
29
|
Awori J, Friedman SD, Howard C, Kronmal R, Buddhe S. Comparative effectiveness of virtual reality (VR) vs 3D printed models of congenital heart disease in resident and nurse practitioner educational experience. 3D Print Med 2023; 9:2. [PMID: 36773171 PMCID: PMC9918815 DOI: 10.1186/s41205-022-00164-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 12/13/2022] [Indexed: 02/12/2023] Open
Abstract
BACKGROUND Medical trainees frequently note that cardiac anatomy is difficult to conceive within a two dimensional framework. The specific anatomic defects and the subsequent pathophysiology in flow dynamics may become more apparent when framed in three dimensional models. Given the evidence of improved comprehension using such modeling, this study aimed to contribute further to that understanding by comparing Virtual Reality (VR) and 3D printed models (3DP) in medical education. OBJECTIVES We sought to systematically compare the perceived subjective effectiveness of Virtual Reality (VR) and 3D printed models (3DP) in the educational experience of residents and nurse practitioners. METHODS Trainees and practitioners underwent individual 15-minute teaching sessions in which features of a developmentally typical heart as well as a congenitally diseased heart were demonstrated using both Virtual Reality (VR) and 3D printed models (3DP). Participants then briefly explored each modality before filling out a short survey in which they identified which model (3DP or VR) they felt was more effective in enhancing their understanding of cardiac anatomy and associated pathophysiology. The survey included a binary summative assessment and a series of Likert scale questions addressing usefulness of each model type and degree of comfort with each modality. RESULTS Twenty-seven pediatric residents and 3 nurse practitioners explored models of a developmentally typical heart and tetralogy of Fallot pathology. Most participants had minimal prior exposure to VR (1.1 ± 0.4) or 3D printed models (2.1 ± 1.5). Participants endorsed a greater degree of understanding with VR models (8.5 ± 1) compared with 3D Printed models (6.3 ± 1.8) or traditional models of instruction (5.5 ± 1.5) p < 0.001. Most participants felt comfortable with modern technology (7.6 ± 2.1). 87% of participants preferred VR over 3DP. CONCLUSIONS Our study shows that, overall, VR was preferred over 3DP models by pediatric residents and nurse practitioners for understanding cardiac anatomy and pathophysiology.
Collapse
Affiliation(s)
- Jonathan Awori
- Division of Pediatric Cardiology and Radiology, Seattle Children's Hospital, Seattle, WA, USA.
| | - Seth D. Friedman
- grid.240741.40000 0000 9026 4165Division of Pediatric Cardiology and Radiology, Seattle Children’s Hospital, Seattle, WA USA
| | - Christopher Howard
- grid.240741.40000 0000 9026 4165Division of Pediatric Cardiology and Radiology, Seattle Children’s Hospital, Seattle, WA USA
| | - Richard Kronmal
- grid.240741.40000 0000 9026 4165Division of Pediatric Cardiology and Radiology, Seattle Children’s Hospital, Seattle, WA USA
| | - Sujatha Buddhe
- grid.240741.40000 0000 9026 4165Division of Pediatric Cardiology and Radiology, Seattle Children’s Hospital, Seattle, WA USA
| |
Collapse
|
30
|
Wang X, Liang H, Li L, Zhou J, Song R. Contribution of the stereoscopic representation of motion-in-depth during visually guided feedback control. Cereb Cortex 2023:7030846. [PMID: 36750266 DOI: 10.1093/cercor/bhad010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 01/06/2023] [Accepted: 01/07/2023] [Indexed: 02/09/2023] Open
Abstract
Considerable studies have focused on the neural basis of visually guided tracking movement in the frontoparallel plane, whereas the neural process in real-world circumstances regarding the influence of binocular disparity and motion-in-depth (MID) perception is less understood. Although the role of stereoscopic versus monoscopic MID information has been extensively described for visual processing, its influence on top-down regulation for motor execution has not received much attention. Here, we orthogonally varied the visual representation (stereoscopic versus monoscopic) and motion direction (depth motion versus bias depth motion versus frontoparallel motion) during visually guided tracking movements, with simultaneous functional near-infrared spectroscopy recordings. Results show that the stereoscopic representation of MID could lead to more accurate movements, which was supported by specific neural activity pattern. More importantly, we extend prior evidence about the role of frontoparietal network in brain-behavior relationship, showing that occipital area, more specifically, visual area V2/V3 was also robustly involved in the association. Furthermore, by using the stereoscopic representation of MID, it is plausible to detect robust brain-behavior relationship even with small sample size at low executive task demand. Taken together, these findings highlight the importance of the stereoscopic representation of MID for investigating neural correlates of visually guided feedback control.
Collapse
Affiliation(s)
- Xiaolu Wang
- Key Laboratory of Sensing Technology and Biomedical Instrument of Guangdong Province, School of Biomedical Engineering, Sun Yat-sen University, Guangzhou 510006, China
| | - Haowen Liang
- State Key Laboratory of Optoelectronic Materials and Technology, Guangdong Marine Laboratory, School of Physics, Sun Yat-sen University, Guangzhou 510275, China
| | - Le Li
- Institute of Medical Research, Northwestern Polytechnical University, Xi'an 710072, China.,Department of Rehabilitation Medicine, The First Affiliated Hospital of Sun Yat-sen University, Guangzhou 510030, China
| | - Jianying Zhou
- State Key Laboratory of Optoelectronic Materials and Technology, Guangdong Marine Laboratory, School of Physics, Sun Yat-sen University, Guangzhou 510275, China
| | - Rong Song
- Key Laboratory of Sensing Technology and Biomedical Instrument of Guangdong Province, School of Biomedical Engineering, Sun Yat-sen University, Guangzhou 510006, China
| |
Collapse
|
31
|
Kasoju N, Remya NS, Sasi R, Sujesh S, Soman B, Kesavadas C, Muraleedharan CV, Varma PRH, Behari S. Digital health: trends, opportunities and challenges in medical devices, pharma and bio-technology. CSIT 2023; 11:11-30. [PMCID: PMC10089382 DOI: 10.1007/s40012-023-00380-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 03/27/2023] [Indexed: 04/12/2024]
Abstract
Digital health interventions refer to the use of digital technology and connected devices to improve health outcomes and healthcare delivery. This includes telemedicine, electronic health records, wearable devices, mobile health applications, and other forms of digital health technology. To this end, several research and developmental activities in various fields are gaining momentum. For instance, in the medical devices sector, several smart biomedical materials and medical devices that are digitally enabled are rapidly being developed and introduced into clinical settings. In the pharma and allied sectors, digital health-focused technologies are widely being used through various stages of drug development, viz. computer-aided drug design, computational modeling for predictive toxicology, and big data analytics for clinical trial management. In the biotechnology and bioengineering fields, investigations are rapidly growing focus on digital health, such as omics biology, synthetic biology, systems biology, big data and personalized medicine. Though digital health-focused innovations are expanding the horizons of health in diverse ways, here the development in the fields of medical devices, pharmaceutical technologies and biotech sectors, with emphasis on trends, opportunities and challenges are reviewed. A perspective on the use of digital health in the Indian context is also included.
Collapse
Affiliation(s)
- Naresh Kasoju
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - N. S. Remya
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - Renjith Sasi
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - S. Sujesh
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - Biju Soman
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - C. Kesavadas
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - C. V. Muraleedharan
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - P. R. Harikrishna Varma
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| | - Sanjay Behari
- Sree Chitra Tirunal Institute for Medical Science and Technology, Thiruvananthapuram, 695011 Kerala India
| |
Collapse
|
32
|
|
33
|
Wang G, Badal A, Jia X, Maltz JS, Mueller K, Myers KJ, Niu C, Vannier M, Yan P, Yu Z, Zeng R. Development of metaverse for intelligent healthcare. NAT MACH INTELL 2022; 4:922-929. [PMID: 36935774 PMCID: PMC10015955 DOI: 10.1038/s42256-022-00549-6] [Citation(s) in RCA: 26] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Accepted: 09/16/2022] [Indexed: 11/16/2022]
Abstract
The metaverse integrates physical and virtual realities, enabling humans and their avatars to interact in an environment supported by technologies such as high-speed internet, virtual reality, augmented reality, mixed and extended reality, blockchain, digital twins and artificial intelligence (AI), all enriched by effectively unlimited data. The metaverse recently emerged as social media and entertainment platforms, but extension to healthcare could have a profound impact on clinical practice and human health. As a group of academic, industrial, clinical and regulatory researchers, we identify unique opportunities for metaverse approaches in the healthcare domain. A metaverse of 'medical technology and AI' (MeTAI) can facilitate the development, prototyping, evaluation, regulation, translation and refinement of AI-based medical practice, especially medical imaging-guided diagnosis and therapy. Here, we present metaverse use cases, including virtual comparative scanning, raw data sharing, augmented regulatory science and metaversed medical intervention. We discuss relevant issues on the ecosystem of the MeTAI metaverse including privacy, security and disparity. We also identify specific action items for coordinated efforts to build the MeTAI metaverse for improved healthcare quality, accessibility, cost-effectiveness and patient satisfaction.
Collapse
Affiliation(s)
- Ge Wang
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | - Andreu Badal
- Division of Imaging, Diagnostics and Software Reliability, OSEL, CDRH, US Food and Drug Administration, Silver Spring, MD, USA
| | - Xun Jia
- Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD, USA
| | - Jonathan S. Maltz
- Molecular Imaging and Computed Tomography, GE Healthcare, Waukesha, WI, USA
| | - Klaus Mueller
- Computer Science Department, Stony Brook University, Stony Brook, NY, USA
| | | | - Chuang Niu
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | | | - Pingkun Yan
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | - Zhou Yu
- Canon Medical Research USA, Vernon Hills, IL, USA
| | - Rongping Zeng
- Division of Imaging, Diagnostics and Software Reliability, OSEL, CDRH, US Food and Drug Administration, Silver Spring, MD, USA
| |
Collapse
|
34
|
Osama M, Ateya AA, Ahmed Elsaid S, Muthanna A. Ultra-Reliable Low-Latency Communications: Unmanned Aerial Vehicles Assisted Systems. Information 2022; 13:430. [DOI: 10.3390/info13090430] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Ultra-reliable low-latency communication (uRLLC) is a group of fifth-generation and sixth-generation (5G/6G) cellular applications with special requirements regarding latency, reliability, and availability. Most of the announced 5G/6G applications are uRLLC that require an end-to-end latency of milliseconds and ultra-high reliability of communicated data. Such systems face many challenges since traditional networks cannot meet such requirements. Thus, novel network structures and technologies have been introduced to enable such systems. Since uRLLC is a promising paradigm that covers many applications, this work considers reviewing the current state of the art of the uRLLC. This includes the main applications, specifications, and main requirements of ultra-reliable low-latency (uRLL) applications. The design challenges of uRLLC systems are discussed, and promising solutions are introduced. The virtual and augmented realities (VR/AR) are considered the main use case of uRLLC, and the current proposals for VR and AR are discussed. Moreover, unmanned aerial vehicles (UAVs) are introduced as enablers of uRLLC. The current research directions and the existing proposals are discussed.
Collapse
|
35
|
Abstract
COVID-19 forced humanity to think about new ways of working globally without physically being present with other people, and eXtended Reality (XR) systems (defined as Virtual Reality, Augmented Reality and Mixed Reality) offer a potentially elegant solution. Previously seen as mainly for gaming, commercial and research institutions are investigating XR solutions to solve real world problems from training, simulation, mental health, data analysis, and studying disease progression. More recently large corporations such as Microsoft and Meta have announced they are developing the Metaverse as a new paradigm to interact with the digital world. This article will look at how visualization can leverage the Metaverse in bioinformatics research, the pros and cons of this technology, and what the future may hold.
Collapse
Affiliation(s)
- Stephen Taylor
- Analysis, Visualization and Informatics Group, MRC Weatherall Institute of Computational Biology, MRC Weatherall Institute of Molecular Medicine, Oxford, United Kingdom
- *Correspondence: Stephen Taylor,
| | - Shamit Soneji
- Division of Molecular Hematology, Department of Laboratory Medicine, Faculty of Medicine, BMC, Lund University, Lund, Sweden
- Lund Stem Cell Center, Faculty of Medicine, BMC, Lund University, Lund, Sweden
| |
Collapse
|
36
|
Abstract
Although substantial advancements have been achieved in robot-assisted surgery, the blueprint to existing snake robotics predominantly focuses on the preliminary structural design, control, and human–robot interfaces, with features which have not been particularly explored in the literature. This paper aims to conduct a review of planning and operation concepts of hyper-redundant serpentine robots for surgical use, as well as any future challenges and solutions for better manipulation. Current researchers in the field of the manufacture and navigation of snake robots have faced issues, such as a low dexterity of the end-effectors around delicate organs, state estimation and the lack of depth perception on two-dimensional screens. A wide range of robots have been analysed, such as the i²Snake robot, inspiring the use of force and position feedback, visual servoing and augmented reality (AR). We present the types of actuation methods, robot kinematics, dynamics, sensing, and prospects of AR integration in snake robots, whilst addressing their shortcomings to facilitate the surgeon’s task. For a smoother gait control, validation and optimization algorithms such as deep learning databases are examined to mitigate redundancy in module linkage backlash and accidental self-collision. In essence, we aim to provide an outlook on robot configurations during motion by enhancing their material compositions within anatomical biocompatibility standards.
Collapse
|
37
|
Buyego P, Katwesigye E, Kebirungi G, Nsubuga M, Nakyejwe S, Cruz P, McCarthy MC, Hurt D, Kambugu A, Arinaitwe JW, Ssekabira U, Jjingo D. Feasibility of virtual reality based training for optimising COVID-19 case handling in Uganda. BMC Med Educ 2022; 22:274. [PMID: 35418070 PMCID: PMC9006530 DOI: 10.1186/s12909-022-03294-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/06/2021] [Accepted: 03/22/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND Epidemics and pandemics are causing high morbidity and mortality on a still-evolving scale exemplified by the COVID-19 pandemic. Infection prevention and control (IPC) training for frontline health workers is thus essential. However, classroom or hospital ward-based training portends an infection risk due to the in-person interaction of participants. We explored the use of Virtual Reality (VR) simulations for frontline health worker training since it trains participants without exposing them to infections that would arise from in-person training. It does away with the requirement for expensive personal protective equipment (PPE) that has been in acute shortage and improves learning, retention, and recall. This represents the first attempt in deploying VR-based pedagogy in a Ugandan medical education context. METHODS We used animated VR-based simulations of bedside and ward-based training scenarios for frontline health workers. The training covered the donning and doffing of PPE, case management of COVID-19 infected individuals, and hand hygiene. It used VR headsets to actualize an immersive experience, via a hybrid of fully-interactive VR and 360° videos. The level of knowledge acquisition between individuals trained using this method was compared to similar cohorts previously trained in a classroom setting. That evaluation was supplemented by a qualitative assessment based on feedback from participants about their experience. RESULTS The effort resulted in a COVID-19 IPC curriculum adapted into VR, corresponding VR content, and a pioneer cohort of VR trained frontline health workers. The formalized comparison with classroom-trained cohorts showed relatively better outcomes by way of skills acquired, speed of learning, and rates of information retention (P-value = 4.0e-09). In the qualitative assessment, 90% of the participants rated the method as very good, 58.1% strongly agreed that the activities met the course objectives, and 97.7% strongly indicated willingness to refer the course to colleagues. CONCLUSION VR-based COVID-19 IPC training is feasible, effective and achieves enhanced learning while protecting participants from infections within a pandemic setting in Uganda. It is a delivery medium transferable to the contexts of other highly infectious diseases.
Collapse
Affiliation(s)
- Paul Buyego
- Infectious Diseases Institute, Makerere University, Kampala, Uganda
| | | | - Grace Kebirungi
- Infectious Diseases Institute, Makerere University, Kampala, Uganda
- The African Center of Excellence in Bioinformatics and Data Intensive Sciences, Makerere University, Kampala, Uganda
| | - Mike Nsubuga
- Infectious Diseases Institute, Makerere University, Kampala, Uganda
- The African Center of Excellence in Bioinformatics and Data Intensive Sciences, Makerere University, Kampala, Uganda
| | - Shirley Nakyejwe
- The African Center of Excellence in Bioinformatics and Data Intensive Sciences, Makerere University, Kampala, Uganda
| | - Phillip Cruz
- Office of Cyber Infrastructure and Computational Biology, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Meghan C McCarthy
- Office of Cyber Infrastructure and Computational Biology, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Darrell Hurt
- Office of Cyber Infrastructure and Computational Biology, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Andrew Kambugu
- Infectious Diseases Institute, Makerere University, Kampala, Uganda
| | | | - Umaru Ssekabira
- Infectious Diseases Institute, Makerere University, Kampala, Uganda
| | - Daudi Jjingo
- Infectious Diseases Institute, Makerere University, Kampala, Uganda.
- The African Center of Excellence in Bioinformatics and Data Intensive Sciences, Makerere University, Kampala, Uganda.
- Department of Computer Science, College of Computing and Information Sciences, Makerere University, Kampala, Uganda.
| |
Collapse
|
38
|
Cao Y, Yang Y, Qu X, Shi B, Xu L, Xue J, Wang C, Bai Y, Gai Y, Luo D, Li Z. A Self-Powered Triboelectric Hybrid Coder for Human-Machine Interaction. Small Methods 2022; 6:e2101529. [PMID: 35084114 DOI: 10.1002/smtd.202101529] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/12/2021] [Indexed: 06/14/2023]
Abstract
Human-machine interfaces have penetrated various academia and industry fields such as smartphones, robotic, virtual reality, and wearable electronics, due to their abundant functional sensors and information interaction methods. Nevertheless, most sensors' complex structural design, monotonous parameter detection capability, and single information coding communication hinder their rapid development. As the frontier of self-powered sensors, the triboelectric nanogenerator (TENG) has multiple working modes and high structural adaptability, which is a potential solution for multi-parameter sensing and miniaturizing of traditional interactive electronic devices. Herein, a self-powered hybrid coder (SHC) based on TENG is reported to encode two action parameters of touch and press, which can be used as a smart interface for human-machine interaction. The top-down hollow structure of the SHC, not only constructs a compositing mode to generate stable touch and press signals but also builds a hybrid coding platform for generating action codes in synergy mode. When a finger touches or presses the SHC, Morse code and Gray code can be transmitted for text information or remote control of electric devices. This self-powered coder is of reference value for designing an alternative human-machine interface and having the potential to contribute to the next generation of highly integrated portable smart electronics.
Collapse
Affiliation(s)
- Yu Cao
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, China
| | - Yuan Yang
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Xuecheng Qu
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Bojing Shi
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
- Beijing Advanced Innovation Centre for Biomedical Engineering, Key Laboratory for Biomechanics and Mechanobiology of Ministry of Education, School of Biological Science and Medical Engineering, Beihang University, Beijing, 100191, China
| | - Lingling Xu
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
| | - Jiangtao Xue
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
- Institute of Engineering Medicine, Beijing Institute of technology, Beijing, 100081, China
| | - Chan Wang
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
| | - Yuan Bai
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, China
| | - Yansong Gai
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, China
| | - Dan Luo
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
| | - Zhou Li
- Center on Nanoenergy Research, School of Physical Science and Technology, Guangxi University, Nanning, 530004, China
- Beijing Key Laboratory of Micro-nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, China
- Institute for Stem Cell and Regeneration, Chinese Academy of Sciences, Beijing, 100101, China
| |
Collapse
|
39
|
Laudanski K. Quo Vadis Anesthesiologist? The Value Proposition of Future Anesthesiologists Lies in Preserving or Restoring Presurgical Health after Surgical Insult. J Clin Med 2022; 11:1135. [PMID: 35207406 PMCID: PMC8879076 DOI: 10.3390/jcm11041135] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2022] [Accepted: 02/18/2022] [Indexed: 12/26/2022] Open
Abstract
This Special Issue of the Journal of Clinical Medicine is devoted to anesthesia and perioperative care [...].
Collapse
Affiliation(s)
- Krzysztof Laudanski
- Department of Anesthesiology and Critical Care, University of Pennsylvania, Philadelphia, PA 19104, USA; ; Tel.: +1-215-662-8000
- Leonard Davis Institute for Healthcare Economics, University of Pennsylvania, Philadelphia, PA 19104, USA
- Department of Neurology, University of Pennsylvania, Philadelphia, PA 19104, USA
| |
Collapse
|
40
|
Yin G, Zhang L, Dai T. Application and Visualization of Human 3D Anatomy Teaching for Healthy People Based on a Hybrid Network Model. J Healthc Eng 2022; 2022:3702479. [PMID: 35399827 PMCID: PMC8984063 DOI: 10.1155/2022/3702479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Accepted: 01/20/2022] [Indexed: 11/17/2022]
Abstract
With the development of computer technology, information technology, and 3D reconstruction technology of the medical human body, 3D virtual digital human body technology for human health has been widely used in various fields of medicine, especially in teaching students of application and anatomy. Its advantage is that it can view 3D human anatomy models from any angle and can be cut in any direction. In this paper, we propose an improved algorithm based on a hybrid density network and an element-level attention mechanism. The hybrid density network is used to generate feasible hypotheses for multiple 3D poses, solve the ambiguity problem in pose reasoning from 2D to 3D, and improve the performance of the network by adding the AReLU function combined with an element-wise attention mechanism. Teaching students in anatomy makes students' learning more convenient and teachers' teaching explanations more vivid. Comparative experiments show that the accuracy of 3D human pose estimation using a single image input is better than the other two-stage methods.
Collapse
Affiliation(s)
- Gang Yin
- School of Integrated Traditional Chinese and Western Medicine, Anhui University of Chinese Medicine, Hefei, Anhui 230012, China
| | - Luyao Zhang
- School of Medical Information Engineering, Anhui University of Chinese Medicine, Hefei, AnHui 230012, China
| | - Tingting Dai
- School of Medical Economics and Management, Anhui University of Chinese Medicine, Hefei, Anhui 230012, China
| |
Collapse
|
41
|
Bianchi I, Stefani CJM, Santiago P, Zanatta AL, Rieder R. AnemiaAR: a serious game to support teaching of haematology. J Vis Commun Med 2022; 45:134-153. [PMID: 35129054 DOI: 10.1080/17453054.2021.2021798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Serious games can be suitable tools for educational support in different areas of knowledge, such as Medicine. These applications, combined with technologies like mixed and augmented reality, provide a differentiated user experience that can keep or improve the interest and motivation of students and teachers during the teaching and learning process. In this context, this study aimed to present the development of AnemiaAR, a mixed reality serious game to support the teaching of haematology, helpful for students and professors in the visualisation and presentation of anaemia concepts. Fourteen medical students from the University of Passo Fundo participated in a pilot study to evaluate the application, considering a sociodemographic questionnaire, a questionnaire based on the Technology Acceptance Model, and two modules of the Game Experience Questionnaire. The preliminary results were satisfactory, showing good acceptance and positive experience of the game, besides improvements. The study also pointed out differences in the game evaluation among participants, considering the prior experience in games, the previously attended haematology subject, and the time spent performing the game tasks.
Collapse
Affiliation(s)
- Isabela Bianchi
- Graduate Program in Applied Computing, University of Passo Fundo, Passo Fundo, Brazil.,Institute of Exact Sciences and Geosciences, University of Passo Fundo, Passo Fundo, Brazil
| | - Cassiano J M Stefani
- Institute of Exact Sciences and Geosciences, University of Passo Fundo, Passo Fundo, Brazil
| | - Pablo Santiago
- Faculty of Medicine, University of Passo Fundo, Passo Fundo, Brazil
| | - Alexandre L Zanatta
- Graduate Program in Applied Computing, University of Passo Fundo, Passo Fundo, Brazil.,Institute of Exact Sciences and Geosciences, University of Passo Fundo, Passo Fundo, Brazil
| | - Rafael Rieder
- Graduate Program in Applied Computing, University of Passo Fundo, Passo Fundo, Brazil.,Institute of Exact Sciences and Geosciences, University of Passo Fundo, Passo Fundo, Brazil
| |
Collapse
|
42
|
Guérinot C, Marcon V, Godard C, Blanc T, Verdier H, Planchon G, Raimondi F, Boddaert N, Alonso M, Sailor K, Lledo PM, Hajj B, El Beheiry M, Masson JB. New Approach to Accelerated Image Annotation by Leveraging Virtual Reality and Cloud Computing. Front Bioinform 2022; 1:777101. [PMID: 36303792 PMCID: PMC9580868 DOI: 10.3389/fbinf.2021.777101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/15/2021] [Indexed: 01/02/2023] Open
Abstract
Three-dimensional imaging is at the core of medical imaging and is becoming a standard in biological research. As a result, there is an increasing need to visualize, analyze and interact with data in a natural three-dimensional context. By combining stereoscopy and motion tracking, commercial virtual reality (VR) headsets provide a solution to this critical visualization challenge by allowing users to view volumetric image stacks in a highly intuitive fashion. While optimizing the visualization and interaction process in VR remains an active topic, one of the most pressing issue is how to utilize VR for annotation and analysis of data. Annotating data is often a required step for training machine learning algorithms. For example, enhancing the ability to annotate complex three-dimensional data in biological research as newly acquired data may come in limited quantities. Similarly, medical data annotation is often time-consuming and requires expert knowledge to identify structures of interest correctly. Moreover, simultaneous data analysis and visualization in VR is computationally demanding. Here, we introduce a new procedure to visualize, interact, annotate and analyze data by combining VR with cloud computing. VR is leveraged to provide natural interactions with volumetric representations of experimental imaging data. In parallel, cloud computing performs costly computations to accelerate the data annotation with minimal input required from the user. We demonstrate multiple proof-of-concept applications of our approach on volumetric fluorescent microscopy images of mouse neurons and tumor or organ annotations in medical images.
Collapse
Affiliation(s)
- Corentin Guérinot
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
- Sorbonne Université, Collège Doctoral, Paris, France
| | - Valentin Marcon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Charlotte Godard
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Thomas Blanc
- Sorbonne Université, Collège Doctoral, Paris, France
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Histopathology and Bio-Imaging Group, Sanofi R&D, Vitry-Sur-Seine, France
- Université de Paris, UFR de Physique, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Francesca Raimondi
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Unité Médicochirurgicale de Cardiologie Congénitale et Pédiatrique, Centre de Référence des Malformations Cardiaques Congénitales Complexes M3C, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Mariana Alonso
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Kurt Sailor
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Pierre-Marie Lledo
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Bassam Hajj
- Sorbonne Université, Collège Doctoral, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| |
Collapse
|
43
|
Cmiel V, Chmelikova L, Zumberg I, Kralik M. A Novel Gesture-Based Control System for Fluorescence Volumetric Data in Virtual Reality. Sensors (Basel) 2021; 21:8329. [PMID: 34960422 DOI: 10.3390/s21248329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 12/06/2021] [Accepted: 12/09/2021] [Indexed: 12/04/2022]
Abstract
With the development of light microscopy, it is becoming increasingly easy to obtain detailed multicolor fluorescence volumetric data. The need for their appropriate visualization has become an integral part of fluorescence imaging. Virtual reality (VR) technology provides a new way of visualizing multidimensional image data or models so that the entire 3D structure can be intuitively observed, together with different object features or details on or within the object. With the need for imaging advanced volumetric data, demands for the control of virtual object properties are increasing; this happens especially for multicolor objects obtained by fluorescent microscopy. Existing solutions with universal VR controllers or software-based controllers with the need to define sufficient space for the user to manipulate data in VR are not usable in many practical applications. Therefore, we developed a custom gesture-based VR control system with a custom controller connected to the FluoRender visualization environment. A multitouch sensor disk was used for this purpose. Our control system may be a good choice for easier and more comfortable manipulation of virtual objects and their properties, especially using confocal microscopy, which is the most widely used technique for acquiring volumetric fluorescence data so far.
Collapse
|