1
|
Cheon H, Song JA, Kim J, Jung S, Kim GJ. Virtual Reality-Based Education Program for Managing Behavioral and Psychological Symptoms of Dementia: Development and Feasibility Test. Comput Inform Nurs 2024; 42:118-126. [PMID: 38129321 DOI: 10.1097/cin.0000000000001096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2023]
Abstract
This study aims to develop a virtual reality-based education program for managing behavioral and psychological symptoms of dementia for family carers of persons living with dementia and investigate the feasibility for users. The program was developed through literature review, interviews with family carers, surveys, and expert content validity assessment. User feasibility was evaluated quantitatively through a questionnaire on usefulness, ease of use, and satisfaction, and qualitatively through participant interviews. The program was produced in two parts, Type 1 and Type 2, consisting of three and six episodes, respectively. Participants showed a high level of satisfaction with overall program scores of 4.28 ± 0.66 and 4.34 ± 0.41 for the two evaluations. Participants also expressed that both programs were helpful, Type 1 for achieving changes in attitude associated with more understanding of persons living with dementia and Type 2 for acquiring coping methods through communication training. Use of the virtual reality device was not inconvenient and was identified as helpful due to the high immersion experience. Results of this study confirmed that family carers had no resistance to education using new technologies such as virtual reality devices and that virtual reality-based education could be effective for training family carers.
Collapse
Affiliation(s)
- Hongjin Cheon
- Author Affiliations: College of Nursing (Drs Cheon and Song) and BK21 FOUR R&E Center for Learning Health Systems (Dr Song), Korea University, Seoul; Department of Nursing, Seojeong University (Dr Kim), Yangju; College of Nursing, Chonnam National University (Dr Jung), Gwangju; and College of Informatics, Korea University (Dr Kim), Seoul, Republic of Korea
| | | | | | | | | |
Collapse
|
2
|
Würstle S, Spanke LM, Mehlhase N, Stanley G, Koff J, Dimitriadis S, König S, Hann A. Evaluation of a Virtual Reality-Based Open Educational Resource Software. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2024; 11:23821205241242220. [PMID: 38572090 PMCID: PMC10989036 DOI: 10.1177/23821205241242220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Accepted: 02/28/2024] [Indexed: 04/05/2024]
Abstract
OBJECTIVES Virtual reality (VR) teaching methods have potential to support medical students acquire increasing amounts of knowledge. EVENT (Easy VR EducatioN Tool) is an open educational resource software for immersive VR environments, which is designed for use without programming skills. In this work, EVENT was used in a medical student VR course on pancreatic cancer. METHODS Medical students were invited to participate in the course. Before and after VR simulation, participants completed a multiple-choice knowledge assessment, with a maximum score of 10, and a VR experience questionnaire. The primary endpoint compared pre- and post-VR simulation test scores. Secondary endpoints included usability and factors that could affect learning growth and test results. RESULTS Data from 117 of the 135 participating students was available for analysis. Student test scores improved by an average of 3.4 points (95% CI 3.1-3.7, P < 0.001) after VR course. The secondary endpoints of gender, age, prior knowledge regarding the medical subject, professional training completed in the medical field, video game play, three-dimensional imagination skills, or cyber-sickness had no major impact on test scores or final ranking (top or bottom 25%). The 27 students whose post-VR simulation test scores ranked in the top 25% had no prior experience with VR. The average System Usability Scale score was 86.1, which corresponds to an excellent outcome for user-friendliness. Questionnaire responses post-VR simulation show students (81.2% [95/117]) interest in more VR options in medical school. CONCLUSIONS We present a freely available software that allows for the development of VR teaching lessons without programming skills.
Collapse
Affiliation(s)
- Silvia Würstle
- Department of Internal Medicine II, Infectious Diseases, University Hospital Frankfurt, Goethe University Frankfurt, Frankfurt, Germany
- Department of Internal Medicine II, University Hospital rechts der Isar, School of Medicine, Technical University of Munich, Munich, Germany
| | - Lisa-Marie Spanke
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, Gastroenterology, University Hospital Würzburg, Würzburg, Germany
- Institute of Medical Teaching and Medical Education Research, University Hospital Würzburg, Würzburg, Germany
| | - Niklas Mehlhase
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, Gastroenterology, University Hospital Würzburg, Würzburg, Germany
| | - Gail Stanley
- Department of Internal Medicine, Yale School of Medicine, New Haven, CT 06520, USA
| | - Jonathan Koff
- Department of Internal Medicine, Yale School of Medicine, New Haven, CT 06520, USA
| | - Stavros Dimitriadis
- Department of Gastroenterology, University Hospital Coventry and Warwickshire, Coventry, CV2 2DX, UK
| | - Sarah König
- Institute of Medical Teaching and Medical Education Research, University Hospital Würzburg, Würzburg, Germany
| | - Alexander Hann
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, Gastroenterology, University Hospital Würzburg, Würzburg, Germany
| |
Collapse
|
3
|
Román-Belmonte JM, Rodríguez-Merchán EC, De la Corte-Rodríguez H. Metaverse applied to musculoskeletal pathology: Orthoverse and Rehabverse. Postgrad Med 2023; 135:440-448. [PMID: 36786393 DOI: 10.1080/00325481.2023.2180953] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Accepted: 02/13/2023] [Indexed: 02/15/2023]
Abstract
The Metaverse is 'an integrated network of 3D virtual worlds.' It incorporates digitally created realities into the real world, involves virtual copies of existing places and changes the physical reality by superimposing digital aspects, allowing its users to interact with these elements in an immersive, real-time experience. The applications of the Metaverse are numerous, with an increasing number of experiences in the field of musculoskeletal disease management. In the field of medical training, the Metaverse can help facilitate the learning experience and help develop complex clinical skills. In clinical care, the Metaverse can help clinicians perform orthopedic surgery more accurately and safely and can improve pain management, the performance of rehabilitation techniques and the promotion of healthy lifestyles. Virtualization can also optimize aspects of healthcare information and management, increasing the effectiveness of procedures and the functioning of organizations. This optimization can be especially relevant in departments that are under significant care provider pressure. However, we must not lose sight of the fundamental challenges that still need to be solved, such as ensuring patient privacy and fairness. Several studies are underway to assess the feasibility and safety of the Metaverse.
Collapse
Affiliation(s)
- Juan M Román-Belmonte
- Department of Physical Medicine and Rehabilitation, Cruz Roja San José y Santa Adela University Hospital, Madrid, Spain
| | - E Carlos Rodríguez-Merchán
- Department of Orthopedic Surgery, La Paz University Hospital, Madrid, Spain
- Osteoarticular Surgery Research, Hospital La Paz Institute for Health Research - IdiPAZ (La Paz University Hospital - Autonomous University of Madrid), Madrid, Spain
| | | |
Collapse
|
4
|
Kim J, Kim W, Oh H, Lee S. Progressive Contextual Aggregation Empowered by Pixel-Wise Confidence Scoring for Image Inpainting. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2023; 32:1200-1214. [PMID: 37022427 DOI: 10.1109/tip.2023.3238317] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Image inpainting methods leverage the similarity of adjacent pixels to create alternative content. However, as the invisible region becomes larger, the pixels completed in the deeper hole are difficult to infer from the surrounding pixel signal, which is more prone to visual artifacts. To help fill this void, we adopt an alternative progressive hole-filling scheme that hierarchically fills the corrupted region in the feature and image spaces. This technique allows us to utilize reliable contextual information of the surrounding pixels, even for large hole samples, and then gradually complete the details as the resolution increases. For a more realistic representation of the completed region, we devise a pixel-wise dense detector. By distinguishing each pixel as whether it is a masked region or not, and passing the gradient to all resolutions, the generator further enhances the potential quality of the compositing. Furthermore, the completed images at different resolutions are then merged using a proposed structure transfer module (STM) that incorporates fine-grained local and coarse-grained global interactions. In this new mechanism, each completed image at the different resolutions attends its closest composition at fine granularity adjacent image and thus can capture the global continuity by interacting both short- and long-range dependencies. By comparing our solutions qualitatively and quantitatively with state-of-the-art methods, we conclude that our model exhibits a significantly improved visual quality, even in the case of large holes.
Collapse
|
5
|
Lee K, Kim W, Lee S. From Human Pose Similarity Metric to 3D Human Pose Estimator: Temporal Propagating LSTM Networks. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2023; 45:1781-1797. [PMID: 35377839 DOI: 10.1109/tpami.2022.3164344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
Predicting a 3D pose directly from a monocular image is a challenging problem. Most pose estimation methods proposed in recent years have shown 'quantitatively' good results (below ∼ 50mm). However, these methods remain 'perceptually' flawed because their performance is only measured via a simple distance metric. Although this fact is well understood, the reliance on 'quantitative' information implies that the development of 3D pose estimation methods has been slowed down. To address this issue, we first propose a perceptual Pose SIMilarity (PSIM) metric, by assuming that human perception (HP) is highly adapted to extracting structural information from a given signal. Second, we present a perceptually robust 3D pose estimation framework: Temporal Propagating Long Short-Term Memory networks (TP-LSTMs). Toward this, we analyze the information-theory-based spatio-temporal posture correlations, including joint interdependency, temporal consistency, and HP. The experimental results clearly show that the proposed PSIM metric achieves a superior correlation with users' subjective opinions than conventional pose metrics. Furthermore, we demonstrate the significant quantitative and perceptual performance improvements of TP-LSTMs compared to existing state-of-the-art methods.
Collapse
|
6
|
Li N, Sun N, Cao C, Hou S, Gong Y. Review on visualization technology in simulation training system for major natural disasters. NATURAL HAZARDS (DORDRECHT, NETHERLANDS) 2022; 112:1851-1882. [PMID: 35308193 PMCID: PMC8923969 DOI: 10.1007/s11069-022-05277-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2021] [Accepted: 01/27/2022] [Indexed: 06/14/2023]
Abstract
Major natural disasters have occurred frequently in the last few years, resulting in increased loss of life and economic damage. Most emergency responders do not have first-hand experience with major natural disasters, and thus, there is an urgent need for pre-disaster training. Due to the scenes unreality of traditional emergency drills, the failure to appeal to the target audience and the novel coronavirus pandemic, people are forced to maintain safe social distancing. Therefore, it is difficult to carry out transregional or transnational emergency drills in many countries under the lockdown. There is an increasing demand for simulation training systems that use virtual reality, augmented reality, and mixed reality visualization technologies to simulate major natural disasters. The simulation training system related to natural disasters provides a new way for popular emergency avoidance science education and emergency rescue personnel to master work responsibilities and improve emergency response capabilities. However, to our knowledge, there is no overview of the simulation training system for major natural disasters. Hence, this paper uncovers the visualization techniques commonly used in simulation training systems, and compares, analyses and summarizes the architecture and functions of the existing simulation training systems for different emergency phases of common natural disasters. In addition, the limitations of the existing simulation training system in practical applications and future development directions are discussed to provide reference for relevant researchers to better understand the modern simulation training system.
Collapse
Affiliation(s)
- Ning Li
- Institute of Disaster and Emergency Medicine, Tianjin University, Tianjin, 300072 China
- Wenzhou Safety (Emergency) Institute, Tianjin University, Wenzhou, 325000 China
- Tianjin Key Laboratory of Disaster Medicine Technology, Tianjin, 300072 China
| | - Na Sun
- Institute of Disaster and Emergency Medicine, Tianjin University, Tianjin, 300072 China
- Wenzhou Safety (Emergency) Institute, Tianjin University, Wenzhou, 325000 China
- Tianjin Key Laboratory of Disaster Medicine Technology, Tianjin, 300072 China
| | - Chunxia Cao
- Institute of Disaster and Emergency Medicine, Tianjin University, Tianjin, 300072 China
- Wenzhou Safety (Emergency) Institute, Tianjin University, Wenzhou, 325000 China
- Tianjin Key Laboratory of Disaster Medicine Technology, Tianjin, 300072 China
| | - Shike Hou
- Institute of Disaster and Emergency Medicine, Tianjin University, Tianjin, 300072 China
- Wenzhou Safety (Emergency) Institute, Tianjin University, Wenzhou, 325000 China
- Tianjin Key Laboratory of Disaster Medicine Technology, Tianjin, 300072 China
| | - Yanhua Gong
- Institute of Disaster and Emergency Medicine, Tianjin University, Tianjin, 300072 China
- Wenzhou Safety (Emergency) Institute, Tianjin University, Wenzhou, 325000 China
- Tianjin Key Laboratory of Disaster Medicine Technology, Tianjin, 300072 China
| |
Collapse
|
7
|
Cybersickness and Its Severity Arising from Virtual Reality Content: A Comprehensive Study. SENSORS 2022; 22:s22041314. [PMID: 35214216 PMCID: PMC8963115 DOI: 10.3390/s22041314] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Revised: 01/30/2022] [Accepted: 01/31/2022] [Indexed: 12/03/2022]
Abstract
Virtual reality (VR) experiences often elicit a negative effect, cybersickness, which results in nausea, disorientation, and visual discomfort. To quantitatively analyze the degree of cybersickness depending on various attributes of VR content (i.e., camera movement, field of view, path length, frame reference, and controllability), we generated cybersickness reference (CYRE) content with 52 VR scenes that represent different content attributes. A protocol for cybersickness evaluation was designed to collect subjective opinions from 154 participants as reliably as possible in conjunction with objective data such as rendered VR scenes and biological signals. By investigating the data obtained through the experiment, the statistically significant relationships—the degree that the cybersickness varies with each isolated content factor—are separately identified. We showed that the cybersickness severity was highly correlated with six biological features reflecting brain activities (i.e., relative power spectral densities of Fp1 delta, Fp 1 beta, Fp2 delta, Fp2 gamma, T4 delta, and T4 beta waves) with a coefficient of determination greater than 0.9. Moreover, our experimental results show that individual characteristics (age and susceptibility) are also quantitatively associated with cybersickness level. Notably, the constructed dataset contains a number of labels (i.e., subjective cybersickness scores) that correspond to each VR scene. We used these labels to build cybersickness prediction models and obtain a reliable predictive performance. Hence, the proposed dataset is supposed to be widely applicable in general-purpose scenarios regarding cybersickness quantification.
Collapse
|