1
|
Baaden M, Glowacki DR. Virtual reality in drug design: Benefits, applications and industrial perspectives. Curr Opin Struct Biol 2025; 92:103044. [PMID: 40199042 DOI: 10.1016/j.sbi.2025.103044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2024] [Revised: 03/13/2025] [Accepted: 03/14/2025] [Indexed: 04/10/2025]
Abstract
Virtual reality (VR) is a tool which has transformative potential in domains which involve the visualization of complex 3D data such as structure-based drug design (SBDD), where it offers new ways to visualize and manipulate complex molecular structures in three dimensions, and enable intuitive exploration of protein-ligand complexes. In this article, we outline three levels of interaction which are available in immersive VR environments for drug discovery, and provide illustrative case studies with applications in COVID-19 research and protein-ligand docking. We discuss VR's role in drug discovery based on conversations with experts from the pharmaceutical industry. While industry experts are mostly optimistic about the potential of VR, they point to the challenges related to integration with existing workflows and the need for improved hardware ergonomics, as well as ensuring a synergistic relationship between VR and an expanding suite of artificial intelligence (AI) tools.
Collapse
Affiliation(s)
- Marc Baaden
- Université Paris Cité, CNRS, Laboratoire de Biochimie Théorique, 13 rue Pierre et Marie Curie, 75005, Paris, France.
| | - David R Glowacki
- Intangible Realities Laboratory, CiTIUS∼Centro Singular de Investigación en Tecnoloxías Intelixentes da USC, Rúa de Jenaro de la Fuente Domínguez s/n, 15782, Santiago de Compostela, Spain.
| |
Collapse
|
2
|
Li Y, Chen X, Huang YD, He Q, Li D, Hu S, Xu P, Chen T, Ran X. Comparing between virtual reality based pre-clinical implantation training and traditional learning methods. PeerJ 2025; 13:e18891. [PMID: 40017652 PMCID: PMC11867032 DOI: 10.7717/peerj.18891] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2024] [Accepted: 01/02/2025] [Indexed: 03/01/2025] Open
Abstract
Objective As dental implanting becomes an increasing demand among patients with tooth loss, an efficient and effective training for students is to be necessary. In this case, we anticipate the possible application of virtual reality (VR) technology to pre-clinical implantation training (PCIT) in order to improve the students' learning efficiency and effectiveness. Methods The study divided 20 subjects into two groups on average-VR based PCIT (experimental group) and traditional PCIT (control group) with the completion of the background survey (BS) before PCITs, to guarantee no apparent backgroud variation including learning of oral implantology and VR technology, learning habits, interests and hobbies, etc. All subjects received identical professional tests (T-1, T-2, T-3) before, in and after PCITs to assess the knowledge mastery condition and maintaining levels. Along with both PCITs, the subjective evaluation tests (SET) were distributed to collect the subjective feedback data so as to analyze the preference to each PCIT. Meanwhile the total interaction time, learning duration per subject were recorded for the performance analysis. Results The results show that from T-1 to T-2 period, the score of VR based PCIT increased significantly (p < 0.05). And the results of SET show that subjects in VR based PCIT generally obtain over one score higher than the ones in traditional PCIT as for the items of "Convenience", "Interest", "Comfort", "Confidence" and "Subjective initiative" except "Precision". During both PCITs, VR based PCIT shows a shorter learning duration and sufficient one-on-one interaction opportunities. Conclusion Compared with traditional PCIT, VR based PCIT has obvious influence on enhancing students' knowledge mastery, study willingness and learning efficiency.
Collapse
Affiliation(s)
- Yangjie Li
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Xu Chen
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Yuan ding Huang
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Qingqing He
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Dize Li
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Shanshan Hu
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Peng Xu
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Tao Chen
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| | - Xiongwen Ran
- Stomatological Hospital of Chongqing Medical University, Chongqing Medical University, Chongqing, China
| |
Collapse
|
3
|
Blanc T, Godard C, Grevent D, El Beheiry M, Salomon LJ, Hajj B, Masson JB. Photorealistic rendering of fetal faces from raw magnetic resonance imaging data. ULTRASOUND IN OBSTETRICS & GYNECOLOGY : THE OFFICIAL JOURNAL OF THE INTERNATIONAL SOCIETY OF ULTRASOUND IN OBSTETRICS AND GYNECOLOGY 2025. [PMID: 39825872 DOI: 10.1002/uog.29165] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/24/2024] [Revised: 11/11/2024] [Accepted: 11/29/2024] [Indexed: 01/20/2025]
Affiliation(s)
- T Blanc
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, Paris, France
| | - C Godard
- Decision and Bayesian Computation, Neuroscience & Computational Biology Departments, CNRS UMR 3571, Institut Pasteur, Paris, France
- Epiméthée, INRIA, Paris, France
- AVATAR MEDICAL, Paris, France
| | - D Grevent
- LUMIERE Platform, EA Fetus 7328, Université de Paris Cité, Paris, France
- Department of Radiology, Necker-Enfants Malades Hospital, AP-HP, Paris, France
| | - M El Beheiry
- Decision and Bayesian Computation, Neuroscience & Computational Biology Departments, CNRS UMR 3571, Institut Pasteur, Paris, France
- Epiméthée, INRIA, Paris, France
- AVATAR MEDICAL, Paris, France
| | - L J Salomon
- LUMIERE Platform, EA Fetus 7328, Université de Paris Cité, Paris, France
- Department of Obstetrics, Fetal Medicine and Surgery, Necker-Enfants Malades Hospital, AP-HP, Paris, France
| | - B Hajj
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, Paris, France
| | - J-B Masson
- Decision and Bayesian Computation, Neuroscience & Computational Biology Departments, CNRS UMR 3571, Institut Pasteur, Paris, France
- Epiméthée, INRIA, Paris, France
- AVATAR MEDICAL, Paris, France
| |
Collapse
|
4
|
Streuber M, Allgaier M, Schwab R, Behme D, Saalfeld S. A VR neurointerventional setup for catheter-based interventions focusing on visualizing the risk of radiation. Comput Biol Med 2024; 183:109224. [PMID: 39427425 DOI: 10.1016/j.compbiomed.2024.109224] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2024] [Revised: 09/20/2024] [Accepted: 09/26/2024] [Indexed: 10/22/2024]
Abstract
Interventional neuroradiologists carry out their minimally-invasive procedure by using X-rays within the setup of a Bi-plane Digital Subtraction Angiography. This work provides an immersive virtual reality (VR) environment where the physicians can perform an simulated catheter-based intervention. Since radiation is invisible, the risk of radiation exposure can be enhanced virtually. Our goal is to see whether radiation visualization influences medical professionals such that they take a more mindful approach towards interactions on the operating table then without one. We tested our scenario within an expert study where ten neuroradiologists participated and solved intervention-related tasks. Our expert study found that while visualization does not affect the placement of the radiation shield by the physician, the overall radiation exposure using visualization does decrease as users standing very close move to a greater distance from the table. Furthermore, our System Usability Scale evaluation revealed a high score for this approach's usability.
Collapse
Affiliation(s)
- Marcus Streuber
- Research Campus STIMULATE, University of Magdeburg, 39104, Germany.
| | - Mareen Allgaier
- Research Campus STIMULATE, University of Magdeburg, 39104, Germany
| | - Roland Schwab
- Research Campus STIMULATE, University of Magdeburg, 39104, Germany; Clinic for Neuroradiology, University Hospital of Magdeburg, 39120, Germany
| | - Daniel Behme
- Research Campus STIMULATE, University of Magdeburg, 39104, Germany; Clinic for Neuroradiology, University Hospital of Magdeburg, 39120, Germany
| | - Sylvia Saalfeld
- Research Campus STIMULATE, University of Magdeburg, 39104, Germany; University Hospital Schleswig-Holstein Campus Kiel, Kiel, 24118, Germany
| |
Collapse
|
5
|
Algarni YA, Saini RS, Vaddamanu SK, Quadri SA, Gurumurthy V, Vyas R, Baba SM, Avetisyan A, Mosaddad SA, Heboyan A. The impact of virtual reality simulation on dental education: A systematic review of learning outcomes and student engagement. J Dent Educ 2024; 88:1549-1562. [PMID: 38807268 DOI: 10.1002/jdd.13619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2024] [Revised: 05/09/2024] [Accepted: 05/16/2024] [Indexed: 05/30/2024]
Abstract
PURPOSE Virtual reality (VR) simulations have been increasingly employed to train dental students prior to clinical practice. According to the literature, blended learning designs in the form of VR simulations can be utilized by both dental students and instructors to provide quality education. They can also save time and improve motor skills before students enter clinical stages. Therefore, this study was designed to review the importance of available VR simulators and their impact on student learning and outcomes. METHOD The Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines were followed to review the literature systematically, and different databases such as PubMed, ScienceDirect, Cochrane Library, Scopus, and Google Scholar were searched (up to December 2023) for relevant articles using keywords: "virtual reality," "virtual reality simulators," "virtual reality simulation," and "dental education." The Mixed Methods Appraisal Tool was used to assess the study quality. RESULTS After a comprehensive literature search, 1477 research articles were identified, of which 16 were included in the present study. In terms of students' learning outcomes, engagement, and optimal approach, a significant improvement was observed compared to conventional training methods, specifically in their knowledge, performance, confidence, and psychomotor skills. CONCLUSION The findings suggest that VR simulators enhance the overall learning abilities of dental students and should be regarded as an integral component of the current curriculum. However, it is important to recognize that VR simulators cannot fully substitute traditional training methods; rather, they can effectively complement them.
Collapse
Affiliation(s)
- Youssef Abdullah Algarni
- Department of Restorative Dental Sciences, College of Dentistry, King Khalid University, Abha, Saudi Arabia
| | - Ravinder S Saini
- Department of Dental Technology, COAMS, King Khalid University, Abha, Saudi Arabia
| | | | | | | | - Rajesh Vyas
- Department of Dental Technology, COAMS, King Khalid University, Abha, Saudi Arabia
| | - Suheel Manzoor Baba
- Department of Restorative Dental Sciences, College of Dentistry, King Khalid University, Abha, Saudi Arabia
| | - Anna Avetisyan
- Department of Therapeutic Stomatology, Faculty of Stomatology, Yerevan State Medical University after Mkhitar Heratsi, Yerevan, Armenia
| | - Seyed Ali Mosaddad
- Department of Research Analytics, Saveetha Dental College and Hospitals, Saveetha Institute of Medical and Technical Sciences, Saveetha University, Chennai, India
- Student Research Committee, School of Dentistry, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Artak Heboyan
- Department of Research Analytics, Saveetha Dental College and Hospitals, Saveetha Institute of Medical and Technical Sciences, Saveetha University, Chennai, India
- Department of Prosthodontics, Faculty of Stomatology, Yerevan State Medical University after Mkhitar Heratsi, Yerevan, Armenia
- Department of Prosthodontics, School of Dentistry, Tehran University of Medical Sciences, Tehran, Iran
| |
Collapse
|
6
|
Huang Y, Deng C, Peng M, Hao Y. Experiences and perceptions of palliative care patients receiving virtual reality therapy: a meta-synthesis of qualitative studies. BMC Palliat Care 2024; 23:182. [PMID: 39044242 PMCID: PMC11267777 DOI: 10.1186/s12904-024-01520-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2024] [Accepted: 07/12/2024] [Indexed: 07/25/2024] Open
Abstract
BACKGROUND The combination of virtual reality (VR) and palliative care potentially represents a new opportunity for palliative care. Many previous studies have evaluated the application of VR therapy to patients with advanced disease receiving palliative care. However, patient-perspective reviews to comprehensively understand the actual experiences and feelings of patients and provide practical guidance for designing future studies are currently lacking. This review of qualitative evidence aimed to explore the experiences and perceptions of patients receiving VR therapy in palliative care. METHODS This study was conducted in accordance with the Enhancing Transparency in Reporting the Synthesis of Qualitative Research (ENTREQ) statement guidelines. Ten databases, namely, PubMed, Web of Science, EBSCO, OVID MEDLINE, Scopus, John Wiley, ProQuest, CNKI, WANFANG DATA, and SinoMed, were searched, and qualitative and mixed studies from the establishment of each database to June 30, 2023 were included. The Joanna Briggs Institute Critical Appraisal Checklist for Qualitative Research was used to assess the quality of the included studies. The data included in the literature were analyzed and integrated by "thematic synthesis" to formalize the identification and development of themes. RESULTS The nine selected studies altogether included 156 participants from seven hospice care facilities of different types and two oncology centers. Three key themes were identified: experiences of palliative care patients in VR therapy, the perceived value that palliative care patients gain in VR therapy, and perspectives of palliative care patients toward using VR therapy. CONCLUSIONS The patients' feedback covered discomfort caused by VR devices, good sense of experiences, and situations that affected the interactive experience. Some patients were unable to tolerate VR therapy or reported newer forms of discomfort. The findings indicated that VR therapy may be an effective approach to relieve patients' physical and psychological pain and help them gain self-awareness. Moreover, patients showed a preference for personalized VR therapy.
Collapse
Affiliation(s)
- Yufei Huang
- College of Nursing, Guangzhou Medical University, Guangzhou, Guangdong, China
| | - Cunqing Deng
- College of Nursing, Guangzhou Medical University, Guangzhou, Guangdong, China
| | - Meifang Peng
- Department of Internal Medicine, Affiliated Cancer Hospital and Institute, Guangzhou Medical University, Guangzhou, Guangdong, China
| | - Yanping Hao
- College of Nursing, Guangzhou Medical University, Guangzhou, Guangdong, China.
| |
Collapse
|
7
|
Gudapati V, Chen A, Meyer S, Jay Kuo CC, Ding Y, Hsiai TK, Wang M. Development of a Machine Learning-Enabled Virtual Reality Tool for Preoperative Planning of Functional Endoscopic Sinus Surgery. J Neurol Surg Rep 2024; 85:e118-e123. [PMID: 39104747 PMCID: PMC11300101 DOI: 10.1055/a-2358-8928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Accepted: 05/16/2024] [Indexed: 08/07/2024] Open
Abstract
Objectives Virtual reality (VR) is an increasingly valuable teaching tool, but current simulators are not typically clinically scalable due to their reliance on inefficient manual segmentation. The objective of this project was to leverage a high-throughput and accurate machine learning method to automate data preparation for a patient-specific VR simulator used to explore preoperative sinus anatomy. Methods An endoscopic VR simulator was designed in Unity to enable interactive exploration of sinus anatomy. The Saak transform, a data-efficient machine learning method, was adapted to accurately segment sinus computed tomography (CT) scans using minimal training data, and the resulting data were reconstructed into three-dimensional (3D) patient-specific models that could be explored in the simulator. Results Using minimal training data, the Saak transform-based machine learning method offers accurate soft-tissue segmentation. When explored with an endoscope in the VR simulator, the anatomical models generated by the algorithm accurately capture key sinus structures and showcase patient-specific variability in anatomy. Conclusion By offering an automatic means of preparing VR models from a patient's raw CT scans, this pipeline takes a key step toward clinical scalability. In addition to preoperative planning, this system also enables virtual endoscopy-a tool that is particularly useful in the COVID-19 era. As VR technology inevitably continues to develop, such a foundation will help ensure that future innovations remain clinically accessible.
Collapse
Affiliation(s)
- Varun Gudapati
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| | - Alexander Chen
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| | - Scott Meyer
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| | - Chung-Chieh Jay Kuo
- Ming-Hsieh Department of Electrical Engineering, USC, Los Angeles, California, United States
| | - Yichen Ding
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| | - Tzung K. Hsiai
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| | - Marilene Wang
- David Geffen School of Medicine, UCLA, Los Angeles, California, United States
| |
Collapse
|
8
|
Warchoł J, Tetych A, Tomaszewski R, Kowalczyk B, Olchowik G. Virtual Reality-Induced Modification of Vestibulo-Ocular Reflex Gain in Posturography Tests. J Clin Med 2024; 13:2742. [PMID: 38792284 PMCID: PMC11122614 DOI: 10.3390/jcm13102742] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2024] [Revised: 05/03/2024] [Accepted: 05/05/2024] [Indexed: 05/26/2024] Open
Abstract
Background: The aim of the study was to demonstrate the influence of virtual reality (VR) exposure on postural stability and determine the mechanism of this influence. Methods: Twenty-six male participants aged 21-23 years were included, who underwent postural stability assessment twice before and after a few minute of single VR exposure. The VR projection was a computer-generated simulation of the surrounding scenery. Postural stability was assessed using the Sensory Organization Test (SOT), using Computerized Dynamic Posturography (CDP). Results: The findings indicated that VR exposure affects the visual and vestibular systems. Significant differences (p < 0.05) in results before and after VR exposure were observed in tests on an unstable surface. It was confirmed that VR exposure has a positive influence on postural stability, attributed to an increase in the sensory weight of the vestibular system. Partial evidence suggested that the reduction in vestibulo-ocular reflex (VOR) reinforcement may result in an adaptive shift to the optokinetic reflex (OKR). Conclusions: By modifying the process of environmental perception through artificial sensory simulation, the influence of VR on postural stability has been demonstrated. The validity of this type of research is determined by the effectiveness of VR techniques in the field of vestibular rehabilitation.
Collapse
Affiliation(s)
- Jan Warchoł
- Department of Biophysics, Medical University of Lublin, K. Jaczewskiego 4, 20-090 Lublin, Poland; (A.T.); (B.K.); (G.O.)
| | - Anna Tetych
- Department of Biophysics, Medical University of Lublin, K. Jaczewskiego 4, 20-090 Lublin, Poland; (A.T.); (B.K.); (G.O.)
| | - Robert Tomaszewski
- Department of Computer Science, University of Applied Sciences in Biala Podlaska, Sidorska 95/97, 21-500 Biala Podlaska, Poland;
| | - Bartłomiej Kowalczyk
- Department of Biophysics, Medical University of Lublin, K. Jaczewskiego 4, 20-090 Lublin, Poland; (A.T.); (B.K.); (G.O.)
| | - Grażyna Olchowik
- Department of Biophysics, Medical University of Lublin, K. Jaczewskiego 4, 20-090 Lublin, Poland; (A.T.); (B.K.); (G.O.)
| |
Collapse
|
9
|
Ogata Y, Kolchiba M. Virtual reality images created on the back and front of a display. OPTICS LETTERS 2024; 49:1632-1635. [PMID: 38489469 DOI: 10.1364/ol.515883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Accepted: 02/19/2024] [Indexed: 03/17/2024]
Abstract
To better investigate the biological mechanism of microorganisms, we developed a novel, to the best of our knowledge, virtual reality (VR) microscope that incorporates a head-mounted display (HMD) that creates VR images with a digital microscope. This type of VR microscope can be used with any type of optical microscope. The fabricated microscope is quite different from a common bifocal device because it can create VR images on the back and front of a display. If the VR images are displayed with object (OBJ) images, they are observable in [2 × 2] (back and front VR images and OBJ images; 2 × 2 = 4 images). This feature can provide important information on microscopic OBJs, which can be employed in 3D biological analysis. Furthermore, if a laser light source is added to this microscope, the images can be observed in [3 × 2] (back and front laser VR images, VR images, and OBJ images; 3 × 2 = 6 images). The lasers would also enable optical trapping and tracking, leading to improved biological analysis.
Collapse
|
10
|
Javvaji CK, Reddy H, Vagha JD, Taksande A, Kommareddy A, Reddy NS. Immersive Innovations: Exploring the Diverse Applications of Virtual Reality (VR) in Healthcare. Cureus 2024; 16:e56137. [PMID: 38618363 PMCID: PMC11016331 DOI: 10.7759/cureus.56137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2024] [Accepted: 03/14/2024] [Indexed: 04/16/2024] Open
Abstract
Virtual reality (VR) has experienced a remarkable evolution over recent decades, evolving from its initial applications in specific military domains to becoming a ubiquitous and easily accessible technology. This thorough review delves into the intricate domain of VR within healthcare, seeking to offer a comprehensive understanding of its historical evolution, theoretical foundations, and current adoption status. The examination explores the advantages of VR in enhancing the educational experience for medical students, with a particular focus on skill acquisition and retention. Within this exploration, the review dissects the applications of VR across diverse medical disciplines, highlighting its role in surgical training and anatomy/physiology education. While navigating the expansive landscape of VR, the review addresses challenges related to technology and pedagogy, providing insights into overcoming technical hurdles and seamlessly integrating VR into healthcare practices. Additionally, the review looks ahead to future directions and emerging trends, examining the potential impact of technological advancements and innovative applications in healthcare. This review illuminates the transformative potential of VR as a tool poised to revolutionize healthcare practices.
Collapse
Affiliation(s)
- Chaitanya Kumar Javvaji
- Pediatrics, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Harshitha Reddy
- Internal Medicine, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Jayant D Vagha
- Pediatrics, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Amar Taksande
- Pediatrics, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Anirudh Kommareddy
- Pediatrics, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| | - Naramreddy Sudheesh Reddy
- Pediatrics, Jawaharlal Nehru Medical College, Datta Meghe Institute of Higher Education and Research, Wardha, IND
| |
Collapse
|
11
|
Priya S, La Russa D, Walling A, Goetz S, Hartig T, Khayat A, Gupta P, Nagpal P, Ashwath R. "From Vision to Reality: Virtual Reality's Impact on Baffle Planning in Congenital Heart Disease". Pediatr Cardiol 2024; 45:165-174. [PMID: 37932525 DOI: 10.1007/s00246-023-03323-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/08/2023] [Accepted: 10/04/2023] [Indexed: 11/08/2023]
Abstract
This study aims to evaluate the feasibility and utility of virtual reality (VR) for baffle planning in congenital heart disease (CHD), specifically by creating patient-specific 3D heart models and assessing a user-friendly VR interface. Patient-specific 3D heart models were created using high-resolution imaging data and a VR interface was developed for baffle planning. The process of model creation and the VR interface were assessed for their feasibility, usability, and clinical relevance. Collaborative and interactive planning within the VR space were also explored. The study findings demonstrate the feasibility and usefulness of VR in baffle planning for CHD. Patient-specific 3D heart models generated from imaging data provided valuable insights into complex spatial relationships. The developed VR interface allowed clinicians to interact with the models, simulate different baffle configurations, and assess their impact on blood flow. The VR space's collaborative and interactive planning enhanced the baffle planning process. This study highlights the potential of VR as a valuable tool in baffle planning for CHD. The findings demonstrate the feasibility of using patient-specific 3D heart models and a user-friendly VR interface to enhance surgical planning and patient outcomes. Further research and development in this field are warranted to harness the full benefits of VR technology in CHD surgical management.
Collapse
Affiliation(s)
- Sarv Priya
- Department of Radiology, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA.
| | - Dan La Russa
- Realize Medical Inc., Ottawa, Canada
- Department of Radiology, Radiation Oncology and Medical Physics, University of Ottawa, Ottawa, Canada
| | - Abigail Walling
- Department of Radiology, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA
| | - Sawyer Goetz
- Department of Radiology, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA
| | - Tyler Hartig
- Department of Radiology, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA
| | | | - Pankaj Gupta
- Division of Pediatric Cardiology, The Royal Hospital for Children, Glasgow, UK
| | - Prashant Nagpal
- Department of Radiology, University of Wisconsin School of Medicine and Public Health, Madison, USA
| | - Ravi Ashwath
- Division of Pediatric Cardiology, Department of Pediatrics, University of Iowa Hospitals and Clinics, Iowa City, IA, USA
| |
Collapse
|
12
|
Roudot P, Legant WR, Zou Q, Dean KM, Isogai T, Welf ES, David AF, Gerlich DW, Fiolka R, Betzig E, Danuser G. u-track3D: Measuring, navigating, and validating dense particle trajectories in three dimensions. CELL REPORTS METHODS 2023; 3:100655. [PMID: 38042149 PMCID: PMC10783629 DOI: 10.1016/j.crmeth.2023.100655] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/12/2023] [Revised: 08/10/2023] [Accepted: 11/09/2023] [Indexed: 12/04/2023]
Abstract
We describe u-track3D, a software package that extends the versatile u-track framework established in 2D to address the specific challenges of 3D particle tracking. First, we present the performance of the new package in quantifying a variety of intracellular dynamics imaged by multiple 3D microcopy platforms and on the standard 3D test dataset of the particle tracking challenge. These analyses indicate that u-track3D presents a tracking solution that is competitive to both conventional and deep-learning-based approaches. We then present the concept of dynamic region of interest (dynROI), which allows an experimenter to interact with dynamic 3D processes in 2D views amenable to visual inspection. Third, we present an estimator of trackability that automatically defines a score for every trajectory, thereby overcoming the challenges of trajectory validation by visual inspection. With these combined strategies, u-track3D provides a complete framework for unbiased studies of molecular processes in complex volumetric sequences.
Collapse
Affiliation(s)
- Philippe Roudot
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA; Aix Marseille University, CNRS, Centrale Marseille, I2M, Turing Centre for Living Systems, Marseille, France.
| | - Wesley R Legant
- Joint Department of Biomedical Engineering, University of North Carolina at Chapel Hill, North Carolina State University, Chapel Hill, NC, USA; Department of Pharmacology, University of North Carolina, Chapel Hill, NC, USA
| | - Qiongjing Zou
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Kevin M Dean
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Tadamoto Isogai
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Erik S Welf
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Ana F David
- Institute of Molecular Biotechnology of the Austrian Academy of Sciences, Vienna BioCenter, Vienna, Austria
| | - Daniel W Gerlich
- Institute of Molecular Biotechnology of the Austrian Academy of Sciences, Vienna BioCenter, Vienna, Austria
| | - Reto Fiolka
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Eric Betzig
- Department of Molecular & Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Gaudenz Danuser
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA.
| |
Collapse
|
13
|
Im JE, Gu JY, Bae JH, Lee JG. Comparative study of 360° virtual reality and traditional two-dimensional video in nonface-to-face dental radiology classes: focusing on learning satisfaction and self-efficacy. BMC MEDICAL EDUCATION 2023; 23:855. [PMID: 37953275 PMCID: PMC10642063 DOI: 10.1186/s12909-023-04851-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/26/2023] [Accepted: 11/07/2023] [Indexed: 11/14/2023]
Abstract
BACKGROUND Acquiring adequate theoretical knowledge in the field of dental radiography (DR) is essential for establishing a good foundation at the prepractical stage. Currently, nonface-to-face DR education predominantly relies on two-dimensional (2D) videos, highlighting the need for developing educational resources that address the inherent limitations of this method. We developed a virtual reality (VR) learning medium using 360° video with a prefabricated head-mounted display (pHMD) for nonface-to-face DR learning and compared it with a 2D video medium. METHODS Forty-four participants were randomly assigned to a control group (n = 23; 2D video) and an experimental group (n = 21; 360° VR). DR was re-enacted by the operator and recorded using 360° video. A survey was performed to assess learning satisfaction and self-efficacy. The nonparametric statistical tests comparing the groups were conducted using SPSS statistical analysis software. RESULTS Learners in the experimental group could experience VR for DR by attaching their smartphones to the pHMD. The 360° VR video with pHMD provided a step-by-step guide for DR learning from the point of view of an operator as VR. Learning satisfaction and self-efficacy were statistically significantly higher in the experimental group than the control group (p < 0.001). CONCLUSIONS The 360° VR videos were associated with greater learning satisfaction and self-efficacy than conventional 2D videos. However, these findings do not necessarily substantiate the educational effects of this medium, but instead suggest that it may be considered a suitable alternative for DR education in a nonface-to-face environment. However, further examination of the extent of DR knowledge gained in a nonface-to-face setting is warranted. Future research should aim to develop simulation tools based on 3D objects and also explore additional uses of 360° VR videos as prepractical learning mediums.
Collapse
Affiliation(s)
- Ji-Eun Im
- Department of Dental Hygiene, Graduate School of Namseoul University, Cheonan, Republic of Korea
| | - Ja-Young Gu
- Department of Dental Hygiene, Sahmyook Health University, Seoul, Republic of Korea
| | - Jung-Hee Bae
- Department of Dental Hygiene, College of Health and Health Care, Namseoul University, Cheonan, Republic of Korea
| | - Jae-Gi Lee
- Department of Dental Hygiene, College of Health and Health Care, Namseoul University, Cheonan, Republic of Korea.
| |
Collapse
|
14
|
Sun L, Liu D, Lian J, Yang M. Application of flipped classroom combined with virtual simulation platform in clinical biochemistry practical course. BMC MEDICAL EDUCATION 2023; 23:771. [PMID: 37845661 PMCID: PMC10577961 DOI: 10.1186/s12909-023-04735-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Accepted: 09/28/2023] [Indexed: 10/18/2023]
Abstract
BACKGROUND The study explores an innovative teaching mode that integrates Icourse, DingTalk, and online experimental simulation platforms to provide online theoretical and experimental resources for clinical biochemistry practical courses. These platforms, combined with flipped classroom teaching, aim to increase student engagement and benefit in practical courses, ultimately improving the effectiveness of clinical biochemistry practical teaching. METHODS In a prospective cohort study, we examined the impact of integrating the Icourse and DingTalk platforms to provide theoretical knowledge resources and clinical cases to 48 medical laboratory science students from the 2019 and 2020 grades. Students were assigned to the experimental group using an overall sampling method, and had access to relevant videos through Icourse before and during class. Using a flipped classroom approach, students actively participated in the design, analysis, and discussion of the experimental technique. For the experimental operation part, students participated in virtual simulation experiments and actual experiments. Overall, the study aimed to evaluate students' theoretical and operational performance after completing the practical course. To collect feedback, we distributed a questionnaire to students in the experimental group. For comparison, we included 42 students from the grades of 2017 and 2018 who received traditional instruction and were evaluated using standard textbooks as the control group. RESULTS The experimental group scored significantly higher than the control group on both the theoretical and experimental operational tests (82.45 ± 3.76 vs. 76.36 ± 3.96, P = 0.0126; 92.03 ± 1.62 vs. 81.67 ± 4.19, P < 0.001). The survey revealed that the experimental group preferred the teaching mode that combined the flipped classroom with the virtual simulation platform. This mixed method effectively promoted understanding of basic knowledge (93.8%, 45/48), operative skills (89.6%, 43/48), learning interest (87.5%, 42/48), clinical thinking (85.4%, 41/48), self-learning ability (91.7%, 44/48), and overall satisfaction compared with traditional methods (P < 0.05). This study demonstrates that an innovative teaching approach significantly improves the quality of clinical biochemistry practical courses and promotes students' professional development and self-directed learning habits. CONCLUSION Incorporating virtual simulation with flipped classrooms into clinical biochemistry practical teaching is an efficient and well-received alternative to traditional methods.
Collapse
Affiliation(s)
- Liangbo Sun
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China
| | - Dong Liu
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China
| | - Jiqin Lian
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China.
| | - Mingzhen Yang
- Department of Clinical Biochemistry, Army Medical University, No. 30, Gaotanyan Street, Shapingba District, Chongqing, 400038, Chongqing, China.
| |
Collapse
|
15
|
Wu Y, Yi A, Ma C, Chen L. Artificial intelligence for video game visualization, advancements, benefits and challenges. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2023; 20:15345-15373. [PMID: 37679183 DOI: 10.3934/mbe.2023686] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Abstract
In recent years, the field of artificial intelligence (AI) has witnessed remarkable progress and its applications have extended to the realm of video games. The incorporation of AI in video games enhances visual experiences, optimizes gameplay and fosters more realistic and immersive environments. In this review paper, we systematically explore the diverse applications of AI in video game visualization, encompassing machine learning algorithms for character animation, terrain generation and lighting effects following the PRISMA guidelines as our review methodology. Furthermore, we discuss the benefits, challenges and ethical implications associated with AI in video game visualization as well as the potential future trends. We anticipate that the future of AI in video gaming will feature increasingly sophisticated and realistic AI models, heightened utilization of machine learning and greater integration with other emerging technologies leading to more engaging and personalized gaming experiences.
Collapse
Affiliation(s)
- Yueliang Wu
- School of Architecture and Art Design, Hunan University of Science and Technology, Xiangtan 411100, China
| | - Aolong Yi
- School of Architecture and Art Design, Hunan University of Science and Technology, Xiangtan 411100, China
| | - Chengcheng Ma
- School of Architecture and Art Design, Hunan University of Science and Technology, Xiangtan 411100, China
| | - Ling Chen
- College of Engineering and Design, Hunan Normal University, Changsha 410081, China
| |
Collapse
|
16
|
Wang H, Li D, Gu C, Wei W, Chen J. Research on high school students' behavior in art course within a virtual learning environment based on SVVR. Front Psychol 2023; 14:1218959. [PMID: 37519396 PMCID: PMC10379639 DOI: 10.3389/fpsyg.2023.1218959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Accepted: 06/28/2023] [Indexed: 08/01/2023] Open
Abstract
Introduction Students who use spherical video-based virtual reality (SVVR) teaching materials for learning are able to gain more self-regulated, explorative, and immersive experiences in a virtual environment. Using SVVR teaching materials in art courses can present diverse and unique teaching effects, while also leading to the emergence of students' flow states. Methods Therefore, through an art course teaching experiment, this study investigated 380 high school students and used structural equation modeling to analyze the antecedents and outcomes of students' flow state in using SVVR teaching materials. Results The results show that in using SVVR teaching materials in art courses, more attention should be paid to the control and telepresence in the antecedents of students' flow state. Discussion Only when they obtain better flow experiences can they have higher perceived usefulness and satisfaction with the content of the art course, as well as stronger intentions to continue using it. These results can provide a reference for the development and use of SVVR teaching materials in high school art courses.
Collapse
Affiliation(s)
- Hongya Wang
- School of Design, Jiangnan University, Wuxi, China
| | - Dongning Li
- School of Design, Jiangnan University, Wuxi, China
| | - Chao Gu
- Department of Culture and Arts Management, Honam University, Gwangju, Republic of Korea
| | - Wei Wei
- School of Textile Garment and Design, Changshu Institute of Technology, Changshu, China
| | | |
Collapse
|
17
|
Cho KH, Park JB, Kang A. Metaverse for Exercise Rehabilitation: Possibilities and Limitations. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2023; 20:ijerph20085483. [PMID: 37107765 PMCID: PMC10138806 DOI: 10.3390/ijerph20085483] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 04/05/2023] [Accepted: 04/10/2023] [Indexed: 05/11/2023]
Abstract
OBJECTIVES This study aimed to obtain a consensus agreement from an expert panel on the metaverse for exercise rehabilitation in stroke patients using the Delphi technique. METHODS This study recruited twenty-two experts and conducted three rounds of online surveys between January and February 2023. The Delphi consensus technique was performed online to review and evaluate the framework module. A panel of experts, including scholars, physicians, physical therapists, and physical education specialists in the Republic of Korea, was invited to participate in this study. For each round, the expert consensus was defined as more than 90% of the expert panel agreeing or strongly agreeing with the proposed items. RESULTS A total of twenty experts completed the three Delphi rounds. First, virtual reality-assisted (VR) treadmill walking could improve cognitive function, concentration, muscular endurance, stroke prevention, proper weight maintenance, and cardiorespiratory function. Second, related technology, safety, price, place, and securing experts would be obstacles or challenges in VR-assisted treadmill walking for stroke patients. Third, the role of exercise instructors in exercise planning, performance, and assessment for VR-assisted treadmill walking is equally important, and reeducation for them is required. Fourth, VR-assisted treadmill walking for stroke patients requires an exercise intensity of at least five times a week, about one hour each time. CONCLUSIONS This study showed that the metaverse for exercise rehabilitation for stroke patients could be successfully developed and would be feasible to be implemented in the future. However, it would have limitations in terms of technology, safety, price, place, and expert factors to be overcome in the future.
Collapse
Affiliation(s)
- Kyoung-Hwan Cho
- Department of Special Physical Education, Daelim University College, Anyang 13916, Republic of Korea
| | - Jeong-Beom Park
- Department of Special Physical Education, Daelim University College, Anyang 13916, Republic of Korea
| | - Austin Kang
- Department of Medicine, Seoul National University, Seoul 08826, Republic of Korea
- Correspondence: ; Tel.: +82-1027230519
| |
Collapse
|
18
|
Klowait N. On the Multimodal Resolution of a Search Sequence in Virtual Reality. HUMAN BEHAVIOR AND EMERGING TECHNOLOGIES 2023. [DOI: 10.1155/2023/8417012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023]
Abstract
In virtual reality (VR), participants may not always have hands, bodies, eyes, or even voices—using VR helmets and two controllers, participants control an avatar through virtual worlds that do not necessarily obey familiar laws of physics; moreover, the avatar’s bodily characteristics may not neatly match our bodies in the physical world. Despite these limitations and specificities, humans get things done through collaboration and the creative use of the environment. While multiuser interactive VR is attracting greater numbers of participants, there are currently few attempts to analyze the in situ interaction systematically. This paper proposes a video-analytic detail-oriented methodological framework for studying virtual reality interaction. Using multimodal conversation analysis, the paper investigates a nonverbal, embodied, two-person interaction: two players in a survival game strive to gesturally resolve a misunderstanding regarding an in-game mechanic—however, both of their microphones are turned off for the duration of play. The players’ inability to resort to complex language to resolve this issue results in a dense sequence of back-and-forth activity involving gestures, object manipulation, gaze, and body work. Most crucially, timing and modified repetitions of previously produced actions turn out to be the key to overcome both technical and communicative challenges. The paper analyzes these action sequences, demonstrates how they generate intended outcomes, and proposes a vocabulary to speak about these types of interaction more generally. The findings demonstrate the viability of multimodal analysis of VR interaction, shed light on unique challenges of analyzing interaction in virtual reality, and generate broader methodological insights about the study of nonverbal action.
Collapse
|
19
|
Hill JE, Twamley J, Breed H, Kenyon R, Casey R, Zhang J, Clegg A. Scoping review of the use of virtual reality in intensive care units. Nurs Crit Care 2022; 27:756-771. [PMID: 34783134 DOI: 10.1111/nicc.12732] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2021] [Revised: 10/28/2021] [Accepted: 10/29/2021] [Indexed: 10/19/2022]
Abstract
BACKGROUND A wide range of reviews have demonstrated the effectiveness and tolerability of Virtual Reality (VR) in a range of clinical areas and subpopulations. However, no previous review has explored the current maturity, acceptability, tolerability, and effectiveness of VR with intensive care patients. AIMS To identify the range of uses of VR for intensive care patients, classify their current phase of development, effectiveness, acceptability, and tolerability. METHODS A scoping review was conducted. A multi-database search was undertaken (inception to January 2021). Any type of study which examined the use of VR with the target application population of intensive care patients were included. Screening, data extraction, and assessment of quality were undertaken by a single reviewer. A meta-analysis and a descriptive synthesis were undertaken. RESULTS Six hundred and forty-seven records were identified, after duplicate removal and screening 21 studies were included (weak quality). The majority of studies for relaxation, delirium, and Post Traumatic Stress Disorder (PTSD) were at the early stages of assessing acceptability, tolerability, and initial clinical efficacy. Virtual Reality for relaxation and delirium were well-tolerated with completion rates of target treatment of 73.6%, (95% CI:51.1%-96%, I2 = 98.52%) 52.7% (95% CI:52.7%-100%, I2 = 96.8%). The majority of reasons for non-completion were due to external clinical factors. There were some potential benefits demonstrated for the use of VR for relaxation, delirium, and sleep. CONCLUSION Virtual Reality for intensive care is a new domain of research with the majority of areas of application being in the early stages of development. There is great potential for the use of VR in this clinical environment. Further robust assessment of effectiveness is required before any clinical recommendations can be made. RELEVANCE TO CLINICAL PRACTICE Virtual reality for ICU patients is in its infancy and is not at a stage where it should be used as routine practice. However, there is early evidence to suggest that virtual reality interventions have good acceptability and tolerability in intensive care patients for relaxation, delirium, and improving sleep.
Collapse
Affiliation(s)
- James Edward Hill
- Synthesis, Economic Evaluation and Decision Science (SEEDS) Group, University of Central Lancashire, Preston, UK
| | - Jacqueline Twamley
- Intensive Care Nurse/Academic Research and Innovation Manager, Centre for Health Research and Innovation, NIHR Lancashire Clinical Research Facility, UK
| | - Hetty Breed
- Faculty of Biology, Medicine and Health, Manchester University, Manchester, UK
| | - Roger Kenyon
- Community Engagement & Service User Support, University of Central Lancashire, Preston, UK
| | - Rob Casey
- Digital Therapy Solutions to Empower Stroke, Dementia, Parkinson's Rehabilitation, DancingMind Pte Ltd, Singapore
| | | | - Andrew Clegg
- Synthesis, Economic Evaluation and Decision Science (SEEDS) Group, University of Central Lancashire, Preston, UK
| |
Collapse
|
20
|
Staubli SM, Maloca P, Kuemmerli C, Kunz J, Dirnberger AS, Allemann A, Gehweiler J, Soysal S, Droeser R, Däster S, Hess G, Raptis D, Kollmar O, von Flüe M, Bolli M, Cattin P. Magnetic resonance cholangiopancreatography enhanced by virtual reality as a novel tool to improve the understanding of biliary anatomy and the teaching of surgical trainees. Front Surg 2022; 9:916443. [PMID: 36034383 PMCID: PMC9411984 DOI: 10.3389/fsurg.2022.916443] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2022] [Accepted: 07/19/2022] [Indexed: 11/13/2022] Open
Abstract
ObjectiveThe novel picture archiving and communication system (PACS), compatible with virtual reality (VR) software, displays cross-sectional images in VR. VR magnetic resonance cholangiopancreatography (MRCP) was tested to improve the anatomical understanding and intraoperative performance of minimally invasive cholecystectomy (CHE) in surgical trainees.DesignWe used an immersive VR environment to display volumetric MRCP data (Specto VRTM). First, we evaluated the tolerability and comprehensibility of anatomy with a validated simulator sickness questionnaire (SSQ) and examined anatomical landmarks. Second, we compared conventional MRCP and VR MRCP by matching three-dimensional (3D) printed models and identifying and measuring common bile duct stones (CBDS) using VR MRCP. Third, surgical trainees prepared for CHE with either conventional MRCP or VR MRCP, and we measured perioperative parameters and surgical performance (validated GOALS score).SettingThe study was conducted out at Clarunis, University Center for Gastrointestinal and Liver Disease, Basel, Switzerland.ParticipantsFor the first and second study step, doctors from all specialties and years of experience could participate. In the third study step, exclusively surgical trainees were included. Of 74 participating clinicians, 34, 27, and 13 contributed data to the first, second, and third study phases, respectively.ResultsAll participants determined the relevant biliary structures with VR MRCP. The median SSQ score was 0.75 (IQR: 0, 3.5), indicating good tolerability. Participants selected the corresponding 3D printed model faster and more reliably when previously studying VR MRCP compared to conventional MRCP: We obtained a median of 90 s (IQR: 55, 150) and 72.7% correct answers with VR MRCP versus 150 s (IQR: 100, 208) and 49.6% correct answers with conventional MRCP, respectively (p < 0.001). CBDS was correctly identified in 90.5% of VR MRCP cases. The median GOALS score was higher after preparation with VR MRCP than with conventional MRCP for CHE: 16 (IQR: 13, 22) and 11 (IQR: 11, 18), respectively (p = 0.27).ConclusionsVR MRCP allows for a faster, more accurate understanding of displayed anatomy than conventional MRCP and potentially leads to improved surgical performance in CHE in surgical trainees.
Collapse
Affiliation(s)
- Sebastian M Staubli
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
- Clinical Service of HPB Surgery and Liver Transplantation, Royal Free London Hospital, NHS Foundation Trust, London, United Kingdom
| | - Peter Maloca
- Department of Ophthalmology, University of Basel, Basel, Switzerland
- Institute of Molecular and Clinical Ophthalmology Basel (IOB), Basel, Switzerland
- Moorfields Eye Hospital NHS Foundation Trust, London, United Kingdom
| | - Christoph Kuemmerli
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Julia Kunz
- Faculty of Medicine, University of Basel, Basel, Switzerland
| | - Amanda S Dirnberger
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Andreas Allemann
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Julian Gehweiler
- Department of Radiology, University Hospital Basel, Basel, Switzerland
| | - Savas Soysal
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Raoul Droeser
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Silvio Däster
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Gabriel Hess
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Dimitri Raptis
- Clinical Service of HPB Surgery and Liver Transplantation, Royal Free London Hospital, NHS Foundation Trust, London, United Kingdom
| | - Otto Kollmar
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Markus von Flüe
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Martin Bolli
- Clarunis, University Center for Gastrointestinal and Liver Diseases, St. Clara Hospital and University Hospital Basel, Basel, Switzerland
| | - Philippe Cattin
- Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland
| |
Collapse
|
21
|
Yang X, Fan Y, Chu H, Yan L, Wiederhold BK, Wiederhold M, Liao Y. Preliminary Study of Short-Term Visual Perceptual Training Based on Virtual Reality and Augmented Reality in Postoperative Strabismic Patients. CYBERPSYCHOLOGY, BEHAVIOR AND SOCIAL NETWORKING 2022; 25:465-470. [PMID: 35647873 DOI: 10.1089/cyber.2022.0113] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
The study aimed to explore the potential effect of short-term visual perceptual training based on virtual reality (VR) and augmented reality (AR) platforms in postoperative strabismic patients. We enrolled 236 postoperative strabismic patients, among whom 111 patients received VR-based training, and 125 patients received AR-based training. The stereoacuity of 1.5 m and dynamic stereopsis were improved by VR training; meanwhile, AR training exhibited more improvement in stereoacuity of 0.8 and 1.5 m, dynamic and coarse stereopsis. It was suggested that the visual perceptual training based on VR and AR technology can be potentially applied in postoperative strabismus treatment to promote the recovery of binocular vision.
Collapse
Affiliation(s)
- Xubo Yang
- Department of Ophthalmology, West China Hospital, Sichuan University, Chengdu, China
| | - Yuchen Fan
- Department of Ophthalmology, West China Hospital, Sichuan University, Chengdu, China
| | - Hang Chu
- National Engineering Research Center for Healthcare Devices, Guangzhou, China
| | - Li Yan
- National Engineering Research Center for Healthcare Devices, Guangzhou, China
| | - Brenda K Wiederhold
- Virtual Reality Medical Center, Scripps Memorial Hospital, La Jolla, California
| | - Mark Wiederhold
- Virtual Reality Medical Center, Scripps Memorial Hospital, La Jolla, California
| | - Yongchuan Liao
- Department of Ophthalmology, West China Hospital, Sichuan University, Chengdu, China
| |
Collapse
|
22
|
Taylor S, Soneji S. Bioinformatics and the Metaverse: Are We Ready? FRONTIERS IN BIOINFORMATICS 2022; 2:863676. [PMID: 36304263 PMCID: PMC9580841 DOI: 10.3389/fbinf.2022.863676] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Accepted: 04/20/2022] [Indexed: 02/01/2023] Open
Abstract
COVID-19 forced humanity to think about new ways of working globally without physically being present with other people, and eXtended Reality (XR) systems (defined as Virtual Reality, Augmented Reality and Mixed Reality) offer a potentially elegant solution. Previously seen as mainly for gaming, commercial and research institutions are investigating XR solutions to solve real world problems from training, simulation, mental health, data analysis, and studying disease progression. More recently large corporations such as Microsoft and Meta have announced they are developing the Metaverse as a new paradigm to interact with the digital world. This article will look at how visualization can leverage the Metaverse in bioinformatics research, the pros and cons of this technology, and what the future may hold.
Collapse
Affiliation(s)
- Stephen Taylor
- Analysis, Visualization and Informatics Group, MRC Weatherall Institute of Computational Biology, MRC Weatherall Institute of Molecular Medicine, Oxford, United Kingdom
- *Correspondence: Stephen Taylor,
| | - Shamit Soneji
- Division of Molecular Hematology, Department of Laboratory Medicine, Faculty of Medicine, BMC, Lund University, Lund, Sweden
- Lund Stem Cell Center, Faculty of Medicine, BMC, Lund University, Lund, Sweden
| |
Collapse
|
23
|
Comparison of Standard Training to Virtual Reality Training in Nuclear Radiation Emergency Medical Rescue Education. Disaster Med Public Health Prep 2022; 17:e197. [PMID: 35509180 DOI: 10.1017/dmp.2022.65] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
OBJECTIVE Due to the particularity of nuclear radiation emergencies, professional technical training is necessary. However, nuclear radiation emergency medical rescue nurses are not well prepared to respond in time. This study aims to explore the effect of virtual reality (VR) in training nurses for nuclear radiation emergency medical rescue. METHODS Thirty nurses who received traditional nuclear radiation rescue training from May 2020 to October 2020 were selected as the control group, and another 30 nurses who received VR nuclear radiation emergency medical rescue training from November 2020 to April 2021 were selected as the experimental group. The examination results, learning enthusiasm, training effect evaluation, and training satisfaction were compared between the 2 groups. RESULTS The experimental group had significantly higher examination score, learning enthusiasm, training effect evaluation, and training satisfaction than the control group (P < 0.05). CONCLUSIONS The application of VR in the training of nuclear radiation emergency medical rescue can improve the training performance, learning enthusiasm, training effect, and training satisfaction of trainees. Considering the advantages of VR, it could be widely used in the training of nuclear radiation emergency medical rescue in the future.
Collapse
|
24
|
An impact of three dimensional techniques in virtual reality. Int J Health Sci (Qassim) 2022. [DOI: 10.53730/ijhs.v6ns4.6481] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Three dimensional (3D) imaging play a prominent role in the diagnosis, treatment planning, and post-therapeutic monitoring of patients with Rheumatic Heart Disease (RHD) or mitral valve disease. More interactive and realistic medical experiences take an advantage of advanced visualization techniques like augmented, mixed, and virtual reality to analyze the 3D models. Further, 3D printed mitral valve model is being used in medical field. All these technologies improve the understanding of the complex morphologies of mitral valve disease. Real-time 3D Echocardiography has attracted much more attention in medical researches because it provides interactive feedback to acquire high-quality images as well as timely spatial information of the scanned area and hence is necessary for intraoperative ultrasound examinations. In this article, three dimensional techniques and its impacts in mitral valve disease are reviewed. Specifically, the data acquisition techniques, reconstruction algorithms with clinical applications are presented. Moreover, the advantages and disadvantages of state-of-the-art approaches are discussed in detail.
Collapse
|
25
|
Exploration and Assessment of Interaction in an Immersive Analytics Module: A Software-Based Comparison. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12083817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The focus of computer systems in the field of visual analytics is to make the results clear and understandable. However, enhancing human-computer interaction (HCI) in the field is less investigated. Data visualization and visual analytics (VA) are usually performed using traditional desktop settings and mouse interaction. These methods are based on the window, icon, menu, and pointer (WIMP) interface, which often results in information clutter and is difficult to analyze and understand, especially by novice users. Researchers believe that introducing adequate, natural interaction techniques to the field is necessary for building effective and enjoyable visual analytics systems. This work introduces a novel virtual reality (VR) module to perform basic visual analytics tasks and aims to explore new interaction techniques in the field. A pilot study was conducted to measure the time it takes students to perform basic tasks for analytics using the developed VR module and compares it to the time it takes them to perform the same tasks using a traditional desktop to assess the effectiveness of the VR module in enhancing student’s performance. The results show that novice users (Participants with less programming experience) took about 50% less time to complete tasks using the developed VR module as a comrade to a programming language, notably R. Experts (Participants with advanced programming experience) took about the same time to complete tasks under both conditions (R and VR).
Collapse
|
26
|
Seiler A, Schettle M, Amann M, Gaertner S, Wicki S, Christ SM, Theile G, Feuz M, Hertler C, Blum D. Virtual Reality Therapy in Palliative Care: A Case Series. J Palliat Care 2022:8258597221086767. [PMID: 35293818 DOI: 10.1177/08258597221086767] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
OBJECTIVES Virtual reality (VR) opens a variety of therapeutic options to improve symptom burden in patients with advanced disease. Until to date, only few studies have evaluated the use of VR therapy in the context of palliative care. This case series aims to evaluate the feasibility and acceptability of VR therapy in a population of palliative care patients. METHODS In this single-site case series, we report on six palliative care patients undergoing VR therapy. The VR therapy consisted of a one-time session ranging between 20 to 60 minutes depending on the patient's needs and the content chosen for the VR sessions. A semi-structured survey was conducted and the Edmonton Symptom Assessment System (ESAS) and the Distress Thermometer were performed pre- and post-intervention. RESULTS Overall, VR therapy was well accepted by all patients. Five out of six patients reported having appreciated VR therapy. There were individual differences of perceived effects using VR therapy. The semi-structured survey revealed that some patients felt a temporary detachment from their body and that patients were able to experience the VR session as a break from omnipresent worries and the hospital environment ("I completely forgot where I am"). There was a considerable reduction in the total ESAS score post-treatment (T0 ESASTot = 27.2; T1 ESASTot = 18.8) and a slightly reduction in distress (T0 DTTot = 4.4; T1 DTTot = 3.8). However, two patients were more tired after the intervention.Significance of Results: Our preliminary results demonstrate that VR therapy is acceptable, feasible and safe for use within a palliative care population and appears to be a viable treatment option. Clinical trials are both warranted and necessary to confirm any therapeutic effects of VR therapy, as is the need to tailor VR systems better for use in palliative care settings.
Collapse
Affiliation(s)
- A Seiler
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
- Department of Consultation-Liaison Psychiatry and Psychosomatic Medicine, University Hospital Zurich and University of Zurich, Zurich, Switzerland
| | - M Schettle
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - M Amann
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - Sophie Gaertner
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - Stefan Wicki
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
- Internal Medicine Centre, Hirslanden Klinik Aarau, Switzerland
| | - S M Christ
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - G Theile
- Clinic Susenberg, Zurich, Switzerland
| | - M Feuz
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - C Hertler
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| | - D Blum
- Department of Radiation Oncology, Competence Center Palliative Care, University Hospital Zurich, Zurich, Switzerland
| |
Collapse
|
27
|
Guérinot C, Marcon V, Godard C, Blanc T, Verdier H, Planchon G, Raimondi F, Boddaert N, Alonso M, Sailor K, Lledo PM, Hajj B, El Beheiry M, Masson JB. New Approach to Accelerated Image Annotation by Leveraging Virtual Reality and Cloud Computing. FRONTIERS IN BIOINFORMATICS 2022; 1:777101. [PMID: 36303792 PMCID: PMC9580868 DOI: 10.3389/fbinf.2021.777101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/15/2021] [Indexed: 01/02/2023] Open
Abstract
Three-dimensional imaging is at the core of medical imaging and is becoming a standard in biological research. As a result, there is an increasing need to visualize, analyze and interact with data in a natural three-dimensional context. By combining stereoscopy and motion tracking, commercial virtual reality (VR) headsets provide a solution to this critical visualization challenge by allowing users to view volumetric image stacks in a highly intuitive fashion. While optimizing the visualization and interaction process in VR remains an active topic, one of the most pressing issue is how to utilize VR for annotation and analysis of data. Annotating data is often a required step for training machine learning algorithms. For example, enhancing the ability to annotate complex three-dimensional data in biological research as newly acquired data may come in limited quantities. Similarly, medical data annotation is often time-consuming and requires expert knowledge to identify structures of interest correctly. Moreover, simultaneous data analysis and visualization in VR is computationally demanding. Here, we introduce a new procedure to visualize, interact, annotate and analyze data by combining VR with cloud computing. VR is leveraged to provide natural interactions with volumetric representations of experimental imaging data. In parallel, cloud computing performs costly computations to accelerate the data annotation with minimal input required from the user. We demonstrate multiple proof-of-concept applications of our approach on volumetric fluorescent microscopy images of mouse neurons and tumor or organ annotations in medical images.
Collapse
Affiliation(s)
- Corentin Guérinot
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
- Sorbonne Université, Collège Doctoral, Paris, France
| | - Valentin Marcon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Charlotte Godard
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Thomas Blanc
- Sorbonne Université, Collège Doctoral, Paris, France
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Histopathology and Bio-Imaging Group, Sanofi R&D, Vitry-Sur-Seine, France
- Université de Paris, UFR de Physique, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Francesca Raimondi
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Unité Médicochirurgicale de Cardiologie Congénitale et Pédiatrique, Centre de Référence des Malformations Cardiaques Congénitales Complexes M3C, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Mariana Alonso
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Kurt Sailor
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Pierre-Marie Lledo
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Bassam Hajj
- Sorbonne Université, Collège Doctoral, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| |
Collapse
|
28
|
Use of Different Digitization Methods for the Analysis of Cut Marks on the Oldest Bone Found in Brittany (France). APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12031381] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
Abstract
Archaeological 3D digitization of skeletal elements is an essential aspect of the discipline. Objectives are various: archiving of data (especially before destructive sampling for biomolecular studies for example), study or for pedagogical purposes to allow their manipulation. As techniques are rapidly evolving, the question that arises is the use of appropriate methods to answer the different questions and guarantee sufficient quality of information. The combined use of different 3D technologies for the study of a single Mesolithic bone fragment from Brittany (France) is here an opportunity to compare different 3D digitization methods. This oldest human bone of Brittany, a clavicle constituted of two pieces, was dug up from the mesolithic shell midden of Beg-er-Vil in Quiberon and dated from ca. 8200 to 8000 years BP. They are bound to post-mortem processing, realized on fresh bone in order to remove the integuments, which it is necessary to better qualify. The clavicle was studied through a process that combines advanced 3D image acquisition, 3D processing, and 3D printing with the goal to provide relevant support for the experts involved in the work. The bones were first studied with a metallographic microscopy, scanned with a CT scan, and digitized with photogrammetry in order to get a high quality textured model. The CT scan appeared to be insufficient for a detailed analysis; the study was thus completed with a µ-CT providing a very accurate 3D model of the bone. Several 3D-printed copies of the collarbone were produced in order to support knowledge sharing between the experts involved in the study. The 3D models generated from µCT and photogrammetry were combined to provide an accurate and detailed 3D model. This model was used to study desquamation and the different cut marks, including their angle of attack. These cut marks were also studied with traditional binoculars and digital microscopy. This last technique allowed characterizing their type, revealing a probable meat cutting process with a flint tool. This work of crossed analyses allows us to document a fundamental patrimonial piece, and to ensure its preservation. Copies are also available for the regional museums.
Collapse
|
29
|
Blanc T, Verdier H, Regnier L, Planchon G, Guérinot C, El Beheiry M, Masson JB, Hajj B. Towards Human in the Loop Analysis of Complex Point Clouds: Advanced Visualizations, Quantifications, and Communication Features in Virtual Reality. FRONTIERS IN BIOINFORMATICS 2022; 1:775379. [PMID: 36303735 PMCID: PMC9580855 DOI: 10.3389/fbinf.2021.775379] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2021] [Accepted: 12/24/2021] [Indexed: 11/13/2022] Open
Abstract
Multiple fields in biological and medical research produce large amounts of point cloud data with high dimensionality and complexity. In addition, a large set of experiments generate point clouds, including segmented medical data or single-molecule localization microscopy. In the latter, individual molecules are observed within their natural cellular environment. Analyzing this type of experimental data is a complex task and presents unique challenges, where providing extra physical dimensions for visualization and analysis could be beneficial. Furthermore, whether highly noisy data comes from single-molecule recordings or segmented medical data, the necessity to guide analysis with user intervention creates both an ergonomic challenge to facilitate this interaction and a computational challenge to provide fluid interactions as information is being processed. Several applications, including our software DIVA for image stack and our platform Genuage for point clouds, have leveraged Virtual Reality (VR) to visualize and interact with data in 3D. While the visualization aspects can be made compatible with different types of data, quantifications, on the other hand, are far from being standard. In addition, complex analysis can require significant computational resources, making the real-time VR experience uncomfortable. Moreover, visualization software is mainly designed to represent a set of data points but lacks flexibility in manipulating and analyzing the data. This paper introduces new libraries to enhance the interaction and human-in-the-loop analysis of point cloud data in virtual reality and integrate them into the open-source platform Genuage. We first detail a new toolbox of communication tools that enhance user experience and improve flexibility. Then, we introduce a mapping toolbox allowing the representation of physical properties in space overlaid on a 3D mesh while maintaining a point cloud dedicated shader. We introduce later a new and programmable video capture tool in VR and desktop modes for intuitive data dissemination. Finally, we highlight the protocols that allow simultaneous analysis and fluid manipulation of data with a high refresh rate. We illustrate this principle by performing real-time inference of random walk properties of recorded trajectories with a pre-trained Graph Neural Network running in Python.
Collapse
Affiliation(s)
- Thomas Blanc
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, CNRS USR 3756, Department of Computational Biology and Neuroscience, CNRS UMR 3571, Université de Paris, Institut Pasteur, Université de Paris, Paris, France
| | - Louise Regnier
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, CNRS USR 3756, Department of Computational Biology and Neuroscience, CNRS UMR 3571, Université de Paris, Institut Pasteur, Université de Paris, Paris, France
| | - Corentin Guérinot
- Decision and Bayesian Computation, CNRS USR 3756, Department of Computational Biology and Neuroscience, CNRS UMR 3571, Université de Paris, Institut Pasteur, Université de Paris, Paris, France
- Sorbonne Universités, Collège Doctoral, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, CNRS USR 3756, Department of Computational Biology and Neuroscience, CNRS UMR 3571, Université de Paris, Institut Pasteur, Université de Paris, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, CNRS USR 3756, Department of Computational Biology and Neuroscience, CNRS UMR 3571, Université de Paris, Institut Pasteur, Université de Paris, Paris, France
- *Correspondence: Jean-Baptiste Masson, ; Bassam Hajj,
| | - Bassam Hajj
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, Paris, France
- *Correspondence: Jean-Baptiste Masson, ; Bassam Hajj,
| |
Collapse
|
30
|
El Beheiry M, Gaillard T, Girard N, Darrigues L, Osdoit M, Feron JG, Sabaila A, Laas E, Fourchotte V, Laki F, Lecuru F, Couturaud B, Binder JP, Masson JB, Reyal F, Malhaire C. Breast Magnetic Resonance Image Analysis for Surgeons Using Virtual Reality: A Comparative Study. JCO Clin Cancer Inform 2021; 5:1127-1133. [PMID: 34767435 DOI: 10.1200/cci.21.00048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2021] [Revised: 08/23/2021] [Accepted: 09/29/2021] [Indexed: 12/24/2022] Open
Abstract
PURPOSE The treatment of breast cancer, the leading cause of cancer and cancer mortality among women worldwide, is mainly on the basis of surgery. In this study, we describe the use of a medical image visualization tool on the basis of virtual reality (VR), entitled DIVA, in the context of breast cancer tumor localization among surgeons. The aim of this study was to evaluate the speed and accuracy of surgeons using DIVA for medical image analysis of breast magnetic resonance image (MRI) scans relative to standard image slice-based visualization tools. MATERIALS AND METHODS In our study, residents and practicing surgeons used two breast MRI reading modalities: the common slice-based radiology interface and the DIVA system in its VR mode. Metrics measured were compared in relation to postoperative anatomical-pathologic reports. RESULTS Eighteen breast surgeons from the Institut Curie performed all the analysis presented. The MRI analysis time was significantly lower with the DIVA system than with the slice-based visualization for residents, practitioners, and subsequently the entire group (P < .001). The accuracy of determination of which breast contained the lesion significantly increased with DIVA for residents (P = .003) and practitioners (P = .04). There was little difference between the DIVA and slice-based visualization for the determination of the number of lesions. The accuracy of quadrant determination was significantly improved by DIVA for practicing surgeons (P = .01) but not significantly for residents (P = .49). CONCLUSION This study indicates that the VR visualization of medical images systematically improves surgeons' analysis of preoperative breast MRI scans across several different metrics irrespective of surgeon seniority.
Collapse
Affiliation(s)
- Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) and Neuroscience Department CNRS UMR 3571, Institut Pasteur and CNRS, Paris, France
| | - Thomas Gaillard
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Noémie Girard
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Lauren Darrigues
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Marie Osdoit
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Anne Sabaila
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Enora Laas
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Fatima Laki
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Fabrice Lecuru
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Benoit Couturaud
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) and Neuroscience Department CNRS UMR 3571, Institut Pasteur and CNRS, Paris, France
| | - Fabien Reyal
- Surgery Department, Institut Curie, PSL Research University, Paris, France
- U932, Immunity and Cancer, INSERM, Institut Curie, Paris, France
| | - Caroline Malhaire
- Department of Medical Imaging, Institut Curie, PSL Research University, Paris, France
- Institut Curie, INSERM, LITO Laboratory, Orsay, France
| |
Collapse
|
31
|
Qin X, Chen C, Wang L, Chen X, Liang Y, Jin X, Pan W, Liu Z, Li H, Yang G. In-vivo 3D imaging of Zebrafish's intersegmental vessel development by a bi-directional light-sheet illumination microscope. Biochem Biophys Res Commun 2021; 557:8-13. [PMID: 33857842 DOI: 10.1016/j.bbrc.2021.03.160] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2021] [Accepted: 03/29/2021] [Indexed: 11/30/2022]
Abstract
Precise quantification of vascular developments in Zebrafish requires continuous in-vivo 3D imaging. Here we employed a bi-directional light-sheet illumination microscope to characterize the development process of Zebrafish's intersegmental vessels. A Virtual Reality-based method was used to measure the lengths of intersegmental vessels (ISVs). The quantified growth rates of typical ISVs can be plotted, and unusual growth of some specific vessels was also observed.
Collapse
Affiliation(s)
- Xiaofei Qin
- Changchun University of Science and Technology, Changchun, Jilin, 130022, China
| | - Chong Chen
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Linbo Wang
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Xiaohu Chen
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Yong Liang
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Xin Jin
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Weijun Pan
- Shanghai Institute of Nutrition and Health, Chinese Academy of Sciences, Shanghai, 200031, China
| | - Zhiying Liu
- Changchun University of Science and Technology, Changchun, Jilin, 130022, China
| | - Hui Li
- Changchun University of Science and Technology, Changchun, Jilin, 130022, China; Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China
| | - Guang Yang
- Jiangsu Key Laboratory of Medical Optics, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu, 215163, China.
| |
Collapse
|
32
|
Raimondi F, Vida V, Godard C, Bertelli F, Reffo E, Boddaert N, El Beheiry M, Masson JB. Fast-track virtual reality for cardiac imaging in congenital heart disease. J Card Surg 2021; 36:2598-2602. [PMID: 33760302 DOI: 10.1111/jocs.15508] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2021] [Accepted: 02/03/2021] [Indexed: 02/06/2023]
Abstract
BACKGROUND AND AIM OF THE STUDY We sought to evaluate the appropriateness of cardiac anatomy renderings by a new virtual reality (VR) technology, entitled DIVA, directly applicable to raw magnetic resonance imaging (MRI) data without intermediate segmentation steps in comparison to standard three-dimensional (3D) rendering techniques (3D PDF and 3D printing). Differences in post-processing times were also evaluated. METHODS We reconstructed 3D (STL, 3D-PDF, and 3D printed ones) and VR models of three patients with different types of complex congenital heart disease (CHD). We then asked a senior pediatric heart surgeon to compare and grade the results obtained. RESULTS All anatomical structures were well visualized in both VR and 3D PDF/printed models. Ventricular-arterial connections and their relationship with the great vessels were better visualized with the VR model (Case 2); aortic arch anatomy and details were also better visualized by the VR model (Case 3). The median post-processing time to get VR models using DIVA was 5 min in comparison to 8 h (range 8-12 h including printing time) for 3D models (PDF/printed). CONCLUSIONS VR directly applied to non-segmented 3D-MRI data set is a promising technique for 3D advanced modeling in CHD. It is systematically more consistent and faster when compared to standard 3D-modeling techniques.
Collapse
Affiliation(s)
- Francesca Raimondi
- Unité médico-chirurgicale de cardiologie congénitale et pédiatrique, centre de référence des maladies cardiaques congénitales complexes-M3C, Hôpital universitaire Necker-Enfants Malades, Université de Paris, France.,Decision and Bayesian Computation, Computation Biology Department, CNRS, URS 3756, Neuroscience Department, CNRS UMR 3571, Institut Pasteur, Paris, France.,Pediatric Radiology Unit, Hôpital universitaire Necker-Enfants Malades, Université de Paris, France
| | - Vladimiro Vida
- Pediatric and Congenital Cardiac Surgery Unit, University of Padua, Italy
| | - Charlotte Godard
- Decision and Bayesian Computation, Computation Biology Department, CNRS, URS 3756, Neuroscience Department, CNRS UMR 3571, Institut Pasteur, Paris, France
| | - Francesco Bertelli
- Pediatric and Congenital Cardiac Surgery Unit, University of Padua, Italy
| | - Elena Reffo
- Pediatric Cardiology Unit, University of Padua, Italy
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital universitaire Necker-Enfants Malades, Université de Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, Computation Biology Department, CNRS, URS 3756, Neuroscience Department, CNRS UMR 3571, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, Computation Biology Department, CNRS, URS 3756, Neuroscience Department, CNRS UMR 3571, Institut Pasteur, Paris, France
| |
Collapse
|
33
|
Steiniger BS, Pfeffer H, Guthe M, Lobachev O. Exploring human splenic red pulp vasculature in virtual reality: details of sheathed capillaries and the open capillary network. Histochem Cell Biol 2021; 155:341-354. [PMID: 33074357 PMCID: PMC8021519 DOI: 10.1007/s00418-020-01924-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/18/2020] [Indexed: 02/07/2023]
Abstract
We reconstructed serial sections of a representative adult human spleen to clarify the unknown arrangement of the splenic microvasculature, such as terminal arterioles, sheathed capillaries, the red pulp capillary network and venules. The resulting 3D model was evaluated in virtual reality (VR). Capillary sheaths often occurred after the second or third branching of a terminal arteriole and covered its capillary side or end branches. The sheaths started directly after the final smooth muscle cells of the arteriole and consisted of cuboidal CD271++ stromal sheath cells surrounded and infiltrated by B lymphocytes and macrophages. Some sheaths covered up to four sequential capillary bifurcations thus forming bizarre elongated structures. Each sheath had a unique form. Apart from symmetric dichotomous branchings inside the sheath, sheathed capillaries also gave off side branches, which crossed the sheath and freely ended at its surface. These side branches are likely to distribute materials from the incoming blood to sheath-associated B lymphocytes and macrophages and thus represent the first location for recognition of blood-borne antigens in the spleen. A few non-sheathed bypasses from terminal arterioles to the red pulp capillary network also exist. Red pulp venules are primarily supplied by sinuses, but they also exhibit a few connections to the capillary network. Thus, the human splenic red pulp harbors a primarily open microcirculation with a very minor closed part.
Collapse
Affiliation(s)
- Birte S Steiniger
- Institute of Anatomy and Cell Biology, University of Marburg, Robert-Koch-Str.8, 35037, Marburg, Germany.
| | - Henriette Pfeffer
- Institute of Anatomy and Cell Biology, University of Marburg, Robert-Koch-Str.8, 35037, Marburg, Germany
| | - Michael Guthe
- Visual Computing, Institute of Computer Science, University of Bayreuth, 95440, Bayreuth, Germany
| | - Oleg Lobachev
- Visual Computing, Institute of Computer Science, University of Bayreuth, 95440, Bayreuth, Germany
- Institute of Functional and Applied Anatomy, Hannover Medical School, 30625, Hannover, Germany
- Leibniz-Fachhochschule School of Business, 30539, Hannover, Germany
| |
Collapse
|
34
|
Bouaoud J, El Beheiry M, Jablon E, Schouman T, Bertolus C, Picard A, Masson JB, Khonsari RH. DIVA, a 3D virtual reality platform, improves undergraduate craniofacial trauma education. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2020; 122:367-371. [PMID: 33007493 DOI: 10.1016/j.jormas.2020.09.009] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Accepted: 09/11/2020] [Indexed: 12/15/2022]
Abstract
Craniofacial fractures management is challenging to teach due to the complex anatomy of the head, even when using three-dimensional CT-scan images. DIVA is a software allowing the straightforward visualization of CT-scans in a user-friendly three-dimensional virtual reality environment. Here, we assess DIVA as an educational tool for craniofacial trauma for undergraduate medical students. Three craniofacial trauma cases (jaw fracture, naso-orbital-ethmoid complex fracture and Le Fort 3 fracture) were submitted to 50 undergraduate medical students, who had to provide diagnoses and treatment plans. Each student then filled an 8-item questionnaire assessing satisfaction, potential benefit, ease of use and tolerance. Additionally, 4 postgraduate students were requested to explore these cases and to place 6 anatomical landmarks on both virtual reality renderings and usual slice-based three-dimensional CT-scan visualizations. High degrees of satisfaction (98%) without specific tolerance issues (86%) were reported. The potential benefit in a better understanding of craniofacial trauma using virtual reality was reported by almost all students (98%). Virtual reality allowed a reliable localization of key anatomical landmarks when compared with standard three-dimensional CT-scan visualization. Virtual reality interfaces such DIVA are beneficial to medical students for a better understanding of craniofacial trauma and allow a reliable rendering of craniofacial anatomy.
Collapse
Affiliation(s)
- Jebrane Bouaoud
- Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Chirurgie Plastique, Hôpital Universitaire Necker - Enfants Malades, Université Paris Descartes, Université de Paris, Paris, France; Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Stomatologie, Hôpital Universitaire Pitié-Salpêtrière, Université Pierre et Marie Curie, Sorbonne Université, Paris, France.
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, Neuroscience Department UMR 3571 & USR 3756 (C3BI/DBC), Institut Pasteur & CNRS, Paris, France
| | - Eve Jablon
- Université Paris Descartes, Université de Paris, Paris, France
| | - Thomas Schouman
- Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Stomatologie, Hôpital Universitaire Pitié-Salpêtrière, Université Pierre et Marie Curie, Sorbonne Université, Paris, France
| | - Chloé Bertolus
- Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Stomatologie, Hôpital Universitaire Pitié-Salpêtrière, Université Pierre et Marie Curie, Sorbonne Université, Paris, France
| | - Arnaud Picard
- Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Chirurgie Plastique, Hôpital Universitaire Necker - Enfants Malades, Université Paris Descartes, Université de Paris, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, Neuroscience Department UMR 3571 & USR 3756 (C3BI/DBC), Institut Pasteur & CNRS, Paris, France
| | - Roman H Khonsari
- Assistance Publique - Hôpitaux de Paris, Service de Chirurgie Maxillo-Faciale et Chirurgie Plastique, Hôpital Universitaire Necker - Enfants Malades, Université Paris Descartes, Université de Paris, Paris, France
| |
Collapse
|
35
|
Genuage: visualize and analyze multidimensional single-molecule point cloud data in virtual reality. Nat Methods 2020; 17:1100-1102. [PMID: 32958921 DOI: 10.1038/s41592-020-0946-1] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2020] [Accepted: 08/10/2020] [Indexed: 01/10/2023]
Abstract
Experimentally recorded point cloud data, such as those generated by single-molecule localization microscopy, are continuously increasing in size and dimension. Gaining an intuitive understanding and facilitating the analysis of such multidimensional data remains challenging. Here we report a new open-source software platform, Genuage, that enables the easy perception of, interaction with and analysis of multidimensional point clouds in virtual reality. Genuage is compatible with arbitrary multidimensional data extending beyond single-molecule localization microscopy.
Collapse
|
36
|
El Beheiry M, Godard C, Caporal C, Marcon V, Ostertag C, Sliti O, Doutreligne S, Fournier S, Hajj B, Dahan M, Masson JB. DIVA: Natural Navigation Inside 3D Images Using Virtual Reality. J Mol Biol 2020; 432:4745-4749. [DOI: 10.1016/j.jmb.2020.05.026] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 05/26/2020] [Accepted: 05/27/2020] [Indexed: 12/18/2022]
|
37
|
Seeing Your Way to New Insights in Biology. J Mol Biol 2019; 431:2485-2486. [PMID: 31034886 DOI: 10.1016/j.jmb.2019.04.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|