1
|
Cianciulli AR, Sulentic A, Wang Y, Daemer M, Amin S, Joyce J, Lasso A, Otero HJ, Cohen MS, Fuller S, Nuri MAK, Tang J, O'Byrne ML, Jolley MA. Volume Rendering of CT Images to Inform Closure of Complex Ventricular Septal Defects. JACC Case Rep 2025; 30:102827. [PMID: 40118631 PMCID: PMC12011141 DOI: 10.1016/j.jaccas.2024.102827] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2024] [Accepted: 10/16/2024] [Indexed: 03/23/2025]
Abstract
The anatomy of structurally complex ventricular septal defects (SC-VSD) can be difficult to assess using 2-dimensional (2D) images and traditional 2D multiplanar views of 3D images. Direct identification and visualization are not always possible via the standard tricuspid approach in surgical repair. Volume rendering is a near instant method for 3D visualization of computed tomography angiography images, but application of this method to the planning of SC-VSD closure has not been described. We describe the integration of virtual patch and device placement within volume-rendered computed tomography angiography images within SlicerHeart to inform surgical and transcatheter closure of SC-VSDs in 3 patients. Virtual heart models were created and examined by a multidisciplinary team. Virtual device placement and surgical patch design was applied to better inform patient candidacy and procedural planning. Volume rendering-based visualization of SC-VSDs is feasible and may inform understanding of anatomy and conceptualization of the optimal repair. Further study is needed to demonstrate improvement in outcomes.
Collapse
Affiliation(s)
- Alana R Cianciulli
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Analise Sulentic
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Yan Wang
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Matthew Daemer
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Silvani Amin
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Jeremiah Joyce
- Division of Cardiology, The Children's Hospital of Philadelphia, Pennsylvania, USA; Department of Pediatrics Perelman School of Medicine at The University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Andras Lasso
- Laboratory for Percutaneous Surgery, Queens University, Kingston, Ontario, Canada
| | - Hansel J Otero
- Department of Radiology, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Meryl S Cohen
- Division of Cardiology, The Children's Hospital of Philadelphia, Pennsylvania, USA; Department of Pediatrics Perelman School of Medicine at The University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Stephanie Fuller
- Division of Pediatric Cardiac Surgery, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Mohammad A K Nuri
- Division of Pediatric Cardiac Surgery, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Jessica Tang
- Division of Cardiology, The Children's Hospital of Philadelphia, Pennsylvania, USA; Department of Pediatrics Perelman School of Medicine at The University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Michael L O'Byrne
- Division of Cardiology, The Children's Hospital of Philadelphia, Pennsylvania, USA; Department of Pediatrics Perelman School of Medicine at The University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Matthew A Jolley
- Department of Anesthesiology and Critical Care Medicine, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, USA; Division of Cardiology, The Children's Hospital of Philadelphia, Pennsylvania, USA; Department of Pediatrics Perelman School of Medicine at The University of Pennsylvania, Philadelphia, Pennsylvania, USA.
| |
Collapse
|
2
|
Spiegler P, Abdelsalam H, Hellum O, Hadjinicolaou A, Weil AG, Xiao Y. PreVISE: an efficient virtual reality system for SEEG surgical planning. VIRTUAL REALITY 2024; 29:13. [PMID: 39735694 PMCID: PMC11669611 DOI: 10.1007/s10055-024-01088-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/08/2024] [Accepted: 12/10/2024] [Indexed: 12/31/2024]
Abstract
Epilepsy is a neurological disorder characterized by recurring seizures that can cause a wide range of symptoms. Stereo-electroencephalography (SEEG) is a diagnostic procedure where multiple electrodes are stereotactically implanted within predefined brain regions to identify the seizure onset zone, which needs to be surgically removed or disconnected to achieve remission of focal epilepsy. This procedure is complex and challenging due to two main reasons. First, as electrode placement requires good accuracy in desired brain regions, excellent knowledge and understanding of the 3D brain anatomy is required. Second, as typically multiple SEEG electrodes need to be implanted, the positioning of intracerebral electrodes must avoid critical structures (e.g., blood vessels) to ensure patient safety. Traditional SEEG surgical planning relies on 2D display of multi-contrast volumetric medical imaging data, and places a high cognitive demand for surgeons' spatial understanding, resulting in potentially sub-optimal surgical plans and extensive planning time (~ 15 min per electrode). In contrast, virtual reality (VR) presents an intuitive and immersive approach that can offer more intuitive visualization of 3D data as well as potentially enhanced efficiency for neurosurgical planning. Unfortunately, existing VR systems for SEEG surgery only focus on the visualization of post-surgical scans to confirm electrode placement. To address the need, we introduce the first VR system for SEEG planning that integrates user-friendly and efficient visualization and interaction strategies while providing real-time feedback metrics, including distances to nearest blood vessels, angles of insertion, and the overall surgical quality scores. The system reduces the surgical planning time by 91%.
Collapse
Affiliation(s)
- Pascal Spiegler
- Department of Computer Science and Software Engineering, Concordia University, Montreal, Québec Canada
| | - Haitham Abdelsalam
- Department of Computer Science and Software Engineering, Concordia University, Montreal, Québec Canada
| | - Owen Hellum
- Department of Computer Science and Software Engineering, Concordia University, Montreal, Québec Canada
| | - Aristides Hadjinicolaou
- Department of Pediatrics, Division of Neurology, Sainte-Justine University Hospital Center, Montreal, Québec Canada
| | - Alexander G. Weil
- Department of Surgery, Division of Neurosurgery, Sainte-Justine University Hospital Center, Montreal, Québec Canada
- Department of Surgery, University of Montreal Hospital Center (CHUM), Montreal, Québec Canada
| | - Yiming Xiao
- Department of Computer Science and Software Engineering, Concordia University, Montreal, Québec Canada
| |
Collapse
|
3
|
Kantor T, Mahajan P, Murthi S, Stegink C, Brawn B, Varshney A, Reddy RM. Role of eXtended Reality use in medical imaging interpretation for pre-surgical planning and intraoperative augmentation. J Med Imaging (Bellingham) 2024; 11:062607. [PMID: 39649776 PMCID: PMC11618384 DOI: 10.1117/1.jmi.11.6.062607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Revised: 11/13/2024] [Accepted: 11/18/2024] [Indexed: 12/11/2024] Open
Abstract
Purpose eXtended Reality (XR) technology, including virtual reality (VR), augmented reality (AR), and mixed reality (MR), is a growing field in healthcare. Each modality offers unique benefits and drawbacks for medical education, simulation, and clinical care. We review current studies to understand how XR technology uses medical imaging to enhance surgical diagnostics, planning, and performance. We also highlight current limitations and future directions. Approach We reviewed the literature on immersive XR technologies for surgical planning and intraoperative augmentation, excluding studies on telemedicine and 2D video-based training. We cited publications highlighting XR's advantages and limitations in these categories. Results A review of 556 papers on XR for medical imaging in surgery yielded 155 relevant papers reviewed utilizing the aid of chatGPT. XR technology may improve procedural times, reduce errors, and enhance surgical workflows. It aids in preoperative planning, surgical navigation, and real-time data integration, improving surgeon ergonomics and enabling remote collaboration. However, adoption faces challenges such as high costs, infrastructure needs, and regulatory hurdles. Despite these, XR shows significant potential in advancing surgical care. Conclusions Immersive technologies in healthcare enhance visualization and understanding of medical conditions, promising better patient outcomes and innovative treatments but face adoption challenges such as cost, technological constraints, and regulatory hurdles. Addressing these requires strategic collaborations and improvements in image quality, hardware, integration, and training.
Collapse
Affiliation(s)
- Taylor Kantor
- University of Michigan, Department of Surgery, Section of Thoracic Surgery, Ann Arbor, Michigan, United States
- Center for Surgical Innovation, Department of Surgery, Ann Arbor, Michigan, United States
| | - Prashant Mahajan
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan, United States
| | - Sarah Murthi
- University of Maryland, Section of Trauma Surgery, Department of Surgery, Baltimore, Maryland, United States
| | - Candice Stegink
- Center for Surgical Innovation, Department of Surgery, Ann Arbor, Michigan, United States
| | - Barbara Brawn
- University of Maryland, University of Maryland Institute for Advanced Computer Studies, College of Computer, Mathematical, and Natural Sciences, College Park, Maryland, United States
| | - Amitabh Varshney
- University of Maryland, University of Maryland Institute for Advanced Computer Studies, College of Computer, Mathematical, and Natural Sciences, College Park, Maryland, United States
| | - Rishindra M. Reddy
- University of Michigan, Department of Surgery, Section of Thoracic Surgery, Ann Arbor, Michigan, United States
- Center for Surgical Innovation, Department of Surgery, Ann Arbor, Michigan, United States
| |
Collapse
|
4
|
Chen X, Thakur T, Jeyasekharan AD, Benoukraf T, Meruvia-Pastor O. ColocZStats: a z-stack signal colocalization extension tool for 3D slicer. Front Physiol 2024; 15:1440099. [PMID: 39296518 PMCID: PMC11408364 DOI: 10.3389/fphys.2024.1440099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2024] [Accepted: 08/12/2024] [Indexed: 09/21/2024] Open
Abstract
Confocal microscopy has evolved to be a widely adopted imaging technique in molecular biology and is frequently utilized to achieve accurate subcellular localization of proteins. Applying colocalization analysis on image z-stacks obtained from confocal fluorescence microscopes is a dependable method of revealing the relationship between different molecules. In addition, despite the established advantages and growing adoption of 3D visualization software in various microscopy research domains, there have been few systems that can support colocalization analysis within a user-specified region of interest (ROI). In this context, several broadly employed biological image visualization platforms are meticulously explored in this study to understand the current landscape. It has been observed that while these applications can generate three-dimensional (3D) reconstructions for z-stacks, and in some cases transfer them into an immersive virtual reality (VR) scene, there is still little support for performing quantitative colocalization analysis on such images based on a user-defined ROI and thresholding levels. To address these issues, an extension called ColocZStats (pronounced Coloc-Zee-Stats) has been developed for 3D Slicer, a widely used free and open-source software package for image analysis and scientific visualization. With a custom-designed user-friendly interface, ColocZStats allows investigators to conduct intensity thresholding and ROI selection on imported 3D image stacks. It can deliver several essential colocalization metrics for structures of interest and produce reports in the form of diagrams and spreadsheets.
Collapse
Affiliation(s)
- Xiang Chen
- Division of BioMedical Sciences, Faculty of Medicine, Memorial University of Newfoundland, St. John's, NL, Canada
- Department of Computer Science, Faculty of Science, Memorial University of Newfoundland, St. John's, NL, Canada
| | - Teena Thakur
- Cancer Science Institute of Singapore, National University of Singapore, Singapore, Singapore
| | - Anand D Jeyasekharan
- Cancer Science Institute of Singapore, National University of Singapore, Singapore, Singapore
| | - Touati Benoukraf
- Division of BioMedical Sciences, Faculty of Medicine, Memorial University of Newfoundland, St. John's, NL, Canada
- Cancer Science Institute of Singapore, National University of Singapore, Singapore, Singapore
| | - Oscar Meruvia-Pastor
- Department of Computer Science, Faculty of Science, Memorial University of Newfoundland, St. John's, NL, Canada
| |
Collapse
|
5
|
Li P, Xu B, Zhang X, Fang D, Zhang J. Design and development of a personalized virtual reality-based training system for vascular intervention surgery. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2024; 249:108142. [PMID: 38547688 DOI: 10.1016/j.cmpb.2024.108142] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Revised: 02/17/2024] [Accepted: 03/20/2024] [Indexed: 04/21/2024]
Abstract
BACKGROUND AND OBJECTIVES Virtual training has emerged as an exceptionally effective approach for training healthcare practitioners in the field of vascular intervention surgery. By providing a simulated environment and blood vessel model that enables repeated practice, virtual training facilitates the acquisition of surgical skills in a safe and efficient manner for trainees. However, the current state of research in this area is characterized by limitations in the fidelity of blood vessel and guidewire models, which restricts the effectiveness of training. Additionally, existing approaches lack the necessary real-time responsiveness and precision, while the blood vessel models suffer from incompleteness and a lack of scientific rigor. METHODS To address these challenges, this paper integrates position-based dynamics (PBD) and its extensions, shape matching, and Cosserat elastic rods. By combining these approaches within a unified particle framework, accurate and realistic deformation simulation of personalized blood vessel and guidewire models is achieved, thereby enhancing the training experience. Furthermore, a multi-level progressive continuous collision detection method, leveraging spatial hashing, is proposed to improve the accuracy and efficiency of collision detection. RESULTS Our proposed blood vessel model demonstrated acceptable performance with the reduced deformation simulation response times of 7 ms, improving the real-time capability at least of 43.75 %. Experimental validation confirmed that the guidewire model proposed in this paper can dynamically adjust the density of its elastic rods to alter the degree of bending and torsion. It also exhibited a deformation process comparable to that of real guidewires, with an average response time of 6 ms. In the interaction of blood vessel and guidewire models, the simulator blood vessel model used for coronary vascular intervention training exhibited an average response time of 15.42 ms, with a frame rate of approximately 64 FPS. CONCLUSIONS The method presented in this paper achieves deformation simulation of both vascular and guidewire models, demonstrating sufficient real-time performance and accuracy. The interaction efficiency between vascular and guidewire models is enhanced through the unified simulation framework and collision detection. Furthermore, it can be integrated with virtual training scenarios within the system, making it suitable for developing more advanced vascular interventional surgery training systems.
Collapse
Affiliation(s)
- Pan Li
- Tianjin Key Lab of Integrated Design and On-line Monitoring for Light Industry & Food Machinery and Equipment, College of Mechanical Engineering, Tianjin University of Science & Technology, Tianjin 300222, China.
| | - Boxuan Xu
- Tianjin Key Lab of Integrated Design and On-line Monitoring for Light Industry & Food Machinery and Equipment, College of Mechanical Engineering, Tianjin University of Science & Technology, Tianjin 300222, China
| | - Xinxin Zhang
- Tianjin Key Lab of Integrated Design and On-line Monitoring for Light Industry & Food Machinery and Equipment, College of Mechanical Engineering, Tianjin University of Science & Technology, Tianjin 300222, China
| | - Delei Fang
- Tianjin Key Lab of Integrated Design and On-line Monitoring for Light Industry & Food Machinery and Equipment, College of Mechanical Engineering, Tianjin University of Science & Technology, Tianjin 300222, China
| | - Junxia Zhang
- Tianjin Key Lab of Integrated Design and On-line Monitoring for Light Industry & Food Machinery and Equipment, College of Mechanical Engineering, Tianjin University of Science & Technology, Tianjin 300222, China
| |
Collapse
|
6
|
Botha BS, De wet L. CyPVICS: A framework to prevent or minimise cybersickness in immersive virtual clinical simulation. Heliyon 2024; 10:e29595. [PMID: 38665591 PMCID: PMC11044044 DOI: 10.1016/j.heliyon.2024.e29595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Revised: 03/28/2024] [Accepted: 04/10/2024] [Indexed: 04/28/2024] Open
Abstract
Cybersickness is a global issue affecting users of immersive virtual reality. However, there is no agreement on the exact cause of cybersickness. Taking into consideration how it can differ greatly from one person to another, it makes it even more difficult to determine the exact cause or find a solution. Because cybersickness excludes so many prospective users, including healthcare professionals, from using immersive virtual reality as a learning tool, this research sought to find solutions in existing literature and construct a framework that can be used to prevent or minimise cybersickness during immersive virtual clinical simulation (CyPVICS). The Bestfit Framework by Carrol and authors were used to construct the CyPVICS framework. The process started by conducting two separate literature searchers using the BeHEMoTh (for models, theories, and frameworks) and SPIDER (for primary research articles) search techniques. Once the literature searches were completed the models, theories and framework were used to construct a priori framework. The models' theories and frameworks were analysed to determine aspects relevant to causes, reducing, eliminating, and detecting cybersickness. The priori framework was expanded by, first coding the findings of the primary research study into the existing aspects of the priori framework. Once coded the aspects that could not be coded were added in the relevant category, for example causes. After reviewing 1567 abstracts and titles as part of the BeHEMoTh search string,19 full text articles, a total of 15 papers containing models, theories, and frameworks, were used to construct the initial CyPVICS framework. Once the initial CyPVICS was created, a total 904 primary research studies (SPIDER) were evaluated, based on their titles and abstracts, of which 100 were reviewed in full text. In total, 67 articles were accepted and coded to expand the initial CyPVICS framework. This paper presents the CyPVICS framework for use, not only in health professions' education, but also in other disciplines, since the incorporated models, theories, frameworks, and primary research studies were not specific to virtual clinical simulation.
Collapse
Affiliation(s)
- Benjamin Stephanus Botha
- Department of Computer Science and Informatics, Faculty of Natural and Agricultural Sciences, University of the Free State, Bloemfontein, Free State, South Africa
| | - Lizette De wet
- Department of Computer Science and Informatics, Faculty of Natural and Agricultural Sciences, University of the Free State, Bloemfontein, Free State, South Africa
| |
Collapse
|
7
|
Peng MJ, Chen HY, Chen P, Tan Z, Hu Y, To MKT, He E. Virtual reality-based surgical planning simulator for tumorous resection in FreeForm Modeling: an illustrative case of clinical teaching. Quant Imaging Med Surg 2024; 14:2060-2068. [PMID: 38415160 PMCID: PMC10895132 DOI: 10.21037/qims-23-1151] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Accepted: 12/12/2023] [Indexed: 02/29/2024]
Abstract
The importance of virtual reality (VR) has been emphasized by many medical studies, yet it has been relatively under-applied to surgical operation. This study characterized how VR has been applied in clinical education and evaluated its tutorial utility by designing a surgical model of tumorous resection as a simulator for preoperative planning and medical tutorial. A 36-year-old male patient with a femoral tumor who was admitted to the Affiliated Jiangmen Traditional Chinese Medicine Hospital was randomly selected and scanned by computed tomography (CT). The data in digital imaging and communications in medicine (*.DICOM) format were imported into Mimics to reconstruct a femoral model, and were generated to the format of *.stl executing in the computer-aided design (CAD) software SenSable FreeForm Modeling (SFM). A bony tumor was simulated by adding clay to the femur, the procedure of tumorous resection was virtually performed with a toolkit called Phantom, and its bony defect was filled with virtual cement. A 3D workspace was created to enable the individual multimodality manipulation, and a virtual operation of tumorous excision was successfully carried out with indefinitely repeated running. The precise delineation of surgical margins was shown to be achieved with expert proficiency and inexperienced hands among 43 of 50 participants. This simulative educator presented an imitation of high definition, those trained by VR models achieved a higher success rate of 86% than the rate of 74% achieved by those trained by conventional methods. This tumorous resection was repeatably handled by SFM, including the establishment of surgical strategy, whereby participants felt that respondent force feedback was beneficial to surgical teaching programs, enabling engagement of learning experiences by immersive events which mimic real-world circumstances to reinforce didactic and clinical concepts.
Collapse
Affiliation(s)
- Matthew Jianqiao Peng
- Department of Spinal Surgery, Affiliated Jiangmen Traditional Chinese Medicine Hospital of Jinan University, Jiangmen, China
| | - Hai-Yan Chen
- Department of Orthopedics, Huidong People’s Hospital, Huizhou, China
| | - Peikai Chen
- Department of Orthopedics and Traumatology, The University of Hong Kong-Shenzhen Hospital, Hong Kong, China
| | - Zhijia Tan
- Department of Orthopedics and Traumatology, The University of Hong Kong-Shenzhen Hospital, Hong Kong, China
| | - Yong Hu
- Department of Orthopedics and Traumatology, The University of Hong Kong-Shenzhen Hospital, Hong Kong, China
| | - Michael Kai-Tsun To
- Department of Orthopedics and Traumatology, The University of Hong Kong-Shenzhen Hospital, Hong Kong, China
| | - Erxing He
- Department of Spinal Surgery, Affiliated 4th Hospital of Guangzhou Medical University, Guangzhou, China
| |
Collapse
|
8
|
Worlikar H, Coleman S, Kelly J, O'Connor S, Murray A, McVeigh T, Doran J, McCabe I, O'Keeffe D. Mixed Reality Platforms in Telehealth Delivery: Scoping Review. JMIR BIOMEDICAL ENGINEERING 2023; 8:e42709. [PMID: 38875694 PMCID: PMC11041465 DOI: 10.2196/42709] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 11/03/2022] [Accepted: 11/16/2022] [Indexed: 11/17/2022] Open
Abstract
BACKGROUND The distinctive features of the digital reality platforms, namely augmented reality (AR), virtual reality (VR), and mixed reality (MR) have extended to medical education, training, simulation, and patient care. Furthermore, this digital reality technology seamlessly merges with information and communication technology creating an enriched telehealth ecosystem. This review provides a composite overview of the prospects of telehealth delivered using the MR platform in clinical settings. OBJECTIVE This review identifies various clinical applications of high-fidelity digital display technology, namely AR, VR, and MR, delivered using telehealth capabilities. Next, the review focuses on the technical characteristics, hardware, and software technologies used in the composition of AR, VR, and MR in telehealth. METHODS We conducted a scoping review using the methodological framework and reporting design using the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) guidelines. Full-length articles in English were obtained from the Embase, PubMed, and Web of Science databases. The search protocol was based on the following keywords and Medical Subject Headings to obtain relevant results: "augmented reality," "virtual reality," "mixed-reality," "telemedicine," "telehealth," and "digital health." A predefined inclusion-exclusion criterion was developed in filtering the obtained results and the final selection of the articles, followed by data extraction and construction of the review. RESULTS We identified 4407 articles, of which 320 were eligible for full-text screening. A total of 134 full-text articles were included in the review. Telerehabilitation, telementoring, teleconsultation, telemonitoring, telepsychiatry, telesurgery, and telediagnosis were the segments of the telehealth division that explored the use of AR, VR, and MR platforms. Telerehabilitation using VR was the most commonly recurring segment in the included studies. AR and MR has been mainly used for telementoring and teleconsultation. The most important technical features of digital reality technology to emerge with telehealth were virtual environment, exergaming, 3D avatars, telepresence, anchoring annotations, and first-person viewpoint. Different arrangements of technology-3D modeling and viewing tools, communication and streaming platforms, file transfer and sharing platforms, sensors, high-fidelity displays, and controllers-formed the basis of most systems. CONCLUSIONS This review constitutes a recent overview of the evolving digital AR and VR in various clinical applications using the telehealth setup. This combination of telehealth with AR, VR, and MR allows for remote facilitation of clinical expertise and further development of home-based treatment. This review explores the rapidly growing suite of technologies available to users within the digital health sector and examines the opportunities and challenges they present.
Collapse
Affiliation(s)
- Hemendra Worlikar
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
| | - Sean Coleman
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
- Department of Medicine, University Hospital Galway, Galway, Ireland
| | - Jack Kelly
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
- Department of Medicine, University Hospital Galway, Galway, Ireland
| | - Sadhbh O'Connor
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
- Department of Medicine, University Hospital Galway, Galway, Ireland
| | - Aoife Murray
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
| | - Terri McVeigh
- Cancer Genetics Unit, The Royal Marsden National Health Service Foundation Trust, London, United Kingdom
| | - Jennifer Doran
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
| | - Ian McCabe
- Health Innovation Via Engineering Laboratory, Cúram Science Foundation Ireland Research Centre for Medical Devices, University of Galway, Galway, Ireland
| | - Derek O'Keeffe
- Department of Medicine, University Hospital Galway, Galway, Ireland
- School of Medicine, College of Medicine Nursing and Health Sciences, University of Galway, Galway, Ireland
- Lero, Science Foundation Ireland Centre for Software Research, University of Limerick, Limerick, Ireland
| |
Collapse
|
9
|
Arjomandi Rad A, Subbiah Ponniah H, Shah V, Nanchahal S, Vardanyan R, Miller G, Malawana J. Leading Transformation in Medical Education Through Extended Reality. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2023; 1421:161-173. [PMID: 37524987 DOI: 10.1007/978-3-031-30379-1_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/02/2023]
Abstract
Extended reality (XR) has exponentially developed over the past decades to incorporate technology whereby users can visualise, explore, and interact with 3-dimensional-generated computer environments, and superimpose virtual reality (VR) onto real-world environments, thus displaying information and data on various levels of the reality-virtuality continuum. In the context of medicine, VR tools allow for anatomical assessment and diagnosis, surgical training through lifelike procedural simulations, planning of surgeries and biopsies, intraprocedural guidance, and medical education. The following chapter aims to provide an overview of the currently available evidence and perspectives on the application of XR within medical education. It will focus on undergraduate and postgraduate teaching, medical education within Low-Middle Income Countries, key practical steps in implementing a successful XR programme, and the limitations and future of extended reality within medical education.
Collapse
Affiliation(s)
- Arian Arjomandi Rad
- Medical Sciences Division, University of Oxford, Oxford, UK
- The Healthcare Leadership Academy, London, UK
| | | | - Viraj Shah
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - Sukanya Nanchahal
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - Robert Vardanyan
- The Healthcare Leadership Academy, London, UK
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - George Miller
- The Healthcare Leadership Academy, London, UK
- University of Central Lancashire Medical School, Preston, UK
| | - Johann Malawana
- The Healthcare Leadership Academy, London, UK.
- University of Central Lancashire Medical School, Preston, UK.
| |
Collapse
|
10
|
Platt A, Lutton EJ, Offord E, Bretschneider T. MiCellAnnGELo: annotate microscopy time series of complex cell surfaces with 3D virtual reality. Bioinformatics 2023; 39:btad013. [PMID: 36629475 PMCID: PMC9869652 DOI: 10.1093/bioinformatics/btad013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2022] [Revised: 11/29/2022] [Accepted: 01/10/2023] [Indexed: 01/12/2023] Open
Abstract
SUMMARY Advances in 3D live cell microscopy are enabling high-resolution capture of previously unobserved processes. Unleashing the power of modern machine learning methods to fully benefit from these technologies is, however, frustrated by the difficulty of manually annotating 3D training data. MiCellAnnGELo virtual reality software offers an immersive environment for viewing and interacting with 4D microscopy data, including efficient tools for annotation. We present tools for labelling cell surfaces with a wide range of applications, including cell motility, endocytosis and transmembrane signalling. AVAILABILITY AND IMPLEMENTATION MiCellAnnGELo employs the cross-platform (Mac/Unix/Windows) Unity game engine and is available under the MIT licence at https://github.com/CellDynamics/MiCellAnnGELo.git, together with sample data. MiCellAnnGELo can be run in desktop mode on a 2D screen or in 3D using a standard VR headset with a compatible GPU. SUPPLEMENTARY INFORMATION Supplementary data are available at Bioinformatics online.
Collapse
Affiliation(s)
- Adam Platt
- Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
| | - E Josiah Lutton
- Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
| | - Edward Offord
- Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
| | - Till Bretschneider
- Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
| |
Collapse
|
11
|
Lasso A, Herz C, Nam H, Cianciulli A, Pieper S, Drouin S, Pinter C, St-Onge S, Vigil C, Ching S, Sunderland K, Fichtinger G, Kikinis R, Jolley MA. SlicerHeart: An open-source computing platform for cardiac image analysis and modeling. Front Cardiovasc Med 2022; 9:886549. [PMID: 36148054 PMCID: PMC9485637 DOI: 10.3389/fcvm.2022.886549] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 08/08/2022] [Indexed: 11/25/2022] Open
Abstract
Cardiovascular disease is a significant cause of morbidity and mortality in the developed world. 3D imaging of the heart's structure is critical to the understanding and treatment of cardiovascular disease. However, open-source tools for image analysis of cardiac images, particularly 3D echocardiographic (3DE) data, are limited. We describe the rationale, development, implementation, and application of SlicerHeart, a cardiac-focused toolkit for image analysis built upon 3D Slicer, an open-source image computing platform. We designed and implemented multiple Python scripted modules within 3D Slicer to import, register, and view 3DE data, including new code to volume render and crop 3DE. In addition, we developed dedicated workflows for the modeling and quantitative analysis of multi-modality image-derived heart models, including heart valves. Finally, we created and integrated new functionality to facilitate the planning of cardiac interventions and surgery. We demonstrate application of SlicerHeart to a diverse range of cardiovascular modeling and simulation including volume rendering of 3DE images, mitral valve modeling, transcatheter device modeling, and planning of complex surgical intervention such as cardiac baffle creation. SlicerHeart is an evolving open-source image processing platform based on 3D Slicer initiated to support the investigation and treatment of congenital heart disease. The technology in SlicerHeart provides a robust foundation for 3D image-based investigation in cardiovascular medicine.
Collapse
Affiliation(s)
- Andras Lasso
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, Canada
| | - Christian Herz
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States
| | - Hannah Nam
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States
| | - Alana Cianciulli
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States
| | | | - Simon Drouin
- Software and Information Technology Engineering, École de Technologie Supérieure, Montreal, QC, Canada
| | | | - Samuelle St-Onge
- Software and Information Technology Engineering, École de Technologie Supérieure, Montreal, QC, Canada
| | - Chad Vigil
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States
| | - Stephen Ching
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States
| | - Kyle Sunderland
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, Canada
| | - Gabor Fichtinger
- Laboratory for Percutaneous Surgery, School of Computing, Queen's University, Kingston, ON, Canada
| | - Ron Kikinis
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, United States
| | - Matthew A. Jolley
- Department of Anesthesiology and Critical Care Medicine, Children's Hospital of Philadelphia, Philadelphia, PA, United States,Division of Cardiology, Children's Hospital of Philadelphia, Philadelphia, PA, United States,*Correspondence: Matthew A. Jolley
| |
Collapse
|
12
|
Ghosh RM, Jolley MA, Mascio CE, Chen JM, Fuller S, Rome JJ, Silvestro E, Whitehead KK. Clinical 3D modeling to guide pediatric cardiothoracic surgery and intervention using 3D printed anatomic models, computer aided design and virtual reality. 3D Print Med 2022; 8:11. [PMID: 35445896 PMCID: PMC9027072 DOI: 10.1186/s41205-022-00137-9] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2022] [Accepted: 03/12/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Surgical and catheter-based interventions for congenital heart disease require precise understanding of complex anatomy. The use of three-dimensional (3D) printing and virtual reality to enhance visuospatial understanding has been well documented, but integration of these methods into routine clinical practice has not been well described. We review the growth and development of a clinical 3D modeling service to inform procedural planning within a high-volume pediatric heart center. METHODS Clinical 3D modeling was performed using cardiac magnetic resonance (CMR) or computed tomography (CT) derived data. Image segmentation and post-processing was performed using FDA-approved software. Patient-specific anatomy was visualized using 3D printed models, digital flat screen models and virtual reality. Surgical repair options were digitally designed using proprietary and open-source computer aided design (CAD) based modeling tools. RESULTS From 2018 to 2020 there were 112 individual 3D modeling cases performed, 16 for educational purposes and 96 clinically utilized for procedural planning. Over the 3-year period, demand for clinical modeling tripled and in 2020, 3D modeling was requested in more than one-quarter of STAT category 3, 4 and 5 cases. The most common indications for modeling were complex biventricular repair (n = 30, 31%) and repair of multiple ventricular septal defects (VSD) (n = 11, 12%). CONCLUSIONS Using a multidisciplinary approach, clinical application of 3D modeling can be seamlessly integrated into pre-procedural care for patients with congenital heart disease. Rapid expansion and increased demand for utilization of these tools within a high-volume center demonstrate the high value conferred on these techniques by surgeons and interventionalists alike.
Collapse
Affiliation(s)
- Reena M Ghosh
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, 3401 Civic Center Blvd, Philadelphia, 19104, PA, USA.
| | - Matthew A Jolley
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, 3401 Civic Center Blvd, Philadelphia, 19104, PA, USA.,Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Christopher E Mascio
- Division of Cardiothoracic Surgery, Children's Hospital of Philadelphia, Philadelphia, PA, USA.,Division of Cardiovascular and Thoracic Surgery, West Virginia University School of Medicine, Morgantown, WV, USA
| | - Jonathan M Chen
- Division of Cardiothoracic Surgery, Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Stephanie Fuller
- Division of Cardiothoracic Surgery, Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Jonathan J Rome
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, 3401 Civic Center Blvd, Philadelphia, 19104, PA, USA
| | - Elizabeth Silvestro
- Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Kevin K Whitehead
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, 3401 Civic Center Blvd, Philadelphia, 19104, PA, USA
| |
Collapse
|
13
|
Guérinot C, Marcon V, Godard C, Blanc T, Verdier H, Planchon G, Raimondi F, Boddaert N, Alonso M, Sailor K, Lledo PM, Hajj B, El Beheiry M, Masson JB. New Approach to Accelerated Image Annotation by Leveraging Virtual Reality and Cloud Computing. FRONTIERS IN BIOINFORMATICS 2022; 1:777101. [PMID: 36303792 PMCID: PMC9580868 DOI: 10.3389/fbinf.2021.777101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/15/2021] [Indexed: 01/02/2023] Open
Abstract
Three-dimensional imaging is at the core of medical imaging and is becoming a standard in biological research. As a result, there is an increasing need to visualize, analyze and interact with data in a natural three-dimensional context. By combining stereoscopy and motion tracking, commercial virtual reality (VR) headsets provide a solution to this critical visualization challenge by allowing users to view volumetric image stacks in a highly intuitive fashion. While optimizing the visualization and interaction process in VR remains an active topic, one of the most pressing issue is how to utilize VR for annotation and analysis of data. Annotating data is often a required step for training machine learning algorithms. For example, enhancing the ability to annotate complex three-dimensional data in biological research as newly acquired data may come in limited quantities. Similarly, medical data annotation is often time-consuming and requires expert knowledge to identify structures of interest correctly. Moreover, simultaneous data analysis and visualization in VR is computationally demanding. Here, we introduce a new procedure to visualize, interact, annotate and analyze data by combining VR with cloud computing. VR is leveraged to provide natural interactions with volumetric representations of experimental imaging data. In parallel, cloud computing performs costly computations to accelerate the data annotation with minimal input required from the user. We demonstrate multiple proof-of-concept applications of our approach on volumetric fluorescent microscopy images of mouse neurons and tumor or organ annotations in medical images.
Collapse
Affiliation(s)
- Corentin Guérinot
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
- Sorbonne Université, Collège Doctoral, Paris, France
| | - Valentin Marcon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Charlotte Godard
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Thomas Blanc
- Sorbonne Université, Collège Doctoral, Paris, France
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Histopathology and Bio-Imaging Group, Sanofi R&D, Vitry-Sur-Seine, France
- Université de Paris, UFR de Physique, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Francesca Raimondi
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Unité Médicochirurgicale de Cardiologie Congénitale et Pédiatrique, Centre de Référence des Malformations Cardiaques Congénitales Complexes M3C, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Mariana Alonso
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Kurt Sailor
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Pierre-Marie Lledo
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Bassam Hajj
- Sorbonne Université, Collège Doctoral, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| |
Collapse
|
14
|
Filipov I, Chirila L, Sandulescu M, Cristache CM. A Predictable Approach of a Rare and Frequently Misdiagnosed Entity: Laryngeal Nerve Schwannoma. Healthcare (Basel) 2021; 10:59. [PMID: 35052223 PMCID: PMC8775822 DOI: 10.3390/healthcare10010059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Revised: 12/23/2021] [Accepted: 12/28/2021] [Indexed: 11/16/2022] Open
Abstract
(1) Background: Schwannoma, a mesenchymal neoplasm derived from Schwann cells that line peripheral nerve sheaths, has a challenging diagnosis, due to the non-specific medical history and clinical examination. Nowadays, virtual reality (VR) is increasingly more used for enhancing diagnosis and for preoperative planning of surgical procedures. With VR, the surgeon can interact, before any surgery, with a virtual environment that is completely generated by a computer, offering them a real experience inside a virtual 3D model. (2) Methods and Results: The aim of the present paper was to present a case of surgically removal of a schwannoma, which originated from the fibers of the superior laryngeal nerve, in a predictable and minimally invasive fashion, upon using VR for diagnosis and surgical procedure planning. (3) Conclusions: The current clinical report attracted the attention of including schwannoma in the possible differential diagnosis of a swelling in the anterior cervical region, mainly when a nonspecific radiological appearance is noticed, even with the use of multiple imaging modalities. Virtual reality can increase the predictability and success rate of the surgical procedure, being in the meantime a good tool for communication with the patient.
Collapse
Affiliation(s)
- Iulian Filipov
- Department of Maxillofacial Surgery, “Queen Maria” Military Emergency Hospital, 9 Pietii Str., 500007 Brasov, Romania;
- Department of Dental Techniques, “Carol Davila” University of Medicine and Pharmacy, 8, Eroilor Sanitari Blvd., 050474 Bucharest, Romania
| | - Lucian Chirila
- Department of Oral and Maxillofacial Surgery, “Carol Davila” University of Medicine and Pharmacy, 19 Plevnei Ave., 010221 Bucharest, Romania
| | - Mihai Sandulescu
- Department of Implant Prosthetic Therapy, “Carol Davila” University of Medicine and Pharmacy, 19 Plevnei Ave., 010221 Bucharest, Romania;
| | - Corina Marilena Cristache
- Department of Dental Techniques, “Carol Davila” University of Medicine and Pharmacy, 8, Eroilor Sanitari Blvd., 050474 Bucharest, Romania
| |
Collapse
|
15
|
Ito T, Kawashima Y, Yamazaki A, Tsutsumi T. Application of a virtual and mixed reality-navigation system using commercially available devices to the lateral temporal bone resection. Ann Med Surg (Lond) 2021; 72:103063. [PMID: 34824840 PMCID: PMC8604738 DOI: 10.1016/j.amsu.2021.103063] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 11/10/2021] [Accepted: 11/10/2021] [Indexed: 12/24/2022] Open
Abstract
Background Lateral temporal bone resection (LTBR) is performed for stage T1-2 external ear malignant tumors and requires spatial anatomical knowledge of the rare surgical field. Objective This paper presents a novel virtual reality (VR) based surgical simulation and navigation system using only commercially available display device and an online software, to assist in the understanding of the anatomy pre and intraoperatively. Result and conclusion VR model created by 3D Slicer modules and visualized on head mounted display enabled users to simulate and learn surgical techniques of a rare surgical case. 3D hologram through HoloLens assisted the surgeon in comprehending the spatial relationship between crucial vital structures and the pathological lesion during the operation. This platform does not require the users to possess specific programming skill or knowledge, and is therefore applicable in daily clinical usage. Lateral temporal bone resection (LTBR) is standard operative procedure for early-staged malignant tumors of external ear canals. However, many surgeons lack the opportunity to learn the surgical techniques because of its rarity. We report a usage of novel virtual reality based surgical simulation and navigation system for studying the anatomy and the operative steps in LTBR. 3D holograms with head-mounted display will provide revolutionary tool in assisting surgical planning, intraoperative referencing and navigation of otologic and skull base surgery.
Collapse
Affiliation(s)
- Taku Ito
- Department of Otolaryngology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Yoshiyuki Kawashima
- Department of Otolaryngology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Ayame Yamazaki
- Department of Otolaryngology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Takeshi Tsutsumi
- Department of Otolaryngology, Tokyo Medical and Dental University, Tokyo, Japan
| |
Collapse
|
16
|
Ghosh RM, Mascio CE, Rome JJ, Jolley MA, Whitehead KK. Use of Virtual Reality for Hybrid Closure of Multiple Ventricular Septal Defects. JACC Case Rep 2021; 3:1579-1583. [PMID: 34729504 PMCID: PMC8543163 DOI: 10.1016/j.jaccas.2021.07.033] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2021] [Revised: 07/19/2021] [Accepted: 07/21/2021] [Indexed: 11/17/2022]
Abstract
A 28-month-old girl with multiple ventricular septal defects previously underwent surgical and transcatheter attempts at repair. Three-dimensional models were created from cardiac magnetic resonance–derived images. Viewing the models in virtual reality allowed the team to precisely locate the defects and decide on a hybrid transcatheter and surgical approach to ensure successful repair. (Level of Difficulty: Advanced.)
Collapse
Affiliation(s)
- Reena M. Ghosh
- Division of Cardiology, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
- Address for correspondence: Dr Reena M. Ghosh, Division of Cardiology, Children’s Hospital of Philadelphia, 3401 Civic Center Boulevard, Philadelphia, Pennsylvania 19104, USA. @ghoshrm
| | - Christopher E. Mascio
- Division of Cardiothoracic Surgery, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Jonathan J. Rome
- Division of Cardiology, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Matthew A. Jolley
- Division of Cardiology, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
- Department of Anesthesiology and Critical Care Medicine, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| | - Kevin K. Whitehead
- Division of Cardiology, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania, USA
| |
Collapse
|
17
|
Deng S, Wheeler G, Toussaint N, Munroe L, Bhattacharya S, Sajith G, Lin E, Singh E, Chu KYK, Kabir S, Pushparajah K, Simpson JM, Schnabel JA, Gomez A. A Virtual Reality System for Improved Image-Based Planning of Complex Cardiac Procedures. J Imaging 2021; 7:151. [PMID: 34460787 PMCID: PMC8404926 DOI: 10.3390/jimaging7080151] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Revised: 08/13/2021] [Accepted: 08/17/2021] [Indexed: 12/03/2022] Open
Abstract
The intricate nature of congenital heart disease requires understanding of the complex, patient-specific three-dimensional dynamic anatomy of the heart, from imaging data such as three-dimensional echocardiography for successful outcomes from surgical and interventional procedures. Conventional clinical systems use flat screens, and therefore, display remains two-dimensional, which undermines the full understanding of the three-dimensional dynamic data. Additionally, the control of three-dimensional visualisation with two-dimensional tools is often difficult, so used only by imaging specialists. In this paper, we describe a virtual reality system for immersive surgery planning using dynamic three-dimensional echocardiography, which enables fast prototyping for visualisation such as volume rendering, multiplanar reformatting, flow visualisation and advanced interaction such as three-dimensional cropping, windowing, measurement, haptic feedback, automatic image orientation and multiuser interactions. The available features were evaluated by imaging and nonimaging clinicians, showing that the virtual reality system can help improve the understanding and communication of three-dimensional echocardiography imaging and potentially benefit congenital heart disease treatment.
Collapse
Affiliation(s)
- Shujie Deng
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Gavin Wheeler
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Nicolas Toussaint
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Lindsay Munroe
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Suryava Bhattacharya
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Gina Sajith
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Ei Lin
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Eeshar Singh
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Ka Yee Kelly Chu
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| | - Saleha Kabir
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - Kuberan Pushparajah
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - John M. Simpson
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Congenital Heart Disease, Evelina London Children’s Hospital, Guy’s and St Thomas’ National Health Service Foundation Trust, London SE1 7EH, UK;
| | - Julia A. Schnabel
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
- Department of Informatics, Technische Universität München, 85748 Garching, Germany
- Helmholtz Zentrum München—German Research Center for Environmental Health, 85764 Neuherberg, Germany
| | - Alberto Gomez
- School of Biomedical Engineering & Imaging Sciences, King’s College London, London SE1 7EU, UK; (S.D.); (G.W.); (N.T.); (L.M.); (S.B.); (G.S.); (E.L.); (E.S.); (K.Y.K.C.); (K.P.); (J.M.S.); (J.A.S.)
| |
Collapse
|
18
|
Vigil C, Lasso A, Ghosh RM, Pinter C, Cianciulli A, Nam HH, Abid A, Herz C, Mascio CE, Chen J, Fuller S, Whitehead K, Jolley MA. Modeling Tool for Rapid Virtual Planning of the Intracardiac Baffle in Double-Outlet Right Ventricle. Ann Thorac Surg 2021; 111:2078-2083. [PMID: 33689734 DOI: 10.1016/j.athoracsur.2021.02.058] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/19/2021] [Accepted: 02/13/2021] [Indexed: 11/29/2022]
Abstract
PURPOSE Biventricular repair of double-outlet right ventricle (DORV) necessitates the creation of a complex intracardiac baffle. Creation of the optimal baffle design and placement thereof can be challenging to conceptualize, even with 2-dimensional and 3-dimensional images. This report describes a recently developed methodology for creating virtual baffles to inform intraoperative repair. DESCRIPTION A total of 3 heart models of DORV were created from cardiac magnetic resonance images. Baffles were created and visualized using custom software. EVALUATION This report demonstrates application of the tool to virtual planning of the baffle for repair of DORV in 3 cases. Models were examined by a multidisciplinary team, on screen and in virtual reality. Baffles could be rapidly created and revised to facilitate planning of the surgical procedure. CONCLUSIONS Virtual modeling of the baffle pathway by using cardiac magnetic resonance, creation of physical templates for the baffle, and visualization in virtual reality are feasible and may be beneficial for preoperative planning of complex biventricular repairs in DORV. Further work is needed to demonstrate clinical benefit or improvement in outcomes.
Collapse
Affiliation(s)
- Chad Vigil
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Andras Lasso
- School of Computing, Queen's University, Kingston, Ontario, Canada
| | - Reena M Ghosh
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | | | - Alana Cianciulli
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Hannah H Nam
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Ashraful Abid
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Christian Herz
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Christopher E Mascio
- Division of Pediatric Cardiac Surgery, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Jonathan Chen
- Division of Pediatric Cardiac Surgery, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Stephanie Fuller
- Division of Pediatric Cardiac Surgery, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Kevin Whitehead
- Division of Pediatric Cardiology, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Matthew A Jolley
- Department of Anesthesia and Critical Care, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania; Division of Pediatric Cardiology, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania.
| |
Collapse
|
19
|
Narang A, Hitschrich N, Mor-Avi V, Schreckenberg M, Schummers G, Tiemann K, Hitschrich D, Sodian R, Addetia K, Lang RM, Mumm B. Virtual Reality Analysis of Three-Dimensional Echocardiographic and Cardiac Computed Tomographic Data Sets. J Am Soc Echocardiogr 2020; 33:1306-1315. [DOI: 10.1016/j.echo.2020.06.018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/06/2020] [Revised: 06/20/2020] [Accepted: 06/22/2020] [Indexed: 12/13/2022]
|