1
|
Żydowicz WM, Skokowski J, Marano L, Polom K. Current Trends and Beyond Conventional Approaches: Advancements in Breast Cancer Surgery through Three-Dimensional Imaging, Virtual Reality, Augmented Reality, and the Emerging Metaverse. J Clin Med 2024; 13:915. [PMID: 38337610 PMCID: PMC10856583 DOI: 10.3390/jcm13030915] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 01/25/2024] [Accepted: 02/03/2024] [Indexed: 02/12/2024] Open
Abstract
Breast cancer stands as the most prevalent cancer globally, necessitating comprehensive care. A multidisciplinary approach proves crucial for precise diagnosis and treatment, ultimately leading to effective disease management. While surgical interventions continue to evolve and remain integral for curative treatment, imaging assumes a fundamental role in breast cancer detection. Advanced imaging techniques not only facilitate improved diagnosis but also contribute significantly to the overall enhancement of breast cancer management. This review article aims to provide an overview of innovative technologies such as virtual reality, augmented reality, and three-dimensional imaging, utilized in the medical field to elevate the diagnosis and treatment of breast cancer. Additionally, the article delves into an emerging technology known as the metaverse, still under development. Through the analysis of impactful research and comparison of their findings, this study offers valuable insights into the advantages of each innovative technique. The goal is to provide physicians, surgeons, and radiologists with information on how to enhance breast cancer management.
Collapse
Affiliation(s)
- Weronika Magdalena Żydowicz
- Department of General Surgery and Surgical Oncology, “Saint Wojciech” Hospital, “Nicolaus Copernicus” Health Center, Jana Pawła II 50, 80-462 Gdańsk, Poland; (W.M.Ż.); (J.S.)
| | - Jaroslaw Skokowski
- Department of General Surgery and Surgical Oncology, “Saint Wojciech” Hospital, “Nicolaus Copernicus” Health Center, Jana Pawła II 50, 80-462 Gdańsk, Poland; (W.M.Ż.); (J.S.)
- Department of Medicine, Academy of Applied Medical and Social Sciences, Akademia Medycznych I Spolecznych Nauk Stosowanych (AMiSNS), 2 Lotnicza Street, 82-300 Elbląg, Poland;
| | - Luigi Marano
- Department of General Surgery and Surgical Oncology, “Saint Wojciech” Hospital, “Nicolaus Copernicus” Health Center, Jana Pawła II 50, 80-462 Gdańsk, Poland; (W.M.Ż.); (J.S.)
- Department of Medicine, Academy of Applied Medical and Social Sciences, Akademia Medycznych I Spolecznych Nauk Stosowanych (AMiSNS), 2 Lotnicza Street, 82-300 Elbląg, Poland;
| | - Karol Polom
- Department of Medicine, Academy of Applied Medical and Social Sciences, Akademia Medycznych I Spolecznych Nauk Stosowanych (AMiSNS), 2 Lotnicza Street, 82-300 Elbląg, Poland;
- Department of Gastrointestinal Surgical Oncology, Greater Poland Cancer Centre, Garbary 15, 61-866 Poznan, Poland
| |
Collapse
|
2
|
Zhang J, Lu V, Khanduja V. The impact of extended reality on surgery: a scoping review. INTERNATIONAL ORTHOPAEDICS 2023; 47:611-621. [PMID: 36645474 PMCID: PMC9841146 DOI: 10.1007/s00264-022-05663-z] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 12/03/2022] [Indexed: 01/17/2023]
Abstract
PURPOSE Extended reality (XR) is defined as a spectrum of technologies that range from purely virtual environments to enhanced real-world environments. In the past two decades, XR-assisted surgery has seen an increase in its use and also in research and development. This scoping review aims to map out the historical trends in these technologies and their future prospects, with an emphasis on the reported outcomes and ethical considerations on the use of these technologies. METHODS A systematic search of PubMed, Scopus, and Embase for literature related to XR-assisted surgery and telesurgery was performed using Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for scoping reviews (PRISMA-ScR) guidelines. Primary studies, peer-reviewed articles that described procedures performed by surgeons on human subjects and cadavers, as well as studies describing general surgical education, were included. Non-surgical procedures, bedside procedures, veterinary procedures, procedures performed by medical students, and review articles were excluded. Studies were classified into the following categories: impact on surgery (pre-operative planning and intra-operative navigation/guidance), impact on the patient (pain and anxiety), and impact on the surgeon (surgical training and surgeon confidence). RESULTS One hundred and sixty-eight studies were included for analysis. Thirty-one studies investigated the use of XR for pre-operative planning concluded that virtual reality (VR) enhanced the surgeon's spatial awareness of important anatomical landmarks. This leads to shorter operating sessions and decreases surgical insult. Forty-nine studies explored the use of XR for intra-operative planning. They noted that augmented reality (AR) headsets highlight key landmarks, as well as important structures to avoid, which lowers the chance of accidental surgical trauma. Eleven studies investigated patients' pain and noted that VR is able to generate a meditative state. This is beneficial for patients, as it reduces the need for analgesics. Ten studies commented on patient anxiety, suggesting that VR is unsuccessful at altering patients' physiological parameters such as mean arterial blood pressure or cortisol levels. Sixty studies investigated surgical training whilst seven studies suggested that the use of XR-assisted technology increased surgeon confidence. CONCLUSION The growth of XR-assisted surgery is driven by advances in hardware and software. Whilst augmented virtuality and mixed reality are underexplored, the use of VR is growing especially in the fields of surgical training and pre-operative planning. Real-time intra-operative guidance is key for surgical precision, which is being supplemented with AR technology. XR-assisted surgery is likely to undertake a greater role in the near future, given the effect of COVID-19 limiting physical presence and the increasing complexity of surgical procedures.
Collapse
Affiliation(s)
- James Zhang
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Victor Lu
- School of Clinical Medicine, University of Cambridge, Cambridge, CB2 0SP UK
| | - Vikas Khanduja
- Young Adult Hip Service, Department of Trauma and Orthopaedics, Addenbrooke’s Hospital, Cambridge University Hospital, Hills Road, Cambridge, CB2 0QQ UK
| |
Collapse
|
3
|
Guérinot C, Marcon V, Godard C, Blanc T, Verdier H, Planchon G, Raimondi F, Boddaert N, Alonso M, Sailor K, Lledo PM, Hajj B, El Beheiry M, Masson JB. New Approach to Accelerated Image Annotation by Leveraging Virtual Reality and Cloud Computing. FRONTIERS IN BIOINFORMATICS 2022; 1:777101. [PMID: 36303792 PMCID: PMC9580868 DOI: 10.3389/fbinf.2021.777101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/15/2021] [Indexed: 01/02/2023] Open
Abstract
Three-dimensional imaging is at the core of medical imaging and is becoming a standard in biological research. As a result, there is an increasing need to visualize, analyze and interact with data in a natural three-dimensional context. By combining stereoscopy and motion tracking, commercial virtual reality (VR) headsets provide a solution to this critical visualization challenge by allowing users to view volumetric image stacks in a highly intuitive fashion. While optimizing the visualization and interaction process in VR remains an active topic, one of the most pressing issue is how to utilize VR for annotation and analysis of data. Annotating data is often a required step for training machine learning algorithms. For example, enhancing the ability to annotate complex three-dimensional data in biological research as newly acquired data may come in limited quantities. Similarly, medical data annotation is often time-consuming and requires expert knowledge to identify structures of interest correctly. Moreover, simultaneous data analysis and visualization in VR is computationally demanding. Here, we introduce a new procedure to visualize, interact, annotate and analyze data by combining VR with cloud computing. VR is leveraged to provide natural interactions with volumetric representations of experimental imaging data. In parallel, cloud computing performs costly computations to accelerate the data annotation with minimal input required from the user. We demonstrate multiple proof-of-concept applications of our approach on volumetric fluorescent microscopy images of mouse neurons and tumor or organ annotations in medical images.
Collapse
Affiliation(s)
- Corentin Guérinot
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
- Sorbonne Université, Collège Doctoral, Paris, France
| | - Valentin Marcon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Charlotte Godard
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Thomas Blanc
- Sorbonne Université, Collège Doctoral, Paris, France
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Histopathology and Bio-Imaging Group, Sanofi R&D, Vitry-Sur-Seine, France
- Université de Paris, UFR de Physique, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Francesca Raimondi
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Unité Médicochirurgicale de Cardiologie Congénitale et Pédiatrique, Centre de Référence des Malformations Cardiaques Congénitales Complexes M3C, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Mariana Alonso
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Kurt Sailor
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Pierre-Marie Lledo
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Bassam Hajj
- Sorbonne Université, Collège Doctoral, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| |
Collapse
|
4
|
Zhu T, Jiang S, Yang Z, Zhou Z, Li Y, Ma S, Zhuo J. A neuroendoscopic navigation system based on dual-mode augmented reality for minimally invasive surgical treatment of hypertensive intracerebral hemorrhage. Comput Biol Med 2022; 140:105091. [PMID: 34872012 DOI: 10.1016/j.compbiomed.2021.105091] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2021] [Revised: 11/23/2021] [Accepted: 11/26/2021] [Indexed: 01/01/2023]
Abstract
BACKGROUND AND OBJECTIVE Hypertensive intracerebral hemorrhage is characterized by a high rate of morbidity, mortality, disability and recurrence. Neuroendoscopy has been utilized for treatment as an advanced technology. However, traditional neuroendoscopy allows professionals to see only tissue surfaces, and the field of vision is limited, which cannot provide spatial guidance. In this study, an AR-based neuroendoscopic navigation system is proposed to assist surgeons in locating and clearing hematoma. METHODS The neuroendoscope can be registered through the vector closed loop algorithm. The single-shot method is designed to register medical images with patients precisely. Real-time AR is realized based on video stream fusion. Dual-mode AR navigation is proposed to provide comprehensive guidance from catheter implantation to hematoma removal. A series of experiments is designed to validate the accuracy and significance of this system. RESULTS The average root mean square error of the registration between medical images and patients is 0.784 mm, and the variance is 0.1426 mm. The pixel mismatching degrees are less than 1% in different AR modes. In catheter implantation experiments, the average error of distance is 1.28 mm, and the variance is 0.43 mm, while the average error of angles is 1.34°, and the variance is 0.45°. Comparative experiments are also conducted to evaluate the feasibility of this system. CONCLUSION This system can provide stereo images with depth information fused with patients to guide surgeons to locate targets and remove hematoma. It has been validated to have high accuracy and feasibility.
Collapse
Affiliation(s)
- Tao Zhu
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China.
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Yuhua Li
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shixing Ma
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Jie Zhuo
- Department of Neurosurgery, Tianjin Huanhu Hospital, Tianjin, 300200, China
| |
Collapse
|
5
|
El Beheiry M, Gaillard T, Girard N, Darrigues L, Osdoit M, Feron JG, Sabaila A, Laas E, Fourchotte V, Laki F, Lecuru F, Couturaud B, Binder JP, Masson JB, Reyal F, Malhaire C. Breast Magnetic Resonance Image Analysis for Surgeons Using Virtual Reality: A Comparative Study. JCO Clin Cancer Inform 2021; 5:1127-1133. [PMID: 34767435 DOI: 10.1200/cci.21.00048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2021] [Revised: 08/23/2021] [Accepted: 09/29/2021] [Indexed: 12/24/2022] Open
Abstract
PURPOSE The treatment of breast cancer, the leading cause of cancer and cancer mortality among women worldwide, is mainly on the basis of surgery. In this study, we describe the use of a medical image visualization tool on the basis of virtual reality (VR), entitled DIVA, in the context of breast cancer tumor localization among surgeons. The aim of this study was to evaluate the speed and accuracy of surgeons using DIVA for medical image analysis of breast magnetic resonance image (MRI) scans relative to standard image slice-based visualization tools. MATERIALS AND METHODS In our study, residents and practicing surgeons used two breast MRI reading modalities: the common slice-based radiology interface and the DIVA system in its VR mode. Metrics measured were compared in relation to postoperative anatomical-pathologic reports. RESULTS Eighteen breast surgeons from the Institut Curie performed all the analysis presented. The MRI analysis time was significantly lower with the DIVA system than with the slice-based visualization for residents, practitioners, and subsequently the entire group (P < .001). The accuracy of determination of which breast contained the lesion significantly increased with DIVA for residents (P = .003) and practitioners (P = .04). There was little difference between the DIVA and slice-based visualization for the determination of the number of lesions. The accuracy of quadrant determination was significantly improved by DIVA for practicing surgeons (P = .01) but not significantly for residents (P = .49). CONCLUSION This study indicates that the VR visualization of medical images systematically improves surgeons' analysis of preoperative breast MRI scans across several different metrics irrespective of surgeon seniority.
Collapse
Affiliation(s)
- Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) and Neuroscience Department CNRS UMR 3571, Institut Pasteur and CNRS, Paris, France
| | - Thomas Gaillard
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Noémie Girard
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Lauren Darrigues
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Marie Osdoit
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Anne Sabaila
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Enora Laas
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Fatima Laki
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Fabrice Lecuru
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | - Benoit Couturaud
- Surgery Department, Institut Curie, PSL Research University, Paris, France
| | | | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) and Neuroscience Department CNRS UMR 3571, Institut Pasteur and CNRS, Paris, France
| | - Fabien Reyal
- Surgery Department, Institut Curie, PSL Research University, Paris, France
- U932, Immunity and Cancer, INSERM, Institut Curie, Paris, France
| | - Caroline Malhaire
- Department of Medical Imaging, Institut Curie, PSL Research University, Paris, France
- Institut Curie, INSERM, LITO Laboratory, Orsay, France
| |
Collapse
|
6
|
Zhou Z, Yang Z, Jiang S, Zhang F, Yan H. Design and validation of a surgical navigation system for brachytherapy based on mixed reality. Med Phys 2019; 46:3709-3718. [PMID: 31169914 DOI: 10.1002/mp.13645] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2018] [Revised: 05/29/2019] [Accepted: 05/29/2019] [Indexed: 11/07/2022] Open
Abstract
PURPOSE An accurate position of the needle is vitally important in low-dose-rate seed implantation brachytherapy. Our paper aims to implement a mixed reality navigation system to assist with the placement of the I125 seed implantation thoracoabdominal tumor brachytherapy needle and to validate the accuracy and quality of this type of method. METHODS With the surgical navigation system, based on mixed reality through a novel modified multi-information fusion method, the fusion of virtual organs and a preoperative plan for a real patient and the tracking of surgical tools in real time were achieved. Personalized image recognition and pose estimation were used to track needle punctures in real time and to perform registration processes. After a one-time registration with a hexagonal prism tracker that used an iterative closest point algorithm, all information, including medical images and volume renderings of organs, needles, and seeds, was precisely merged with the patient. Doctors were able to observe the tumor target and to visualize the preoperative plan. This system was validated in both phantom and animal experiments. The accuracy of this system was validated by calculating the positional and rotational error of each needle insertion. The accuracy of implantation of each seed was determined in an animal experiment to test the accuracy in low-dose-rate brachytherapy. The efficiency of this system was also validated through time consumption assessments. RESULTS In the phantom experiment, the average error of the needle locations was 0.664 mm and the angle error was 4.74°, average time consumption was 16.1 min with six needles inserted. Based on the results of the animal experiment, the accuracy of the needle insertion was 1.617 mm, while the angle error was 5.574° and the average error of the seed positions was 1.925 mm. CONCLUSIONS This paper describes the design and experimental validation of a novel surgical navigation system based on mixed reality for I125 seed brachytherapy for thoracoabdominal tumors. This system was validated through a series of experiments, including phantom experiments and animal experiments. Compared with the traditional image-guided system, the procedure presented here is convenient, displays clinically acceptable accuracy and reduces the number of CT scans, allowing doctors to perform surgery based on a visualized plan. All the experimental results indicated that the procedure is ready to be applied in further clinical studies.
Collapse
Affiliation(s)
- Zeyang Zhou
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Zhiyong Yang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China
| | - Shan Jiang
- School of Mechanical Engineering, Tianjin University, Tianjin, 300350, China.,Centre for advanced Mechanisms and Robotics, Tianjin University, Tianjin, 300350, China
| | - Fujun Zhang
- Sun Yat-sen University Cancer Center, Guangzhou, 510060, China.,State Key Laboratory of Oncology in South China, Guangzhou, 510060, China.,Collaborative Innovation Center for Cancer Medicine, Guangzhou, 510060, China
| | - Huzheng Yan
- Sun Yat-sen University Cancer Center, Guangzhou, 510060, China.,State Key Laboratory of Oncology in South China, Guangzhou, 510060, China.,Collaborative Innovation Center for Cancer Medicine, Guangzhou, 510060, China
| |
Collapse
|
7
|
Navigated Breast Tumor Excision Using Electromagnetically Tracked Ultrasound and Surgical Instruments. IEEE Trans Biomed Eng 2016; 63:600-6. [DOI: 10.1109/tbme.2015.2466591] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
8
|
Application of desorption electrospray ionization mass spectrometry imaging in breast cancer margin analysis. Proc Natl Acad Sci U S A 2014; 111:15184-9. [PMID: 25246570 DOI: 10.1073/pnas.1408129111] [Citation(s) in RCA: 189] [Impact Index Per Article: 17.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023] Open
Abstract
Distinguishing tumor from normal glandular breast tissue is an important step in breast-conserving surgery. Because this distinction can be challenging in the operative setting, up to 40% of patients require an additional operation when traditional approaches are used. Here, we present a proof-of-concept study to determine the feasibility of using desorption electrospray ionization mass spectrometry imaging (DESI-MSI) for identifying and differentiating tumor from normal breast tissue. We show that tumor margins can be identified using the spatial distributions and varying intensities of different lipids. Several fatty acids, including oleic acid, were more abundant in the cancerous tissue than in normal tissues. The cancer margins delineated by the molecular images from DESI-MSI were consistent with those margins obtained from histological staining. Our findings prove the feasibility of classifying cancerous and normal breast tissues using ambient ionization MSI. The results suggest that an MS-based method could be developed for the rapid intraoperative detection of residual cancer tissue during breast-conserving surgery.
Collapse
|
9
|
Kersten-Oertel M, Jannin P, Collins DL. The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 2013; 37:98-112. [PMID: 23490236 DOI: 10.1016/j.compmedimag.2013.01.009] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Revised: 01/04/2013] [Accepted: 01/23/2013] [Indexed: 11/26/2022]
Abstract
This paper presents a review of the state of the art of visualization in mixed reality image guided surgery (IGS). We used the DVV (data, visualization processing, view) taxonomy to classify a large unbiased selection of publications in the field. The goal of this work was not only to give an overview of current visualization methods and techniques in IGS but more importantly to analyze the current trends and solutions used in the domain. In surveying the current landscape of mixed reality IGS systems, we identified a strong need to assess which of the many possible data sets should be visualized at particular surgical steps, to focus on novel visualization processing techniques and interface solutions, and to evaluate new systems.
Collapse
Affiliation(s)
- Marta Kersten-Oertel
- Department of Biomedical Engineering, McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montréal, Canada.
| | | | | |
Collapse
|
10
|
Image-guided laparoscopic surgery in an open MRI operating theater. Surg Endosc 2013; 27:2178-84. [DOI: 10.1007/s00464-012-2737-y] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2012] [Accepted: 12/04/2012] [Indexed: 11/26/2022]
|
11
|
Larrieux G, Cupp JA, Liao J, Scott-Conner CEH, Weigel RJ. Effect of introducing hematoma ultrasound-guided lumpectomy in a surgical practice. J Am Coll Surg 2012; 215:237-43. [PMID: 22632911 DOI: 10.1016/j.jamcollsurg.2012.04.018] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2011] [Revised: 03/27/2012] [Accepted: 04/11/2012] [Indexed: 11/26/2022]
Abstract
BACKGROUND Preoperative needle localization (NL) is the gold standard for lumpectomy of nonpalpable breast cancer. Hematoma ultrasound-guided (HUG) lumpectomy can offer several advantages. The purpose of this study was to compare the use of HUG with NL lumpectomy in a single surgical practice. STUDY DESIGN Patients with nonpalpable lesions who underwent NL or HUG lumpectomy from January 2007 to December 2009 by a single surgeon were identified from a breast surgery database. Ease of scheduling, volume excised, re-excision rates, operating room time, and health care charges were the main outcomes variables. Univariate and multivariate analyses were performed to compare the 2 groups. RESULTS Lumpectomy was performed in 110 patients, 55 underwent HUG and 55 underwent NL. Hematoma ultrasound-guided lumpectomy was associated with a nearly 3-fold increase in the odds ratio of additional tissue being submitted to pathology (p = 0.039), but neither the total amount of breast tissue removed, nor the need for second procedure were statistically different between the 2 groups. Duration of the surgical procedure did not vary between the 2 groups; however, the time from biopsy to surgery was shorter for HUG by an expected 9.7 days (p = 0.019), implying greater ease of scheduling. Mean charges averaged $250 less for HUG than for NL, but this difference was not statistically significant. CONCLUSIONS Hematoma ultrasound-guided is equivalent to NL with regard to volume of tissue excised, need for operative re-excision, and operating room time. Adoption of HUG in our practice allowed for more timely surgical care.
Collapse
Affiliation(s)
- Gregory Larrieux
- Department of Surgery, University of Iowa, Iowa City, IA 52242, USA
| | | | | | | | | |
Collapse
|
12
|
Seraglia B, Gamberini L, Priftis K, Scatturin P, Martinelli M, Cutini S. An exploratory fNIRS study with immersive virtual reality: a new method for technical implementation. Front Hum Neurosci 2011; 5:176. [PMID: 22207843 PMCID: PMC3246589 DOI: 10.3389/fnhum.2011.00176] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2011] [Accepted: 12/14/2011] [Indexed: 11/13/2022] Open
Abstract
For over two decades Virtual Reality (VR) has been used as a useful tool in several fields, from medical and psychological treatments, to industrial and military applications. Only in recent years researchers have begun to study the neural correlates that subtend VR experiences. Even if the functional Magnetic Resonance Imaging (fMRI) is the most common and used technique, it suffers several limitations and problems. Here we present a methodology that involves the use of a new and growing brain imaging technique, functional Near-infrared Spectroscopy (fNIRS), while participants experience immersive VR. In order to allow a proper fNIRS probe application, a custom-made VR helmet was created. To test the adapted helmet, a virtual version of the line bisection task was used. Participants could bisect the lines in a virtual peripersonal or extrapersonal space, through the manipulation of a Nintendo Wiimote ® controller in order for the participants to move a virtual laser pointer. Although no neural correlates of the dissociation between peripersonal and extrapersonal space were found, a significant hemodynamic activity with respect to the baseline was present in the right parietal and occipital areas. Both advantages and disadvantages of the presented methodology are discussed.
Collapse
Affiliation(s)
- Bruno Seraglia
- Department of General Psychology, University of Padua Padua, Italy
| | | | | | | | | | | |
Collapse
|
13
|
Magnetic resonance imaging-guided navigation with a thermoplastic shell for breast-conserving surgery. Eur J Surg Oncol 2011; 37:950-5. [DOI: 10.1016/j.ejso.2011.07.006] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2011] [Revised: 06/14/2011] [Accepted: 07/25/2011] [Indexed: 11/17/2022] Open
|