1
|
Qualkenbush E, Perry AG, Kumar N, Thomas CS, Pak RW, Hemal S, Pathak RA. Virtual Reality as an Adjunct to Traditional Patient Counseling in Patients With Newly Diagnosed Localized Prostate Cancer. Urology 2025; 196:1-8. [PMID: 39307432 DOI: 10.1016/j.urology.2024.09.025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Revised: 08/14/2024] [Accepted: 09/13/2024] [Indexed: 10/04/2024]
Abstract
OBJECTIVE To determine the utility of a virtual reality (VR) model constructed using patient-derived clinical imaging to improve patient understanding of localized prostate cancer (PCa) diagnosis and surgical plan. METHODS Patients undergoing robotic radical prostatectomy were selected and demographic data recorded. Patients completed a questionnaire to assess baseline knowledge of their diagnosis after consultation and shared-decision making with their surgeon. A trained non-clinical staff member then guided the patient through a VR experience to view patient-specific anatomy in a 3-dimensional space. Patients then completed the same questionnaire, followed by an additional post-VR questionnaire evaluating patient satisfaction. Questions 1-7 (patient understanding of prostate cancer and treatment plan) and 11-17 (patient opinion of VR) used a standard Likert scale and Questions 8-10 were multiple choice with 1 correct answer. RESULTS In total, 15 patients were included with an average age of 64.1 years. 6 of 7 questions showed an improvement after VR (P <.001). The percentage of correct responses on Questions 8-10 was higher after VR but not statistically significant (P >.13). Mean responses range from 4.3 to 4.8 (Likert scale, 1 through 5) for the post-VR questionnaire, with a mean total of 31.9 out of 35. CONCLUSION This small preliminary investigation of a novel technology to improve the patient experience showed potential as an adjunct to traditional patient counseling. However, due the small sample size and study design, further research is needed to determine the value VR adds to prostate cancer surgical counseling.
Collapse
Affiliation(s)
| | - Alan G Perry
- Department of Urology, Mayo Clinic, Jacksonville, FL
| | - Neal Kumar
- Department of Urology, Mayo Clinic, Jacksonville, FL
| | | | - Raymond W Pak
- Department of Urology, Mayo Clinic, Jacksonville, FL
| | - Sij Hemal
- Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA
| | - Ram A Pathak
- Department of Urology, Mayo Clinic, Jacksonville, FL.
| |
Collapse
|
2
|
Makiyama K, Komeya M, Tatenuma T, Noguchi G, Ohtake S. Patient-specific simulations and navigation systems for partial nephrectomy. Int J Urol 2023; 30:1087-1095. [PMID: 37622340 DOI: 10.1111/iju.15287] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2023] [Accepted: 08/07/2023] [Indexed: 08/26/2023]
Abstract
Partial nephrectomy (PN) is the standard treatment for T1 renal cell carcinoma. PN is affected more by surgical variations and requires greater surgical experience than radical nephrectomy. Patient-specific simulations and navigation systems may help to reduce the surgical experience required for PN. Recent advances in three-dimensional (3D) virtual reality (VR) imaging and 3D printing technology have allowed accurate patient-specific simulations and navigation systems. We reviewed previous studies about patient-specific simulations and navigation systems for PN. Recently, image reconstruction technology has developed, and commercial software that converts two-dimensional images into 3D images has become available. Many urologists are now able to view 3DVR images when preparing for PN. Surgical simulations based on 3DVR images can change surgical plans and improve surgical outcomes, and are useful during patient consultations. Patient-specific simulators that are capable of simulating surgical procedures, the gold-standard form of patient-specific simulations, have also been reported. Besides VR, 3D printing is also useful for understanding patient-specific information. Some studies have reported simulation and navigation systems for PN based on solid 3D models. Patient-specific simulations are a form of preoperative preparation, whereas patient-specific navigation is used intraoperatively. Navigation-assisted PN procedures using 3DVR images have become increasingly common, especially in robotic surgery. Some studies found that these systems produced improvements in surgical outcomes. Once its accuracy has been confirmed, it is hoped that this technology will spread further and become more generalized.
Collapse
Affiliation(s)
- Kazuhide Makiyama
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Mitsuru Komeya
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Tomoyuki Tatenuma
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Go Noguchi
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| | - Shinji Ohtake
- Department of Urology, Yokohama City University Graduate School of Medicine, Yokohama, Kanagawa, Japan
| |
Collapse
|
4
|
Ng PY, Bing EG, Cuevas A, Aggarwal A, Chi B, Sundar S, Mwanahamuntu M, Mutebi M, Sullivan R, Parham GP. Virtual reality and surgical oncology. Ecancermedicalscience 2023; 17:1525. [PMID: 37113716 PMCID: PMC10129400 DOI: 10.3332/ecancer.2023.1525] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Indexed: 04/29/2023] Open
Abstract
More than 80% of people diagnosed with cancer will require surgery. However, less than 5% have access to safe, affordable and timely surgery in low- and middle-income countries (LMICs) settings mostly due to the lack of trained workforce. Since its creation, virtual reality (VR) has been heralded as a viable adjunct to surgical training, but its adoption in surgical oncology to date is poorly understood. We undertook a systematic review to determine the application of VR across different surgical specialties, modalities and cancer pathway globally between January 2011 and 2021. We reviewed their characteristics and respective methods of validation of 24 articles. The results revealed gaps in application and accessibility of VR with a proclivity for high-income countries and high-risk, complex oncological surgeries. There is a lack of standardisation of clinical evaluation of VR, both in terms of clinical trials and implementation science. While all VR illustrated face and content validity, only around two-third exhibited construct validity and predictive validity was lacking overall. In conclusion, the asynchrony between VR development and actual global cancer surgery demand means the technology is not effectively, efficiently and equitably utilised to realise its surgical capacity-building potential. Future research should prioritise cost-effective VR technologies with predictive validity for high demand, open cancer surgeries required in LMICs.
Collapse
Affiliation(s)
- Peng Yun Ng
- King’s College London, London WC2R 2LS, UK
- Guy’s and St Thomas’ Trust, London SE1 9R, UK
| | - Eric G Bing
- Institute for Leadership Impact, Southern Methodist University, Dallas, TX 75205, USA
| | - Anthony Cuevas
- Department of Teaching and Learning, Technology-Enhanced Immersive Learning Cluster, Annette Simmons School of Education and Human Development, Southern Methodist University, Dallas, TX 75205, USA
| | - Ajay Aggarwal
- King’s College London, London WC2R 2LS, UK
- Guy’s and St Thomas’ Trust, London SE1 9R, UK
- London School of Hygiene and Tropical Medicine, London WC1E 7HT, UK
| | - Benjamin Chi
- Icahn School of Medicine, New York, NY 10029-6574, USA
| | - Sudha Sundar
- Institute of Cancer and Genomic Sciences, University of Birmingham, Birmingham B152TT, UK
- Pan Birmingham Gynaecological Cancer Centre, City Hospital, Birmingham, B187QH, UK
| | | | - Miriam Mutebi
- Department of Surgery, Aga Khan University Hospital, Nairobi 30270-00100, Kenya
| | - Richard Sullivan
- Conflict & Health Research Group, King’s College London, London WC2R 2LS, UK
| | - Groesbeck P Parham
- Department of Surgery, Aga Khan University Hospital, Nairobi 30270-00100, Kenya
| |
Collapse
|
5
|
Houshyar R, Glavis-Bloom J, Bui TL, Chahine C, Bardis MD, Ushinsky A, Liu H, Bhatter P, Lebby E, Fujimoto D, Grant W, Tran-Harding K, Landman J, Chow DS, Chang PD. Outcomes of Artificial Intelligence Volumetric Assessment of Kidneys and Renal Tumors for Preoperative Assessment of Nephron Sparing Interventions. J Endourol 2021; 35:1411-1418. [PMID: 33847156 DOI: 10.1089/end.2020.1125] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Background Renal cell carcinoma is the most common kidney cancer and the 13th most common cause of cancer death worldwide. Partial nephrectomy and percutaneous ablation, increasingly utilized to treat small renal masses and preserve renal parenchyma, require precise preoperative imaging interpretation. We sought to develop and evaluate a convolutional neural network (CNN), a type of deep learning artificial intelligence, to act as a surgical planning aid by determining renal tumor and kidney volumes via segmentation on single-phase computed tomography (CT). Materials and Methods After institutional review board approval, the CT images of 319 patients were retrospectively analyzed. Two distinct CNNs were developed for (1) bounding cube localization of the right and left hemi-abdomen and (2) segmentation of the renal parenchyma and tumor within each bounding cube. Training was performed on a randomly selected cohort of 269 patients. CNN performance was evaluated on a separate cohort of 50 patients using Sorensen-Dice coefficients (which measures the spatial overlap between the manually segmented and neural network derived segmentations) and Pearson correlation coefficients. Experiments were run on a GPU-optimized workstation with a single NVIDIA GeForce GTX Titan X (12GB, Maxwell architecture). Results Median Dice coefficients for kidney and tumor segmentation were 0.970 and 0.816, respectively; Pearson correlation coefficients between CNN-generated and human-annotated estimates for kidney and tumor volume were 0.998 and 0.993 (p < 0.001), respectively. End-to-end trained CNNs were able to perform renal parenchyma and tumor segmentation on a new test case in an average of 5.6 seconds. Conclusions Initial experience with automated deep learning artificial intelligence demonstrates that it is capable of rapidly and accurately segmenting kidneys and renal tumors on single-phase contrast-enhanced CT scans and calculating tumor and renal volumes.
Collapse
Affiliation(s)
- Roozbeh Houshyar
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Justin Glavis-Bloom
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Thanh-Lan Bui
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Chantal Chahine
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Michelle D Bardis
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States.,University of California Irvine Center for Artificial Intelligence in Diagnostic Medicine, Irvine, California, United States;
| | - Alexander Ushinsky
- Washington University in St Louis School of Medicine, 12275, Mallinckrodt Institute of Radiology, St Louis, Missouri, United States;
| | - Hanna Liu
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Param Bhatter
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Elliott Lebby
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Dylann Fujimoto
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - William Grant
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Karen Tran-Harding
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States;
| | - Jaime Landman
- University of California Irvine, Urology, 333 City Blvd West, Orange, California, United States, 92868;
| | - Daniel S Chow
- University of California Irvine School of Medicine, 12219, Radiological Sciences, 101 The City Dr S, Orange, California, United States, 92697-3950.,University of California Irvine Center for Artificial Intelligence in Diagnostic Medicine, 4100 E. Peltason Dr., Irvine, California, United States, 92617;
| | - Peter D Chang
- University of California Irvine School of Medicine, 12219, Radiological Sciences, Orange, California, United States.,University of California Irvine Center for Artificial Intelligence in Diagnostic Medicine, Irvine, California, United States;
| |
Collapse
|