1
|
Lacey H, Khan A, Dheansa B. Extended reality for mapping perforator-based flaps in breast reconstruction: a systematic review and meta-analysis. JPRAS Open 2025; 44:269-283. [PMID: 40247954 PMCID: PMC12005224 DOI: 10.1016/j.jpra.2025.02.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2025] [Accepted: 02/16/2025] [Indexed: 04/19/2025] Open
Abstract
Introduction Extended reality technologies including augmented reality (AR) and virtual reality (VR) can be used in surgical settings for surgical planning, perioperative visualisation of patient anatomy and simulation of operative steps. This study aimed to ascertain the role of extended reality in perforator-based breast reconstruction. Methods A systematic search of the literature was performed. Screening was conducted independently by two reviewers, with conflicts resolved through consensus. Results In total, 4957 articles were identified of which 10 were included, comprising 229 flaps. Overall, localisation of perforator emergence with AR and VR were 4 mm (95% confidence interval [CI] 3.2-4.9) and 14 mm (95% CI 5.5-22.5), respectively, from the correct intraoperatively identified location. Two studies reported a significant reduction in harvesting time (88 min for VR and 19 min for AR). The pooled mean image processing time from four studies was 39±47 min. Preoperative VR was associated with shorter operative times than conventional Doppler ultrasound (478±56.94 vs 606.29 ± 81.94 min, P<0.05). Use of a simulated environment for mapping perforators appeared to reduce complications (wound breakdown, flap revision, flap loss, infection); however, this failed to reach statistical significance (odds ratio 0.6; 95% CI 0.3-1.3; P=0.20). Conclusions This study suggests that AR and VR offer limited benefit in improving accuracy of perforator identification; however, they may reduce flap harvesting and total operative time. Key limitations include heterogeneity and quality of the included studies. With larger sample sizes and higher quality evidence, definitive benefits and longitudinal outcomes relating to use of extended reality in perforator-based breast reconstruction may be established.
Collapse
Affiliation(s)
- Hester Lacey
- University Hospitals Sussex NHS Foundation Trust, Eastern Rd, Brighton and Hove, Brighton, BN2 5BE, UK
- Department of Plastic Surgery, Queen Victoria Hospital, Holtye Rd, East Grinstead, RH19 3DZ, UK
| | - Anas Khan
- University Hospitals Sussex NHS Foundation Trust, Eastern Rd, Brighton and Hove, Brighton, BN2 5BE, UK
| | - Baljit Dheansa
- Department of Plastic Surgery, Queen Victoria Hospital, Holtye Rd, East Grinstead, RH19 3DZ, UK
| |
Collapse
|
2
|
Landau M, Tsoukas M, Somani AK, Goldust M. Augmented and virtual reality in dermatologic surgery. Int J Dermatol 2025; 64:945-946. [PMID: 39564681 DOI: 10.1111/ijd.17582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/14/2024] [Revised: 10/25/2024] [Accepted: 11/06/2024] [Indexed: 11/21/2024]
Affiliation(s)
- Marina Landau
- Arena Dermatology and Department of Plastic Surgery, Shamir Medical Center, Be'er Ya'akov, Israel
| | - Maria Tsoukas
- Department of Dermatology, University of Illinois College of Medicine at Chicago, Chicago, Illinois, USA
| | - Ally-Khan Somani
- Department of Dermatology, Indiana University School of Medicine, Indianapolis, Indiana, USA
| | - Mohamad Goldust
- Department of Dermatology, Yale University School of Medicine, New Haven, Connecticut, USA
| |
Collapse
|
3
|
Konovalov AN, Okishev DN, Pilipenko YV, Eliava SS, Artemiev AA, Ivanov VM, Smirnov AY, Strelkov SV. [Augmented reality as a method of neuronavigation in microsurgical treatment of cerebrovascular diseases: description of the method and clinical experience]. ZHURNAL VOPROSY NEIROKHIRURGII IMENI N. N. BURDENKO 2025; 89:37-45. [PMID: 39907665 DOI: 10.17116/neiro20258901137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2025]
Abstract
Augmented reality (AR) is a promising area in microsurgical treatment of cerebrovascular pathologies that can significantly facilitate preoperative planning and intraoperative understanding of anatomy. OBJECTIVE To describe AR-assisted neuronavigation in microsurgical treatment of intracranial aneurysms, arteriovenous malformations and cavernomas; to evaluate accuracy and applicability of AR-assisted neuronavigation. MATERIAL AND METHODS The study involved 22 patients with cerebral aneurysms, arteriovenous and cavernous malformations. Microsoft Hololens 2 HMD glasses and «Medgital» software for AR navigation were used. Accuracy of registration (TRE and FRE) and time for preoperative preparation were evaluated. RESULTS. MEAN TRE when using QR code was 0.6±0.2 cm, when combining through craniometric points - 1.4±0.6 cm. Time for preoperative image processing was 24.7±5.1 minutes, application setup in the operating theatre - 1.6±0.2 minutes. Combination using QR code provided higher accuracy of registration compared to craniometric points. AR-assisted navigation improved visualization and planning of surgeries for aneurysms, arteriovenous malformations, microvascular anastomoses and cavernous angiomas. CONCLUSION AR-assisted navigation is an innovative method with specific advantages that can potentially improve microsurgical treatment of cerebrovascular diseases. Further research is needed to confirm these findings and develop AR technology in neurosurgery.
Collapse
Affiliation(s)
- A N Konovalov
- Burdenko Neurosurgical Center, Moscow, Russia
- Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - D N Okishev
- Burdenko Neurosurgical Center, Moscow, Russia
| | | | | | - A A Artemiev
- Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - V M Ivanov
- Peter the Great St. Petersburg Polytechnic University, Saint Petersburg, Russia
| | - A Yu Smirnov
- Peter the Great St. Petersburg Polytechnic University, Saint Petersburg, Russia
| | | |
Collapse
|
4
|
Guruswamy J, Chhina A, Mitchell JD, Shah S, Uribe-Marquez S. Virtual Reality and Augmented Reality in Anesthesiology Education. Int Anesthesiol Clin 2024; 62:64-70. [PMID: 38798152 DOI: 10.1097/aia.0000000000000445] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Affiliation(s)
- Jayakar Guruswamy
- Department of Anesthesiology, Pain Management, and Perioperative Medicine, Henry Ford Health, Michigan State University, Detroit, Michigan
| | | | | | | | | |
Collapse
|
5
|
Begagić E, Bečulić H, Pugonja R, Memić Z, Balogun S, Džidić-Krivić A, Milanović E, Salković N, Nuhović A, Skomorac R, Sefo H, Pojskić M. Augmented Reality Integration in Skull Base Neurosurgery: A Systematic Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:335. [PMID: 38399622 PMCID: PMC10889940 DOI: 10.3390/medicina60020335] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 02/05/2024] [Accepted: 02/09/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: To investigate the role of augmented reality (AR) in skull base (SB) neurosurgery. Materials and Methods: Utilizing PRISMA methodology, PubMed and Scopus databases were explored to extract data related to AR integration in SB surgery. Results: The majority of 19 included studies (42.1%) were conducted in the United States, with a focus on the last five years (77.8%). Categorization included phantom skull models (31.2%, n = 6), human cadavers (15.8%, n = 3), or human patients (52.6%, n = 10). Microscopic surgery was the predominant modality in 10 studies (52.6%). Of the 19 studies, surgical modality was specified in 18, with microscopic surgery being predominant (52.6%). Most studies used only CT as the data source (n = 9; 47.4%), and optical tracking was the prevalent tracking modality (n = 9; 47.3%). The Target Registration Error (TRE) spanned from 0.55 to 10.62 mm. Conclusion: Despite variations in Target Registration Error (TRE) values, the studies highlighted successful outcomes and minimal complications. Challenges, such as device practicality and data security, were acknowledged, but the application of low-cost AR devices suggests broader feasibility.
Collapse
Affiliation(s)
- Emir Begagić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Hakija Bečulić
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Ragib Pugonja
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Zlatan Memić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Simon Balogun
- Division of Neurosurgery, Department of Surgery, Obafemi Awolowo University Teaching Hospitals Complex, Ilesa Road PMB 5538, Ile-Ife 220282, Nigeria
| | - Amina Džidić-Krivić
- Department of Neurology, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina
| | - Elma Milanović
- Neurology Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Naida Salković
- Department of General Medicine, School of Medicine, University of Tuzla, Univerzitetska 1, 75000 Tuzla, Bosnia and Herzegovina;
| | - Adem Nuhović
- Department of General Medicine, School of Medicine, University of Sarajevo, Univerzitetska 1, 71000 Sarajevo, Bosnia and Herzegovina;
| | - Rasim Skomorac
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Surgery, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina
| | - Haso Sefo
- Neurosurgery Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Mirza Pojskić
- Department of Neurosurgery, University Hospital Marburg, Baldingerstr., 35033 Marburg, Germany
| |
Collapse
|
6
|
Shaikh HJF, Hasan SS, Woo JJ, Lavoie-Gagne O, Long WJ, Ramkumar PN. Exposure to Extended Reality and Artificial Intelligence-Based Manifestations: A Primer on the Future of Hip and Knee Arthroplasty. J Arthroplasty 2023; 38:2096-2104. [PMID: 37196732 DOI: 10.1016/j.arth.2023.05.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 05/06/2023] [Accepted: 05/08/2023] [Indexed: 05/19/2023] Open
Abstract
BACKGROUND Software-infused services, from robot-assisted and wearable technologies to artificial intelligence (AI)-laden analytics, continue to augment clinical orthopaedics - namely hip and knee arthroplasty. Extended reality (XR) tools, which encompass augmented reality, virtual reality, and mixed reality technology, represent a new frontier for expanding surgical horizons to maximize technical education, expertise, and execution. The purpose of this review is to critically detail and evaluate the recent developments surrounding XR in the field of hip and knee arthroplasty and to address potential future applications as they relate to AI. METHODS In this narrative review surrounding XR, we discuss (1) definitions, (2) techniques, (3) studies, (4) current applications, and (5) future directions. We highlight XR subsets (augmented reality, virtual reality, and mixed reality) as they relate to AI in the increasingly digitized ecosystem within hip and knee arthroplasty. RESULTS A narrative review of the XR orthopaedic ecosystem with respect to XR developments is summarized with specific emphasis on hip and knee arthroplasty. The XR as a tool for education, preoperative planning, and surgical execution is discussed with future applications dependent upon AI to potentially obviate the need for robotic assistance and preoperative advanced imaging without sacrificing accuracy. CONCLUSION In a field where exposure is critical to clinical success, XR represents a novel stand-alone software-infused service that optimizes technical education, execution, and expertise but necessitates integration with AI and previously validated software solutions to offer opportunities that improve surgical precision with or without the use of robotics and computed tomography-based imaging.
Collapse
Affiliation(s)
| | - Sayyida S Hasan
- Donald and Barbara Zucker School of Medicine at Hofstra, Uniondale, New York
| | | | | | | | - Prem N Ramkumar
- Hospital for Special Surgery, New York, New York; Long Beach Orthopaedic Institute, Long Beach, California
| |
Collapse
|
7
|
Timóteo R, Pinto D, Martinho M, Gouveia P, Lopes DS, Mavioso C, Cardoso MJ. Skin deformation analysis for pre-operative planning of DIEAP flap reconstruction surgery. Med Eng Phys 2023; 119:104025. [PMID: 37634903 DOI: 10.1016/j.medengphy.2023.104025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Revised: 07/11/2023] [Accepted: 07/19/2023] [Indexed: 08/29/2023]
Abstract
Deep inferior epigastric artery perforator (DIEAP) flap reconstruction surgeries can potentially benefit from augmented reality (AR) in the context of surgery planning and outcomes improvement. Although three-dimensional (3D) models help visualize and map the perforators, the anchorage of the models to the patient's body during surgery does not consider eventual skin deformation from the moment of computed tomography angiography (CTA) data acquisition until the position of the patient while in surgery. In this work, we compared the 3D deformation registration from supine arms down (CTA position) to supine with arms at 90° degrees (surgical position), estimating the patient's skin deformation. We processed the data sets of 20 volunteers with a 3D rigid registration tool and performed a descriptive statistical analysis and statistical inference. With 2.45 mm of root mean square and 2.89 mm of standard deviation, results include 30% cases of deformation above 3 mm and 15% above 4 mm. Pose transformation deformation indicates that 3D surface data from the CTA scan position differs from data acquired in loco at the surgical table. Such results indicate that research should be conducted to construct accurate 3D models using CTA data to display on the patient, while considering projection errors when using AR technology.
Collapse
Affiliation(s)
- Rafaela Timóteo
- Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa, Portugal; Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - David Pinto
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Marta Martinho
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Pedro Gouveia
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal; Faculdade de Medicina de Lisboa, Av. Prof. Egas Moniz MB, 1649-028 Lisboa, Portugal.
| | - Daniel Simões Lopes
- Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa, Portugal; INESC ID, Rua Alves Redol 9, 1000-029 Lisboa, Portugal; ITI/LARSyS, Hub Criativo do Beato, Factory Lisbon, Rua da Manutenção 71, Building F S05, 1900-500 Lisboa, Portugal.
| | - Carlos Mavioso
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Maria João Cardoso
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| |
Collapse
|
8
|
Arjomandi Rad A, Subbiah Ponniah H, Shah V, Nanchahal S, Vardanyan R, Miller G, Malawana J. Leading Transformation in Medical Education Through Extended Reality. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2023; 1421:161-173. [PMID: 37524987 DOI: 10.1007/978-3-031-30379-1_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/02/2023]
Abstract
Extended reality (XR) has exponentially developed over the past decades to incorporate technology whereby users can visualise, explore, and interact with 3-dimensional-generated computer environments, and superimpose virtual reality (VR) onto real-world environments, thus displaying information and data on various levels of the reality-virtuality continuum. In the context of medicine, VR tools allow for anatomical assessment and diagnosis, surgical training through lifelike procedural simulations, planning of surgeries and biopsies, intraprocedural guidance, and medical education. The following chapter aims to provide an overview of the currently available evidence and perspectives on the application of XR within medical education. It will focus on undergraduate and postgraduate teaching, medical education within Low-Middle Income Countries, key practical steps in implementing a successful XR programme, and the limitations and future of extended reality within medical education.
Collapse
Affiliation(s)
- Arian Arjomandi Rad
- Medical Sciences Division, University of Oxford, Oxford, UK
- The Healthcare Leadership Academy, London, UK
| | | | - Viraj Shah
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - Sukanya Nanchahal
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - Robert Vardanyan
- The Healthcare Leadership Academy, London, UK
- Faculty of Medicine, Department of Medicine, Imperial College London, London, UK
| | - George Miller
- The Healthcare Leadership Academy, London, UK
- University of Central Lancashire Medical School, Preston, UK
| | - Johann Malawana
- The Healthcare Leadership Academy, London, UK.
- University of Central Lancashire Medical School, Preston, UK.
| |
Collapse
|
9
|
Li Z, Shu Y. A commentary on 'augmented reality self-training system for suturing in open surgery: a randomized controlled trial' [Int J Surg 102 (2022) 106650]. Int J Surg 2023; 109:77-78. [PMID: 36799803 PMCID: PMC10389480 DOI: 10.1097/js9.0000000000000076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Accepted: 11/20/2022] [Indexed: 02/18/2023]
Affiliation(s)
| | - Yan Shu
- Department of Clinical Laboratory, Chongqing University FuLing Hospital, Chongqing, People’s Republic of China
| |
Collapse
|
10
|
Dhar P, Rocks T, Samarasinghe RM, Stephenson G, Smith C. Augmented reality in medical education: students' experiences and learning outcomes. MEDICAL EDUCATION ONLINE 2021; 26:1953953. [PMID: 34259122 PMCID: PMC8281102 DOI: 10.1080/10872981.2021.1953953] [Citation(s) in RCA: 70] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
Augmented reality (AR) is a relatively new technology that allows for digitally generated three-dimensional representations to be integrated with real environmental stimuli. AR can make use of smart phones, tablets, or other devices to achieve a highly stimulating learning environment and hands-on immersive experience. The use of AR in industry is becoming widespread with applications being developed for use not just for entertainment and gaming but also healthcare, retail and marketing, education, military, travel and tourism, automotive industry, manufacturing, architecture, and engineering. Due to the distinct learning advantages that AR offers, such as remote learning and interactive simulations, AR-based teaching programs are also increasingly being adopted within medical schools across the world. These advantages are further highlighted by the current COVID-19 pandemic, which has caused an even greater shift towards online learning. In this review, we investigate the use of AR in medical training/education and its effect on students' experiences and learning outcomes. This includes the main goals of AR-based learning, such as to simplify the delivery and enhance the comprehension of complex information. We also describe how AR can enhance the experiences of medical students, by improving knowledge and understanding, practical skills and social skills. These concepts are discussed within the context of specific AR medical training programs, such as HoloHuman, OculAR SIM, and HoloPatient. Finally, we discuss the challenges of AR in learning and teaching and propose future directions for the use of this technology in medical education.
Collapse
Affiliation(s)
- Poshmaal Dhar
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Tetyana Rocks
- Institute for Innovation in Mental and Physical Health and Clinical Translation, Food and Mood Centre, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Rasika M Samarasinghe
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Garth Stephenson
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
| | - Craig Smith
- Institute for Innovation in Mental and Physical Health and Clinical Translation, School of Medicine, Faculty of Health, Deakin University, Geelong, Australia
- CONTACT Craig Smith School of Medicine, Institute for Innovation in Mental and Physical Health and Clinical Translation, Deakin University, Australia
| |
Collapse
|
11
|
Parsa S, Basagaoglu B, Mackley K, Aitson P, Kenkel J, Amirlak B. Current and Future Photography Techniques in Aesthetic Surgery. AESTHETIC SURGERY JOURNAL OPEN FORUM 2021; 4:ojab050. [PMID: 35156020 PMCID: PMC8830310 DOI: 10.1093/asjof/ojab050] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022] Open
Abstract
Background The rapidly increasing modalities and mediums of clinical photography, use of 3-dimensional (3D) and 4-dimensional (4D) patient modeling, and widening implementation of cloud-based storage and artificial intelligence (AI) call for an overview of various methods currently in use as well as future considerations in the field. Objectives Through a close look at the methods used in aesthetic surgery photography, clinicians will be able to select the modality best suited to their practice and goals. Methods Review and discussion of current data pertaining to: 2-dimensional (2D) and 3D clinical photography, current photography software, augmented reality reconstruction, AI photography, and cloud-based storage. Results Important considerations for current image capture include a device with a gridded viewing screen and high megapixel resolution, a tripod with leveling base, studio lighting with dual-sourced light, standardized matte finish background, and consistency in patient orientation. Currently, 3D and 4D photography devices offer advantages such as improved communication to the patient on outcome expectation and better quality of patient service and safety. AI may contribute to post-capture processing and 3D printing of postoperative outcomes. Current smartphones distort patient perceptions about their appearance and should be used cautiously in an aesthetic surgery setting. Cloud-based storage provides flexibility, cost, and ease of service while remaining vulnerable to data breaches. Conclusions While there are advancements to be made in the physical equipment and preparation for the photograph, the future of clinical photography will be heavily influenced by innovations in software and 3D and 4D modeling of outcomes.
Collapse
Affiliation(s)
- Shyon Parsa
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| | - Berkay Basagaoglu
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| | - Kate Mackley
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| | - Patricia Aitson
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| | - Jeffrey Kenkel
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| | - Bardia Amirlak
- Department of Plastic Surgery, UT Southwestern Medical Center, Dallas, TX, USA
| |
Collapse
|
12
|
Plastic Surgery Lockdown Learning during Coronavirus Disease 2019: Are Adaptations in Education Here to Stay? PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2020; 8:e3064. [PMID: 32802695 PMCID: PMC7413776 DOI: 10.1097/gox.0000000000003064] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Accepted: 06/30/2020] [Indexed: 11/25/2022]
Abstract
Summary: The novel coronavirus disease 2019 has had a major impact on human life and livelihood. The unprecedented challenges have expanded beyond just social and work life, and have grown to impact resident education. In this article, we review the structure of plastic surgery education before the pandemic, the different online learning opportunities for self-directed learning. A summary of the range of platforms and approaches of online remote access delivery of conferences and education that emerged or expanded as a result of the crisis has been reported. This article highlighted the rapid initiatives and efforts of programs and national and international societies to support continuing medical education in conjunction with the guidelines to “shelter at home” and maintain social distancing, and possible future for expanding the reach of online academic initiatives, in addition to the role of developing virtual technologies. The coronavirus disease 2019 crisis has created an opportunity to analyze and advance online learning options to overcome the associated challenges and continue as a reliable platform even following the resolution of the social distancing requirements.
Collapse
|
13
|
Fusion of augmented reality imaging with the endoscopic view for endonasal skull base surgery; a novel application for surgical navigation based on intraoperative cone beam computed tomography and optical tracking. PLoS One 2020; 15:e0227312. [PMID: 31945082 PMCID: PMC6964902 DOI: 10.1371/journal.pone.0227312] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2019] [Accepted: 12/16/2019] [Indexed: 01/11/2023] Open
Abstract
Objective Surgical navigation is a well-established tool in endoscopic skull base surgery. However, navigational and endoscopic views are usually displayed on separate monitors, forcing the surgeon to focus on one or the other. Aiming to provide real-time integration of endoscopic and diagnostic imaging information, we present a new navigation technique based on augmented reality with fusion of intraoperative cone beam computed tomography (CBCT) on the endoscopic view. The aim of this study was to evaluate the accuracy of the method. Material and methods An augmented reality surgical navigation system (ARSN) with 3D CBCT capability was used. The navigation system incorporates an optical tracking system (OTS) with four video cameras embedded in the flat detector of the motorized C-arm. Intra-operative CBCT images were fused with the view of the surgical field obtained by the endoscope’s camera. Accuracy of CBCT image co-registration was tested using a custom-made grid with incorporated 3D spheres. Results Co-registration of the CBCT image on the endoscopic view was performed. Accuracy of the overlay, measured as mean target registration error (TRE), was 0.55 mm with a standard deviation of 0.24 mm and with a median value of 0.51mm and interquartile range of 0.39˗˗0.68 mm. Conclusion We present a novel augmented reality surgical navigation system, with fusion of intraoperative CBCT on the endoscopic view. The system shows sub-millimeter accuracy.
Collapse
|
14
|
Abstract
Augmented reality (AR) technology is gaining popularity and scholarly interest in the rehabilitation sector because of the possibility to generate controlled, user-specific environmental and perceptual stimuli which motivate the patient, while still preserving the possibility to interact with the real environment and other subjects, including the rehabilitation specialist. The paper presents the first wearable AR application for shoulder rehabilitation, based on Microsoft HoloLens, with real-time markerless tracking of the user’s hand. Potentialities and current limits of commercial head-mounted displays (HMDs) are described for the target medical field, and details of the proposed application are reported. A serious game was designed starting from the analysis of a traditional rehabilitation exercise, taking into account HoloLens specifications to maximize user comfort during the AR rehabilitation session. The AR application implemented consistently meets the recommended target frame rate for immersive applications with HoloLens device: 60 fps. Moreover, the ergonomics and the motivational value of the proposed application were positively evaluated by a group of five rehabilitation specialists and 20 healthy subjects. Even if a larger study, including real patients, is necessary for a clinical validation of the proposed application, the results obtained encourage further investigations and the integration of additional technical features for the proposed AR application.
Collapse
|