1
|
Barbour MC, Amin SN, Friedman SD, Perez FA, Bly RA, Johnson KE, Parikh SR, Richardson CM, Dahl JP, Aliseda A. Surface Reconstruction of the Pediatric Larynx via Structure from Motion Photogrammetry: A Pilot Study. Otolaryngol Head Neck Surg 2024; 170:1195-1199. [PMID: 38168480 PMCID: PMC10960702 DOI: 10.1002/ohn.635] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 11/10/2023] [Accepted: 12/07/2023] [Indexed: 01/05/2024]
Abstract
Endoscopy is the gold standard for characterizing pediatric airway disorders, however, it is limited for quantitative analysis due to lack of three-dimensional (3D) vision and poor stereotactic depth perception. We utilize structure from motion (SfM) photogrammetry, to reconstruct 3D surfaces of pathologic and healthy pediatric larynges from monocular two-dimensional (2D) endoscopy. Models of pediatric subglottic stenosis were 3D printed and airway endoscopies were simulated. 3D surfaces were successfully reconstructed from endoscopic videos of all models using an SfM analysis toolkit. Average subglottic surface error between SfM reconstructed surfaces and 3D printed models was 0.65 mm as measured by Modified Hausdorff Distance. Average volumetric similarity between SfM surfaces and printed models was 0.82 as measured by Jaccard Index. SfM can be used to accurately reconstruct 3D surface renderings of the larynx from 2D endoscopy video. This technique has immense potential for use in quantitative analysis of airway geometry and virtual surgical planning.
Collapse
Affiliation(s)
- Michael C Barbour
- Department of Mechanical Engineering, University of Washington, Seattle, Washington, USA
| | - Shaunak N Amin
- Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington, USA
| | - Seth D Friedman
- Center for Respiratory Biology and Therapeutics, Seattle Children's Hospital, Seattle, Washington, USA
| | - Francisco A Perez
- Department of Pediatric Radiology, Seattle Children's Hospital, Seattle, Washington, USA
| | - Randall A Bly
- Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington, USA
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Seattle Children's Hospital, Seattle, Washington, USA
| | - Kaalan E Johnson
- Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington, USA
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Seattle Children's Hospital, Seattle, Washington, USA
| | - Sanjay R Parikh
- Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington, USA
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Seattle Children's Hospital, Seattle, Washington, USA
| | - Clare M Richardson
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Seattle Children's Hospital, Seattle, Washington, USA
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Phoenix Children's Hospital, Phoenix, Arizona, USA
| | - John P Dahl
- Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington, USA
- Division of Pediatric Otolaryngology-Head and Neck Surgery, Seattle Children's Hospital, Seattle, Washington, USA
| | - Alberto Aliseda
- Department of Mechanical Engineering, University of Washington, Seattle, Washington, USA
| |
Collapse
|
2
|
Widya AR, Monno Y, Okutomi M, Suzuki S, Gotoda T, Miki K. Stomach 3D Reconstruction Using Virtual Chromoendoscopic Images. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2021; 9:1700211. [PMID: 33796417 PMCID: PMC8009143 DOI: 10.1109/jtehm.2021.3062226] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 01/19/2021] [Accepted: 02/15/2021] [Indexed: 12/23/2022]
Abstract
Gastric endoscopy is a golden standard in the clinical process that enables medical practitioners to diagnose various lesions inside a patient’s stomach. If a lesion is found, a success in identifying the location of the found lesion relative to the global view of the stomach will lead to better decision making for the next clinical treatment. Our previous research showed that the lesion localization could be achieved by reconstructing the whole stomach shape from chromoendoscopic indigo carmine (IC) dye-sprayed images using a structure-from-motion (SfM) pipeline. However, spraying the IC dye to the whole stomach requires additional time, which is not desirable for both patients and practitioners. Our objective is to propose an alternative way to achieve whole stomach 3D reconstruction without the need of the IC dye. We generate virtual IC-sprayed (VIC) images based on image-to-image style translation trained on unpaired real no-IC and IC-sprayed images, where we have investigated the effect of input and output color channel selection for generating the VIC images. We validate our reconstruction results by comparing them with the results using real IC-sprayed images and confirm that the obtained stomach 3D structures are comparable to each other. We also propose a local reconstruction technique to obtain a more detailed surface and texture around an interesting region. The proposed method achieves the whole stomach reconstruction without the need of real IC dye using SfM. We have found that translating no-IC green-channel images to IC-sprayed red-channel images gives the best SfM reconstruction result. Clinical impact We offer a method of the frame localization and local 3D reconstruction of a found gastric lesion using standard endoscopy images, leading to better clinical decision.
Collapse
Affiliation(s)
- Aji Resindra Widya
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Yusuke Monno
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Masatoshi Okutomi
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Sho Suzuki
- Division of Gastroenterology and HepatologyDepartment of MedicineNihon University School of MedicineTokyo101-8309Japan
| | - Takuji Gotoda
- Division of Gastroenterology and HepatologyDepartment of MedicineNihon University School of MedicineTokyo101-8309Japan
| | - Kenji Miki
- Department of Internal MedicineTsujinaka Hospital KashiwanohaKashiwa277-0871Japan
| |
Collapse
|
3
|
Widya AR, Monno Y, Okutomi M, Suzuki S, Gotoda T, Miki K. Stomach 3D Reconstruction Based on Virtual Chromoendoscopic Image Generation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2020:1848-1852. [PMID: 33018360 DOI: 10.1109/embc44109.2020.9176016] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Gastric endoscopy is a standard clinical process that enables medical practitioners to diagnose various lesions inside a patient's stomach. If any lesion is found, it is very important to perceive the location of the lesion relative to the global view of the stomach. Our previous research showed that this could be addressed by reconstructing the whole stomach shape from chromoendoscopic images using a structure-from-motion (SfM) pipeline, in which indigo carmine (IC) blue dye-sprayed images were used to increase feature matches for SfM by enhancing stomach surface's textures. However, spraying the IC dye to the whole stomach requires additional time, labor, and cost, which is not desirable for patients and practitioners. In this paper, we propose an alternative way to achieve whole stomach 3D reconstruction without the need of the IC dye by generating virtual IC-sprayed (VIC) images based on image-to-image style translation trained on unpaired real no-IC and IC-sprayed images. We have specifically investigated the effect of input and output color channel selection for generating the VIC images and found that translating no-IC green-channel images to IC-sprayed red-channel images gives the best SfM reconstruction result.
Collapse
|
4
|
Widya AR, Monno Y, Imahori K, Okutomi M, Suzuki S, Gotoda T, Miki K. 3D Reconstruction of Whole Stomach from Endoscope Video Using Structure-from-Motion. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2019:3900-3904. [PMID: 31946725 DOI: 10.1109/embc.2019.8857964] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Gastric endoscopy is a common clinical practice that enables medical doctors to diagnose the stomach inside a body. In order to identify a gastric lesion's location such as early gastric cancer within the stomach, this work addressed to reconstruct the 3D shape of a whole stomach with color texture information generated from a standard monocular endoscope video. Previous works have tried to reconstruct the 3D structures of various organs from endoscope images. However, they are mainly focused on a partial surface. In this work, we investigated how to enable structure-from-motion (SfM) to reconstruct the whole shape of a stomach from a standard endoscope video. We specifically investigated the combined effect of chromo-endoscopy and color channel selection on SfM. Our study found that 3D reconstruction of the whole stomach can be achieved by using red channel images captured under chromo-endoscopy by spreading indigo carmine (IC) dye on the stomach surface.
Collapse
|
5
|
Widya AR, Monno Y, Okutomi M, Suzuki S, Gotoda T, Miki K. Whole Stomach 3D Reconstruction and Frame Localization From Monocular Endoscope Video. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2019; 7:3300310. [PMID: 32309059 PMCID: PMC6830857 DOI: 10.1109/jtehm.2019.2946802] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/23/2019] [Revised: 09/03/2019] [Accepted: 09/25/2019] [Indexed: 12/22/2022]
Abstract
Gastric endoscopy is a common clinical practice that enables medical doctors to diagnose various lesions inside a stomach. In order to identify the location of a gastric lesion such as early cancer and a peptic ulcer within the stomach, this work addresses to reconstruct the color-textured 3D model of a whole stomach from a standard monocular endoscope video and localize any selected video frame to the 3D model. We examine how to enable structure-from-motion (SfM) to reconstruct the whole shape of a stomach from endoscope images, which is a challenging task due to the texture-less nature of the stomach surface. We specifically investigate the combined effect of chromo-endoscopy and color channel selection on SfM to increase the number of feature points. We also design a plane fitting-based algorithm for 3D point outliers removal to improve the 3D model quality. We show that whole stomach 3D reconstruction can be achieved (more than 90% of the frames can be reconstructed) by using red channel images captured under chromo-endoscopy by spreading indigo carmine (IC) dye on the stomach surface. In experimental results, we demonstrate the reconstructed 3D models for seven subjects and the application of lesion localization and reconstruction. The methodology and results presented in this paper could offer some valuable reference to other researchers and also could be an excellent tool for gastric surgeons in various computer-aided diagnosis applications.
Collapse
Affiliation(s)
- Aji Resindra Widya
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Yusuke Monno
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Masatoshi Okutomi
- Department of Systems and Control EngineeringSchool of EngineeringTokyo Institute of TechnologyTokyo152-8550Japan
| | - Sho Suzuki
- Division of Gastroenterology and HepatologyDepartment of MedicineNihon University School of MedicineTokyo101-8309Japan
| | - Takuji Gotoda
- Division of Gastroenterology and HepatologyDepartment of MedicineNihon University School of MedicineTokyo101-8309Japan
| | - Kenji Miki
- Department of Internal MedicineTsujinaka Hospital KashiwanohaKashiwa277-0871Japan
| |
Collapse
|
6
|
Hann A, Walter BM, Mehlhase N, Meining A. Virtual reality in GI endoscopy: intuitive zoom for improving diagnostics and training. Gut 2019; 68:957-959. [PMID: 30228217 PMCID: PMC6580767 DOI: 10.1136/gutjnl-2018-317058] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 07/29/2018] [Accepted: 08/26/2018] [Indexed: 01/09/2023]
Affiliation(s)
- Alexander Hann
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm, Germany
| | - Benjamin M Walter
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm, Germany
| | - Niklas Mehlhase
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm, Germany
| | - Alexander Meining
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm, Germany
| |
Collapse
|
7
|
Egger J, Gall M, Wallner J, Boechat P, Hann A, Li X, Chen X, Schmalstieg D. HTC Vive MeVisLab integration via OpenVR for medical applications. PLoS One 2017; 12:e0173972. [PMID: 28323840 PMCID: PMC5360258 DOI: 10.1371/journal.pone.0173972] [Citation(s) in RCA: 62] [Impact Index Per Article: 8.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2016] [Accepted: 03/01/2017] [Indexed: 01/30/2023] Open
Abstract
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.
Collapse
Affiliation(s)
- Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16c/II, Graz, Austria
- BioTechMed-Graz, Krenngasse 37/1, Graz, Austria
- * E-mail:
| | - Markus Gall
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16c/II, Graz, Austria
| | - Jürgen Wallner
- Medical University of Graz, Department of Oral and Maxillofacial Surgery, Auenbruggerplatz 5/1, Graz, Austria
| | - Pedro Boechat
- Medical University of Graz, Department of Oral and Maxillofacial Surgery, Auenbruggerplatz 5/1, Graz, Austria
| | - Alexander Hann
- Department of Internal Medicine I, Ulm University, Albert-Einstein-Allee 23, Ulm, Germany
| | - Xing Li
- Shanghai Jiao Tong University, School of Mechanical Engineering, Shanghai, China
| | - Xiaojun Chen
- Shanghai Jiao Tong University, School of Mechanical Engineering, Shanghai, China
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16c/II, Graz, Austria
| |
Collapse
|
8
|
Chadebecq F, Tilmant C, Bartoli A. How big is this neoplasia? live colonoscopic size measurement using the Infocus-Breakpoint. Med Image Anal 2014; 19:58-74. [PMID: 25277373 DOI: 10.1016/j.media.2014.09.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2014] [Revised: 06/26/2014] [Accepted: 09/01/2014] [Indexed: 01/25/2023]
Abstract
Colonoscopy is the reference medical examination for early diagnosis and treatment of colonic diseases. This minimally invasive technique allows endoscopists to explore the colon cavity and remove neoplasias - abnormal growths of tissue - which may develop into malignant tumors. The size, shape and appearance of a neoplasia are essential cues for diagnostic. However, the size is difficult to estimate because the absolute scale of the observed tissue is not directly conveyed in the 2D colonoscopic images. An erroneous size estimate may lead to inappropriate treatment. There currently exist no solutions to reproducible neoplasia size measurement adapted to colonoscopy. We propose a colonoscopic size measurement system for neoplasias. By using a simple planar geometry, the key technical problem is reduced to resolving scale. Our core contribution is introducing the Infocus-Breakpoint (IB) that allows us to resolve scale from a regular colonoscopic video. We define the IB as the lower limit of the colonoscope's depth of field. The IB corresponds to a precise colonoscope to tissue distance, called the reference depth, which we calibrate preoperatively. We detect the IB intraoperatively thanks to two novel modules: deformable Blur-Estimating Tracking (BET) and Blur-Model Fitting (BMF). With our system, the endoscopist may interactively measure the length and area of a neoplasia in a 2D colonoscopic image directly. Our system needs no hardware modification to standard monocular colonoscopes, yet reaching a size measurement accuracy of the order of a millimeter, as shown on several phantom and patient datasets.
Collapse
Affiliation(s)
- F Chadebecq
- ISIT UMR 6284 CNRS/Université d'Auvergne, Bâtiment 3C, Faculté de Médecine, 28 place Henri Dunant, BP 38, 63001 Clermont-Ferrand, France; Institut Pascal UMR 6602 CNRS/Université Blaise Pascal/IFMA, Complexe Universitaire des Cézeaux, 24 Avenue des Landais, BP 80026, 63171 Aubière Cedex, France
| | - C Tilmant
- Institut Pascal UMR 6602 CNRS/Université Blaise Pascal/IFMA, Complexe Universitaire des Cézeaux, 24 Avenue des Landais, BP 80026, 63171 Aubière Cedex, France
| | - A Bartoli
- ISIT UMR 6284 CNRS/Université d'Auvergne, Bâtiment 3C, Faculté de Médecine, 28 place Henri Dunant, BP 38, 63001 Clermont-Ferrand, France
| |
Collapse
|