1
|
Nwoye CI, Padoy N. SurgiTrack: Fine-grained multi-class multi-tool tracking in surgical videos. Med Image Anal 2025; 101:103438. [PMID: 39708509 DOI: 10.1016/j.media.2024.103438] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2024] [Revised: 12/08/2024] [Accepted: 12/10/2024] [Indexed: 12/23/2024]
Abstract
Accurate tool tracking is essential for the success of computer-assisted intervention. Previous efforts often modeled tool trajectories rigidly, overlooking the dynamic nature of surgical procedures, especially tracking scenarios like out-of-body and out-of-camera views. Addressing this limitation, the new CholecTrack20 dataset provides detailed labels that account for multiple tool trajectories in three perspectives: (1) intraoperative, (2) intracorporeal, and (3) visibility, representing the different types of temporal duration of tool tracks. These fine-grained labels enhance tracking flexibility but also increase the task complexity. Re-identifying tools after occlusion or re-insertion into the body remains challenging due to high visual similarity, especially among tools of the same category. This work recognizes the critical role of the tool operators in distinguishing tool track instances, especially those belonging to the same tool category. The operators' information are however not explicitly captured in surgical videos. We therefore propose SurgiTrack, a novel deep learning method that leverages YOLOv7 for precise tool detection and employs an attention mechanism to model the originating direction of the tools, as a proxy to their operators, for tool re-identification. To handle diverse tool trajectory perspectives, SurgiTrack employs a harmonizing bipartite matching graph, minimizing conflicts and ensuring accurate tool identity association. Experimental results on CholecTrack20 demonstrate SurgiTrack's effectiveness, outperforming baselines and state-of-the-art methods with real-time inference capability. This work sets a new standard in surgical tool tracking, providing dynamic trajectories for more adaptable and precise assistance in minimally invasive surgeries.
Collapse
Affiliation(s)
- Chinedu Innocent Nwoye
- University of Strasbourg, CAMMA, ICube, CNRS, INSERM, France; IHU Strasbourg, Strasbourg, France.
| | - Nicolas Padoy
- University of Strasbourg, CAMMA, ICube, CNRS, INSERM, France; IHU Strasbourg, Strasbourg, France
| |
Collapse
|
2
|
Xu H, Giannarou S. Occlusion-robust markerless surgical instrument pose estimation. Healthc Technol Lett 2024; 11:327-335. [PMID: 39720750 PMCID: PMC11665797 DOI: 10.1049/htl2.12100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2024] [Revised: 11/09/2024] [Accepted: 11/11/2024] [Indexed: 12/26/2024] Open
Abstract
The estimation of the pose of surgical instruments is important in Robot-assisted Minimally Invasive Surgery (RMIS) to assist surgical navigation and enable autonomous robotic task execution. The performance of current instrument pose estimation methods deteriorates significantly in the presence of partial tool visibility, occlusions, and changes in the surgical scene. In this work, a vision-based framework is proposed for markerless estimation of the 6DoF pose of surgical instruments. To deal with partial instrument visibility, a keypoint object representation is used and stable and accurate instrument poses are computed using a PnP solver. To boost the learning process of the model under occlusion, a new mask-based data augmentation approach has been proposed. To validate the model, a dataset for instrument pose estimation with highly accurate ground truth data has been generated using different surgical robotic instruments. The proposed network can achieve submillimeter accuracy and the experimental results verify its generalisability to different shapes of occlusion.
Collapse
Affiliation(s)
- Haozheng Xu
- Hamlyn Centre for Robotic Surgery, Department of Surgery and CancerImperial College LondonLondonUK
| | - Stamatia Giannarou
- Hamlyn Centre for Robotic Surgery, Department of Surgery and CancerImperial College LondonLondonUK
| |
Collapse
|
3
|
Fragoso Costa P, Shi K, Holm S, Vidal-Sicart S, Kracmerova T, Tosi G, Grimm J, Visvikis D, Knapp WH, Gnanasegaran G, van Leeuwen FWB. Surgical radioguidance with beta-emitting radionuclides; challenges and possibilities: A position paper by the EANM. Eur J Nucl Med Mol Imaging 2024; 51:2903-2921. [PMID: 38189911 PMCID: PMC11300492 DOI: 10.1007/s00259-023-06560-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Accepted: 12/01/2023] [Indexed: 01/09/2024]
Abstract
Radioguidance that makes use of β-emitting radionuclides is gaining in popularity and could have potential to strengthen the range of existing radioguidance techniques. While there is a strong tendency to develop new PET radiotracers, due to favorable imaging characteristics and the success of theranostics research, there are practical challenges that need to be overcome when considering use of β-emitters for surgical radioguidance. In this position paper, the EANM identifies the possibilities and challenges that relate to the successful implementation of β-emitters in surgical guidance, covering aspects related to instrumentation, radiation protection, and modes of implementation.
Collapse
Affiliation(s)
- Pedro Fragoso Costa
- Department of Nuclear Medicine, University Hospital Essen, West German Cancer Center (WTZ), University of Duisburg-Essen, Essen, Germany.
| | - Kuangyu Shi
- Department of Nuclear Medicine, Inselspital, Bern University Hospital, University of Bern, Bern, Switzerland
- Computer Aided Medical Procedures and Augmented Reality, Institute of Informatics I16, Technical University of Munich, Munich, Germany
| | - Soren Holm
- Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University Hospital Copenhagen, Copenhagen, Denmark
| | - Sergi Vidal-Sicart
- Nuclear Medicine Department, Hospital Clinic Barcelona, Barcelona, Spain
| | - Tereza Kracmerova
- Department of Medical Physics, Motol University Hospital, Prague, Czech Republic
| | - Giovanni Tosi
- Department of Medical Physics, Ospedale U. Parini, Aosta, Italy
| | - Jan Grimm
- Molecular Pharmacology Program, Memorial Sloan Kettering Cancer Center, New York, NY, USA
- Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY, USA
| | | | - Wolfram H Knapp
- Department of Nuclear Medicine, Medizinische Hochschule Hannover, Hannover, Germany
| | - Gopinath Gnanasegaran
- Institute of Nuclear Medicine, University College London Hospital, Tower 5, 235 Euston Road, London, NW1 2BU, UK
- Royal Free London NHS Foundation Trust Hospital, London, UK
| | - Fijs W B van Leeuwen
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, the Netherlands
| |
Collapse
|
4
|
Huang B, Nguyen A, Wang S, Wang Z, Mayer E, Tuch D, Vyas K, Giannarou S, Elson DS. Simultaneous Depth Estimation and Surgical Tool Segmentation in Laparoscopic Images. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2022; 4:335-338. [PMID: 36148137 PMCID: PMC7613616 DOI: 10.1109/tmrb.2022.3170215] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Surgical instrument segmentation and depth estimation are crucial steps to improve autonomy in robotic surgery. Most recent works treat these problems separately, making the deployment challenging. In this paper, we propose a unified framework for depth estimation and surgical tool segmentation in laparoscopic images. The network has an encoder-decoder architecture and comprises two branches for simultaneously performing depth estimation and segmentation. To train the network end to end, we propose a new multi-task loss function that effectively learns to estimate depth in an unsupervised manner, while requiring only semi-ground truth for surgical tool segmentation. We conducted extensive experiments on different datasets to validate these findings. The results showed that the end-to-end network successfully improved the state-of-the-art for both tasks while reducing the complexity during their deployment.
Collapse
Affiliation(s)
- Baoru Huang
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | - Anh Nguyen
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Computer Science, University of Liverpool, UK
| | - Siyao Wang
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
| | - Ziyang Wang
- Department of Computer Science, University of Oxford, UK
| | - Erik Mayer
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | | | | | - Stamatia Giannarou
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| | - Daniel S Elson
- The Hamlyn Centre for Robotic Surgery, Imperial College London, SW7 2AZ, UK
- Department of Surgery & Cancer, Imperial College London, SW7 2AZ, UK
| |
Collapse
|
5
|
Gkouzionis I, Nazarian S, Kawka M, Darzi A, Patel N, Peters CJ, Elson DS. Real-time tracking of a diffuse reflectance spectroscopy probe used to aid histological validation of margin assessment in upper gastrointestinal cancer resection surgery. JOURNAL OF BIOMEDICAL OPTICS 2022; 27:JBO-210293R. [PMID: 35106980 PMCID: PMC8804336 DOI: 10.1117/1.jbo.27.2.025001] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Accepted: 01/10/2022] [Indexed: 05/27/2023]
Abstract
SIGNIFICANCE Diffuse reflectance spectroscopy (DRS) allows discrimination of tissue type. Its application is limited by the inability to mark the scanned tissue and the lack of real-time measurements. AIM This study aimed to develop a real-time tracking system to enable localization of a DRS probe to aid the classification of tumor and non-tumor tissue. APPROACH A green-colored marker attached to the DRS probe was detected using hue-saturation-value (HSV) segmentation. A live, augmented view of tracked optical biopsy sites was recorded in real time. Supervised classifiers were evaluated in terms of sensitivity, specificity, and overall accuracy. A developed software was used for data collection, processing, and statistical analysis. RESULTS The measured root mean square error (RMSE) of DRS probe tip tracking was 1.18 ± 0.58 mm and 1.05 ± 0.28 mm for the x and y dimensions, respectively. The diagnostic accuracy of the system to classify tumor and non-tumor tissue in real time was 94% for stomach and 96% for the esophagus. CONCLUSIONS We have successfully developed a real-time tracking and classification system for a DRS probe. When used on stomach and esophageal tissue for tumor detection, the accuracy derived demonstrates the strength and clinical value of the technique to aid margin assessment in cancer resection surgery.
Collapse
Affiliation(s)
- Ioannis Gkouzionis
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
- Imperial College London, Hamlyn Centre, London, United Kingdom
| | - Scarlet Nazarian
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
| | - Michal Kawka
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
| | - Ara Darzi
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
- Imperial College London, Hamlyn Centre, London, United Kingdom
| | - Nisha Patel
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
| | | | - Daniel S. Elson
- Imperial College London, Department of Surgery and Cancer, London, United Kingdom
- Imperial College London, Hamlyn Centre, London, United Kingdom
| |
Collapse
|
6
|
Wendler T, van Leeuwen FWB, Navab N, van Oosterom MN. How molecular imaging will enable robotic precision surgery : The role of artificial intelligence, augmented reality, and navigation. Eur J Nucl Med Mol Imaging 2021; 48:4201-4224. [PMID: 34185136 PMCID: PMC8566413 DOI: 10.1007/s00259-021-05445-6] [Citation(s) in RCA: 32] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Accepted: 06/01/2021] [Indexed: 02/08/2023]
Abstract
Molecular imaging is one of the pillars of precision surgery. Its applications range from early diagnostics to therapy planning, execution, and the accurate assessment of outcomes. In particular, molecular imaging solutions are in high demand in minimally invasive surgical strategies, such as the substantially increasing field of robotic surgery. This review aims at connecting the molecular imaging and nuclear medicine community to the rapidly expanding armory of surgical medical devices. Such devices entail technologies ranging from artificial intelligence and computer-aided visualization technologies (software) to innovative molecular imaging modalities and surgical navigation (hardware). We discuss technologies based on their role at different steps of the surgical workflow, i.e., from surgical decision and planning, over to target localization and excision guidance, all the way to (back table) surgical verification. This provides a glimpse of how innovations from the technology fields can realize an exciting future for the molecular imaging and surgery communities.
Collapse
Affiliation(s)
- Thomas Wendler
- Chair for Computer Aided Medical Procedures and Augmented Reality, Technische Universität München, Boltzmannstr. 3, 85748 Garching bei München, Germany
| | - Fijs W. B. van Leeuwen
- Department of Radiology, Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology, The Netherlands Cancer Institute - Antonie van Leeuwenhoek Hospital, Amsterdam, The Netherlands
- Orsi Academy, Melle, Belgium
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures and Augmented Reality, Technische Universität München, Boltzmannstr. 3, 85748 Garching bei München, Germany
- Chair for Computer Aided Medical Procedures Laboratory for Computational Sensing + Robotics, Johns-Hopkins University, Baltimore, MD USA
| | - Matthias N. van Oosterom
- Department of Radiology, Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology, The Netherlands Cancer Institute - Antonie van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| |
Collapse
|
7
|
Azargoshasb S, Houwing KHM, Roos PR, van Leeuwen SI, Boonekamp M, Mazzone E, Bauwens K, Dell'Oglio P, van Leeuwen FWB, van Oosterom MN. Optical Navigation of the Drop-In γ-Probe as a Means to Strengthen the Connection Between Robot-Assisted and Radioguided Surgery. J Nucl Med 2021; 62:1314-1317. [PMID: 33419942 PMCID: PMC8882900 DOI: 10.2967/jnumed.120.259796] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 01/03/2021] [Indexed: 12/30/2022] Open
Abstract
With translation of the Drop-In γ-probe, radioguidance has advanced into laparoscopic robot-assisted surgery. Global-positioning-system-like navigation can further enhance the symbiosis between nuclear medicine and surgery. Therefore, we developed a fluorescence-video-based tracking method that integrates the Drop-In with navigated robotic surgery. Methods: Fluorescent markers, integrated into the Drop-In, were automatically detected using a daVinci Firefly laparoscope. Subsequently, a declipseSPECT-navigation platform calculated the Drop-In location within the surgical field. Using a phantom (n = 3), we pursued robotic navigation on SPECT/CT, whereas intraoperative feasibility was validated during porcine surgery (n = 4). Results: Video-based tracking allowed for navigation of the Drop-In toward all lesions detected on SPECT/CT (external iliac and common iliac artery regions). Augmented-reality visualization in the surgical console indicated the distance to these lesions in real time, confirmed by the Drop-In readout. Porcine surgery underlined the feasibility of the concept. Conclusion: Optical navigation of the Drop-In probe provides a next step toward connecting nuclear medicine with robotic surgery.
Collapse
Affiliation(s)
- Samaneh Azargoshasb
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Krijn H M Houwing
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Paul R Roos
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Sven I van Leeuwen
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
| | - Michael Boonekamp
- Instrumentele Zaken Ontwikkeling, Facilitair Bedrijf, Leiden University Medical Center, Leiden, The Netherlands
| | - Elio Mazzone
- Department of Urology and Division of Experimental Oncology, URI, Urological Research Institute IRCCS San Raffaele Scientific Institute, Milan, Italy
- Orsi Academy, Melle, Belgium
| | | | - Paolo Dell'Oglio
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology and Division of Experimental Oncology, URI, Urological Research Institute IRCCS San Raffaele Scientific Institute, Milan, Italy
- Department of Urology, ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy; and
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| | - Fijs W B van Leeuwen
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
- Orsi Academy, Melle, Belgium
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| | - Matthias N van Oosterom
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands;
- Department of Urology, Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| |
Collapse
|
8
|
Cartucho J, Tukra S, Li Y, S. Elson D, Giannarou S. VisionBlender: a tool to efficiently generate computer vision datasets for robotic surgery. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2021. [DOI: 10.1080/21681163.2020.1835546] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- João Cartucho
- The Hamlyn Centre for Robotic Surgery, Department of Surgery and Cancer, Imperial College London, London, UK
| | - Samyakh Tukra
- The Hamlyn Centre for Robotic Surgery, Department of Surgery and Cancer, Imperial College London, London, UK
| | - Yunpeng Li
- School of Precision Instruments and Opto-Electronics Engineering, Tianjin University, Tianjin, China
| | - Daniel S. Elson
- The Hamlyn Centre for Robotic Surgery, Department of Surgery and Cancer, Imperial College London, London, UK
| | - Stamatia Giannarou
- The Hamlyn Centre for Robotic Surgery, Department of Surgery and Cancer, Imperial College London, London, UK
| |
Collapse
|