1
|
Ylisiurua S, Sipola A, Nieminen MT, Brix MAK. Deep learning enables time-efficient soft tissue enhancement in CBCT: Proof-of-concept study for dentomaxillofacial applications. Phys Med 2024; 117:103184. [PMID: 38016216 DOI: 10.1016/j.ejmp.2023.103184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 10/06/2023] [Accepted: 11/19/2023] [Indexed: 11/30/2023] Open
Abstract
PURPOSE The use of iterative and deep learning reconstruction methods, which would allow effective noise reduction, is limited in cone-beam computed tomography (CBCT). As a consequence, the visibility of soft tissues is limited with CBCT. The study aimed to improve this issue through time-efficient deep learning enhancement (DLE) methods. METHODS Two DLE networks, UNIT and U-Net, were trained with simulated CBCT data. The performance of the networks was tested with three different test data sets. The quantitative evaluation measured the structural similarity index measure (SSIM) and the peak signal-to-noise ratio (PSNR) of the DLE reconstructions with respect to the ground truth iterative reconstruction method. In the second assessment, a dentomaxillofacial radiologist assessed the resolution of hard tissue structures, visibility of soft tissues, and overall image quality of real patient data using the Likert scale. Finally, the technical image quality was determined using modulation transfer function, noise power spectrum, and noise magnitude analyses. RESULTS The study demonstrated that deep learning CBCT denoising is feasible and time efficient. The DLE methods, trained with simulated CBCT data, generalized well, and DLE provided quantitatively (SSIM/PSNR) and visually similar noise-reduction as conventional IR, but with faster processing time. The DLE methods improved soft tissue visibility compared to the conventional Feldkamp-Davis-Kress (FDK) algorithm through noise reduction. However, in hard tissue quantification tasks, the radiologist preferred the FDK over the DLE methods. CONCLUSION Post-reconstruction DLE allowed feasible reconstruction times while yielding improvements in soft tissue visibility in each dataset.
Collapse
Affiliation(s)
- Sampo Ylisiurua
- Research Unit of Health Sciences and Technology, University of Oulu, Oulu 90220, Finland; Department of Diagnostic Radiology, Oulu University Hospital, Oulu 90220, Finland.
| | - Annina Sipola
- Medical Research Center, University of Oulu and Oulu University Hospital, Oulu 90220, Finland; Department of Dental Imaging, Oulu University Hospital, Oulu 90220, Finland; Research Unit of Oral Health Sciences, University of Oulu, Oulu 90220, Finland.
| | - Miika T Nieminen
- Research Unit of Health Sciences and Technology, University of Oulu, Oulu 90220, Finland; Department of Diagnostic Radiology, Oulu University Hospital, Oulu 90220, Finland; Medical Research Center, University of Oulu and Oulu University Hospital, Oulu 90220, Finland.
| | - Mikael A K Brix
- Research Unit of Health Sciences and Technology, University of Oulu, Oulu 90220, Finland; Department of Diagnostic Radiology, Oulu University Hospital, Oulu 90220, Finland; Medical Research Center, University of Oulu and Oulu University Hospital, Oulu 90220, Finland.
| |
Collapse
|
2
|
Hui X, Rajendran P, Ling T, Dai X, Xing L, Pramanik M. Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth. PHOTOACOUSTICS 2023; 34:100575. [PMID: 38174105 PMCID: PMC10761306 DOI: 10.1016/j.pacs.2023.100575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 11/15/2023] [Accepted: 11/27/2023] [Indexed: 01/05/2024]
Abstract
Accurate needle guidance is crucial for safe and effective clinical diagnosis and treatment procedures. Conventional ultrasound (US)-guided needle insertion often encounters challenges in consistency and precisely visualizing the needle, necessitating the development of reliable methods to track the needle. As a powerful tool in image processing, deep learning has shown promise for enhancing needle visibility in US images, although its dependence on manual annotation or simulated data as ground truth can lead to potential bias or difficulties in generalizing to real US images. Photoacoustic (PA) imaging has demonstrated its capability for high-contrast needle visualization. In this study, we explore the potential of PA imaging as a reliable ground truth for deep learning network training without the need for expert annotation. Our network (UIU-Net), trained on ex vivo tissue image datasets, has shown remarkable precision in localizing needles within US images. The evaluation of needle segmentation performance extends across previously unseen ex vivo data and in vivo human data (collected from an open-source data repository). Specifically, for human data, the Modified Hausdorff Distance (MHD) value stands at approximately 3.73, and the targeting error value is around 2.03, indicating the strong similarity and small needle orientation deviation between the predicted needle and actual needle location. A key advantage of our method is its applicability beyond US images captured from specific imaging systems, extending to images from other US imaging systems.
Collapse
Affiliation(s)
- Xie Hui
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
| | - Praveenbalaji Rajendran
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Tong Ling
- School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore
- School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 637459, Singapore
| | - Xianjin Dai
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Lei Xing
- Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States
| | - Manojit Pramanik
- Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50011, United States
| |
Collapse
|
3
|
Masoumi N, Rivaz H, Hacihaliloglu I, Ahmad MO, Reinertsen I, Xiao Y. The Big Bang of Deep Learning in Ultrasound-Guided Surgery: A Review. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2023; 70:909-919. [PMID: 37028313 DOI: 10.1109/tuffc.2023.3255843] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Ultrasound (US) imaging is a paramount modality in many image-guided surgeries and percutaneous interventions, thanks to its high portability, temporal resolution, and cost-efficiency. However, due to its imaging principles, the US is often noisy and difficult to interpret. Appropriate image processing can greatly enhance the applicability of the imaging modality in clinical practice. Compared with the classic iterative optimization and machine learning (ML) approach, deep learning (DL) algorithms have shown great performance in terms of accuracy and efficiency for US processing. In this work, we conduct a comprehensive review on deep-learning algorithms in the applications of US-guided interventions, summarize the current trends, and suggest future directions on the topic.
Collapse
|
5
|
Baker C, Xochicale M, Lin FY, Mathews S, Joubert F, Shakir DI, Miles R, Mosse CA, Zhao T, Liang W, Kunpalin Y, Dromey B, Mistry T, Sebire NJ, Zhang E, Ourselin S, Beard PC, David AL, Desjardins AE, Vercauteren T, Xia W. Intraoperative Needle Tip Tracking with an Integrated Fibre-Optic Ultrasound Sensor. SENSORS (BASEL, SWITZERLAND) 2022; 22:9035. [PMID: 36501738 PMCID: PMC9739176 DOI: 10.3390/s22239035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 11/15/2022] [Accepted: 11/16/2022] [Indexed: 06/17/2023]
Abstract
Ultrasound is an essential tool for guidance of many minimally-invasive surgical and interventional procedures, where accurate placement of the interventional device is critical to avoid adverse events. Needle insertion procedures for anaesthesia, fetal medicine and tumour biopsy are commonly ultrasound-guided, and misplacement of the needle may lead to complications such as nerve damage, organ injury or pregnancy loss. Clear visibility of the needle tip is therefore critical, but visibility is often precluded by tissue heterogeneities or specular reflections from the needle shaft. This paper presents the in vitro and ex vivo accuracy of a new, real-time, ultrasound needle tip tracking system for guidance of fetal interventions. A fibre-optic, Fabry-Pérot interferometer hydrophone is integrated into an intraoperative needle and used to localise the needle tip within a handheld ultrasound field. While previous, related work has been based on research ultrasound systems with bespoke transmission sequences, the new system-developed under the ISO 13485 Medical Devices quality standard-operates as an adjunct to a commercial ultrasound imaging system and therefore provides the image quality expected in the clinic, superimposing a cross-hair onto the ultrasound image at the needle tip position. Tracking accuracy was determined by translating the needle tip to 356 known positions in the ultrasound field of view in a tank of water, and by comparison to manual labelling of the the position of the needle in B-mode US images during an insertion into an ex vivo phantom. In water, the mean distance between tracked and true positions was 0.7 ± 0.4 mm with a mean repeatability of 0.3 ± 0.2 mm. In the tissue phantom, the mean distance between tracked and labelled positions was 1.1 ± 0.7 mm. Tracking performance was found to be independent of needle angle. The study demonstrates the performance and clinical compatibility of ultrasound needle tracking, an essential step towards a first-in-human study.
Collapse
Affiliation(s)
- Christian Baker
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Miguel Xochicale
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Fang-Yu Lin
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Sunish Mathews
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Francois Joubert
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Dzhoshkun I. Shakir
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Richard Miles
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Charles A. Mosse
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Tianrui Zhao
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Weidong Liang
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Yada Kunpalin
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Brian Dromey
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Talisa Mistry
- NIHR Great Ormond Street BRC and Institute of Child Health, University College London, 30 Guilford Street, London WC1N 1EH, UK
| | - Neil J. Sebire
- NIHR Great Ormond Street BRC and Institute of Child Health, University College London, 30 Guilford Street, London WC1N 1EH, UK
| | - Edward Zhang
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Sebastien Ourselin
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Paul C. Beard
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Anna L. David
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
- Elizabeth Garrett Anderson Institute for Women’s Health, University College London, 74 Huntley Street, London WC1E 6AU, UK
| | - Adrien E. Desjardins
- Department of Medical Physics and Biomedical Engineering, University College London, Gower Street, London WC1E 6BT, UK
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, UK
| | - Tom Vercauteren
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| | - Wenfeng Xia
- School of Biomedical Engineering and Imaging Sciences, King’s College London, 4th Floor, Lambeth Wing, St Thomas’ Hospital, London SE1 7EH, UK
| |
Collapse
|
6
|
Shi M, Zhao T, West SJ, Desjardins AE, Vercauteren T, Xia W. Improving needle visibility in LED-based photoacoustic imaging using deep learning with semi-synthetic datasets. PHOTOACOUSTICS 2022; 26:100351. [PMID: 35495095 PMCID: PMC9048160 DOI: 10.1016/j.pacs.2022.100351] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Revised: 03/29/2022] [Accepted: 03/30/2022] [Indexed: 06/14/2023]
Abstract
Photoacoustic imaging has shown great potential for guiding minimally invasive procedures by accurate identification of critical tissue targets and invasive medical devices (such as metallic needles). The use of light emitting diodes (LEDs) as the excitation light sources accelerates its clinical translation owing to its high affordability and portability. However, needle visibility in LED-based photoacoustic imaging is compromised primarily due to its low optical fluence. In this work, we propose a deep learning framework based on U-Net to improve the visibility of clinical metallic needles with a LED-based photoacoustic and ultrasound imaging system. To address the complexity of capturing ground truth for real data and the poor realism of purely simulated data, this framework included the generation of semi-synthetic training datasets combining both simulated data to represent features from the needles and in vivo measurements for tissue background. Evaluation of the trained neural network was performed with needle insertions into blood-vessel-mimicking phantoms, pork joint tissue ex vivo and measurements on human volunteers. This deep learning-based framework substantially improved the needle visibility in photoacoustic imaging in vivo compared to conventional reconstruction by suppressing background noise and image artefacts, achieving 5.8 and 4.5 times improvements in terms of signal-to-noise ratio and the modified Hausdorff distance, respectively. Thus, the proposed framework could be helpful for reducing complications during percutaneous needle insertions by accurate identification of clinical needles in photoacoustic imaging.
Collapse
Affiliation(s)
- Mengjie Shi
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Tianrui Zhao
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Simeon J. West
- Department of Anaesthesia, University College Hospital, London NW1 2BU, United Kingdom
| | - Adrien E. Desjardins
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1 W 7TY, United Kingdom
- Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, United Kingdom
| | - Tom Vercauteren
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| | - Wenfeng Xia
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, United Kingdom
| |
Collapse
|