1
|
Cozzi JL, Li H, Fuhrman JD, Lan L, Williams J, Finnerty B, Fahey TJ, Tumati A, Genender J, Keutgen XM, Giger ML. Multi-institutional development and testing of attention-enhanced deep learning segmentation of thyroid nodules on ultrasound. Int J Comput Assist Radiol Surg 2025; 20:259-267. [PMID: 39751996 DOI: 10.1007/s11548-024-03294-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2024] [Accepted: 11/12/2024] [Indexed: 01/04/2025]
Abstract
PURPOSE Thyroid nodules are common, and ultrasound-based risk stratification using ACR's TIRADS classification is a key step in predicting nodule pathology. Determining thyroid nodule contours is necessary for the calculation of TIRADS scores and can also be used in the development of machine learning nodule diagnosis systems. This paper presents the development, validation, and multi-institutional independent testing of a machine learning system for the automatic segmentation of thyroid nodules on ultrasound. METHODS The datasets, containing a total of 1595 thyroid ultrasound images from 520 patients with thyroid nodules, were retrospectively collected under IRB approval from University of Chicago Medicine (UCM) and Weill Cornell Medical Center (WCMC). Nodules were manually contoured by a team of UCM and WCMC physicians for ground truth. An AttU-Net, a U-Net architecture with additional attention weighting functions, was trained for the segmentations. The algorithm was validated through fivefold cross-validation by nodule and was tested on two independent test sets: one from UCM and one from WCMC. Dice similarity coefficient (DSC) and percent Hausdorff distance (%HD), Hausdorff distance reported as a percent of the nodule's effective diameter, served as the performance metrics. RESULTS On multi-institutional independent testing, the AttU-Net yielded average DSCs (std. deviation) of 0.915 (0.04) and 0.922 (0.03) and %HDs (std. deviation) of 12.9% (4.6) and 13.4% (6.3) on the UCM and WCMC test sets, respectively. Similarity testing showed the algorithm's performance on the two institutional test sets was equivalent up to margins of Δ DSC ≤ 0.013 and Δ %HD ≤ 1.73%. CONCLUSIONS This work presents a robust automatic thyroid nodule segmentation algorithm that could be implemented for risk stratification systems. Future work is merited to incorporate this segmentation method within an automatic thyroid classification system.
Collapse
Affiliation(s)
- Joseph L Cozzi
- Department of Radiology, University of Chicago, Chicago, IL, USA.
| | - Hui Li
- Department of Radiology, University of Chicago, Chicago, IL, USA
| | - Jordan D Fuhrman
- Department of Radiology, University of Chicago, Chicago, IL, USA
| | - Li Lan
- Department of Radiology, University of Chicago, Chicago, IL, USA
| | - Jelani Williams
- Division of General Surgery and Surgical Oncology, Department of Surgery, University of Chicago Medicine, Endocrine Surgery Research Program, Chicago, IL, USA
| | - Brendan Finnerty
- Endocrine Oncology Research Program, Division of Endocrine Surgery, Department of Surgery, New York Presbyterian Hospital-Weill Cornell Medicine, New York, USA
| | - Thomas J Fahey
- Endocrine Oncology Research Program, Division of Endocrine Surgery, Department of Surgery, New York Presbyterian Hospital-Weill Cornell Medicine, New York, USA
| | - Abhinay Tumati
- Endocrine Oncology Research Program, Division of Endocrine Surgery, Department of Surgery, New York Presbyterian Hospital-Weill Cornell Medicine, New York, USA
| | - Joshua Genender
- Department of Radiology, University of Chicago, Chicago, IL, USA
- David Geffen School of Medicine, University of California - Los Angeles, Los Angeles, CA, USA
| | - Xavier M Keutgen
- Division of General Surgery and Surgical Oncology, Department of Surgery, University of Chicago Medicine, Endocrine Surgery Research Program, Chicago, IL, USA
| | | |
Collapse
|
2
|
Luo L, Wang X, Lin Y, Ma X, Tan A, Chan R, Vardhanabhuti V, Chu WC, Cheng KT, Chen H. Deep Learning in Breast Cancer Imaging: A Decade of Progress and Future Directions. IEEE Rev Biomed Eng 2025; 18:130-151. [PMID: 38265911 DOI: 10.1109/rbme.2024.3357877] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Breast cancer has reached the highest incidence rate worldwide among all malignancies since 2020. Breast imaging plays a significant role in early diagnosis and intervention to improve the outcome of breast cancer patients. In the past decade, deep learning has shown remarkable progress in breast cancer imaging analysis, holding great promise in interpreting the rich information and complex context of breast imaging modalities. Considering the rapid improvement in deep learning technology and the increasing severity of breast cancer, it is critical to summarize past progress and identify future challenges to be addressed. This paper provides an extensive review of deep learning-based breast cancer imaging research, covering studies on mammograms, ultrasound, magnetic resonance imaging, and digital pathology images over the past decade. The major deep learning methods and applications on imaging-based screening, diagnosis, treatment response prediction, and prognosis are elaborated and discussed. Drawn from the findings of this survey, we present a comprehensive discussion of the challenges and potential avenues for future research in deep learning-based breast cancer imaging.
Collapse
|
3
|
Ferreira MR, Torres HR, Oliveira B, de Araujo ARVF, Morais P, Novais P, Vilaca JL. Deep Learning Networks for Breast Lesion Classification in Ultrasound Images: A Comparative Study. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38083151 DOI: 10.1109/embc40787.2023.10340293] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Accurate lesion classification as benign or malignant in breast ultrasound (BUS) images is a critical task that requires experienced radiologists and has many challenges, such as poor image quality, artifacts, and high lesion variability. Thus, automatic lesion classification may aid professionals in breast cancer diagnosis. In this scope, computer-aided diagnosis systems have been proposed to assist in medical image interpretation, outperforming the intra and inter-observer variability. Recently, such systems using convolutional neural networks have demonstrated impressive results in medical image classification tasks. However, the lack of public benchmarks and a standardized evaluation method hampers the performance comparison of networks. This work is a benchmark for lesion classification in BUS images comparing six state-of-the-art networks: GoogLeNet, InceptionV3, ResNet, DenseNet, MobileNetV2, and EfficientNet. For each network, five input data variations that include segmentation information were tested to compare their impact on the final performance. The methods were trained on a multi-center BUS dataset (BUSI and UDIAT) and evaluated using the following metrics: precision, sensitivity, F1-score, accuracy, and area under the curve (AUC). Overall, the lesion with a thin border of background provides the best performance. For this input data, EfficientNet obtained the best results: an accuracy of 97.65% and an AUC of 96.30%.Clinical Relevance- This study showed the potential of deep neural networks to be used in clinical practice for breast lesion classification, also suggesting the best model choices.
Collapse
|
4
|
Ribeiro RF, Torres HR, Oliveira B, Morais P, Vilaca JL. Comparative analysis of deep learning methods for lesion detection on full screening mammography. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38082575 DOI: 10.1109/embc40787.2023.10340501] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Breast cancer is the most prevalent type of cancer in women. Although mammography is used as the main imaging modality for the diagnosis, robust lesion detection in mammography images is a challenging task, due to the poor contrast of the lesion boundaries and the widely diverse sizes and shapes of the lesions. Deep Learning techniques have been explored to facilitate automatic diagnosis and have produced outstanding outcomes when used for different medical challenges. This study provides a benchmark for breast lesion detection in mammography images. Five state-of-art methods were evaluated on 1592 mammograms from a publicly available dataset (CBIS-DDSM) and compared considering the following seven metrics: i) mean Average Precision (mAP); ii) intersection over union; iii) precision; iv) recall; v) True Positive Rate (TPR); and vi) false positive per image. The CenterNet, YOLOv5, Faster-R-CNN, EfficientDet, and RetinaNet architectures were trained with a combination of the L1 localization loss and L2 localization loss. Despite all evaluated networks having mAP ratings greater than 60%, two managed to stand out among the evaluated networks. In general, the results demonstrate the efficiency of the model CenterNet with Hourglass-104 as its backbone and the model YOLOv5, achieving mAP scores of 70.71% and 69.36%, and TPR scores of 96.10% and 92.19%, respectively, outperforming the state-of-the-art models.Clinical Relevance - This study demonstrates the effectiveness of deep learning algorithms for breast lesion detection in mammography, potentially improving the accuracy and efficiency of breast cancer diagnosis.
Collapse
|
5
|
Costa N, Ferreira L, de Araújo ARVF, Oliveira B, Torres HR, Morais P, Alves V, Vilaça JL. Augmented Reality-Assisted Ultrasound Breast Biopsy. SENSORS (BASEL, SWITZERLAND) 2023; 23:1838. [PMID: 36850436 PMCID: PMC9961993 DOI: 10.3390/s23041838] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Revised: 01/17/2023] [Accepted: 02/04/2023] [Indexed: 06/18/2023]
Abstract
Breast cancer is the most prevalent cancer in the world and the fifth-leading cause of cancer-related death. Treatment is effective in the early stages. Thus, a need to screen considerable portions of the population is crucial. When the screening procedure uncovers a suspect lesion, a biopsy is performed to assess its potential for malignancy. This procedure is usually performed using real-time Ultrasound (US) imaging. This work proposes a visualization system for US breast biopsy. It consists of an application running on AR glasses that interact with a computer application. The AR glasses track the position of QR codes mounted on an US probe and a biopsy needle. US images are shown in the user's field of view with enhanced lesion visualization and needle trajectory. To validate the system, latency of the transmission of US images was evaluated. Usability assessment compared our proposed prototype with a traditional approach with different users. It showed that needle alignment was more precise, with 92.67 ± 2.32° in our prototype versus 89.99 ± 37.49° in a traditional system. The users also reached the lesion more accurately. Overall, the proposed solution presents promising results, and the use of AR glasses as a tracking and visualization device exhibited good performance.
Collapse
Affiliation(s)
- Nuno Costa
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
- Algoritmi Center, School of Engineering, University of Minho, 4800-058 Guimaraes, Portugal
- LASI—Associate Laboratory of Intelligent Systems, 4800-058 Guimaraes, Portugal
| | - Luís Ferreira
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
| | - Augusto R. V. F. de Araújo
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
- Institute of Computing, Universidade Federal Fluminense (UFF), Niteroi 24210-310, Brazil
| | - Bruno Oliveira
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
- Algoritmi Center, School of Engineering, University of Minho, 4800-058 Guimaraes, Portugal
- LASI—Associate Laboratory of Intelligent Systems, 4800-058 Guimaraes, Portugal
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, 4710-057 Braga, Portugal
- ICVS/3B’s—PT Government Associate Laboratory, 4710-057 Braga/Guimaraes, Portugal
| | - Helena R. Torres
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
- Algoritmi Center, School of Engineering, University of Minho, 4800-058 Guimaraes, Portugal
- LASI—Associate Laboratory of Intelligent Systems, 4800-058 Guimaraes, Portugal
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, 4710-057 Braga, Portugal
- ICVS/3B’s—PT Government Associate Laboratory, 4710-057 Braga/Guimaraes, Portugal
| | - Pedro Morais
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
| | - Victor Alves
- Algoritmi Center, School of Engineering, University of Minho, 4800-058 Guimaraes, Portugal
- LASI—Associate Laboratory of Intelligent Systems, 4800-058 Guimaraes, Portugal
| | - João L. Vilaça
- 2Ai—School of Technology, IPCA, 4750-810 Barcelos, Portugal
| |
Collapse
|
6
|
Real A, Morais P, Barbosa LCN, Gomes-Fonseca J, Oliveira B, Moreira AHJ, Vilaca JL. A sensorized needle guide for ultrasound assisted breast biopsy. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:865-868. [PMID: 36085709 DOI: 10.1109/embc48229.2022.9871148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
One in every eight women will get breast cancer during their lifetime. Therefore, the early diagnosis of the lesions is fundamental to improve the chances of recovery. To find breast cancers, breast screening using techniques such as mammography and ultrasound (US) imaging scans are often used. When a lesion is found, a breast biopsy is performed to extract a tissue sample for analysis. The breast biopsy is usually assisted by an US to help find the lesion and guide the needle to its location. However, the identification of the needle tip in US image is challenging, possibly resulting in puncture failures. In this paper, we intend to study the potential of a sensorized needle guide system that provides information about the needle angle and displacement in respect to the US probe. Laboratory tests were initially conducted to evaluate the accuracy of each sensor in controlled conditions. After, a practical experiment with the US probe, working as a proof of concept, was performed. The angle sensor showed a root mean square error (RMSE) of 0.48 degrees and the displacement sensor showed a RMSE of 0.26mm after being calibrated. For the US probe tests, the displacement sensor shows high errors in the range of 1.19mm to 2.05mm due to mechanical reasons. Overall, the proposed system showed its potential to be used to accurately estimate needle tip localization throughout breast biopsies guided by US, corroborating its potential clinical application. Clinical relevance - Potential for clinical application where precise needle localization in ultrasound image is required.
Collapse
|