1
|
Miller R, Battle M, Wangerin K, Huff DT, Weisman AJ, Chen S, Perk TG, Ulaner GA. Evaluating Automated Tools for Lesion Detection on 18F Fluoroestradiol PET/CT Images and Assessment of Concordance with Standard-of-Care Imaging in Metastatic Breast Cancer. Radiol Imaging Cancer 2025; 7:e240253. [PMID: 40314583 DOI: 10.1148/rycan.240253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/03/2025]
Abstract
Purpose To evaluate two automated tools for detecting lesions on fluorine 18 (18F) fluoroestradiol (FES) PET/CT images and assess concordance of 18F-FES PET/CT with standard diagnostic CT and/or 18F fluorodeoxyglucose (FDG) PET/CT in patients with breast cancer. Materials and Methods This retrospective analysis of a prospective study included participants with breast cancer who underwent 18F-FES PET/CT examinations (n = 52), 18F-FDG PET/CT examinations (n = 13 of 52), and diagnostic CT examinations (n = 37 of 52). A convolutional neural network was trained for lesion detection using manually contoured lesions. Concordance in lesions labeled by a nuclear medicine physician between 18F-FES and 18F-FDG PET/CT and between 18F-FES PET/CT and diagnostic CT was assessed using an automated software medical device. Lesion detection performance was evaluated using sensitivity and false positives per participant. Wilcoxon tests were used for statistical comparisons. Results The study included 52 participants. The lesion detection algorithm achieved a median sensitivity of 62% with 0 false positives per participant. Compared with sensitivity in overall lesion detection, the sensitivity was higher for detection of high-uptake lesions (maximum standardized uptake value > 1.5, P = .002) and similar for detection of large lesions (volume > 0.5 cm3, P = .15). The artificial intelligence (AI) lesion detection tool was combined with a standardized uptake value threshold to demonstrate a fully automated method of labeling patients as having FES-avid metastases. Additionally, automated concordance analysis showed that 17 of 25 participants (68%) had over half of the detected lesions across two modalities present on 18F-FES PET/CT images. Conclusion An AI model was trained to detect lesions on 18F-FES PET/CT images and an automated concordance tool measured heterogeneity between 18F-FES PET/CT and standard-of-care imaging. Keywords: Molecular Imaging-Cancer, Neural Networks, PET/CT, Breast, Computer Applications-General (Informatics), Segmentation, 18F-FES PET, Metastatic Breast Cancer, Lesion Detection, Artificial Intelligence, Lesion Matching Supplemental material is available for this article. Clinical Trials Identifier: NCT04883814 Published under a CC BY 4.0 license.
Collapse
Affiliation(s)
- Renee Miller
- GE HealthCare, Pollards Wood, Nightingales Lane, Chalfont Saint Giles HP8 4SP, United Kingdom
| | - Mark Battle
- GE HealthCare, Pollards Wood, Nightingales Lane, Chalfont Saint Giles HP8 4SP, United Kingdom
| | - Kristen Wangerin
- GE HealthCare, Pollards Wood, Nightingales Lane, Chalfont Saint Giles HP8 4SP, United Kingdom
| | | | | | - Song Chen
- Department of Nuclear Medicine, The First Hospital of China Medical University, Shenyang, China
| | | | - Gary A Ulaner
- Department of Molecular Imaging and Therapy, Hoag Family Cancer Institute, Irvine, Calif
- Department of Radiology and Translational Genomics, University of Southern California, Los Angeles, Calif
| |
Collapse
|
2
|
Hossain A, Chowdhury SI. Breast Cancer Subtype Prediction Model Employing Artificial Neural Network and 18F-Fluorodeoxyglucose Positron Emission Tomography/ Computed Tomography. J Med Phys 2024; 49:181-188. [PMID: 39131430 PMCID: PMC11309150 DOI: 10.4103/jmp.jmp_181_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2023] [Revised: 03/17/2024] [Accepted: 04/14/2024] [Indexed: 08/13/2024] Open
Abstract
Introduction Although positron emission tomography/computed tomography (PET/CT) is a common tool for measuring breast cancer (BC), subtypes are not automatically classified by it. Therefore, the purpose of this research is to use an artificial neural network (ANN) to evaluate the clinical subtypes of BC based on the value of the tumor marker. Materials and Methods In our nuclear medical facility, 122 BC patients (training and testing) had 18F-fluoro-D-glucose (18F-FDG) PET/CT to identify the various subtypes of the disease. 18F-FDG-18 injections were administered to the patients before the scanning process. We carried out the scan according to protocol. Based on the tumor marker value, the ANN's output layer uses the Softmax function with cross-entropy loss to detect different subtypes of BC. Results With an accuracy of 95.77%, the result illustrates the ANN model for K-fold cross-validation. The mean values of specificity and sensitivity were 0.955 and 0.958, respectively. The area under the curve on average was 0.985. Conclusion Subtypes of BC may be categorized using the suggested approach. The PET/CT may be updated to diagnose BC subtypes using the appropriate tumor maker value when the suggested model is clinically implemented.
Collapse
Affiliation(s)
- Alamgir Hossain
- Department of Physics, University of Rajshahi, Rajshahi-6205, Rajshahi, Bangladesh
| | - Shariful Islam Chowdhury
- Institute of Nuclear Medicine and Allied Sciences, Bangladesh Atomic Energy Commission, Rajshahi, Bangladesh
| |
Collapse
|
3
|
Kidera E, Koyasu S, Hirata K, Hamaji M, Nakamoto R, Nakamoto Y. Convolutional neural network-based program to predict lymph node metastasis of non-small cell lung cancer using 18F-FDG PET. Ann Nucl Med 2024; 38:71-80. [PMID: 37755604 DOI: 10.1007/s12149-023-01866-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 09/11/2023] [Indexed: 09/28/2023]
Abstract
PURPOSE To develop a convolutional neural network (CNN)-based program to analyze maximum intensity projection (MIP) images of 2-deoxy-2-[F-18]fluoro-D-glucose (FDG) positron emission tomography (PET) scans, aimed at predicting lymph node metastasis of non-small cell lung cancer (NSCLC), and to evaluate its effectiveness in providing diagnostic assistance to radiologists. METHODS We obtained PET images of NSCLC from public datasets, including those of 435 patients with available N-stage information, which were divided into a training set (n = 304) and a test set (n = 131). We generated 36 maximum intensity projection (MIP) images for each patient. A residual network (ResNet-50)-based CNN was trained using the MIP images of the training set to predict lymph node metastasis. Lymph node metastasis in the test set was predicted by the trained CNN as well as by seven radiologists twice: first without and second with CNN assistance. Diagnostic performance metrics, including accuracy and prediction error (the difference between the truth and the predictions), were calculated, and reading times were recorded. RESULTS In the test set, 67 (51%) patients exhibited lymph node metastases and the CNN yielded 0.748 predictive accuracy. With the assistance of the CNN, the prediction error was significantly reduced for six of the seven radiologists although the accuracy did not change significantly. The prediction time was significantly reduced for five of the seven radiologists with the median reduction ratio 38.0%. CONCLUSION The CNN-based program could potentially assist radiologists in predicting lymph node metastasis by increasing diagnostic confidence and reducing reading time without affecting diagnostic accuracy, at least in the limited situations using MIP images.
Collapse
Affiliation(s)
- Eitaro Kidera
- Department of Radiology, Kishiwada City Hospital, Kishiwada, Japan
- Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto, 606-8507, Japan
| | - Sho Koyasu
- Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto, 606-8507, Japan.
| | - Kenji Hirata
- Department of Diagnostic Imaging, Graduate School of Medicine, Hokkaido University, Sapporo, Japan
| | - Masatsugu Hamaji
- Department of Thoracic Surgery, Kyoto University Hospital, Kyoto University, Kyoto, Japan
| | - Ryusuke Nakamoto
- Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto, 606-8507, Japan
| | - Yuji Nakamoto
- Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto, 606-8507, Japan
| |
Collapse
|
4
|
Hussain S, Lafarga-Osuna Y, Ali M, Naseem U, Ahmed M, Tamez-Peña JG. Deep learning, radiomics and radiogenomics applications in the digital breast tomosynthesis: a systematic review. BMC Bioinformatics 2023; 24:401. [PMID: 37884877 PMCID: PMC10605943 DOI: 10.1186/s12859-023-05515-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 10/02/2023] [Indexed: 10/28/2023] Open
Abstract
BACKGROUND Recent advancements in computing power and state-of-the-art algorithms have helped in more accessible and accurate diagnosis of numerous diseases. In addition, the development of de novo areas in imaging science, such as radiomics and radiogenomics, have been adding more to personalize healthcare to stratify patients better. These techniques associate imaging phenotypes with the related disease genes. Various imaging modalities have been used for years to diagnose breast cancer. Nonetheless, digital breast tomosynthesis (DBT), a state-of-the-art technique, has produced promising results comparatively. DBT, a 3D mammography, is replacing conventional 2D mammography rapidly. This technological advancement is key to AI algorithms for accurately interpreting medical images. OBJECTIVE AND METHODS This paper presents a comprehensive review of deep learning (DL), radiomics and radiogenomics in breast image analysis. This review focuses on DBT, its extracted synthetic mammography (SM), and full-field digital mammography (FFDM). Furthermore, this survey provides systematic knowledge about DL, radiomics, and radiogenomics for beginners and advanced-level researchers. RESULTS A total of 500 articles were identified, with 30 studies included as the set criteria. Parallel benchmarking of radiomics, radiogenomics, and DL models applied to the DBT images could allow clinicians and researchers alike to have greater awareness as they consider clinical deployment or development of new models. This review provides a comprehensive guide to understanding the current state of early breast cancer detection using DBT images. CONCLUSION Using this survey, investigators with various backgrounds can easily seek interdisciplinary science and new DL, radiomics, and radiogenomics directions towards DBT.
Collapse
Affiliation(s)
- Sadam Hussain
- School of Engineering and Sciences, Tecnológico de Monterrey, Ave. Eugenio Garza Sada 2501, 64849, Monterrey, Mexico.
| | - Yareth Lafarga-Osuna
- School of Engineering and Sciences, Tecnológico de Monterrey, Ave. Eugenio Garza Sada 2501, 64849, Monterrey, Mexico
| | - Mansoor Ali
- School of Engineering and Sciences, Tecnológico de Monterrey, Ave. Eugenio Garza Sada 2501, 64849, Monterrey, Mexico
| | - Usman Naseem
- College of Science and Engineering, James Cook University, Cairns, Australia
| | - Masroor Ahmed
- School of Engineering and Sciences, Tecnológico de Monterrey, Ave. Eugenio Garza Sada 2501, 64849, Monterrey, Mexico
| | - Jose Gerardo Tamez-Peña
- School of Medicine and Health Sciences, Tecnológico de Monterrey, Ave. Eugenio Garza Sada 2501, 64849, Monterrey, Mexico
| |
Collapse
|
5
|
El Naqa I, Karolak A, Luo Y, Folio L, Tarhini AA, Rollison D, Parodi K. Translation of AI into oncology clinical practice. Oncogene 2023; 42:3089-3097. [PMID: 37684407 DOI: 10.1038/s41388-023-02826-z] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 08/23/2023] [Accepted: 08/25/2023] [Indexed: 09/10/2023]
Abstract
Artificial intelligence (AI) is a transformative technology that is capturing popular imagination and can revolutionize biomedicine. AI and machine learning (ML) algorithms have the potential to break through existing barriers in oncology research and practice such as automating workflow processes, personalizing care, and reducing healthcare disparities. Emerging applications of AI/ML in the literature include screening and early detection of cancer, disease diagnosis, response prediction, prognosis, and accelerated drug discovery. Despite this excitement, only few AI/ML models have been properly validated and fewer have become regulated products for routine clinical use. In this review, we highlight the main challenges impeding AI/ML clinical translation. We present different clinical use cases from the domains of radiology, radiation oncology, immunotherapy, and drug discovery in oncology. We dissect the unique challenges and opportunities associated with each of these cases. Finally, we summarize the general requirements for successful AI/ML implementation in the clinic, highlighting specific examples and points of emphasis including the importance of multidisciplinary collaboration of stakeholders, role of domain experts in AI augmentation, transparency of AI/ML models, and the establishment of a comprehensive quality assurance program to mitigate risks of training bias and data drifts, all culminating toward safer and more beneficial AI/ML applications in oncology labs and clinics.
Collapse
Affiliation(s)
- Issam El Naqa
- Department of Machine Learning, Moffitt Cancer Center, Tampa, FL, 33612, USA.
| | - Aleksandra Karolak
- Department of Machine Learning, Moffitt Cancer Center, Tampa, FL, 33612, USA
| | - Yi Luo
- Department of Machine Learning, Moffitt Cancer Center, Tampa, FL, 33612, USA
| | - Les Folio
- Diagnostic Imaging & Interventional Radiology, Moffitt Cancer Center, Tampa, FL, 33612, USA
| | - Ahmad A Tarhini
- Cutaneous Oncology and Immunology, Moffitt Cancer Center, Tampa, FL, 33612, USA
| | - Dana Rollison
- Department of Cancer Epidemiology, Moffitt Cancer Center, Tampa, FL, 33612, USA
| | - Katia Parodi
- Department of Medical Physics, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|