1
|
Chen J, Qiu RLJ, Wang T, Momin S, Yang X. A review of artificial intelligence in brachytherapy. J Appl Clin Med Phys 2025; 26:e70034. [PMID: 40014044 DOI: 10.1002/acm2.70034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2024] [Revised: 01/25/2025] [Accepted: 01/30/2025] [Indexed: 02/28/2025] Open
Abstract
Artificial intelligence (AI) has the potential to revolutionize brachytherapy's clinical workflow. This review comprehensively examines the application of AI, focusing on machine learning and deep learning, in various aspects of brachytherapy. We analyze AI's role in making brachytherapy treatments more personalized, efficient, and effective. The applications are systematically categorized into seven categories: imaging, preplanning, treatment planning, applicator reconstruction, quality assurance, outcome prediction, and real-time monitoring. Each major category is further subdivided based on cancer type or specific tasks, with detailed summaries of models, data sizes, and results presented in corresponding tables. Additionally, we discuss the limitations, challenges, and ethical concerns of current AI applications, along with perspectives on future directions. This review offers insights into the current advancements, challenges, and the impact of AI on treatment paradigms, encouraging further research to expand its clinical utility.
Collapse
Affiliation(s)
- Jingchu Chen
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
- School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia, USA
| | - Richard L J Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Tonghe Wang
- Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York, USA
| | - Shadab Momin
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
- School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia, USA
| |
Collapse
|
2
|
Chen J, Qiu RL, Wang T, Momin S, Yang X. A Review of Artificial Intelligence in Brachytherapy. ARXIV 2024:arXiv:2409.16543v1. [PMID: 39398213 PMCID: PMC11469420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 10/15/2024]
Abstract
Artificial intelligence (AI) has the potential to revolutionize brachytherapy's clinical workflow. This review comprehensively examines the application of AI, focusing on machine learning and deep learning, in facilitating various aspects of brachytherapy. We analyze AI's role in making brachytherapy treatments more personalized, efficient, and effective. The applications are systematically categorized into seven categories: imaging, preplanning, treatment planning, applicator reconstruction, quality assurance, outcome prediction, and real-time monitoring. Each major category is further subdivided based on cancer type or specific tasks, with detailed summaries of models, data sizes, and results presented in corresponding tables. This review offers insights into the current advancements, challenges, and the impact of AI on treatment paradigms, encouraging further research to expand its clinical utility.
Collapse
Affiliation(s)
- Jingchu Chen
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30308
- School of Mechanical Engineering, Georgia Institute of Technology, GA, Atlanta, USA
| | - Richard L.J. Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30308
| | - Tonghe Wang
- Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, NY 10065
| | - Shadab Momin
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30308
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30308
| |
Collapse
|
3
|
Mandal S, Balraj K, Kodamana H, Arora C, Clark JM, Kwon DS, Rathore AS. Weakly supervised large-scale pancreatic cancer detection using multi-instance learning. Front Oncol 2024; 14:1362850. [PMID: 39267824 PMCID: PMC11390448 DOI: 10.3389/fonc.2024.1362850] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2023] [Accepted: 08/01/2024] [Indexed: 09/15/2024] Open
Abstract
Introduction Early detection of pancreatic cancer continues to be a challenge due to the difficulty in accurately identifying specific signs or symptoms that might correlate with the onset of pancreatic cancer. Unlike breast or colon or prostate cancer where screening tests are often useful in identifying cancerous development, there are no tests to diagnose pancreatic cancers. As a result, most pancreatic cancers are diagnosed at an advanced stage, where treatment options, whether systemic therapy, radiation, or surgical interventions, offer limited efficacy. Methods A two-stage weakly supervised deep learning-based model has been proposed to identify pancreatic tumors using computed tomography (CT) images from Henry Ford Health (HFH) and publicly available Memorial Sloan Kettering Cancer Center (MSKCC) data sets. In the first stage, the nnU-Net supervised segmentation model was used to crop an area in the location of the pancreas, which was trained on the MSKCC repository of 281 patient image sets with established pancreatic tumors. In the second stage, a multi-instance learning-based weakly supervised classification model was applied on the cropped pancreas region to segregate pancreatic tumors from normal appearing pancreas. The model was trained, tested, and validated on images obtained from an HFH repository with 463 cases and 2,882 controls. Results The proposed deep learning model, the two-stage architecture, offers an accuracy of 0.907 ± 0.01, sensitivity of 0.905 ± 0.01, specificity of 0.908 ± 0.02, and AUC (ROC) 0.903 ± 0.01. The two-stage framework can automatically differentiate pancreatic tumor from non-tumor pancreas with improved accuracy on the HFH dataset. Discussion The proposed two-stage deep learning architecture shows significantly enhanced performance for predicting the presence of a tumor in the pancreas using CT images compared with other reported studies in the literature.
Collapse
Affiliation(s)
- Shyamapada Mandal
- Department of Chemical Engineering, Indian Institute of Technology Delhi, New Delhi, India
| | - Keerthiveena Balraj
- Yardi School of Artificial Intelligence, Indian Institute of Technology Delhi, New Delhi, India
| | - Hariprasad Kodamana
- Department of Chemical Engineering, Indian Institute of Technology Delhi, New Delhi, India
- Yardi School of Artificial Intelligence, Indian Institute of Technology Delhi, New Delhi, India
| | - Chetan Arora
- Yardi School of Artificial Intelligence, Indian Institute of Technology Delhi, New Delhi, India
- Department of Computer Science and Engineering, Indian Institute of Technology Delhi, New Delhi, India
| | - Julie M Clark
- Henry Ford Pancreatic Cancer Center, Henry Ford Health, Detroit, MI, United States
| | - David S Kwon
- Henry Ford Pancreatic Cancer Center, Henry Ford Health, Detroit, MI, United States
- Department of Surgery, Henry Ford Health, Detroit, MI, United States
| | - Anurag S Rathore
- Department of Chemical Engineering, Indian Institute of Technology Delhi, New Delhi, India
- Yardi School of Artificial Intelligence, Indian Institute of Technology Delhi, New Delhi, India
| |
Collapse
|
4
|
Qiu RL, Chang CW, Yang X. Deep learning-assisted lesion segmentation in PET/CT imaging: A feasibility study for salvage radiation therapy in prostate cancer. Oncoscience 2024; 11:49-50. [PMID: 38770445 PMCID: PMC11104407 DOI: 10.18632/oncoscience.603] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2024] [Indexed: 05/22/2024] Open
Affiliation(s)
- Richard L.J. Qiu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30322, USA
| | - Chih-Wei Chang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30322, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA 30322, USA
| |
Collapse
|
5
|
Eisazadeh R, Shahbazi-Akbari M, Mirshahvalad SA, Pirich C, Beheshti M. Application of Artificial Intelligence in Oncologic Molecular PET-Imaging: A Narrative Review on Beyond [ 18F]F-FDG Tracers Part II. [ 18F]F-FLT, [ 18F]F-FET, [ 11C]C-MET and Other Less-Commonly Used Radiotracers. Semin Nucl Med 2024; 54:293-301. [PMID: 38331629 DOI: 10.1053/j.semnuclmed.2024.01.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2024] [Accepted: 01/11/2024] [Indexed: 02/10/2024]
Abstract
Following the previous part of the narrative review on artificial intelligence (AI) applications in positron emission tomography (PET) using tracers rather than 18F-fluorodeoxyglucose ([18F]F-FDG), in this part we review the impact of PET-derived radiomics data on the diagnostic performance of other PET radiotracers, 18F-O-(2-fluoroethyl)-L-tyrosine ([18F]F-FET), 18F-Fluorothymidine ([18F]F-FLT) and 11C-Methionine ([11C]C-MET). [18F]F-FET-PET, using an artificial amino acid taken up into upregulated tumoral cells, showed potential in lesion detection and tumor characterization, especially with its ability to reflect glioma heterogeneity. [18F]F-FET-PET-derived textural features appeared to have the potential to reveal considerable information for accurate delineation for guiding biopsy and treatment, differentiate between low-grade and high-grade glioma and related wild-type genotypes, and distinguish pseudoprogression from true progression. In addition, models built using clinical parameters and [18F]F-FET-PET-derived radiomics features showed acceptable results for survival stratification of glioblastoma patients. [18F]F-FLT-PET-based characteristics also showed potential in evaluating glioma patients, correlating with Ki-67 and patient prognosis. AI-based PET-volumetry using this radiotracer as a proliferation marker also revealed promising preliminary results in terms of guide-targeting bone marrow-preserving adaptive radiation therapy. Similar to [18F]F-FET, the other amino acid tracer which reflects cellular proliferation, [11C]C-MET, has also shown acceptable performance in predicting tumor grade, distinguishing brain tumor recurrence from radiation necrosis, and treatment monitoring by PET-derived radiomics models. In addition, PET-derived radiomics features of various radiotracers such as [18F]F-DOPA, [18F]F-FACBC, [18F]F-NaF, [68Ga]Ga-CXCR-4 and [18F]F-FMISO may also provide useful information for tumor characterization and predict of disease outcome. In conclusion, AI using tracers beyond [18F]F-FDG could improve the diagnostic performance of PET-imaging for specific indications and help clinicians in their daily routine by providing features that are often not detectable by the naked eye.
Collapse
Affiliation(s)
- Roya Eisazadeh
- Division of Molecular Imaging & Theranostics, Department of Nuclear Medicine, University Hospital, Paracelsus Medical University, Salzburg, Austria
| | - Malihe Shahbazi-Akbari
- Division of Molecular Imaging & Theranostics, Department of Nuclear Medicine, University Hospital, Paracelsus Medical University, Salzburg, Austria; Research center for Nuclear Medicine, Department of Nuclear Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Seyed Ali Mirshahvalad
- Division of Molecular Imaging & Theranostics, Department of Nuclear Medicine, University Hospital, Paracelsus Medical University, Salzburg, Austria; Research center for Nuclear Medicine, Department of Nuclear Medicine, Tehran University of Medical Sciences, Tehran, Iran; Joint Department of Medical Imaging, University Medical Imaging Toronto (UMIT), University Health Network, Mount Sinai Hospital & Women's College Hospital; University of Toronto, Toronto, Ontario, Canada
| | - Christian Pirich
- Division of Molecular Imaging & Theranostics, Department of Nuclear Medicine, University Hospital, Paracelsus Medical University, Salzburg, Austria
| | - Mohsen Beheshti
- Division of Molecular Imaging & Theranostics, Department of Nuclear Medicine, University Hospital, Paracelsus Medical University, Salzburg, Austria.
| |
Collapse
|