1
|
Chen J, Jiang Y, Yang K, Ye X, Cui C, Shi S, Wu H, Tian H, Song D, Yao J, Wang L, Huang S, Xu J, Xu D, Dong F. Feasibility of using AI to auto-catch responsible frames in ultrasound screening for breast cancer diagnosis. iScience 2023; 26:105692. [PMID: 36570770 PMCID: PMC9771726 DOI: 10.1016/j.isci.2022.105692] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2022] [Revised: 10/31/2022] [Accepted: 11/26/2022] [Indexed: 12/12/2022] Open
Abstract
The research of AI-assisted breast diagnosis has primarily been based on static images. It is unclear whether it represents the best diagnosis image.To explore the method of capturing complementary responsible frames from breast ultrasound screening by using artificial intelligence. We used feature entropy breast network (FEBrNet) to select responsible frames from breast ultrasound screenings and compared the diagnostic performance of AI models based on FEBrNet-recommended frames, physician-selected frames, 5-frame interval-selected frames, all frames of video, as well as that of ultrasound and mammography specialists. The AUROC of AI model based on FEBrNet-recommended frames outperformed other frame set based AI models, as well as ultrasound and mammography physicians, indicating that FEBrNet can reach level of medical specialists in frame selection.FEBrNet model can extract video responsible frames for breast nodule diagnosis, whose performance is equivalent to the doctors selected responsible frames.
Collapse
Affiliation(s)
- Jing Chen
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Yitao Jiang
- Research and Development Department, Microport Prophecy, Shanghai 201203, China
| | - Keen Yang
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Xiuqin Ye
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Chen Cui
- Research and Development Department, Illuminate, LLC, Shenzhen, Guangdong 518000, China
| | - Siyuan Shi
- Research and Development Department, Illuminate, LLC, Shenzhen, Guangdong 518000, China
| | - Huaiyu Wu
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Hongtian Tian
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Di Song
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Jincao Yao
- The Cancer Hospital of the University of Chinese Academy of Sciences (Zhejiang Cancer Hospital), Institute of Basic Medicine and Cancer (IBMC), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China
| | - Liping Wang
- The Cancer Hospital of the University of Chinese Academy of Sciences (Zhejiang Cancer Hospital), Institute of Basic Medicine and Cancer (IBMC), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China
| | - Sijing Huang
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Jinfeng Xu
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| | - Dong Xu
- The Cancer Hospital of the University of Chinese Academy of Sciences (Zhejiang Cancer Hospital), Institute of Basic Medicine and Cancer (IBMC), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China
| | - Fajin Dong
- Department of Ultrasound, Shenzhen People's Hospital (The Second Clinical School of Medicine, Jinan University; The First Affiliated Hospital of Southern University of Science and Technology), Shenzhen, Guangdong 518020, China
| |
Collapse
|
2
|
Sun P, Feng Y, Chen C, Dekker A, Qian L, Wang Z, Guo J. An AI model of sonographer's evaluation+ S-Detect + elastography + clinical information improves the preoperative identification of benign and malignant breast masses. Front Oncol 2022; 12:1022441. [PMID: 36439410 PMCID: PMC9692079 DOI: 10.3389/fonc.2022.1022441] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2022] [Accepted: 10/25/2022] [Indexed: 09/10/2024] Open
Abstract
Purpose The purpose of the study was to build an AI model with selected preoperative clinical features to further improve the accuracy of the assessment of benign and malignant breast nodules. Methods Patients who underwent ultrasound, strain elastography, and S-Detect before ultrasound-guided biopsy or surgical excision were enrolled. The diagnosis model was built using a logistic regression model. The diagnostic performances of different models were evaluated and compared. Results A total of 179 lesions (101 benign and 78 malignant) were included. The whole dataset consisted of a training set (145 patients) and an independent test set (34 patients). The AI models constructed based on clinical features, ultrasound features, and strain elastography to predict and classify benign and malignant breast nodules had ROC AUCs of 0.87, 0.81, and 0.79 in the test set. The AUCs of the sonographer and S-Detect were 0.75 and 0.82, respectively, in the test set. The AUC of the combined AI model with the best performance was 0.89 in the test set. The combined AI model showed a better specificity of 0.92 than the other models. The sonographer's assessment showed better sensitivity (0.97 in the test set). Conclusion The combined AI model could improve the preoperative identification of benign and malignant breast masses and may reduce unnecessary breast biopsies.
Collapse
Affiliation(s)
- Pengfei Sun
- Department of Ultrasound, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Ying Feng
- Department of Ultrasound, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Chen Chen
- Department of Ultrasound, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Andre Dekker
- Department of Radiation Oncology (Maastro), GROW-School for Oncology and Reproduction, Maastricht University Medical Centre, Maastricht, Netherlands
| | - Linxue Qian
- Department of Ultrasound, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Zhixiang Wang
- Department of Radiation Oncology (Maastro), GROW-School for Oncology and Reproduction, Maastricht University Medical Centre, Maastricht, Netherlands
| | - Jun Guo
- Department of Ultrasound, Aerospace Center Hospital, Beijing, China
| |
Collapse
|