1
|
Tao X, Cao Y, Jiang Y, Wu X, Yan D, Xue W, Zhuang S, Yang X, Huang R, Zhang J, Ni D. Enhancing lesion detection in automated breast ultrasound using unsupervised multi-view contrastive learning with 3D DETR. Med Image Anal 2025; 101:103466. [PMID: 39854815 DOI: 10.1016/j.media.2025.103466] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Revised: 12/28/2024] [Accepted: 01/09/2025] [Indexed: 01/27/2025]
Abstract
The inherent variability of lesions poses challenges in leveraging AI in 3D automated breast ultrasound (ABUS) for lesion detection. Traditional methods based on single scans have fallen short compared to comprehensive evaluations by experienced sonologists using multiple scans. To address this, our study introduces an innovative approach combining the multi-view co-attention mechanism (MCAM) with unsupervised contrastive learning. Rooted in the detection transformer (DETR) architecture, our model employs a one-to-many matching strategy, significantly boosting training efficiency and lesion recall metrics. The model integrates MCAM within the decoder, facilitating the interpretation of lesion data across diverse views. Simultaneously, unsupervised multi-view contrastive learning (UMCL) aligns features consistently across scans, improving detection performance. When tested on two multi-center datasets comprising 1509 patients, our approach outperforms existing state-of-the-art 3D detection models. Notably, our model achieves a 90.3% cancer detection rate with a false positive per image (FPPI) rate of 0.5 on the external validation dataset. This surpasses junior sonologists and matches the performance of seasoned experts.
Collapse
Affiliation(s)
- Xing Tao
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China
| | - Yan Cao
- Shenzhen RayShape Medical Technology Co., Ltd, Shenzhen, China
| | - Yanhui Jiang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Xiaoxi Wu
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Dan Yan
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Wen Xue
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Shulian Zhuang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Xin Yang
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China
| | - Ruobing Huang
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China.
| | - Jianxing Zhang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China.
| | - Dong Ni
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China; School of Biomedical Engineering and Informatics, Nanjing Medical University, Nanjing 211166, China.
| |
Collapse
|
2
|
Jiang X, Chen C, Yao J, Wang L, Yang C, Li W, Ou D, Jin Z, Liu Y, Peng C, Wang Y, Xu D. A nomogram for diagnosis of BI-RADS 4 breast nodules based on three-dimensional volume ultrasound. BMC Med Imaging 2025; 25:48. [PMID: 39953395 PMCID: PMC11829536 DOI: 10.1186/s12880-025-01580-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Accepted: 02/03/2025] [Indexed: 02/17/2025] Open
Abstract
OBJECTIVES The classification of malignant breast nodules into four categories according to the Breast Imaging Reporting and Data System (BI-RADS) presents significant variability, posing challenges in clinical diagnosis. This study investigates whether a nomogram prediction model incorporating automated breast ultrasound system (ABUS) can improve the accuracy of differentiating benign and malignant BI-RADS 4 breast nodules. METHODS Data were collected for a total of 257 nodules with breast nodules corresponding to BI-RADS 4 who underwent ABUS examination and for whom pathology results were obtained from January 2019 to August 2022. The participants were divided into a benign group (188 cases) and a malignant group (69 cases) using a retrospective study method. Ultrasound imaging features were recorded. Logistic regression analysis was used to screen the clinical and ultrasound characteristics. Using the results of these analyses, a nomogram prediction model was established accordingly. RESULTS Age, distance between nodule and nipple, calcification and C-plane convergence sign were independent risk factors that enabled differentiation between benign and malignant breast nodules (all P < 0.05). A nomogram model was established based on these variables. The area under curve (AUC) values for the nomogram model, age, distance between nodule and nipple, calcification, and C-plane convergence sign were 0.86, 0.735, 0.645, 0.697, and 0.685, respectively. Thus, the AUC value for the model was significantly higher than a single variable. CONCLUSIONS A nomogram based on the clinical and ultrasound imaging features of ABUS can be used to improve the accuracy of the diagnosis of benign and malignant BI-RADS 4 nodules. It can function as a relatively accurate predictive tool for sonographers and clinicians and is therefore clinically useful. ADVANCES IN KNOWLEDGE STATEMENT: we retrospectively analyzed the clinical and ultrasound characteristics of ABUS BI-RADS 4 nodules and established a nomogram model to improve the efficiency of the majority of ABUS readers in the diagnosis of BI-RADS 4 nodules.
Collapse
Affiliation(s)
- Xianping Jiang
- Department of Ultrasound, Shengzhou People's Hospital (Shengzhou Branch of the First Affiliated Hospital of Zhejiang University School of Medicine, the Shengzhou Hospital of Shaoxing University), Shengzhou, 312400, China
| | - Chen Chen
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Jincao Yao
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Liping Wang
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Chen Yang
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Wei Li
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Di Ou
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Zhiyan Jin
- Postgraduate training base Alliance of Wenzhou Medical University, Hangzhou, 310022, China
| | - Yuanzhen Liu
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Chanjuan Peng
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China
| | - Yifan Wang
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China.
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China.
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China.
| | - Dong Xu
- Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, No.1 East Banshan Road, Gongshu District, Hangzhou, Zhejiang, 310022, China.
- Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, 317502, China.
- Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, 317502, China.
| |
Collapse
|
3
|
Bahl M, Chang JM, Mullen LA, Berg WA. Artificial Intelligence for Breast Ultrasound: AJR Expert Panel Narrative Review. AJR Am J Roentgenol 2024; 223:e2330645. [PMID: 38353449 DOI: 10.2214/ajr.23.30645] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/05/2024]
Abstract
Breast ultrasound is used in a wide variety of clinical scenarios, including both diagnostic and screening applications. Limitations of ultrasound, however, include its low specificity and, for automated breast ultrasound screening, the time necessary to review whole-breast ultrasound images. As of this writing, four AI tools that are approved or cleared by the FDA address these limitations. Current tools, which are intended to provide decision support for lesion classification and/or detection, have been shown to increase specificity among nonspecialists and to decrease interpretation times. Potential future applications include triage of patients with palpable masses in low-resource settings, preoperative prediction of axillary lymph node metastasis, and preoperative prediction of neoadjuvant chemotherapy response. Challenges in the development and clinical deployment of AI for ultrasound include the limited availability of curated training datasets compared with mammography, the high variability in ultrasound image acquisition due to equipment- and operator-related factors (which may limit algorithm generalizability), and the lack of postimplementation evaluation studies. Furthermore, current AI tools for lesion classification were developed based on 2D data, but diagnostic accuracy could potentially be improved if multimodal ultrasound data were used, such as color Doppler, elastography, cine clips, and 3D imaging.
Collapse
Affiliation(s)
- Manisha Bahl
- Department of Radiology, Massachusetts General Hospital, 55 Fruit St, WAC 240, Boston, MA 02114
| | - Jung Min Chang
- Department of Radiology, Seoul National University Hospital, Seoul, Korea
| | - Lisa A Mullen
- Department of Radiology and Radiological Science, Johns Hopkins Medicine, Baltimore, MD
| | - Wendie A Berg
- Department of Radiology, University of Pittsburgh School of Medicine, Pittsburgh, PA
| |
Collapse
|
4
|
Bajaj S, Gandhi D, Nayar D. Potential Applications and Impact of ChatGPT in Radiology. Acad Radiol 2024; 31:1256-1261. [PMID: 37802673 DOI: 10.1016/j.acra.2023.09.013] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Revised: 08/15/2023] [Accepted: 08/28/2023] [Indexed: 10/11/2023]
Abstract
Radiology has always gone hand-in-hand with technology and artificial intelligence (AI) is not new to the field. While various AI devices and algorithms have already been integrated in the daily clinical practice of radiology, with applications ranging from scheduling patient appointments to detecting and diagnosing certain clinical conditions on imaging, the use of natural language processing and large language model based software have been in discussion for a long time. Algorithms like ChatGPT can help in improving patient outcomes, increasing the efficiency of radiology interpretation, and aiding in the overall workflow of radiologists and here we discuss some of its potential applications.
Collapse
Affiliation(s)
- Suryansh Bajaj
- Department of Radiology, University of Arkansas for Medical Sciences, Little Rock, Arkansas 72205 (S.B.)
| | - Darshan Gandhi
- Department of Diagnostic Radiology, University of Tennessee Health Science Center, Memphis, Tennessee 38103 (D.G.).
| | - Divya Nayar
- Department of Neurology, University of Arkansas for Medical Sciences, Little Rock, Arkansas 72205 (D.N.)
| |
Collapse
|
5
|
Pawlak ME, Rudnicki W, Borkowska A, Skubisz K, Rydzyk R, Łuczyńska E. Comparative Analysis of Diagnostic Performance of Automatic Breast Ultrasound, Full-Field Digital Mammography and Contrast-Enhanced Mammography in Relation to Breast Composition. Biomedicines 2023; 11:3226. [PMID: 38137447 PMCID: PMC10741119 DOI: 10.3390/biomedicines11123226] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Revised: 12/01/2023] [Accepted: 12/04/2023] [Indexed: 12/24/2023] Open
Abstract
This single center study includes a comparative analysis of the diagnostic performance of full-field digital mammography (FFDM), contrast-enhanced mammography (CEM) and automatic breast ultrasound (ABUS) in the group of patients with breast American College of Radiology (ACR) categories C and D as well as A and B with FFDM. The study involved 297 patients who underwent ABUS and FFDM. Breast types C and D were determined in 40% of patients with FFDM and low- energy CEM. CEM was performed on 76 patients. Focal lesions were found in 131 patients, of which 115 were histopathologically verified. The number of lesions detected in patients with multiple lesions were 40 from 48 with ABUS, 13 with FFDM and 21 with CEM. Compliance in determining the number of foci was 82% for FFDM and 91% for both CEM and ABUS. In breast types C and D, 72% of all lesions were found with ABUS, 56% with CEM and 29% with FFDM (p = 0.008, p = 0.000); all invasive cancers were diagnosed with ABUS, 83% with CEM and 59% with FFDM (p = 0.000, p = 0.023); 100% DCIS were diagnosed with ABUS, 93% with CEM and 59% with FFDM. The size of lesions from histopathology in breast ACR categories A and B was 14-26 mm, while in breast categories C and D was 11-37 mm. In breast categories C and D, sensitivity of ABUS, FFDM and CEM was, respectively, 78.05, 85.37, 92.68; specificity: 40, 13.33, 8.33; PPV (positive predictive value): 78.05, 72.92, 77.55; NPV (negative predictive value): 40, 25, 25, accuracy: 67.86, 66.07, 73.58. In breast categories A and B, sensitivity of ABUS, FFDM and CEM was, respectively, 81.25, 93.75, 93.48; specificity: 18.18, 18.18, 16.67; PPV: 81.25, 83.33, 89.58; NPV: 18.18, 40, 25; accuracy: 69.49, 79.66, 84.62. The sensitivity of the combination of FFDM and ABUS was 100 for all types of breast categories; the accuracy was 75 in breast types C and D and 81.36 in breast types A and B. The study confirms the predominance of C and D breast anatomy types and the low diagnostic performance of FFDM within that group and indicates ABUS and CEM as potential additive methods in breast cancer diagnostics.
Collapse
Affiliation(s)
- Marta Ewa Pawlak
- Diagnostic Imaging Department, University Hospital in Cracow, 30-688 Cracow, Poland;
| | - Wojciech Rudnicki
- Department of Electroradiology, Jagiellonian University Medical College, 30-688 Cracow, Poland; (W.R.); (A.B.); (K.S.)
| | - Anna Borkowska
- Department of Electroradiology, Jagiellonian University Medical College, 30-688 Cracow, Poland; (W.R.); (A.B.); (K.S.)
| | - Karolina Skubisz
- Department of Electroradiology, Jagiellonian University Medical College, 30-688 Cracow, Poland; (W.R.); (A.B.); (K.S.)
| | - Rafał Rydzyk
- Diagnostic Imaging Department, 5th Military Clinical Hospital in Krakow, 30-901 Cracow, Poland
| | - Elżbieta Łuczyńska
- Department of Electroradiology, Jagiellonian University Medical College, 30-688 Cracow, Poland; (W.R.); (A.B.); (K.S.)
| |
Collapse
|
6
|
Retson TA, Eghtedari M. Expanding Horizons: The Realities of CAD, the Promise of Artificial Intelligence, and Machine Learning's Role in Breast Imaging beyond Screening Mammography. Diagnostics (Basel) 2023; 13:2133. [PMID: 37443526 PMCID: PMC10341264 DOI: 10.3390/diagnostics13132133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2023] [Revised: 06/06/2023] [Accepted: 06/12/2023] [Indexed: 07/15/2023] Open
Abstract
Artificial intelligence (AI) applications in mammography have gained significant popular attention; however, AI has the potential to revolutionize other aspects of breast imaging beyond simple lesion detection. AI has the potential to enhance risk assessment by combining conventional factors with imaging and improve lesion detection through a comparison with prior studies and considerations of symmetry. It also holds promise in ultrasound analysis and automated whole breast ultrasound, areas marked by unique challenges. AI's potential utility also extends to administrative tasks such as MQSA compliance, scheduling, and protocoling, which can reduce the radiologists' workload. However, adoption in breast imaging faces limitations in terms of data quality and standardization, generalizability, benchmarking performance, and integration into clinical workflows. Developing methods for radiologists to interpret AI decisions, and understanding patient perspectives to build trust in AI results, will be key future endeavors, with the ultimate aim of fostering more efficient radiology practices and better patient care.
Collapse
Affiliation(s)
- Tara A. Retson
- Department of Radiology, University of California, San Diego, CA 92093, USA;
| | | |
Collapse
|
7
|
An Efficient USE-Net Deep Learning Model for Cancer Detection. INT J INTELL SYST 2023. [DOI: 10.1155/2023/8509433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/25/2023]
Abstract
Breast cancer (BrCa) is the most common disease in women worldwide. Classifying the BrCa image is extremely important for finding BrCa at an earlier stage and monitoring BrCa during treatment. The computer-aided detection methods have been used to interpret BrCa and improve the detection of BrCa during the screening and treatment stages. However, if a new BrCa image is generated for the treatment, it will not classify correctly. The main objective of this research is to classify the BrCa images for newly generated images. The model performs preprocessing, segmentation, feature extraction, and classification. In preprocessing, a hybrid median filtering (HMF) is used to eliminate the noise in the images. The contrast of the images is enhanced using quadrant dynamic histogram equalization (QDHE). Then, ROI segmentation is performed using the USE-Net deep learning model. The CaffeNet model is used for feature extraction on the segmented images, and finally, classification is made using the improved random forest (IRF) with extreme gradient boosting (XGB). The model obtained 97.87% accuracy, 98.45% sensitivity, 95.24% specificity, 98.96% precision, and 98.70% f1-score for ultrasound images. The model gives 98.31% accuracy, 99.29% sensitivity, 90.20% specificity, 98.82% precision, and 99.05% f1-score for mammogram images.
Collapse
|
8
|
Villa-Camacho JC, Baikpour M, Chou SHS. Artificial Intelligence for Breast US. JOURNAL OF BREAST IMAGING 2023; 5:11-20. [PMID: 38416959 DOI: 10.1093/jbi/wbac077] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Indexed: 03/01/2024]
Abstract
US is a widely available, commonly used, and indispensable imaging modality for breast evaluation. It is often the primary imaging modality for the detection and diagnosis of breast cancer in low-resource settings. In addition, it is frequently employed as a supplemental screening tool via either whole breast handheld US or automated breast US among women with dense breasts. In recent years, a variety of artificial intelligence systems have been developed to assist radiologists with the detection and diagnosis of breast lesions on US. This article reviews the background and evidence supporting the use of artificial intelligence tools for breast US, describes implementation strategies and impact on clinical workflow, and discusses potential emerging roles and future directions.
Collapse
Affiliation(s)
| | - Masoud Baikpour
- Massachusetts General Hospital, Department of Radiology, Boston, MA, USA
| | - Shinn-Huey S Chou
- Massachusetts General Hospital, Department of Radiology, Boston, MA, USA
| |
Collapse
|
9
|
Hejduk P, Marcon M, Unkelbach J, Ciritsis A, Rossi C, Borkowski K, Boss A. Fully automatic classification of automated breast ultrasound (ABUS) imaging according to BI-RADS using a deep convolutional neural network. Eur Radiol 2022; 32:4868-4878. [PMID: 35147776 PMCID: PMC9213284 DOI: 10.1007/s00330-022-08558-0] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 12/14/2021] [Accepted: 12/26/2021] [Indexed: 12/15/2022]
Abstract
PURPOSE The aim of this study was to develop and test a post-processing technique for detection and classification of lesions according to the BI-RADS atlas in automated breast ultrasound (ABUS) based on deep convolutional neural networks (dCNNs). METHODS AND MATERIALS In this retrospective study, 645 ABUS datasets from 113 patients were included; 55 patients had lesions classified as high malignancy probability. Lesions were categorized in BI-RADS 2 (no suspicion of malignancy), BI-RADS 3 (probability of malignancy < 3%), and BI-RADS 4/5 (probability of malignancy > 3%). A deep convolutional neural network was trained after data augmentation with images of lesions and normal breast tissue, and a sliding-window approach for lesion detection was implemented. The algorithm was applied to a test dataset containing 128 images and performance was compared with readings of 2 experienced radiologists. RESULTS Results of calculations performed on single images showed accuracy of 79.7% and AUC of 0.91 [95% CI: 0.85-0.96] in categorization according to BI-RADS. Moderate agreement between dCNN and ground truth has been achieved (κ: 0.57 [95% CI: 0.50-0.64]) what is comparable with human readers. Analysis of whole dataset improved categorization accuracy to 90.9% and AUC of 0.91 [95% CI: 0.77-1.00], while achieving almost perfect agreement with ground truth (κ: 0.82 [95% CI: 0.69-0.95]), performing on par with human readers. Furthermore, the object localization technique allowed the detection of lesion position slice-wise. CONCLUSIONS Our results show that a dCNN can be trained to detect and distinguish lesions in ABUS according to the BI-RADS classification with similar accuracy as experienced radiologists. KEY POINTS • A deep convolutional neural network (dCNN) was trained for classification of ABUS lesions according to the BI-RADS atlas. • A sliding-window approach allows accurate automatic detection and classification of lesions in ABUS examinations.
Collapse
Affiliation(s)
- Patryk Hejduk
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
| | - Magda Marcon
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Jan Unkelbach
- Department of Radiation Oncology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Cristina Rossi
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Karol Borkowski
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Andreas Boss
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| |
Collapse
|
10
|
Tan T, Das B, Soni R, Fejes M, Yang H, Ranjan S, Szabo DA, Melapudi V, Shriram KS, Agrawal U, Rusko L, Herczeg Z, Darazs B, Tegzes P, Ferenczi L, Mullick R, Avinash G. Multi-modal trained artificial intelligence solution to triage chest X-ray for COVID-19 using pristine ground-truth, versus radiologists. Neurocomputing 2022; 485:36-46. [PMID: 35185296 PMCID: PMC8847079 DOI: 10.1016/j.neucom.2022.02.040] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Revised: 12/25/2021] [Accepted: 02/11/2022] [Indexed: 11/05/2022]
Abstract
The front-line imaging modalities computed tomography (CT) and X-ray play important roles for triaging COVID patients. Thoracic CT has been accepted to have higher sensitivity than a chest X-ray for COVID diagnosis. Considering the limited access to resources (both hardware and trained personnel) and issues related to decontamination, CT may not be ideal for triaging suspected subjects. Artificial intelligence (AI) assisted X-ray based application for triaging and monitoring require experienced radiologists to identify COVID patients in a timely manner with the additional ability to delineate and quantify the disease region is seen as a promising solution for widespread clinical use. Our proposed solution differs from existing solutions presented by industry and academic communities. We demonstrate a functional AI model to triage by classifying and segmenting a single chest X-ray image, while the AI model is trained using both X-ray and CT data. We report on how such a multi-modal training process improves the solution compared to single modality (X-ray only) training. The multi-modal solution increases the AUC (area under the receiver operating characteristic curve) from 0.89 to 0.93 for a binary classification between COVID-19 and non-COVID-19 cases. It also positively impacts the Dice coefficient (0.59 to 0.62) for localizing the COVID-19 pathology. To compare the performance of experienced readers to the AI model, a reader study is also conducted. The AI model showed good consistency with respect to radiologists. The DICE score between two radiologists on the COVID group was 0.53 while the AI had a DICE value of 0.52 and 0.55 when compared to the segmentation done by the two radiologists separately. From a classification perspective, the AUCs of two readers was 0.87 and 0.81 while the AUC of the AI is 0.93 based on the reader study dataset. We also conducted a generalization study by comparing our method to the-state-art methods on independent datasets. The results show better performance from the proposed method. Leveraging multi-modal information for the development benefits the single-modal inferencing.
Collapse
Affiliation(s)
- Tao Tan
- GE Healthcare, The Netherlands
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
11
|
Wang Q, Chen H, Luo G, Li B, Shang H, Shao H, Sun S, Wang Z, Wang K, Cheng W. Performance of novel deep learning network with the incorporation of the automatic segmentation network for diagnosis of breast cancer in automated breast ultrasound. Eur Radiol 2022; 32:7163-7172. [PMID: 35488916 DOI: 10.1007/s00330-022-08836-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Revised: 04/15/2022] [Accepted: 04/21/2022] [Indexed: 11/29/2022]
Abstract
OBJECTIVE To develop novel deep learning network (DLN) with the incorporation of the automatic segmentation network (ASN) for morphological analysis and determined the performance for diagnosis breast cancer in automated breast ultrasound (ABUS). METHODS A total of 769 breast tumors were enrolled in this study and were randomly divided into training set and test set at 600 vs. 169. The novel DLNs (Resent v2, ResNet50 v2, ResNet101 v2) added a new ASN to the traditional ResNet networks and extracted morphological information of breast tumors. The accuracy, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), area under the receiver operating characteristic (ROC) curve (AUC), and average precision (AP) were calculated. The diagnostic performances of novel DLNs were compared with those of two radiologists with different experience. RESULTS The ResNet34 v2 model had higher specificity (76.81%) and PPV (82.22%) than the other two, the ResNet50 v2 model had higher accuracy (78.11%) and NPV (72.86%), and the ResNet101 v2 model had higher sensitivity (85.00%). According to the AUCs and APs, the novel ResNet101 v2 model produced the best result (AUC 0.85 and AP 0.90) compared with the remaining five DLNs. Compared with the novice radiologist, the novel DLNs performed better. The F1 score was increased from 0.77 to 0.78, 0.81, and 0.82 by three novel DLNs. However, their diagnostic performance was worse than that of the experienced radiologist. CONCLUSIONS The novel DLNs performed better than traditional DLNs and may be helpful for novice radiologists to improve their diagnostic performance of breast cancer in ABUS. KEY POINTS • A novel automatic segmentation network to extract morphological information was successfully developed and implemented with ResNet deep learning networks. • The novel deep learning networks in our research performed better than the traditional deep learning networks in the diagnosis of breast cancer using ABUS images. • The novel deep learning networks in our research may be useful for novice radiologists to improve diagnostic performance.
Collapse
Affiliation(s)
- Qiucheng Wang
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - He Chen
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - Gongning Luo
- School of Computer Science and Technology, Harbin Institute of Technology, No. 92, Xidazhi Street, Nangang District, Harbin, Heilongjiang Province, China
| | - Bo Li
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - Haitao Shang
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - Hua Shao
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - Shanshan Sun
- Department of Breast Surgery, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China
| | - Zhongshuai Wang
- School of Computer Science and Technology, Harbin Institute of Technology, No. 92, Xidazhi Street, Nangang District, Harbin, Heilongjiang Province, China
| | - Kuanquan Wang
- School of Computer Science and Technology, Harbin Institute of Technology, No. 92, Xidazhi Street, Nangang District, Harbin, Heilongjiang Province, China
| | - Wen Cheng
- Department of Ultrasound, Harbin Medical University Cancer Hospital, No. 150, Haping Road, Nangang District, Harbin, Heilongjiang Province, China.
| |
Collapse
|
12
|
Lee J, Kang BJ, Kim SH, Park GE. Evaluation of Computer-Aided Detection (CAD) in Screening Automated Breast Ultrasound Based on Characteristics of CAD Marks and False-Positive Marks. Diagnostics (Basel) 2022; 12:diagnostics12030583. [PMID: 35328136 PMCID: PMC8947351 DOI: 10.3390/diagnostics12030583] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2022] [Revised: 02/12/2022] [Accepted: 02/23/2022] [Indexed: 02/04/2023] Open
Abstract
The present study evaluated the effectiveness of computer-aided detection (CAD) system in screening automated breast ultrasound (ABUS) and analyzed the characteristics of CAD marks and the causes of false-positive marks. A total of 846 women who underwent ABUS for screening from January 2017 to December 2017 were included. Commercial CAD was used in all ABUS examinations, and its diagnostic performance and efficacy in shortening the reading time (RT) were evaluated. In addition, we analyzed the characteristics of CAD marks and the causes of false-positive marks. A total of 1032 CAD marks were displayed based on the patient and 534 CAD marks on the lesion. Five cases of breast cancer were diagnosed. The sensitivity, specificity, PPV, and NPV of CAD were 60.0%, 59.0%, 0.9%, and 99.6% for 846 patients. In the case of a negative study, it was less time-consuming and easier to make a decision. Among 530 false-positive marks, 459 were identified clearly for pseudo-lesions; the most common cause was marginal shadowing, followed by Cooper’s ligament shadowing, peri-areolar shadowing, rib, and skin lesions. Even though CAD does not improve the performance of ABUS and a large number of false-positive marks were detected, the addition of CAD reduces RT, especially in the case of negative screening ultrasound.
Collapse
|
13
|
Kim YS, Lee SE, Chang JM, Kim SY, Bae YK. Ultrasonographic morphological characteristics determined using a deep learning-based computer-aided diagnostic system of breast cancer. Medicine (Baltimore) 2022; 101:e28621. [PMID: 35060538 PMCID: PMC8772632 DOI: 10.1097/md.0000000000028621] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/13/2021] [Accepted: 12/23/2021] [Indexed: 01/05/2023] Open
Abstract
To investigate the correlations between ultrasonographic morphological characteristics quantitatively assessed using a deep learning-based computer-aided diagnostic system (DL-CAD) and histopathologic features of breast cancer.This retrospective study included 282 women with invasive breast cancer (<5 cm; mean age, 54.4 [range, 29-85] years) who underwent surgery between February 2016 and April 2017. The morphological characteristics of breast cancer on B-mode ultrasonography were analyzed using DL-CAD, and quantitative scores (0-1) were obtained. Associations between quantitative scores and tumor histologic type, grade, size, subtype, and lymph node status were compared.Two-hundred and thirty-six (83.7%) tumors were invasive ductal carcinoma, 18 (6.4%) invasive lobular carcinoma, and 28 (9.9%) micropapillary, apocrine, and mucinous. The mean size was 1.8 ± 1.0 (standard deviation) cm, and 108 (38.3%) cases were node positive. Irregular shape score was associated with tumor size (P < .001), lymph nodes status (P = .001), and estrogen receptor status (P = .016). Not-circumscribed margin (P < .001) and hypoechogenicity (P = .003) scores correlated with tumor size, and non-parallel orientation score correlated with histologic grade (P = .024). Luminal A tumors exhibited more irregular features (P = .048) with no parallel orientation (P = .002), whereas triple-negative breast cancer showed a rounder/more oval and parallel orientation.Quantitative morphological characteristics of breast cancers determined using DL-CAD correlated with histopathologic features and could provide useful information about breast cancer phenotypes.
Collapse
Affiliation(s)
- Young Seon Kim
- Department of Radiology, Yeungnam University Hospital, Yeungnam University College of Medicine, Daegu, South Korea
| | - Seung Eun Lee
- Department of Radiology, Yeungnam University Hospital, Yeungnam University College of Medicine, Daegu, South Korea
| | - Jung Min Chang
- Department of Radiology, Seoul National University Hospital, Seoul, South Korea
| | - Soo-Yeon Kim
- Department of Radiology, Seoul National University Hospital, Seoul, South Korea
| | - Young Kyung Bae
- Department of Pathology, Yeungnam University Hospital, Yeungnam University College of Medicine, Daegu, South Korea
| |
Collapse
|
14
|
Wang K, Liang S, Zhong S, Feng Q, Ning Z, Zhang Y. Breast ultrasound image segmentation: A coarse-to-fine fusion convolutional neural network. Med Phys 2021; 48:4262-4278. [PMID: 34053092 DOI: 10.1002/mp.15006] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2020] [Revised: 05/20/2021] [Accepted: 05/20/2021] [Indexed: 11/11/2022] Open
Abstract
PURPOSE Breast ultrasound (BUS) image segmentation plays a crucial role in computer-aided diagnosis systems for BUS examination, which are useful for improved accuracy of breast cancer diagnosis. However, such performance remains a challenging task owing to the poor image quality and large variations in the sizes, shapes, and locations of breast lesions. In this paper, we propose a new convolutional neural network with coarse-to-fine feature fusion to address the aforementioned challenges. METHODS The proposed fusion network consists of an encoder path, a decoder path, and a core fusion stream path (FSP). The encoder path is used to capture the context information, and the decoder path is used for localization prediction. The FSP is designed to generate beneficial aggregate feature representations (i.e., various-sized lesion features, aggregated coarse-to-fine information, and high-resolution edge characteristics) from the encoder and decoder paths, which are eventually used for accurate breast lesion segmentation. To better retain the boundary information and alleviate the effect of image noise, we input the superpixel image along with the original image to the fusion network. Furthermore, a weighted-balanced loss function was designed to address the problem of lesion regions having different sizes. We then conducted exhaustive experiments on three public BUS datasets to evaluate the proposed network. RESULTS The proposed method outperformed state-of-the-art (SOTA) segmentation methods on the three public BUS datasets, with average dice similarity coefficients of 84.71(±1.07), 83.76(±0.83), and 86.52(±1.52), average intersection-over-union values of 76.34(±1.50), 75.70(±0.98), and 77.86(±2.07), average sensitivities of 86.66(±1.82), 85.21(±1.98), and 87.21(±2.51), average specificities of 97.92(±0.46), 98.57(±0.19), and 99.42(±0.21), and average accuracies of 95.89(±0.57), 97.17(±0.3), and 98.51(±0.3). CONCLUSIONS The proposed fusion network could effectively segment lesions from BUS images, thereby presenting a new feature fusion strategy to handle challenging task of segmentation, while outperforming the SOTA segmentation methods. The code is publicly available at https://github.com/mniwk/CF2-NET.
Collapse
Affiliation(s)
- Ke Wang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Shujun Liang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Shengzhou Zhong
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Qianjin Feng
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Zhenyuan Ning
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Yu Zhang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| |
Collapse
|
15
|
Goyal S. An Overview of Current Trends, Techniques, Prospects, and Pitfalls of Artificial Intelligence in Breast Imaging. REPORTS IN MEDICAL IMAGING 2021. [DOI: 10.2147/rmi.s295205] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022] Open
|
16
|
Spear GG, Mendelson EB. Automated breast ultrasound: Supplemental screening for average-risk women with dense breasts. Clin Imaging 2020; 76:15-25. [PMID: 33548888 DOI: 10.1016/j.clinimag.2020.12.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Revised: 11/24/2020] [Accepted: 12/17/2020] [Indexed: 11/25/2022]
Abstract
OBJECTIVE We review ultrasound (US) options for supplemental breast cancer screening of average risk women with dense breasts. CONCLUSION Performance data of physician-performed handheld US (HHUS), technologist-performed HHUS, and automated breast ultrasound (AUS) indicate that all are appropriate for adjunctive screening. Volumetric 3D acquisitions, reduced operator dependence, protocol standardization, reliable comparison with previous studies, independence of performance and interpretation, and whole breast depiction on coronal view may favor selection of AUS. Important considerations are workflow adjustments for physicians and staff.
Collapse
Affiliation(s)
- Georgia Giakoumis Spear
- NorthShore University HealthSystem, The University of Chicago Pritzker School of Medicine, United States of America.
| | - Ellen B Mendelson
- Feinberg School of Medicine, Northwestern University, Chicago, IL, United States of America
| |
Collapse
|
17
|
Chiu LY, Kuo WH, Chen CN, Chang KJ, Chen A. A 2-Phase Merge Filter Approach to Computer-Aided Detection of Breast Tumors on 3-Dimensional Ultrasound Imaging. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2020; 39:2439-2455. [PMID: 32567133 DOI: 10.1002/jum.15365] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 05/13/2020] [Accepted: 05/15/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVES The role of image analysis in 3-dimensional (3D) automated breast ultrasound (ABUS) images is increasingly important because of its widespread use as a screening tool in whole-breast examinations. However, reviewing a large number of images acquired from ABUS is time-consuming and sometimes error prone. The aim of this study, therefore, was to develop an efficient computer-aided detection (CADe) algorithm to assist the review process. METHODS The proposed CADe algorithm consisted of 4 major steps. First, initial tumor candidates were formed by extracting and merging hypoechoic square cells on 2-dimensional (2D) transverse images. Second, a feature-based classifier was then constructed using 2D features to filter out nontumor candidates. Third, the remaining 2D candidates were merged longitudinally into 3D masses. Finally, a 3D feature-based classifier was used to further filter out nontumor masses to obtain the final detected masses. The proposed method was validated with 176 passes of breast images acquired by an Acuson S2000 automated breast volume scanner (Siemens Medical Solutions USA, Inc., Malvern, PA), including 44 normal passes and 132 abnormal passes containing 162 proven lesions (79 benign and 83 malignant). RESULTS The proposed CADe system could achieve overall sensitivity of 100% and 90% with 6.71 and 5.14 false-positives (FPs) per pass, respectively. Our results also showed that the average number of FPs per normal pass (7.16) was more than the number of FPs per abnormal pass (6.56) at 100% sensitivity. CONCLUSIONS The proposed CADe system has a great potential for becoming a good companion tool with ABUS imaging by ensuring high sensitivity with a relatively small number of FPs.
Collapse
Affiliation(s)
- Ling-Ying Chiu
- Institute of Industrial Engineering, National Taiwan University, Taipei, Taiwan
| | - Wen-Hung Kuo
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Chiung-Nien Chen
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - King-Jen Chang
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Argon Chen
- Institute of Industrial Engineering, National Taiwan University, Taipei, Taiwan
- Department of Mechanical Engineering, National Taiwan University, Taipei, Taiwan
| |
Collapse
|
18
|
|
19
|
Wang F, Liu X, Yuan N, Qian B, Ruan L, Yin C, Jin C. Study on automatic detection and classification of breast nodule using deep convolutional neural network system. J Thorac Dis 2020; 12:4690-4701. [PMID: 33145042 PMCID: PMC7578508 DOI: 10.21037/jtd-19-3013] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Backgrounds Conventional ultrasound manual scanning and artificial diagnosis approaches in breast are considered to be operator-dependence, slight slow and error-prone. In this study, we used Automated Breast Ultrasound (ABUS) machine for the scanning, and deep convolutional neural network (CNN) technology, a kind of Deep Learning (DL) algorithm, for the detection and classification of breast nodules, aiming to achieve the automatic and accurate diagnosis of breast nodules. Methods Two hundred and ninety-three lesions from 194 patients with definite pathological diagnosis results (117 benign and 176 malignancy) were recruited as case group. Another 70 patients without breast diseases were enrolled as control group. All the breast scans were carried out by an ABUS machine and then randomly divided into training set, verification set and test set, with a proportion of 7:1:2. In the training set, we constructed a detection model by a three-dimensionally U-shaped convolutional neural network (3D U-Net) architecture for the purpose of segment the nodules from background breast images. Processes such as residual block, attention connections, and hard mining were used to optimize the model while strategies of random cropping, flipping and rotation for data augmentation. In the test phase, the current model was compared with those in previously reported studies. In the verification set, the detection effectiveness of detection model was evaluated. In the classification phase, multiple convolutional layers and fully-connected layers were applied to set up a classification model, aiming to identify whether the nodule was malignancy. Results Our detection model yielded a sensitivity of 91% and 1.92 false positive subjects per automatically scanned imaging. The classification model achieved a sensitivity of 87.0%, a specificity of 88.0% and an accuracy of 87.5%. Conclusions Deep CNN combined with ABUS maybe a promising tool for easy detection and accurate diagnosis of breast nodule.
Collapse
Affiliation(s)
- Feiqian Wang
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Xiaotong Liu
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Na Yuan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Buyue Qian
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Litao Ruan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Changchang Yin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Ciping Jin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
20
|
Kim SH, Kim HH, Moon WK. Automated Breast Ultrasound Screening for Dense Breasts. Korean J Radiol 2020; 21:15-24. [PMID: 31920025 PMCID: PMC6960307 DOI: 10.3348/kjr.2019.0176] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2019] [Accepted: 09/04/2019] [Indexed: 11/25/2022] Open
Abstract
Mammography is the primary screening method for breast cancers. However, the sensitivity of mammographic screening is lower for dense breasts, which are an independent risk factor for breast cancers. Automated breast ultrasound (ABUS) is used as an adjunct to mammography for screening breast cancers in asymptomatic women with dense breasts. It is an effective screening modality with diagnostic accuracy comparable to that of handheld ultrasound (HHUS). Radiologists should be familiar with the unique display mode, imaging features, and artifacts in ABUS, which differ from those in HHUS. The purpose of this study was to provide a comprehensive review of the clinical significance of dense breasts and ABUS screening, describe the unique features of ABUS, and introduce the method of use and interpretation of ABUS.
Collapse
Affiliation(s)
- Sung Hun Kim
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea.
| | - Hak Hee Kim
- Department of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea
| | - Woo Kyung Moon
- Department of Radiology, Seoul National University Hospital, Seoul National University College of Medicine, Seoul, Korea
| |
Collapse
|
21
|
Wang Y, Choi EJ, Choi Y, Zhang H, Jin GY, Ko SB. Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning. ULTRASOUND IN MEDICINE & BIOLOGY 2020; 46:1119-1132. [PMID: 32059918 DOI: 10.1016/j.ultrasmedbio.2020.01.001] [Citation(s) in RCA: 63] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 12/12/2019] [Accepted: 01/02/2020] [Indexed: 05/11/2023]
Abstract
To assist radiologists in breast cancer classification in automated breast ultrasound (ABUS) imaging, we propose a computer-aided diagnosis based on a convolutional neural network (CNN) that classifies breast lesions as benign and malignant. The proposed CNN adopts a modified Inception-v3 architecture to provide efficient feature extraction in ABUS imaging. Because the ABUS images can be visualized in transverse and coronal views, the proposed CNN provides an efficient way to extract multiview features from both views. The proposed CNN was trained and evaluated on 316 breast lesions (135 malignant and 181 benign). An observer performance test was conducted to compare five human reviewers' diagnostic performance before and after referring to the predicting outcomes of the proposed CNN. Our method achieved an area under the curve (AUC) value of 0.9468 with five-folder cross-validation, for which the sensitivity and specificity were 0.886 and 0.876, respectively. Compared with conventional machine learning-based feature extraction schemes, particularly principal component analysis (PCA) and histogram of oriented gradients (HOG), our method achieved a significant improvement in classification performance. The proposed CNN achieved a >10% increased AUC value compared with PCA and HOG. During the observer performance test, the diagnostic results of all human reviewers had increased AUC values and sensitivities after referring to the classification results of the proposed CNN, and four of the five human reviewers' AUCs were significantly improved. The proposed CNN employing a multiview strategy showed promise for the diagnosis of breast cancer, and could be used as a second reviewer for increasing diagnostic reliability.
Collapse
Affiliation(s)
- Yi Wang
- Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, Canada
| | - Eun Jung Choi
- Department of Radiology, Research Institute of Clinical Medicine of Jeonbuk National University-Biomedical Research Institute of Jeonbuk National University Hospital, Jeonbuk National University Medical School, Jeonju City, Jeollabuk-Do, South Korea
| | - Younhee Choi
- Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, Canada
| | - Hao Zhang
- Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, Canada
| | - Gong Yong Jin
- Department of Radiology, Research Institute of Clinical Medicine of Jeonbuk National University-Biomedical Research Institute of Jeonbuk National University Hospital, Jeonbuk National University Medical School, Jeonju City, Jeollabuk-Do, South Korea
| | - Seok-Bum Ko
- Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, Canada.
| |
Collapse
|
22
|
Should We Ignore, Follow, or Biopsy? Impact of Artificial Intelligence Decision Support on Breast Ultrasound Lesion Assessment. AJR Am J Roentgenol 2020; 214:1445-1452. [PMID: 32319794 DOI: 10.2214/ajr.19.21872] [Citation(s) in RCA: 58] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
OBJECTIVE. The objective of this study was to assess the impact of artificial intelligence (AI)-based decision support (DS) on breast ultrasound (US) lesion assessment. MATERIALS AND METHODS. A multicenter retrospective review of 900 breast lesions (470/900 [52.2%] benign; 430/900 [47.8%] malignant) on US by 15 physicians (11 radiologists, two surgeons, two obstetrician/gynecologists). An AI system (Koios DS for Breast, Koios Medical) evaluated images and assigned them to one of four categories: benign, probably benign, suspicious, and probably malignant. Each reader reviewed cases twice: 750 cases with US only or with US plus DS; 4 weeks later, cases were reviewed in the opposite format. One hundred fifty additional cases were presented identically in each session. DS and reader sensitivity, specificity, and positive likelihood ratios (PLRs) were calculated as well as reader AUCs with and without DS. The Kendall τ-b correlation coefficient was used to assess intraand interreader variability. RESULTS. Mean reader AUC for cases reviewed with US only was 0.83 (95% CI, 0.78-0.89); for cases reviewed with US plus DS, mean AUC was 0.87 (95% CI, 0.84-0.90). PLR for the DS system was 1.98 (95% CI, 1.78-2.18) and was higher than the PLR for all readers but one. Fourteen readers had better AUC with US plus DS than with US only. Mean Kendall τ-b for US-only interreader variability was 0.54 (95% CI, 0.53-0.55); for US plus DS, it was 0.68 (95% CI, 0.67-0.69). Intrareader variability improved with DS; class switching (defined as crossing from BI-RADS category 3 to BI-RADS category 4A or above) occurred in 13.6% of cases with US only versus 10.8% of cases with US plus DS (p = 0.04). CONCLUSION. AI-based DS improves accuracy of sonographic breast lesion assessment while reducing inter- and intraobserver variability.
Collapse
|
23
|
Wang Y, Wang N, Xu M, Yu J, Qin C, Luo X, Yang X, Wang T, Li A, Ni D. Deeply-Supervised Networks With Threshold Loss for Cancer Detection in Automated Breast Ultrasound. IEEE TRANSACTIONS ON MEDICAL IMAGING 2020; 39:866-876. [PMID: 31442972 DOI: 10.1109/tmi.2019.2936500] [Citation(s) in RCA: 56] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABUS, or Automated breast ultrasound, is an innovative and promising method of screening for breast examination. Comparing to common B-mode 2D ultrasound, ABUS attains operator-independent image acquisition and also provides 3D views of the whole breast. Nonetheless, reviewing ABUS images is particularly time-intensive and errors by oversight might occur. For this study, we offer an innovative 3D convolutional network, which is used for ABUS for automated cancer detection, in order to accelerate reviewing and meanwhile to obtain high detection sensitivity with low false positives (FPs). Specifically, we offer a densely deep supervision method in order to augment the detection sensitivity greatly by effectively using multi-layer features. Furthermore, we suggest a threshold loss in order to present voxel-level adaptive threshold for discerning cancer vs. non-cancer, which can attain high sensitivity with low false positives. The efficacy of our network is verified from a collected dataset of 219 patients with 614 ABUS volumes, including 745 cancer regions, and 144 healthy women with a total of 900 volumes, without abnormal findings. Extensive experiments demonstrate our method attains a sensitivity of 95% with 0.84 FP per volume. The proposed network provides an effective cancer detection scheme for breast examination using ABUS by sustaining high sensitivity with low false positives. The code is publicly available at https://github.com/nawang0226/abus_code.
Collapse
|
24
|
van Zelst JCM, Tan T, Mann RM, Karssemeijer N. Validation of radiologists' findings by computer-aided detection (CAD) software in breast cancer detection with automated 3D breast ultrasound: a concept study in implementation of artificial intelligence software. Acta Radiol 2020; 61:312-320. [PMID: 31324132 PMCID: PMC7059207 DOI: 10.1177/0284185119858051] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Accepted: 05/22/2019] [Indexed: 11/16/2022]
Abstract
Background Computer-aided detection software for automated breast ultrasound has been shown to have potential in improving the accuracy of radiologists. Alternative ways of implementing computer-aided detection, such as independent validation or preselecting suspicious cases, might also improve radiologists’ accuracy. Purpose To investigate the effect of using computer-aided detection software to improve the performance of radiologists by validating findings reported by radiologists during screening with automated breast ultrasound. Material and Methods Unilateral automated breast ultrasound exams were performed in 120 women with dense breasts that included 60 randomly selected normal exams, 30 exams with benign lesions, and 30 malignant cases (20 mammography-negative). Eight radiologists were instructed to detect breast cancer and rate lesions using BI-RADS and level-of-suspiciousness scores. Computer-aided detection software was used to check the validity of radiologists' findings. Findings found negative by computer-aided detection were not included in the readers’ performance analysis; however, the nature of these findings were further analyzed. The area under the curve and the partial area under the curve for an interval in the range of 80%–100% specificity before and after validation of computer-aided detection were compared. Sensitivity was computed for all readers at a simulation of 90% specificity. Results Partial AUC improved significantly from 0.126 (95% confidence interval [CI] = 0.098–0.153) to 0.142 (95% CI = 0.115–0.169) (P = 0.037) after computer-aided detection rejected mostly benign lesions and normal tissue scored BI-RADS 3 or 4. The full areas under the curve (0.823 vs. 0.833, respectively) were not significantly different (P = 0.743). Four cancers detected by readers were completely missed by computer-aided detection and four other cancers were detected by both readers and computer-aided detection but falsely rejected due to technical limitations of our implementation of computer-aided detection validation. In this study, validation of computer-aided detection discarded 42.6% of findings that were scored BI-RADS ≥3 by the radiologists, of which 85.5% were non-malignant findings. Conclusion Validation of radiologists’ findings using computer-aided detection software for automated breast ultrasound has the potential to improve the performance of radiologists. Validation of computer-aided detection might be an efficient tool for double-reading strategies by limiting the amount of discordant cases needed to be double-read.
Collapse
Affiliation(s)
- Jan CM van Zelst
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Tao Tan
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Ritse M Mann
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Nico Karssemeijer
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| |
Collapse
|
25
|
Abd Elkhalek YI, Bassiouny AM, Hamid RWARA. Automated breast ultrasound system (ABUS): can it replace mammography as a screening tool? THE EGYPTIAN JOURNAL OF RADIOLOGY AND NUCLEAR MEDICINE 2019. [DOI: 10.1186/s43055-019-0051-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
Abstract
Abstract
Background
Mammography is the most accepted, accurate, and effective modality in screening of breast cancer, yet its sensitivity is affected by the density of the breast tissue. Alternative methods for screening are the sonography and MRI but both had their limitations. A new option named ABUS (automated breast ultrasound system) is now proposed to overcome the breast US limitation as it is time-consuming and operator-dependent and to overcome the costly time-consuming MRI. The objectives of the study are to evaluate the accuracy of ABUS in the detection of different breast lesions as a substitution for mammography. This prospective study included 25 women outreached for digital mammography or handheld ultrasound examination at the period between January 2017 and February 2018. Women have no specific age group.
Results
Significant improvement in the detection of breast lesions by ABUS use with mammogram especially in dense breasts (ACR class C and D)
Conclusion
ABUS is a promising competitor to mammogram in screening of breast lesions
Collapse
|
26
|
Wu JY, Zhao ZZ, Zhang WY, Liang M, Ou B, Yang HY, Luo BM. Computer-Aided Diagnosis of Solid Breast Lesions With Ultrasound: Factors Associated With False-negative and False-positive Results. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2019; 38:3193-3202. [PMID: 31077414 DOI: 10.1002/jum.15020] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/17/2018] [Revised: 04/14/2019] [Accepted: 04/19/2019] [Indexed: 06/09/2023]
Abstract
OBJECTIVES To investigate factors that may lead to false-positive or false-negative results in a computer-aided diagnostic system (S-Detect; Samsung Medison Co, Ltd, Seoul, Korea) for ultrasound (US) examinations of solid breast lesions. METHODS This prospective study was approved by the Institutional Review Board of Sun Yat-sen Memorial Hospital. All patients signed and provided written informed consent before biopsy or surgery. From September 2017 to May 2018, 269 consecutive women with 338 solid breast lesions were included. All lesions were examined with US and S-Detect before biopsy or surgical excision. The final US assessments made by radiologists and S-Detect were matched to the pathologic results. Patient and lesion factors in the "true" and "false" S-Detect groups were compared, and multivariate logistic regression analyses were used to identify the factors associated with false S-Detect results. RESULTS The mean age of the patients ± SD was 42.6 ± 12.9 years (range, 18-77 years). Of the 338 lesions, 209 (61.8%) were benign, and 129 (38.2%) were malignant. Larger lesions, the presence of lesion calcifications detected by B-mode US, and grades of 2 and 3 according to Adler et al (Ultrasound Med Biol 1990; 16:553-559) were significantly associated with false-positive S-Detect results (odds ratio [OR], 1.071; P = .006; OR, 5.851; P = .001; OR, 1.726; P = .009, respectively). Smaller lesions and the absence of calcifications detected by B-mode US in malignant solid breast lesions were significantly associated with false-negative S-Detect results (OR, 1.141; P = .015; OR, 7.434; P = .016). CONCLUSIONS Larger benign lesions, the presence of lesion calcifications, and high degrees of vascularity are likely to show false-positive S-Detect results. Smaller malignant lesions and the absence of calcifications are likely to show false-negative S-Detect results.
Collapse
Affiliation(s)
- Jia-Yi Wu
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Zi-Zhuo Zhao
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Wen-Yue Zhang
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Ming Liang
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Bing Ou
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Hai-Yun Yang
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| | - Bao-Ming Luo
- Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| |
Collapse
|
27
|
Zhang L, Bao LY, Tan YJ, Zhu LQ, Xu XJ, Zhu QQ, Shan YN, Zhao J, Xie LS, Liu J. Diagnostic Performance Using Automated Breast Ultrasound System for Breast Cancer in Chinese Women Aged 40 Years or Older: A Comparative Study. ULTRASOUND IN MEDICINE & BIOLOGY 2019; 45:3137-3144. [PMID: 31563481 DOI: 10.1016/j.ultrasmedbio.2019.08.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/03/2019] [Revised: 07/09/2019] [Accepted: 08/23/2019] [Indexed: 06/10/2023]
Abstract
The purpose of this study was to investigate the diagnostic performance of the automated breast ultrasound system (ABUS) compared with hand-held ultrasonography (HHUS) and mammography (MG) for breast cancer in women aged 40 y or older. A total of 594 breasts in 385 patients were enrolled in the study. HHUS, ABUS and MG exams were performed for these patients. Follow-up and pathologic findings were used as the reference standard. Based on the reference standard, 519 units were benign or normal and 75 were malignant. The sensitivity, specificity, accuracy and Youden index were 97.33%, 89.79%, 90.74% and 0.87 for HHUS; 90.67%, 92.49%, 92.26% and 0.83 for ABUS; 84.00%, 92.87%, 91.75% and 0.77 for MG, respectively. The specificity of ABUS was significantly superior to that of HHUS (p = 0.024). The area under the receiver operating characteristic curve was 0.936 for HHUS, which was the highest, followed by 0.916 for ABUS and 0.884 for MG. However, the difference was not statistically significant (p > 0.05). In conclusion, the diagnostic performance of ABUS for breast cancer was equivalent to HHUS and MG and potentially can be used as an alternative method for breast cancer diagnosis.
Collapse
Affiliation(s)
- Li Zhang
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Ling-Yun Bao
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China.
| | - Yan-Juan Tan
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Luo-Qian Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Xiao-Jing Xu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Qing-Qing Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Yan-Na Shan
- Department of Radiology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Jing Zhao
- Department of Radiology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Le-Si Xie
- Department of Pathology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Jan Liu
- Department of Breast Surgery, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| |
Collapse
|
28
|
Ghani KA, Sudik S, Omar AF, Mail MH, Seeni A. VIS-NIR spectral signature and quantitative analysis of HeLa and DU145 cell line. SPECTROCHIMICA ACTA. PART A, MOLECULAR AND BIOMOLECULAR SPECTROSCOPY 2019; 222:117241. [PMID: 31216502 DOI: 10.1016/j.saa.2019.117241] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Revised: 06/01/2019] [Accepted: 06/05/2019] [Indexed: 06/09/2023]
Abstract
Cancer is increasing in incidence and the leading cause of death worldwide. Controlling and reducing cancer requires early detection and technique to accurately detect and quantify predictive biomarkers. Optical spectroscopy has shown promising non-destructive ability to display distinctive spectral characteristics between cancerous and normal tissues from different part of human organ. Nonetheless, not many information is available on spectroscopic properties of cancer cell lines. In this research, the visible-near infrared (VIS-NIR) absorbance spectroscopy measurement of cultured cervical cancer (HeLa) and prostate cancer cells (DU145) lines has been performed to develop spectral signature of cancer cells and to generate algorithm to quantify cancer cells. Spectroscopic measurement on mouse skin fibroblast (L929) was also taken for comparative purposes. In visible region, the raw cells' spectra do not produce any noticeable peak absorbance that provides information on color because the medium used for cells is colorless and transparent. NIR wavelength between 950 and 975 nm exhibit significant peak due to water absorbance by the medium. Development of spectral signature for the cells through the application of regression technique significantly enhances the diverse characteristics between L929, HeLa and DU145. The application of multiple linear regression allows high measurement accuracy of the cells with coefficient of determination above 0.94.
Collapse
Affiliation(s)
| | - Suhainah Sudik
- School of Physics, Universiti Sains Malaysia, 11800 Penang, Malaysia
| | - Ahmad Fairuz Omar
- School of Physics, Universiti Sains Malaysia, 11800 Penang, Malaysia.
| | - Mohd Hafiz Mail
- Malaysian Institute of Pharmaceuticals and Nutraceuticals, National Institute of Biotechnology Malaysia, Ministry of Energy, Science, Technology, Environment and Climate Change, 11700 Penang, Malaysia; Advanced Medical and Dental Institute, Universiti Sains Malaysia, Bertam, 13200, Pulau Pinang, Malaysia
| | - Azman Seeni
- Malaysian Institute of Pharmaceuticals and Nutraceuticals, National Institute of Biotechnology Malaysia, Ministry of Energy, Science, Technology, Environment and Climate Change, 11700 Penang, Malaysia; Advanced Medical and Dental Institute, Universiti Sains Malaysia, Bertam, 13200, Pulau Pinang, Malaysia
| |
Collapse
|
29
|
Marcon M, Ciritsis A, Rossi C, Becker AS, Berger N, Wurnig MC, Wagner MW, Frauenfelder T, Boss A. Diagnostic performance of machine learning applied to texture analysis-derived features for breast lesion characterisation at automated breast ultrasound: a pilot study. Eur Radiol Exp 2019; 3:44. [PMID: 31676937 PMCID: PMC6825080 DOI: 10.1186/s41747-019-0121-6] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Accepted: 08/28/2019] [Indexed: 12/31/2022] Open
Abstract
Background Our aims were to determine if features derived from texture analysis (TA) can distinguish normal, benign, and malignant tissue on automated breast ultrasound (ABUS); to evaluate whether machine learning (ML) applied to TA can categorise ABUS findings; and to compare ML to the analysis of single texture features for lesion classification. Methods This ethically approved retrospective pilot study included 54 women with benign (n = 38) and malignant (n = 32) solid breast lesions who underwent ABUS. After manual region of interest placement along the lesions’ margin as well as the surrounding fat and glandular breast tissue, 47 texture features (TFs) were calculated for each category. Statistical analysis (ANOVA) and a support vector machine (SVM) algorithm were applied to the texture feature to evaluate the accuracy in distinguishing (i) lesions versus normal tissue and (ii) benign versus malignant lesions. Results Skewness and kurtosis were the only TF significantly different among all the four categories (p < 0.000001). In subsets (i) and (ii), a maximum area under the curve of 0.86 (95% confidence interval [CI] 0.82–0.88) for energy and 0.86 (95% CI 0.82–0.89) for entropy were obtained. Using the SVM algorithm, a maximum area under the curve of 0.98 for both subsets was obtained with a maximum accuracy of 94.4% in subset (i) and 90.7% in subset (ii). Conclusions TA in combination with ML might represent a useful diagnostic tool in the evaluation of breast imaging findings in ABUS. Applying ML techniques to TFs might be superior compared to the analysis of single TF. Electronic supplementary material The online version of this article (10.1186/s41747-019-0121-6) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Magda Marcon
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland.
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Cristina Rossi
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Anton S Becker
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Nicole Berger
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Moritz C Wurnig
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Matthias W Wagner
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Thomas Frauenfelder
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| | - Andreas Boss
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland
| |
Collapse
|
30
|
Love SM, Berg WA, Podilchuk C, López Aldrete AL, Gaxiola Mascareño AP, Pathicherikollamparambil K, Sankarasubramanian A, Eshraghi L, Mammone R. Palpable Breast Lump Triage by Minimally Trained Operators in Mexico Using Computer-Assisted Diagnosis and Low-Cost Ultrasound. J Glob Oncol 2019; 4:1-9. [PMID: 30156946 PMCID: PMC6223536 DOI: 10.1200/jgo.17.00222] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
Purpose In low- to middle-income countries (LMICs), most breast cancers present as palpable lumps; however, most palpable lumps are benign. We have developed artificial intelligence–based computer-assisted diagnosis (CADx) for an existing low-cost portable ultrasound system to triage which lumps need further evaluation and which are clearly benign. This pilot study was conducted to demonstrate that this approach can be successfully used by minimally trained health care workers in an LMIC country. Patients and Methods We recruited and trained three nonradiologist health care workers to participate in an institutional review board–approved, Health Insurance Portability and Accountability Act–compliant pilot study in Jalisco, Mexico, to determine whether they could use portable ultrasound (GE Vscan Dual Probe) to acquire images of palpable breast lumps of adequate quality for accurate computer analysis. Images from 32 women with 32 breast masses were then analyzed with a triage-CADx system, generating an output of benign or suspicious (biopsy recommended). Triage-CADx outputs were compared with radiologist readings. Results The nonradiologists were able to acquire adequate images. Triage by the CADx software was as accurate as assessment by specialist radiologists, with two (100%) of two cancers considered suspicious and 30 (100%) of 30 benign lesions classified as benign. Conclusion A portable ultrasound system with CADx software can be successfully used by first-level health care workers to triage palpable breast lumps. These results open up the possibility of implementing practical, cost-effective triage of palpable breast lumps, ensuring that scarce resources can be dedicated to suspicious lesions requiring further workup.
Collapse
Affiliation(s)
- Susan M Love
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Wendie A Berg
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Christine Podilchuk
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Ana Lilia López Aldrete
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Aarón Patricio Gaxiola Mascareño
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Krishnamohan Pathicherikollamparambil
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Ananth Sankarasubramanian
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Leah Eshraghi
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| | - Richard Mammone
- Susan M. Love and Leah Eshraghi, Dr Susan Love Research Foundation, Encino, CA; Wendie A. Berg, Magee-Womens Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA; Christine Podilchuk, Krishnamohan Pathicherikollamparambil, Ananth Sankarasubramanian, and Richard Mammone, AI Strategy, Warren, NJ; Richard Mammone, Rutgers University, New Brunswick, NJ; and Ana Lilia López Aldrete and Aarón Patricio Gaxiola Mascareño, Instituto de Seguridad y Servicios Sociales de los Trabajadores del Estado Hospital Regional Valentin Gomez Farias, Jalisco, Mexico
| |
Collapse
|
31
|
Yang S, Gao X, Liu L, Shu R, Yan J, Zhang G, Xiao Y, Ju Y, Zhao N, Song H. Performance and Reading Time of Automated Breast US with or without Computer-aided Detection. Radiology 2019; 292:540-549. [PMID: 31210612 DOI: 10.1148/radiol.2019181816] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
BackgroundComputer-aided detection (CAD) systems may be used to help radiologists interpret automated breast (AB) US images. However, the optimal use of CAD with AB US has, to the knowledge of the authors, not been determined.PurposeTo compare the performance and reading time of different readers by using AB US CAD system to detect breast cancer in different reading modes.Materials and MethodsIn this retrospective study, 1485 AB US images (282 with malignant lesions, 695 with benign lesions, and 508 healthy) in 1452 women (mean age, 43.7 years; age range, 19-82 years) including 529 (36.4%) women who were asymptomatic were collected between 2016 and 2017. A CAD system was used to interpret the images. Three novice readers with 1-3 years of US experience and three experienced readers with 5-10 years of US experience were assigned to read AB US images without CAD, at a second reading (after the reader completed a full unaided interpretation), and at concurrent reading (use of CAD at the start of the assessment). Diagnostic performances and reading times were compared by using analysis of variance.ResultsFor all readers, the mean area under the receiver operating characteristic curve improved from 0.88 (95% confidence interval [CI]: 0.85, 0.91) at without-CAD mode to 0.91 (95% CI: 0.89, 0.92; P < .001) at the second-reading mode and 0.90 (95% CI: 0.89, 0.92; P = .002) at the concurrent-reading mode. The mean sensitivity of novice readers in women who were asymptomatic improved from 67% (95% CI: 63%, 74%) at without-CAD mode to 88% (95% CI: 84%, 89%) at both the second-reading mode and the concurrent-reading mode (P = .003). Compared with the without-CAD and second-reading modes, the mean reading time per volume of concurrent reading was 16 seconds (95% CI: 11, 22; P < .001) and 27 seconds (95% CI: 21, 32; P < .001) shorter, respectively.ConclusionComputer-aided detection (CAD) was helpful for novice readers to improve cancer detection at automated breast US in women who were asymptomatic. CAD was more efficient when used concurrently for all readers.© RSNA, 2019Online supplemental material is available for this article.See also the editorial by Slanetz in this issue.
Collapse
Affiliation(s)
- Shanling Yang
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Xican Gao
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Liwen Liu
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Rui Shu
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Jingru Yan
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Ge Zhang
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Yao Xiao
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Yan Ju
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Ni Zhao
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| | - Hongping Song
- From the Department of Ultrasonic Medicine, Xijing Hospital of the Fourth Military Medical University, No. 127 Changle West Road, Xi'an, Shaanxi, China 710032
| |
Collapse
|
32
|
|
33
|
Cruz-Bernal A, Flores-Barranco MM, Almanza-Ojeda DL, Ledesma S, Ibarra-Manzano MA. Analysis of the Cluster Prominence Feature for Detecting Calcifications in Mammograms. JOURNAL OF HEALTHCARE ENGINEERING 2018; 2018:2849567. [PMID: 30687489 PMCID: PMC6330822 DOI: 10.1155/2018/2849567] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/30/2018] [Revised: 10/22/2018] [Accepted: 11/06/2018] [Indexed: 11/22/2022]
Abstract
In mammograms, a calcification is represented as small but brilliant white region of the digital image. Earlier detection of malignant calcifications in patients provides high expectation of surviving to this disease. Nevertheless, white regions are difficult to see by visual inspection because a mammogram is a gray-scale image of the breast. To help radiologists in detecting abnormal calcification, computer-inspection methods of mammograms have been proposed; however, it remains an open important issue. In this context, we propose a strategy for detecting calcifications in mammograms based on the analysis of the cluster prominence (cp) feature histogram. The highest frequencies of the cp histogram describe the calcifications on the mammography. Therefore, we obtain a function that models the behaviour of the cp histogram using the Vandermonde interpolation twice. The first interpolation yields a global representation, and the second models the highest frequencies of the histogram. A weak classifier is used for obtaining a final classification of the mammography, that is, with or without calcifications. Experimental results are compared with real DICOM images and their corresponding diagnosis provided by expert radiologists, showing that the cp feature is highly discriminative.
Collapse
Affiliation(s)
- Alejandra Cruz-Bernal
- Laboratorio de Procesamiento Digital de Señales, Departamento de Ingeniería Electrónica, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
- Departamento de Ingeniería Robótica, Universidad Politécnica de Guanajuato, Av. Universidad Norte SN., Comunidad Juan Alonso, Cortazar 38496, Mexico
| | - Martha M. Flores-Barranco
- Laboratorio de Procesamiento Digital de Señales, Departamento de Ingeniería Electrónica, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
| | - Dora L. Almanza-Ojeda
- Laboratorio de Procesamiento Digital de Señales, Departamento de Ingeniería Electrónica, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
- Cuerpo Académico de Telemática, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
| | - Sergio Ledesma
- Cuerpo Académico de Telemática, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
| | - Mario A. Ibarra-Manzano
- Laboratorio de Procesamiento Digital de Señales, Departamento de Ingeniería Electrónica, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
- Cuerpo Académico de Telemática, DICIS, Universidad de Guanajuato, Carr. Salamanca-Valle de Santiago KM. 3.5 + 1.8 Km., Salamanca 36885, Mexico
| |
Collapse
|
34
|
Abstract
OBJECTIVE The purpose of this article is to discuss potential applications of artificial intelligence (AI) in breast imaging and limitations that may slow or prevent its adoption. CONCLUSION The algorithms of AI for workflow improvement and outcome analyses are advancing. Using imaging data of high quality and quantity, AI can support breast imagers in diagnosis and patient management, but AI cannot yet be relied on or be responsible for physicians' decisions that may affect survival. Education in AI is urgently needed for physicians.
Collapse
|
35
|
Rella R, Belli P, Giuliani M, Bufi E, Carlino G, Rinaldi P, Manfredi R. Automated Breast Ultrasonography (ABUS) in the Screening and Diagnostic Setting: Indications and Practical Use. Acad Radiol 2018; 25:1457-1470. [PMID: 29555568 DOI: 10.1016/j.acra.2018.02.014] [Citation(s) in RCA: 61] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Revised: 02/10/2018] [Accepted: 02/11/2018] [Indexed: 10/17/2022]
Abstract
Automated breast ultrasonography (ABUS) is a new imaging technology for automatic breast scanning through ultrasound. It was first developed to overcome the limitation of operator dependency and lack of standardization and reproducibility of handheld ultrasound. ABUS provides a three-dimensional representation of breast tissue and allows images reformatting in three planes, and the generated coronal plane has been suggested to improve diagnostic accuracy. This technique has been first used in the screening setting to improve breast cancer detection, especially in mammographically dense breasts. In recent years, numerous studies also evaluated its use in the diagnostic setting: they showed its suitability for breast cancer staging, evaluation of tumor response to neoadjuvant chemotherapy, and second-look ultrasound after magnetic resonance imaging. The purpose of this article is to provide a comprehensive review of the current body of literature about the clinical performance of ABUS, summarize available evidence, and identify gaps in knowledge for future research.
Collapse
|
36
|
Xu X, Bao L, Tan Y, Zhu L, Kong F, Wang W. 1000-Case Reader Study of Radiologists' Performance in Interpretation of Automated Breast Volume Scanner Images with a Computer-Aided Detection System. ULTRASOUND IN MEDICINE & BIOLOGY 2018; 44:1694-1702. [PMID: 29853222 DOI: 10.1016/j.ultrasmedbio.2018.04.020] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/16/2017] [Revised: 04/24/2018] [Accepted: 04/27/2018] [Indexed: 06/08/2023]
Abstract
The objective of our study was to assess, in a reader study, radiologists' performance in interpretation of automated breast volume scanner (ABVS) images with the aid of a computer-aided detection (CADe) system. Our study is a retrospective observer study with the purpose of investigating the effectiveness of using a CADe system as an aid for radiologists in interpretation of ABVS images. The multiple-reader, multiple-case study was designed to compare the diagnostic performance of radiologists with and without CADe. The study included 1000 cases selected from ABVS examinations in our institution in 2012. Among those cases were 206 malignant, 486 benign and 308 normal cases. The cancer cases were consecutive; the benign and normal cases were randomly selected. All malignant and benign cases were confirmed by biopsy or surgery, and normal cases were confirmed by 2-y follow-up. Reader performance was compared in terms of area under the receiver operating characteristic curve, sensitivity and specificity. Additionally, the reading time per case for each reader was recorded. Nine radiologists from our institution participated in the study. Three had more than 8 y of ultrasound experience and more than 4 y of ABVS experience (group A); 3 had more than 5 y of ultrasound experience (group B), and 3 had more than 1 y of ultrasound experience (group C). Both group B and group C had no ABVS experience. The CADe system used was the QVCAD System (QView Medical, Inc., Los Altos, CA, USA). It is designed to aid radiologists in searching for suspicious areas in ABVS images. CADe results are presented to the reader simultaneously with the ABVS images; that is, the radiologists read the ABVS images concurrently with the CADe results. The cases were randomly assigned for each reader into two equal-size groups, 1 and 2. Initially the readers read their group 1 cases with the aid of CADe and their group 2 cases without CADe. After a 1-mo washout period, they re-read their group 1 cases without CADe and their group 2 cases with CADe. The areas under the receiver operating characteristic curves of all readers were 0.784 for reading with CADe and 0.747 without CADe. Areas under the curves with and without CADe were 0.833 and 0.829 for group A, 0.757 and 0.696 for group B and 0.759 and 0.718 for group C. All differences in areas under the curve were statistically significant (p <0.05), except that for group A. The average reading time was 9.3% (p < < 0.05) faster with CADe for all readers. In summary, CADe improves radiologist performance with respect to both accuracy and reading time for the detection of breast cancer using the ABVS, with the greater benefit for those inexperienced with ABVS.
Collapse
Affiliation(s)
- Xiaojing Xu
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Lingyun Bao
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China.
| | - Yanjuan Tan
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Luoxi Zhu
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Fanlei Kong
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Wei Wang
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| |
Collapse
|
37
|
van Zelst JCM, Tan T, Clauser P, Domingo A, Dorrius MD, Drieling D, Golatta M, Gras F, de Jong M, Pijnappel R, Rutten MJCM, Karssemeijer N, Mann RM. Dedicated computer-aided detection software for automated 3D breast ultrasound; an efficient tool for the radiologist in supplemental screening of women with dense breasts. Eur Radiol 2018; 28:2996-3006. [PMID: 29417251 PMCID: PMC5986849 DOI: 10.1007/s00330-017-5280-3] [Citation(s) in RCA: 46] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2017] [Revised: 11/21/2017] [Accepted: 12/21/2017] [Indexed: 11/29/2022]
Abstract
OBJECTIVES To determine the effect of computer-aided-detection (CAD) software for automated breast ultrasound (ABUS) on reading time (RT) and performance in screening for breast cancer. MATERIAL AND METHODS Unilateral ABUS examinations of 120 women with dense breasts were randomly selected from a multi-institutional archive of cases including 30 malignant (20/30 mammography-occult), 30 benign, and 60 normal cases with histopathological verification or ≥ 2 years of negative follow-up. Eight radiologists read once with (CAD-ABUS) and once without CAD (ABUS) with > 8 weeks between reading sessions. Readers provided a BI-RADS score and a level of suspiciousness (0-100). RT, sensitivity, specificity, PPV and area under the curve (AUC) were compared. RESULTS Average RT was significantly shorter using CAD-ABUS (133.4 s/case, 95% CI 129.2-137.6) compared with ABUS (158.3 s/case, 95% CI 153.0-163.3) (p < 0.001). Sensitivity was 0.84 for CAD-ABUS (95% CI 0.79-0.89) and ABUS (95% CI 0.78-0.88) (p = 0.90). Three out of eight readers showed significantly higher specificity using CAD. Pooled specificity (0.71, 95% CI 0.68-0.75 vs. 0.67, 95% CI 0.64-0.70, p = 0.08) and PPV (0.50, 95% CI 0.45-0.55 vs. 0.44, 95% CI 0.39-0.49, p = 0.07) were higher in CAD-ABUS vs. ABUS, respectively, albeit not significantly. Pooled AUC for CAD-ABUS was comparable with ABUS (0.82 vs. 0.83, p = 0.53, respectively). CONCLUSION CAD software for ABUS may decrease the time needed to screen for breast cancer without compromising the screening performance of radiologists. KEY POINTS • ABUS with CAD software may speed up reading time without compromising radiologists' accuracy. • CAD software for ABUS might prevent non-detection of malignant breast lesions by radiologists. • Radiologists reading ABUS with CAD software might improve their specificity without losing sensitivity.
Collapse
Affiliation(s)
- Jan C M van Zelst
- Department of Radiology and Nuclear Medicine, Radboud University Medical Centre Nijmegen (NL), Geert Grooteplein 10, 6525 GA, Nijmegen, The Netherlands.
| | - Tao Tan
- Department of Radiology and Nuclear Medicine, Radboud University Medical Centre Nijmegen (NL), Geert Grooteplein 10, 6525 GA, Nijmegen, The Netherlands
| | - Paola Clauser
- Department of Biomedical Imaging and Image Guided Therapy, Division of Molecular and Gender Imaging, Medical University of Vienna/Vienna General Hospital (A), Vienna, Austria
| | - Angels Domingo
- Department of Radiology, Centre Diagnosi per la Imatge Tarragona (E), Tarragona, Spain
| | - Monique D Dorrius
- Center for Medical Imaging and Department of Radiology, University Medical Centre Groningen (NL), Groningen, Netherlands
| | | | - Michael Golatta
- Department of Gynaecology and Obstetrics, Universitäts-Frauenklinik Heidelberg (D), Heidelberg, Germany
| | - Francisca Gras
- Department of Radiology, Centre Diagnosi per la Imatge Tarragona (E), Tarragona, Spain
| | - Mathijn de Jong
- Department of Radiology, Jeroen Bosch Hospital, s-Hertogenbosch (NL), s-Hertogenbosch, Netherlands
| | - Ruud Pijnappel
- Department of Radiology, University Medical Centre Utrecht (NL), Utrecht, Netherlands
| | - Matthieu J C M Rutten
- Department of Radiology, Jeroen Bosch Hospital, s-Hertogenbosch (NL), s-Hertogenbosch, Netherlands
| | - Nico Karssemeijer
- Department of Radiology and Nuclear Medicine, Radboud University Medical Centre Nijmegen (NL), Geert Grooteplein 10, 6525 GA, Nijmegen, The Netherlands
| | - Ritse M Mann
- Department of Radiology and Nuclear Medicine, Radboud University Medical Centre Nijmegen (NL), Geert Grooteplein 10, 6525 GA, Nijmegen, The Netherlands
| |
Collapse
|
38
|
Phelps A, Callen AL, Marcovici P, Naeger DM, Mongan J, Webb EM. Can Radiologists Learn From Airport Baggage Screening?: A Survey About Using Fictional Patients for Quality Assurance. Acad Radiol 2018; 25:226-234. [PMID: 29122472 DOI: 10.1016/j.acra.2017.08.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Revised: 08/29/2017] [Accepted: 08/31/2017] [Indexed: 10/18/2022]
Abstract
RATIONALE AND OBJECTIVES For both airport baggage screeners and radiologists, low target prevalence is associated with low detection rate, a phenomenon known as "prevalence effect." In airport baggage screening, the target prevalence is artificially increased with fictional weapons that are digitally superimposed on real baggage. This strategy improves the detection rate of real weapons and also allows airport supervisors to monitor screener performance. A similar strategy using fictional patients could be applied in radiology. The purpose of this study was twofold: (1) to review the psychophysics literature regarding low target prevalence and (2) to survey radiologists' attitudes toward using fictional patients as a quality assurance tool. MATERIALS AND METHODS We reviewed the psychophysics literature on low target prevalence and airport x-ray baggage screeners. An online survey was e-mailed to all members of the Association of University Radiologists to determine their attitudes toward using fictional patients in radiology. RESULTS Of the 1503 Association of University Radiologists member recipients, there were 153 respondents (10% response rate). When asked whether the use of fictional patients was a good idea, the responses were as follows: disagree (44%), neutral (25%), and agree (31%). The most frequent concern was the time taken away from doing clinical work (89% of the respondents). CONCLUSIONS The psychophysics literature supports the use of fictional targets to mitigate the prevalence effect. However, the use of fictional patients is not a popular idea among academic radiologists.
Collapse
Affiliation(s)
- Andrew Phelps
- Department of Radiology and Biomedical Imaging, University of California, 1975 4th Street, San Francisco, CA 94158.
| | - Andrew L Callen
- Department of Radiology and Biomedical Imaging, University of California, 1975 4th Street, San Francisco, CA 94158
| | - Peter Marcovici
- Department of Radiology, Kaiser Permanente Northwest, Portland, Oregon
| | - David M Naeger
- Department of Radiology and Biomedical Imaging, University of California, 1975 4th Street, San Francisco, CA 94158
| | - John Mongan
- Department of Radiology and Biomedical Imaging, University of California, 1975 4th Street, San Francisco, CA 94158
| | - Emily M Webb
- Department of Radiology and Biomedical Imaging, University of California, 1975 4th Street, San Francisco, CA 94158
| |
Collapse
|
39
|
Maier A, Heil J, Lauer A, Harcos A, Schaefgen B, von Au A, Spratte J, Riedel F, Rauch G, Hennigs A, Domschke C, Schott S, Rom J, Schuetz F, Sohn C, Golatta M. Inter-rater reliability and double reading analysis of an automated three-dimensional breast ultrasound system: comparison of two independent examiners. Arch Gynecol Obstet 2017; 296:571-582. [PMID: 28748340 DOI: 10.1007/s00404-017-4473-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2017] [Accepted: 07/21/2017] [Indexed: 10/19/2022]
Abstract
PURPOSE Breast ultrasound could be a valuable tool complementary to mammography in breast cancer screening. Automated 3D breast ultrasound (ABUS) addresses challenges of hand-held ultrasound and could allow double reading analysis of ultrasound images. This trial assesses the inter-rater reliability and double reading analysis of an ABUS system. METHODS To assess the reproducibility and diagnostic validity of the ABUS system, SomoV™, a blinded double reading analysis, was performed in 1019 patients (2038 breasts) by two examiners (examiner A/B) and compared to single reading results, as well as to the reference standard regarding its diagnostic validity. Cohen's kappa coefficients were calculated to measure the inter-rater reliability and agreement of the different diagnostic modalities. Patient comfort and time consumption for image acquisition and reading were analyzed descriptively as secondary objectives. RESULTS Analysis of inter-rater reliability yielded agreement in 81.6% (κ = 0.37; p < 0.0001) showing fair agreement. Single reading analysis of SomoV™ exams (examiner A/examiner B) compared to reference standard showed good specificity (examiner A: 88.3%/examiner B: 84.5%), fair inter-rater agreement (examiner A: κ = 0.31/examiner B: κ = 0.31), and adequate sensitivity (examiner A: 53.1%/examiner B: 64.2%). Double reading analysis yielded good sensitivity and specificity (73.7 and 77.7%). Mammography (n = 1911) alone detected 160 of 176 carcinomas (sensitivity 90.1%). Adding SomoV™ to mammography would have detected 12 additional carcinomas, resulting in a higher sensitivity of 97.7%. CONCLUSION SomoV™ is a promising technique with good sensitivity, high patient comfort, and fair inter-examiner reliability. It allows double reading analysis that, in combination with mammography, could increase detection rates in breast cancer screening.
Collapse
Affiliation(s)
- Anna Maier
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Joerg Heil
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Anna Lauer
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Aba Harcos
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Benedikt Schaefgen
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Alexandra von Au
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Julia Spratte
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Fabian Riedel
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Geraldine Rauch
- Institute of Medical Biometry and Informatics, University of Heidelberg, Heidelberg, Germany.,Institute of Medical Biometry and Epidemiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - André Hennigs
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Christoph Domschke
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Sarah Schott
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Joachim Rom
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Florian Schuetz
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Christof Sohn
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Michael Golatta
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany.
| |
Collapse
|