1
|
Tao X, Cao Y, Jiang Y, Wu X, Yan D, Xue W, Zhuang S, Yang X, Huang R, Zhang J, Ni D. Enhancing lesion detection in automated breast ultrasound using unsupervised multi-view contrastive learning with 3D DETR. Med Image Anal 2025; 101:103466. [PMID: 39854815 DOI: 10.1016/j.media.2025.103466] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Revised: 12/28/2024] [Accepted: 01/09/2025] [Indexed: 01/27/2025]
Abstract
The inherent variability of lesions poses challenges in leveraging AI in 3D automated breast ultrasound (ABUS) for lesion detection. Traditional methods based on single scans have fallen short compared to comprehensive evaluations by experienced sonologists using multiple scans. To address this, our study introduces an innovative approach combining the multi-view co-attention mechanism (MCAM) with unsupervised contrastive learning. Rooted in the detection transformer (DETR) architecture, our model employs a one-to-many matching strategy, significantly boosting training efficiency and lesion recall metrics. The model integrates MCAM within the decoder, facilitating the interpretation of lesion data across diverse views. Simultaneously, unsupervised multi-view contrastive learning (UMCL) aligns features consistently across scans, improving detection performance. When tested on two multi-center datasets comprising 1509 patients, our approach outperforms existing state-of-the-art 3D detection models. Notably, our model achieves a 90.3% cancer detection rate with a false positive per image (FPPI) rate of 0.5 on the external validation dataset. This surpasses junior sonologists and matches the performance of seasoned experts.
Collapse
Affiliation(s)
- Xing Tao
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China
| | - Yan Cao
- Shenzhen RayShape Medical Technology Co., Ltd, Shenzhen, China
| | - Yanhui Jiang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Xiaoxi Wu
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Dan Yan
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Wen Xue
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Shulian Zhuang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China
| | - Xin Yang
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China
| | - Ruobing Huang
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China.
| | - Jianxing Zhang
- Department of Ultrasound, Remote Consultation Center of ABUS, The Second Affiliated Hospital, Guangzhou University of Chinese Medicine, Guangzhou, China.
| | - Dong Ni
- National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China; Medical Ultrasound Image Computing (MUSIC) Lab, Shenzhen University, Shenzhen, China; Marshall Laboratory of Biomedical Engineering, Shenzhen University, Shenzhen, China; School of Biomedical Engineering and Informatics, Nanjing Medical University, Nanjing 211166, China.
| |
Collapse
|
2
|
Verma R, Kumar K, Bhatt S, Yadav M, Kumar M, Tagde P, Rajinikanth PS, Tiwari A, Tiwari V, Nagpal D, Mittal V, Kaushik D. Untangling Breast Cancer: Trailing Towards Nanoformulations-based Drug Development. RECENT PATENTS ON NANOTECHNOLOGY 2025; 19:76-98. [PMID: 37519201 DOI: 10.2174/1872210517666230731091046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 06/28/2023] [Accepted: 06/28/2023] [Indexed: 08/01/2023]
Abstract
All over the world, cancer death and prevalence are increasing. Breast cancer (BC) is the major cause of cancer mortality (15%) which makes it the most common cancer in women. BC is defined as the furious progression and quick division of breast cells. Novel nanotechnology-based approaches helped in improving survival rate, metastatic BC is still facing obstacles to treat with an expected overall 23% survival rate. This paper represents epidemiology, classification (non-invasive, invasive and metastatic), risk factors (genetic and non-genetic) and treatment challenges of breast cancer in brief. This review paper focus on the importance of nanotechnology-based nanoformulations for treatment of BC. This review aims to deliver elementary insight and understanding of the novel nanoformulations in BC treatment and to explain to the readers for enduring designing novel nanomedicine. Later, we elaborate on several types of nanoformulations used in tumor therapeutics such as liposomes, dendrimers, polymeric nanomaterials and many others. Potential research opportunities for clinical application and current challenges related to nanoformulations utility for the treatment of BC are also highlighted in this review. The role of artificial intelligence is elaborated in detail. We also confer the existing challenges and perspectives of nanoformulations in effective tumor management, with emphasis on the various patented nanoformulations approved or progression of clinical trials retrieved from various search engines.
Collapse
Affiliation(s)
- Ravinder Verma
- Department of Pharmaceutical Sciences, Chaudhary Bansi Lal University, Bhiwani, Haryana, 127021, India
| | - Kuldeep Kumar
- Department of Pharmaceutical Sciences and Drug Research, Punjabi University, Patiala, Punjab, India
| | - Shailendra Bhatt
- Shrinathji Institute of Pharmacy, Shrinathji Society for Higher Education, Upali Oden, Nathdwara, Rajasmand, Rajasthan, India
| | - Manish Yadav
- Department of Pharmacy, G.D. Goenka University, Sohna Road, Gurugram, 122103, India
| | - Manish Kumar
- School of Pharmaceutical Sciences, CT University, Ludhiana, 142024, Punjab, India
| | - Priti Tagde
- Bhabha Pharmacy Research Institute, Bhabha University Bhopal, 462026, Madhya Pradesh, India
- PRISAL Foundation, Pharmaceutical Royal International Society, New Dehli, India
| | - P S Rajinikanth
- Department of Pharmaceutical Sciences, Babasaheb Bhimrao Amebdkar University, Lucknow, India
| | - Abhishek Tiwari
- Pharmacy Academy, IFTM University, Lodhipur Rajput, Moradabad, U.P., 244102, India
| | - Varsha Tiwari
- Pharmacy Academy, IFTM University, Lodhipur Rajput, Moradabad, U.P., 244102, India
| | - Diksha Nagpal
- Department of Pharmaceutical Sciences, Maharshi Dayanand University, Rohtak, Haryana, 124001, India
| | - Vineet Mittal
- Department of Pharmaceutical Sciences, Maharshi Dayanand University, Rohtak, Haryana, 124001, India
| | - Deepak Kaushik
- Department of Pharmaceutical Sciences, Maharshi Dayanand University, Rohtak, Haryana, 124001, India
| |
Collapse
|
3
|
Barekatrezaei S, Kozegar E, Salamati M, Soryani M. Mass detection in automated three dimensional breast ultrasound using cascaded convolutional neural networks. Phys Med 2024; 124:103433. [PMID: 39002423 DOI: 10.1016/j.ejmp.2024.103433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Revised: 07/03/2024] [Accepted: 07/08/2024] [Indexed: 07/15/2024] Open
Abstract
PURPOSE Early detection of breast cancer has a significant effect on reducing its mortality rate. For this purpose, automated three-dimensional breast ultrasound (3-D ABUS) has been recently used alongside mammography. The 3-D volume produced in this imaging system includes many slices. The radiologist must review all the slices to find the mass, a time-consuming task with a high probability of mistakes. Therefore, many computer-aided detection (CADe) systems have been developed to assist radiologists in this task. In this paper, we propose a novel CADe system for mass detection in 3-D ABUS images. METHODS The proposed system includes two cascaded convolutional neural networks. The goal of the first network is to achieve the highest possible sensitivity, and the second network's goal is to reduce false positives while maintaining high sensitivity. In both networks, an improved version of 3-D U-Net architecture is utilized in which two types of modified Inception modules are used in the encoder section. In the second network, new attention units are also added to the skip connections that receive the results of the first network as saliency maps. RESULTS The system was evaluated on a dataset containing 60 3-D ABUS volumes from 43 patients and 55 masses. A sensitivity of 91.48% and a mean false positive of 8.85 per patient were achieved. CONCLUSIONS The suggested mass detection system is fully automatic without any user interaction. The results indicate that the sensitivity and the mean FP per patient of the CADe system outperform competing techniques.
Collapse
Affiliation(s)
- Sepideh Barekatrezaei
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran.
| | - Ehsan Kozegar
- Department of Computer Engineering and Engineering Sciences, Faculty of Technology and Engineering, University of Guilan, Rudsar-Vajargah, Guilan, Iran.
| | - Masoumeh Salamati
- Department of Reproductive Imaging, Reproductive Biomedicine Research Center, Royan Institute for Reproductive Biomedicine, ACECR, Tehran, Iran.
| | - Mohsen Soryani
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran.
| |
Collapse
|
4
|
Malekmohammadi A, Barekatrezaei S, Kozegar E, Soryani M. Mass detection in automated 3-D breast ultrasound using a patch Bi-ConvLSTM network. ULTRASONICS 2023; 129:106891. [PMID: 36493507 DOI: 10.1016/j.ultras.2022.106891] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/13/2022] [Revised: 10/27/2022] [Accepted: 11/13/2022] [Indexed: 06/17/2023]
Abstract
Breast cancer mortality can be significantly reduced by early detection of its symptoms. The 3-D Automated Breast Ultrasound (ABUS) has been widely used for breast screening due to its high sensitivity and reproducibility. The large number of ABUS slices, and high variation in size and shape of the masses, make the manual evaluation a challenging and time-consuming process. To assist the radiologists, we propose a convolutional BiLSTM network to classify the slices based on the presence of a mass. Because of its patch-based architecture, this model produces the approximate location of masses as a heat map. The prepared dataset consists of 60 volumes belonging to 43 patients. The precision, recall, accuracy, F1-score, and AUC of the proposed model for slice classification were 84%, 84%, 93%, 84%, and 97%, respectively. Based on the FROC analysis, the proposed detector obtained a sensitivity of 82% with two false positives per volume.
Collapse
Affiliation(s)
- Amin Malekmohammadi
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| | - Sepideh Barekatrezaei
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| | - Ehsan Kozegar
- Faculty of Technology and Engineering-East of Guilan, University of Guilan, Vajargah, Rudsar, Guilan 4199613776, Iran.
| | - Mohsen Soryani
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| |
Collapse
|
5
|
Wang YW, Kuo TT, Chou YH, Su Y, Huang SH, Chen CJ. Breast Tumor Classification using Short-ResNet with Pixel-based Tumor Probability Map in Ultrasound Images. ULTRASONIC IMAGING 2023; 45:74-84. [PMID: 36951105 DOI: 10.1177/01617346231162906] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
Breast cancer is the most common form of cancer and is still the second leading cause of death for women in the world. Early detection and treatment of breast cancer can reduce mortality rates. Breast ultrasound is always used to detect and diagnose breast cancer. The accurate breast segmentation and diagnosis as benign or malignant is still a challenging task in the ultrasound image. In this paper, we proposed a classification model as short-ResNet with DC-UNet to solve the segmentation and diagnosis challenge to find the tumor and classify benign or malignant with breast ultrasonic images. The proposed model has a dice coefficient of 83% for segmentation and achieves an accuracy of 90% for classification with breast tumors. In the experiment, we have compared with segmentation task and classification result in different datasets to prove that the proposed model is more general and demonstrates better results. The deep learning model using short-ResNet to classify tumor whether benign or malignant, that combine DC-UNet of segmentation task to assist in improving the classification results.
Collapse
Affiliation(s)
- You-Wei Wang
- Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan
| | - Tsung-Ter Kuo
- Department of Medical Imaging and Radiological Technology, Yuanpei University of Medical Technology, Hsinchu, Taiwan
| | - Yi-Hong Chou
- Department of Medical Imaging and Radiological Technology, Yuanpei University of Medical Technology, Hsinchu, Taiwan
| | - Yu Su
- Department of Medical Imaging and Radiological Technology, Yuanpei University of Medical Technology, Hsinchu, Taiwan
| | - Shing-Hwa Huang
- Department of Breast Surgery, En Chu Kong Hospital, New Taipei City, Taiwan
| | - Chii-Jen Chen
- Department of Computer Science and Information Engineering, Tamkang University, New Taipei City, Taiwan
| |
Collapse
|
6
|
Zizaan A, Idri A. Machine learning based Breast Cancer screening: trends, challenges, and opportunities. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2023. [DOI: 10.1080/21681163.2023.2172615] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/10/2023]
Affiliation(s)
- Asma Zizaan
- Mohammed VI Polytechnic University, Benguerir, Morocco
| | - Ali Idri
- Mohammed VI Polytechnic University, Benguerir, Morocco
- Software Project Management Research Team, ENSIAS, Mohammed V University, Rabat, Morocco
| |
Collapse
|
7
|
Tan T, Rodriguez-Ruiz A, Zhang T, Xu L, Beets-Tan RGH, Shen Y, Karssemeijer N, Xu J, Mann RM, Bao L. Multi-modal artificial intelligence for the combination of automated 3D breast ultrasound and mammograms in a population of women with predominantly dense breasts. Insights Imaging 2023; 14:10. [PMID: 36645507 PMCID: PMC9842825 DOI: 10.1186/s13244-022-01352-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Accepted: 12/09/2022] [Indexed: 01/17/2023] Open
Abstract
OBJECTIVES To assess the stand-alone and combined performance of artificial intelligence (AI) detection systems for digital mammography (DM) and automated 3D breast ultrasound (ABUS) in detecting breast cancer in women with dense breasts. METHODS 430 paired cases of DM and ABUS examinations from a Asian population with dense breasts were retrospectively collected. All cases were analyzed by two AI systems, one for DM exams and one for ABUS exams. A selected subset (n = 152) was read by four radiologists. The performance of AI systems was based on analysis of the area under the receiver operating characteristic curve (AUC). The maximum Youden's index and its associated sensitivity and specificity were also reported for each AI systems. Detection performance of human readers in the subcohort of the reader study was measured in terms of sensitivity and specificity. RESULTS The performance of the AI systems in a multi-modal setting was significantly better when the weights of AI-DM and AI-ABUS were 0.25 and 0.75, respectively, than each system individually in a single-modal setting (AUC-AI-Multimodal = 0.865; AUC-AI-DM = 0.832, p = 0.026; AUC-AI-ABUS = 0.841, p = 0.041). The maximum Youden's index for AI-Multimodal was 0.707 (sensitivity = 79.4%, specificity = 91.2%). In the subcohort that underwent human reading, the panel of four readers achieved a sensitivity of 93.2% and specificity of 32.7%. AI-multimodal achieves superior or equal sensitivity as single human readers at the same specificity operating points on the ROC curve. CONCLUSION Multimodal (ABUS + DM) AI systems for detecting breast cancer in women with dense breasts are a potential solution for breast screening in radiologist-scarce regions.
Collapse
Affiliation(s)
- Tao Tan
- grid.430814.a0000 0001 0674 1393Department of Radiology, Netherlands Cancer Institute (NKI), Plesmanlaan 121, 1066 CX Amsterdam, The Netherlands ,Faculty of Applied Science, Macao Polytechnic University, Macao, 999078 China
| | | | - Tianyu Zhang
- grid.430814.a0000 0001 0674 1393Department of Radiology, Netherlands Cancer Institute (NKI), Plesmanlaan 121, 1066 CX Amsterdam, The Netherlands ,grid.5012.60000 0001 0481 6099GROW School for Oncology and Development Biology, Maastricht University, P. O. Box 616, 6200 MD Maastricht, The Netherlands
| | - Lin Xu
- grid.440637.20000 0004 4657 8879School of Information Science and Technology, ShanghaiTech University, Shanghai, 201210 China
| | - Regina G. H. Beets-Tan
- grid.430814.a0000 0001 0674 1393Department of Radiology, Netherlands Cancer Institute (NKI), Plesmanlaan 121, 1066 CX Amsterdam, The Netherlands ,grid.5012.60000 0001 0481 6099GROW School for Oncology and Development Biology, Maastricht University, P. O. Box 616, 6200 MD Maastricht, The Netherlands
| | - Yingzhao Shen
- grid.13402.340000 0004 1759 700XAffiliated Hangzhou First People’s Hospital, Zhejiang University School of Medicine, Xueshi Road, Hubin Street, Shangcheng District, Hangzhou, 310006 Zhejiang China
| | - Nico Karssemeijer
- grid.10417.330000 0004 0444 9382Department of Diagnostic Imaging, Radboud University Medical Center, PO Box 9101, 6500 HB Nijmegen, The Netherlands
| | - Jun Xu
- grid.260478.f0000 0000 9249 2313Institute for AI in Medicine, School of Artificial Intelligence, Nanjing University of Information Science and Technology, Nanjing, 210044 China
| | - Ritse M. Mann
- grid.430814.a0000 0001 0674 1393Department of Radiology, Netherlands Cancer Institute (NKI), Plesmanlaan 121, 1066 CX Amsterdam, The Netherlands ,grid.10417.330000 0004 0444 9382Department of Diagnostic Imaging, Radboud University Medical Center, PO Box 9101, 6500 HB Nijmegen, The Netherlands
| | - Lingyun Bao
- grid.13402.340000 0004 1759 700XAffiliated Hangzhou First People’s Hospital, Zhejiang University School of Medicine, Xueshi Road, Hubin Street, Shangcheng District, Hangzhou, 310006 Zhejiang China
| |
Collapse
|
8
|
Górski K, Borowska M, Stefanik E, Polkowska I, Turek B, Bereznowski A, Domino M. Application of Two-Dimensional Entropy Measures to Detect the Radiographic Signs of Tooth Resorption and Hypercementosis in an Equine Model. Biomedicines 2022; 10:2914. [PMID: 36428482 PMCID: PMC9687516 DOI: 10.3390/biomedicines10112914] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Revised: 10/28/2022] [Accepted: 11/11/2022] [Indexed: 11/16/2022] Open
Abstract
Dental disorders are a serious health problem in equine medicine, their early recognition benefits the long-term general health of the horse. Most of the initial signs of Equine Odontoclastic Tooth Resorption and Hypercementosis (EOTRH) syndrome concern the alveolar aspect of the teeth, thus, the need for early recognition radiographic imaging. This study is aimed to evaluate the applicability of entropy measures to quantify the radiological signs of tooth resorption and hypercementosis as well as to enhance radiographic image quality in order to facilitate the identification of the signs of EOTRH syndrome. A detailed examination of the oral cavity was performed in eighty horses. Each evaluated incisor tooth was assigned to one of four grade-related EOTRH groups (0-3). Radiographs of the incisor teeth were taken and digitally processed. For each radiograph, two-dimensional sample (SampEn2D), fuzzy (FuzzEn2D), permutation (PermEn2D), dispersion (DispEn2D), and distribution (DistEn2D) entropies were measured after image filtering was performed using Normalize, Median, and LaplacianSharpening filters. Moreover, the similarities between entropy measures and selected Gray-Level Co-occurrence Matrix (GLCM) texture features were investigated. Among the 15 returned measures, DistEn2D was EOTRH grade-related. Moreover, DistEn2D extracted after Normalize filtering was the most informative. The EOTRH grade-related similarity between DistEn2D and Difference Entropy (GLCM) confirms the higher irregularity and complexity of incisor teeth radiographs in advanced EOTRH syndrome, demonstrating the greatest sensitivity (0.50) and specificity (0.95) of EOTRH 3 group detection. An application of DistEn2D to Normalize filtered incisor teeth radiographs enables the identification of the radiological signs of advanced EOTRH with higher accuracy than the previously used entropy-related GLCM texture features.
Collapse
Affiliation(s)
- Kamil Górski
- Department of Large Animal Diseases and Clinic, Institute of Veterinary Medicine, Warsaw University of Life Sciences, 02-787 Warsaw, Poland; (E.S.); (B.T.)
| | - Marta Borowska
- Institute of Biomedical Engineering, Faculty of Mechanical Engineering, Białystok University of Technology, 15-351 Bialystok, Poland;
| | - Elżbieta Stefanik
- Department of Large Animal Diseases and Clinic, Institute of Veterinary Medicine, Warsaw University of Life Sciences, 02-787 Warsaw, Poland; (E.S.); (B.T.)
| | - Izabela Polkowska
- Department and Clinic of Animal Surgery, Faculty of Veterinary Medicine, University of Life Sciences, 20-950 Lublin, Poland;
| | - Bernard Turek
- Department of Large Animal Diseases and Clinic, Institute of Veterinary Medicine, Warsaw University of Life Sciences, 02-787 Warsaw, Poland; (E.S.); (B.T.)
| | - Andrzej Bereznowski
- Division of Veterinary Epidemiology and Economics, Institute of Veterinary Medicine, Warsaw University of Life Sciences, Nowoursynowska 159c, 02-776 Warsaw, Poland;
| | - Małgorzata Domino
- Department of Large Animal Diseases and Clinic, Institute of Veterinary Medicine, Warsaw University of Life Sciences, 02-787 Warsaw, Poland; (E.S.); (B.T.)
| |
Collapse
|
9
|
Cheng Z, Li Y, Chen H, Zhang Z, Pan P, Cheng L. DSGMFFN: Deepest semantically guided multi-scale feature fusion network for automated lesion segmentation in ABUS images. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2022; 221:106891. [PMID: 35623209 DOI: 10.1016/j.cmpb.2022.106891] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/29/2021] [Revised: 05/06/2022] [Accepted: 05/12/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND AND OBJECTIVE Automated breast ultrasound (ABUS) imaging technology has been widely used in clinical diagnosis. Accurate lesion segmentation in ABUS images is essential in computer-aided diagnosis (CAD) systems. Although deep learning-based approaches have been widely employed in medical image analysis, the large variety of lesions and the imaging interference make ABUS lesion segmentation challenging. METHODS In this paper, we propose a novel deepest semantically guided multi-scale feature fusion network (DSGMFFN) for lesion segmentation in 2D ABUS slices. In order to cope with the large variety of lesions, a deepest semantically guided decoder (DSGNet) and a multi-scale feature fusion model (MFFM) are designed, where the deepest semantics is fully utilized to guide the decoding and feature fusion. That is, the deepest information is given the highest weight in the feature fusion process, and participates in every decoding stage. Aiming at the challenge of imaging interference, a novel mixed attention mechanism is developed, integrating spatial self-attention and channel self-attention to obtain the correlation among pixels and channels to highlight the lesion region. RESULTS The proposed DSGMFFN is evaluated on 3742 slices of 170 ABUS volumes. The experimental result indicates that DSGMFFN achieves 84.54% and 73.24% in Dice similarity coefficient (DSC) and intersection over union (IoU), respectively. CONCLUSIONS The proposed method shows better performance than the state-of-the-art methods in ABUS lesion segmentation. Incorrect segmentation caused by lesion variety and imaging interference in ABUS images can be alleviated.
Collapse
Affiliation(s)
- Zhanyi Cheng
- School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing, China
| | - Yanfeng Li
- School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing, China.
| | - Houjin Chen
- School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing, China
| | - Zilu Zhang
- School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing, China
| | - Pan Pan
- School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing, China
| | - Lin Cheng
- Center for Breast, People's Hospital of Peking University, Beijing, China
| |
Collapse
|
10
|
Tan T, Das B, Soni R, Fejes M, Yang H, Ranjan S, Szabo DA, Melapudi V, Shriram KS, Agrawal U, Rusko L, Herczeg Z, Darazs B, Tegzes P, Ferenczi L, Mullick R, Avinash G. Multi-modal trained artificial intelligence solution to triage chest X-ray for COVID-19 using pristine ground-truth, versus radiologists. Neurocomputing 2022; 485:36-46. [PMID: 35185296 PMCID: PMC8847079 DOI: 10.1016/j.neucom.2022.02.040] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Revised: 12/25/2021] [Accepted: 02/11/2022] [Indexed: 11/05/2022]
Abstract
The front-line imaging modalities computed tomography (CT) and X-ray play important roles for triaging COVID patients. Thoracic CT has been accepted to have higher sensitivity than a chest X-ray for COVID diagnosis. Considering the limited access to resources (both hardware and trained personnel) and issues related to decontamination, CT may not be ideal for triaging suspected subjects. Artificial intelligence (AI) assisted X-ray based application for triaging and monitoring require experienced radiologists to identify COVID patients in a timely manner with the additional ability to delineate and quantify the disease region is seen as a promising solution for widespread clinical use. Our proposed solution differs from existing solutions presented by industry and academic communities. We demonstrate a functional AI model to triage by classifying and segmenting a single chest X-ray image, while the AI model is trained using both X-ray and CT data. We report on how such a multi-modal training process improves the solution compared to single modality (X-ray only) training. The multi-modal solution increases the AUC (area under the receiver operating characteristic curve) from 0.89 to 0.93 for a binary classification between COVID-19 and non-COVID-19 cases. It also positively impacts the Dice coefficient (0.59 to 0.62) for localizing the COVID-19 pathology. To compare the performance of experienced readers to the AI model, a reader study is also conducted. The AI model showed good consistency with respect to radiologists. The DICE score between two radiologists on the COVID group was 0.53 while the AI had a DICE value of 0.52 and 0.55 when compared to the segmentation done by the two radiologists separately. From a classification perspective, the AUCs of two readers was 0.87 and 0.81 while the AUC of the AI is 0.93 based on the reader study dataset. We also conducted a generalization study by comparing our method to the-state-art methods on independent datasets. The results show better performance from the proposed method. Leveraging multi-modal information for the development benefits the single-modal inferencing.
Collapse
Affiliation(s)
- Tao Tan
- GE Healthcare, The Netherlands
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
11
|
Liu H, Cui G, Luo Y, Guo Y, Zhao L, Wang Y, Subasi A, Dogan S, Tuncer T. Artificial Intelligence-Based Breast Cancer Diagnosis Using Ultrasound Images and Grid-Based Deep Feature Generator. Int J Gen Med 2022; 15:2271-2282. [PMID: 35256855 PMCID: PMC8898057 DOI: 10.2147/ijgm.s347491] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2021] [Accepted: 01/11/2022] [Indexed: 01/30/2023] Open
Abstract
Purpose Breast cancer is a prominent cancer type with high mortality. Early detection of breast cancer could serve to improve clinical outcomes. Ultrasonography is a digital imaging technique used to differentiate benign and malignant tumors. Several artificial intelligence techniques have been suggested in the literature for breast cancer detection using breast ultrasonography (BUS). Nowadays, particularly deep learning methods have been applied to biomedical images to achieve high classification performances. Patients and Methods This work presents a new deep feature generation technique for breast cancer detection using BUS images. The widely known 16 pre-trained CNN models have been used in this framework as feature generators. In the feature generation phase, the used input image is divided into rows and columns, and these deep feature generators (pre-trained models) have applied to each row and column. Therefore, this method is called a grid-based deep feature generator. The proposed grid-based deep feature generator can calculate the error value of each deep feature generator, and then it selects the best three feature vectors as a final feature vector. In the feature selection phase, iterative neighborhood component analysis (INCA) chooses 980 features as an optimal number of features. Finally, these features are classified by using a deep neural network (DNN). Results The developed grid-based deep feature generation-based image classification model reached 97.18% classification accuracy on the ultrasonic images for three classes, namely malignant, benign, and normal. Conclusion The findings obviously denoted that the proposed grid deep feature generator and INCA-based feature selection model successfully classified breast ultrasonic images.
Collapse
Affiliation(s)
- Haixia Liu
- Department of Ultrasound, Cangzhou Central Hospital, Cangzhou, Hebei Province, 061000, People's Republic of China
| | - Guozhong Cui
- Department of Surgical Oncology, Cangzhou Central Hospital, Cangzhou, Hebei Province, 061000, People's Republic of China
| | - Yi Luo
- Medical Statistics Room, Cangzhou Central Hospital, Cangzhou, Hebei Province, 061000, People's Republic of China
| | - Yajie Guo
- Department of Ultrasound, Cangzhou Central Hospital, Cangzhou, Hebei Province, 061000, People's Republic of China
| | - Lianli Zhao
- Department of Internal Medicine teaching and research group, Cangzhou Central Hospital, Cangzhou, Hebei Province, 061000, China
| | - Yueheng Wang
- Department of Ultrasound, The Second Hospital of Hebei MedicalUniversity, Shijiazhuang, Hebei Province, 050000, People's Republic of China
| | - Abdulhamit Subasi
- Institute of Biomedicine, Faculty of Medicine, University of Turku, Turku, 20520, Finland.,Department of Computer Science, College of Engineering, Effat University, Jeddah, 21478, Saudi Arabia
| | - Sengul Dogan
- Department of Digital Forensics Engineering, College of Technology, Firat University, Elazig, 23119, Turkey
| | - Turker Tuncer
- Department of Digital Forensics Engineering, College of Technology, Firat University, Elazig, 23119, Turkey
| |
Collapse
|
12
|
Luo X, Xu M, Tang G, PhD YW, Wang N, PhD DN, PhD XL, Li AH. The lesion detection efficacy of deep learning on automatic breast ultrasound and factors affecting its efficacy: a pilot study. Br J Radiol 2022; 95:20210438. [PMID: 34860574 PMCID: PMC8822545 DOI: 10.1259/bjr.20210438] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023] Open
Abstract
OBJECTIVES The aim of this study was to investigate the detection efficacy of deep learning (DL) for automatic breast ultrasound (ABUS) and factors affecting its efficacy. METHODS Females who underwent ABUS and handheld ultrasound from May 2016 to June 2017 (N = 397) were enrolled and divided into training (n = 163 patients with breast cancer and 33 with benign lesions), test (n = 57) and control (n = 144) groups. A convolutional neural network was optimized to detect lesions in ABUS. The sensitivity and false positives (FPs) were evaluated and compared for different breast tissue compositions, lesion sizes, morphologies and echo patterns. RESULTS In the training set, with 688 lesion regions (LRs), the network achieved sensitivities of 93.8%, 97.2% and 100%, based on volume, lesion and patient, respectively, with 1.9 FPs per volume. In the test group with 247 LRs, the sensitivities were 92.7%, 94.5% and 96.5%, respectively, with 2.4 FPs per volume. The control group, with 900 volumes, showed 0.24 FPs per volume. The sensitivity was 98% for lesions > 1 cm3, but 87% for those ≤1 cm3 (p < 0.05). Similar sensitivities and FPs were observed for different breast tissue compositions (homogeneous, 97.5%, 2.1; heterogeneous, 93.6%, 2.1), lesion morphologies (mass, 96.3%, 2.1; non-mass, 95.8%, 2.0) and echo patterns (homogeneous, 96.1%, 2.1; heterogeneous 96.8%, 2.1). CONCLUSIONS DL had high detection sensitivity with a low FP but was affected by lesion size. ADVANCES IN KNOWLEDGE DL is technically feasible for the automatic detection of lesions in ABUS.
Collapse
Affiliation(s)
| | | | | | - Yi Wang PhD
- National Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China, and also with the Medical UltraSound Image Computing (MUSIC) Lab, Shenzhen, China
| | - Na Wang
- National Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China, and also with the Medical UltraSound Image Computing (MUSIC) Lab, Shenzhen, China
| | - Dong Ni PhD
- National Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen, China, and also with the Medical UltraSound Image Computing (MUSIC) Lab, Shenzhen, China
| | | | | |
Collapse
|
13
|
Xing J, Li Z, Wang B, Qi Y, Yu B, Zanjani FG, Zheng A, Duits R, Tan T. Lesion Segmentation in Ultrasound Using Semi-Pixel-Wise Cycle Generative Adversarial Nets. IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS 2021; 18:2555-2565. [PMID: 32149651 DOI: 10.1109/tcbb.2020.2978470] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Breast cancer is the most common invasive cancer with the highest cancer occurrence in females. Handheld ultrasound is one of the most efficient ways to identify and diagnose the breast cancer. The area and the shape information of a lesion is very helpful for clinicians to make diagnostic decisions. In this study we propose a new deep-learning scheme, semi-pixel-wise cycle generative adversarial net (SPCGAN) for segmenting the lesion in 2D ultrasound. The method takes the advantage of a fully convolutional neural network (FCN) and a generative adversarial net to segment a lesion by using prior knowledge. We compared the proposed method to a fully connected neural network and the level set segmentation method on a test dataset consisting of 32 malignant lesions and 109 benign lesions. Our proposed method achieved a Dice similarity coefficient (DSC) of 0.92 while FCN and the level set achieved 0.90 and 0.79 respectively. Particularly, for malignant lesions, our method increases the DSC (0.90) of the fully connected neural network to 0.93 significantly (p 0.001). The results show that our SPCGAN can obtain robust segmentation results. The framework of SPCGAN is particularly effective when sufficient training samples are not available compared to FCN. Our proposed method may be used to relieve the radiologists' burden for annotation.
Collapse
|
14
|
|
15
|
Zhang P, Ma Z, Zhang Y, Chen X, Wang G. Improved Inception V3 method and its effect on radiologists' performance of tumor classification with automated breast ultrasound system. Gland Surg 2021; 10:2232-2245. [PMID: 34422594 PMCID: PMC8340346 DOI: 10.21037/gs-21-328] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Accepted: 06/17/2021] [Indexed: 11/06/2022]
Abstract
BACKGROUND The automated breast ultrasound system (ABUS) is recognized as a valuable detection tool in addition to mammography. The purpose of this study was to propose a novel computer-aided diagnosis (CAD) system by extracting the textural features from ABUS images and to investigate the efficiency of using this CAD for breast cancer detection. METHODS This retrospective study involved 149 breast nodules [maximum diameter: mean size 18.89 mm, standard deviation (SD) 10.238, and range 5-59 mm] in 135. We assigned 3 novice readers (<3 years of experience and 3 experienced readers (≥10 years of experience to review the imaging data and stratify the 149 breast nodules as either malignant or benign. The Improved Inception V3 (II3) method was developed and used as an assistant tool to help the 6 readers to re-interpret the images. RESULTS Our method (II3) achieved an accuracy of 88.6% for the final result. The 3 novice readers had an average accuracy of 71.37%±4.067% while the 3 experienced readers was 83.03%±3.371% on the first-reading. With the help of II3 on the second-reading, the average accuracy of the novice readers increased to 84.13%±1.662% and the experienced readers increased to 89.50%±0.346%.The areas under the curve (AUCs) were similar compared with linear algorithms. The mean AUC of the novice readers was improved from 0.7751 (without II3) to 0.8232 (with II3). The mean AUC of the experienced readers was improved from 0.8939 (without II3) to 0.9211 (with II3). The mean AUC for all readers improved in both the second-reading mode (from 0.8345 to 0.8722, P=0.0081<0.05). CONCLUSIONS With the help of the II3, the diagnostic accuracy of the two groups were both improved, and II3 was more helpful for novice readers than for experienced readers. Our results showed that II3 is valuable in the differentiation of benign and malignant breast nodules and it also improves the experience and skill of some novice radiologists. The II3 cannot completely replace the influence of experience in the diagnostic process and will retain an auxiliary role in the clinic at present.
Collapse
Affiliation(s)
- Panpan Zhang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Zhaosheng Ma
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Yingtao Zhang
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Xiaodan Chen
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Gang Wang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| |
Collapse
|
16
|
Rajathi GM. Optimized Radial Basis Neural Network for Classification of Breast Cancer Images. Curr Med Imaging 2021; 17:97-108. [PMID: 32416697 DOI: 10.2174/1573405616666200516172118] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Revised: 04/18/2020] [Accepted: 04/25/2020] [Indexed: 11/22/2022]
Abstract
BACKGROUND Breast cancer is a curable disease if diagnosed at an early stage. The chances of having breast cancer are the lowest in married women after the breast-feeding phase because the cancer is formed from the blocked milk ducts. INTRODUCTION Nowadays, cancer is considered the leading cause of death globally. Breast cancer is the most common cancer among females. It is possible to develop breast cancer while breast-feeding a baby, but it is rare. Mammography is one of the most effective methods used in hospitals and clinics for early detection of breast cancer. Various researchers are used in artificial intelligence- based mammogram techniques. This process of mammography will reduce the death rate of the patients affected by breast cancer. This process is improved by the image analysing, detection, screening, diagnosing, and other performance measures. METHODS The radial basis neural network will be used for classification purposes. The radial basis neural network is designed with the help of the optimization algorithm. The optimization is to tune the classifier to reduce the error rate with the minimum time for the training process. The cuckoo search algorithm will be used for this purpose. RESULTS Thus, the proposed optimum RBNN is determined to classify breast cancer images. In this, the three sets of properties were classified by performing the feature extraction and feature reduction. In this breast cancer MRI image, the normal, benign, and malignant is taken to perform the classification. The minimum fitness value is determined to evaluate the optimum value of possible locations. The radial basis function is evaluated with the cuckoo search algorithm to optimize the feature reduction process. The proposed methodology is compared with the traditional radial basis neural network using the evaluation parameter like accuracy, precision, recall and f1-score. The whole system model is done by using Matrix Laboratory (MATLAB) with the adaptation of 2018a since the proposed system is most efficient than most recent related literature. CONCLUSION Thus, it concluded with the efficient classification process of RBNN using a cuckoo search algorithm for breast cancer images. The mammogram images are taken into recent research because breast cancer is a major issue for women. This process is carried to classify the various features for three sets of properties. The optimized classifier improves performance and provides a better result. In this proposed research work, the input image is filtered using a wiener filter, and the classifier extracts the feature based on the breast image.
Collapse
Affiliation(s)
- G M Rajathi
- Department of Electronics and Communication Engineering, Sri Ramakrishna Engineering College, Coimbatore, Tamil Nadu, India
| |
Collapse
|
17
|
Lei Y, He X, Yao J, Wang T, Wang L, Li W, Curran WJ, Liu T, Xu D, Yang X. Breast tumor segmentation in 3D automatic breast ultrasound using Mask scoring R-CNN. Med Phys 2021; 48:204-214. [PMID: 33128230 DOI: 10.1002/mp.14569] [Citation(s) in RCA: 57] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Revised: 10/20/2020] [Accepted: 10/20/2020] [Indexed: 12/24/2022] Open
Abstract
PURPOSE Automatic breast ultrasound (ABUS) imaging has become an essential tool in breast cancer diagnosis since it provides complementary information to other imaging modalities. Lesion segmentation on ABUS is a prerequisite step of breast cancer computer-aided diagnosis (CAD). This work aims to develop a deep learning-based method for breast tumor segmentation using three-dimensional (3D) ABUS automatically. METHODS For breast tumor segmentation in ABUS, we developed a Mask scoring region-based convolutional neural network (R-CNN) that consists of five subnetworks, that is, a backbone, a regional proposal network, a region convolutional neural network head, a mask head, and a mask score head. A network block building direct correlation between mask quality and region class was integrated into a Mask scoring R-CNN based framework for the segmentation of new ABUS images with ambiguous regions of interest (ROIs). For segmentation accuracy evaluation, we retrospectively investigated 70 patients with breast tumor confirmed with needle biopsy and manually delineated on ABUS, of which 40 were used for fivefold cross-validation and 30 were used for hold-out test. The comparison between the automatic breast tumor segmentations and the manual contours was quantified by I) six metrics including Dice similarity coefficient (DSC), Jaccard index, 95% Hausdorff distance (HD95), mean surface distance (MSD), residual mean square distance (RMSD), and center of mass distance (CMD); II) Pearson correlation analysis and Bland-Altman analysis. RESULTS The mean (median) DSC was 85% ± 10.4% (89.4%) and 82.1% ± 14.5% (85.6%) for cross-validation and hold-out test, respectively. The corresponding HD95, MSD, RMSD, and CMD of the two tests was 1.646 ± 1.191 and 1.665 ± 1.129 mm, 0.489 ± 0.406 and 0.475 ± 0.371 mm, 0.755 ± 0.755 and 0.751 ± 0.508 mm, and 0.672 ± 0.612 and 0.665 ± 0.729 mm. The mean volumetric difference (mean and ± 1.96 standard deviation) was 0.47 cc ([-0.77, 1.71)) for the cross-validation and 0.23 cc ([-0.23 0.69]) for hold-out test, respectively. CONCLUSION We developed a novel Mask scoring R-CNN approach for the automated segmentation of the breast tumor in ABUS images and demonstrated its accuracy for breast tumor segmentation. Our learning-based method can potentially assist the clinical CAD of breast cancer using 3D ABUS imaging.
Collapse
Affiliation(s)
- Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Xiuxiu He
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Jincao Yao
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital
- Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Lijing Wang
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital
- Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Wei Li
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital
- Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Walter J Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Dong Xu
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital
- Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| |
Collapse
|
18
|
Chiu LY, Kuo WH, Chen CN, Chang KJ, Chen A. A 2-Phase Merge Filter Approach to Computer-Aided Detection of Breast Tumors on 3-Dimensional Ultrasound Imaging. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2020; 39:2439-2455. [PMID: 32567133 DOI: 10.1002/jum.15365] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 05/13/2020] [Accepted: 05/15/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVES The role of image analysis in 3-dimensional (3D) automated breast ultrasound (ABUS) images is increasingly important because of its widespread use as a screening tool in whole-breast examinations. However, reviewing a large number of images acquired from ABUS is time-consuming and sometimes error prone. The aim of this study, therefore, was to develop an efficient computer-aided detection (CADe) algorithm to assist the review process. METHODS The proposed CADe algorithm consisted of 4 major steps. First, initial tumor candidates were formed by extracting and merging hypoechoic square cells on 2-dimensional (2D) transverse images. Second, a feature-based classifier was then constructed using 2D features to filter out nontumor candidates. Third, the remaining 2D candidates were merged longitudinally into 3D masses. Finally, a 3D feature-based classifier was used to further filter out nontumor masses to obtain the final detected masses. The proposed method was validated with 176 passes of breast images acquired by an Acuson S2000 automated breast volume scanner (Siemens Medical Solutions USA, Inc., Malvern, PA), including 44 normal passes and 132 abnormal passes containing 162 proven lesions (79 benign and 83 malignant). RESULTS The proposed CADe system could achieve overall sensitivity of 100% and 90% with 6.71 and 5.14 false-positives (FPs) per pass, respectively. Our results also showed that the average number of FPs per normal pass (7.16) was more than the number of FPs per abnormal pass (6.56) at 100% sensitivity. CONCLUSIONS The proposed CADe system has a great potential for becoming a good companion tool with ABUS imaging by ensuring high sensitivity with a relatively small number of FPs.
Collapse
Affiliation(s)
- Ling-Ying Chiu
- Institute of Industrial Engineering, National Taiwan University, Taipei, Taiwan
| | - Wen-Hung Kuo
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Chiung-Nien Chen
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - King-Jen Chang
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Argon Chen
- Institute of Industrial Engineering, National Taiwan University, Taipei, Taiwan
- Department of Mechanical Engineering, National Taiwan University, Taipei, Taiwan
| |
Collapse
|
19
|
|
20
|
Wang F, Liu X, Yuan N, Qian B, Ruan L, Yin C, Jin C. Study on automatic detection and classification of breast nodule using deep convolutional neural network system. J Thorac Dis 2020; 12:4690-4701. [PMID: 33145042 PMCID: PMC7578508 DOI: 10.21037/jtd-19-3013] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Backgrounds Conventional ultrasound manual scanning and artificial diagnosis approaches in breast are considered to be operator-dependence, slight slow and error-prone. In this study, we used Automated Breast Ultrasound (ABUS) machine for the scanning, and deep convolutional neural network (CNN) technology, a kind of Deep Learning (DL) algorithm, for the detection and classification of breast nodules, aiming to achieve the automatic and accurate diagnosis of breast nodules. Methods Two hundred and ninety-three lesions from 194 patients with definite pathological diagnosis results (117 benign and 176 malignancy) were recruited as case group. Another 70 patients without breast diseases were enrolled as control group. All the breast scans were carried out by an ABUS machine and then randomly divided into training set, verification set and test set, with a proportion of 7:1:2. In the training set, we constructed a detection model by a three-dimensionally U-shaped convolutional neural network (3D U-Net) architecture for the purpose of segment the nodules from background breast images. Processes such as residual block, attention connections, and hard mining were used to optimize the model while strategies of random cropping, flipping and rotation for data augmentation. In the test phase, the current model was compared with those in previously reported studies. In the verification set, the detection effectiveness of detection model was evaluated. In the classification phase, multiple convolutional layers and fully-connected layers were applied to set up a classification model, aiming to identify whether the nodule was malignancy. Results Our detection model yielded a sensitivity of 91% and 1.92 false positive subjects per automatically scanned imaging. The classification model achieved a sensitivity of 87.0%, a specificity of 88.0% and an accuracy of 87.5%. Conclusions Deep CNN combined with ABUS maybe a promising tool for easy detection and accurate diagnosis of breast nodule.
Collapse
Affiliation(s)
- Feiqian Wang
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Xiaotong Liu
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Na Yuan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Buyue Qian
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Litao Ruan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Changchang Yin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Ciping Jin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
21
|
Lei B, Huang S, Li H, Li R, Bian C, Chou YH, Qin J, Zhou P, Gong X, Cheng JZ. Self-co-attention neural network for anatomy segmentation in whole breast ultrasound. Med Image Anal 2020; 64:101753. [DOI: 10.1016/j.media.2020.101753] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Revised: 05/27/2020] [Accepted: 06/06/2020] [Indexed: 11/25/2022]
|
22
|
Lim HG, Liu HC, Yoon CW, Jung H, Kim MG, Yoon C, Kim HH, Shung KK. Investigation of cell mechanics using single-beam acoustic tweezers as a versatile tool for the diagnosis and treatment of highly invasive breast cancer cell lines: an in vitro study. MICROSYSTEMS & NANOENGINEERING 2020; 6:39. [PMID: 34567652 PMCID: PMC8433385 DOI: 10.1038/s41378-020-0150-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/08/2019] [Revised: 02/10/2020] [Accepted: 02/18/2020] [Indexed: 05/27/2023]
Abstract
Advancements in diagnostic systems for metastatic cancer over the last few decades have played a significant role in providing patients with effective treatment by evaluating the characteristics of cancer cells. Despite the progress made in cancer prognosis, we still rely on the visual analysis of tissues or cells from histopathologists, where the subjectivity of traditional manual interpretation persists. This paper presents the development of a dual diagnosis and treatment tool using an in vitro acoustic tweezers platform with a 50 MHz ultrasonic transducer for label-free trapping and bursting of human breast cancer cells. For cancer cell detection and classification, the mechanical properties of a single cancer cell were quantified by single-beam acoustic tweezers (SBAT), a noncontact assessment tool using a focused acoustic beam. Cell-mimicking phantoms and agarose hydrogel spheres (AHSs) served to standardize the biomechanical characteristics of the cells. Based on the analytical comparison of deformability levels between the cells and the AHSs, the mechanical properties of the cells could be indirectly measured by interpolating the Young's moduli of the AHSs. As a result, the calculated Young's moduli, i.e., 1.527 kPa for MDA-MB-231 (highly invasive breast cancer cells), 2.650 kPa for MCF-7 (weakly invasive breast cancer cells), and 2.772 kPa for SKBR-3 (weakly invasive breast cancer cells), indicate that highly invasive cancer cells exhibited a lower Young's moduli than weakly invasive cells, which indicates a higher deformability of highly invasive cancer cells, leading to a higher metastasis rate. Single-cell treatment may also be carried out by bursting a highly invasive cell with high-intensity, focused ultrasound.
Collapse
Affiliation(s)
- Hae Gyun Lim
- Department of Creative IT Engineering, Pohang University of Science and Technology, Pohang, 37673 Republic of Korea
| | - Hsiao-Chuan Liu
- NIH Resource Center for Medical Ultrasonic Transducer Technology and Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089 USA
| | - Chi Woo Yoon
- NIH Resource Center for Medical Ultrasonic Transducer Technology and Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089 USA
| | - Hayong Jung
- NIH Resource Center for Medical Ultrasonic Transducer Technology and Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089 USA
| | - Min Gon Kim
- NIH Resource Center for Medical Ultrasonic Transducer Technology and Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089 USA
| | - Changhan Yoon
- Department of Biomedical Engineering, Inje University, Gimhae, Gyeongnam 50834 Republic of Korea
| | - Hyung Ham Kim
- Department of Creative IT Engineering, Pohang University of Science and Technology, Pohang, 37673 Republic of Korea
| | - K. Kirk Shung
- NIH Resource Center for Medical Ultrasonic Transducer Technology and Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089 USA
| |
Collapse
|
23
|
Xu C, Xu L, Ohorodnyk P, Roth M, Chen B, Li S. Contrast agent-free synthesis and segmentation of ischemic heart disease images using progressive sequential causal GANs. Med Image Anal 2020; 62:101668. [DOI: 10.1016/j.media.2020.101668] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 01/17/2020] [Accepted: 02/21/2020] [Indexed: 10/24/2022]
|
24
|
Wang Y, Wang N, Xu M, Yu J, Qin C, Luo X, Yang X, Wang T, Li A, Ni D. Deeply-Supervised Networks With Threshold Loss for Cancer Detection in Automated Breast Ultrasound. IEEE TRANSACTIONS ON MEDICAL IMAGING 2020; 39:866-876. [PMID: 31442972 DOI: 10.1109/tmi.2019.2936500] [Citation(s) in RCA: 56] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABUS, or Automated breast ultrasound, is an innovative and promising method of screening for breast examination. Comparing to common B-mode 2D ultrasound, ABUS attains operator-independent image acquisition and also provides 3D views of the whole breast. Nonetheless, reviewing ABUS images is particularly time-intensive and errors by oversight might occur. For this study, we offer an innovative 3D convolutional network, which is used for ABUS for automated cancer detection, in order to accelerate reviewing and meanwhile to obtain high detection sensitivity with low false positives (FPs). Specifically, we offer a densely deep supervision method in order to augment the detection sensitivity greatly by effectively using multi-layer features. Furthermore, we suggest a threshold loss in order to present voxel-level adaptive threshold for discerning cancer vs. non-cancer, which can attain high sensitivity with low false positives. The efficacy of our network is verified from a collected dataset of 219 patients with 614 ABUS volumes, including 745 cancer regions, and 144 healthy women with a total of 900 volumes, without abnormal findings. Extensive experiments demonstrate our method attains a sensitivity of 95% with 0.84 FP per volume. The proposed network provides an effective cancer detection scheme for breast examination using ABUS by sustaining high sensitivity with low false positives. The code is publicly available at https://github.com/nawang0226/abus_code.
Collapse
|
25
|
Kim Y, Rim J, Kim SM, Yun BL, Park SY, Ahn HS, Kim B, Jang M. False-negative results on computer-aided detection software in preoperative automated breast ultrasonography of breast cancer patients. Ultrasonography 2020; 40:83-92. [PMID: 32422696 PMCID: PMC7758101 DOI: 10.14366/usg.19076] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2019] [Accepted: 03/24/2020] [Indexed: 01/19/2023] Open
Abstract
Purpose The purpose of this study was to measure the cancer detection rate of computer-aided detection (CAD) software in preoperative automated breast ultrasonography (ABUS) of breast cancer patients and to determine the characteristics associated with false-negative outcomes. Methods A total of 129 index lesions (median size, 1.7 cm; interquartile range, 1.2 to 2.4 cm) from 129 consecutive patients (mean age±standard deviation, 53.4±11.8 years) who underwent preoperative ABUS from December 2017 to February 2018 were assessed. An index lesion was defined as a breast cancer confirmed by ultrasonography (US)-guided core needle biopsy. The detection rate of the index lesions, positive predictive value (PPV), and false-positive rate (FPR) of the CAD software were measured. Subgroup analysis was performed to identify clinical and US findings associated with false-negative outcomes. Results The detection rate of the CAD software was 0.84 (109 of 129; 95% confidence interval, 0.77 to 0.90). The PPV and FPR were 0.41 (221 of 544; 95% CI, 0.36 to 0.45) and 0.45 (174 of 387; 95% CI, 0.40 to 0.50), respectively. False-negative outcomes were more frequent in asymptomatic patients (P<0.001) and were associated with the following US findings: smaller size (P=0.001), depth in the posterior third (P=0.002), angular or indistinct margin (P<0.001), and absence of architectural distortion (P<0.001). Conclusion The CAD software showed a promising detection rate of breast cancer. However, radiologists should judge whether CAD software-marked lesions are true- or false-positive lesions, considering its low PPV and high FPR. Moreover, it would be helpful for radiologists to consider the characteristics associated with false-negative outcomes when reading ABUS with CAD.
Collapse
Affiliation(s)
- Youngjune Kim
- Department of Radiology, Seoul National University Bundang Hospital, Seongnam, Korea.,Aerospace Medical Group, Air Force Education and Training Command, Jinju, Korea
| | - Jiwon Rim
- Department of Radiology, Seoul National University Bundang Hospital, Seongnam, Korea
| | - Sun Mi Kim
- Department of Radiology, Seoul National University Bundang Hospital, Seongnam, Korea.,Department of Radiology, Seoul National University College of Medicine, Seoul, Korea
| | - Bo La Yun
- Department of Radiology, Seoul National University Bundang Hospital, Seongnam, Korea
| | - So Yeon Park
- Department of Pathology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam, Korea
| | - Hye Shin Ahn
- Department of Radiology, Chung-Ang University Hospital,ChungAng University College of Medicine, Seoul, Korea
| | - Bohyoung Kim
- Division of Biomedical Engineering, Hankuk University of Foreign Studies, Yongin, Korea
| | - Mijung Jang
- Department of Radiology, Seoul National University Bundang Hospital, Seongnam, Korea
| |
Collapse
|
26
|
Lee CY, Chang TF, Chou YH, Yang KC. Fully automated lesion segmentation and visualization in automated whole breast ultrasound (ABUS) images. Quant Imaging Med Surg 2020; 10:568-584. [PMID: 32269918 DOI: 10.21037/qims.2020.01.12] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
Background The number of breast cancer patients has increased each year, and the demand for breast cancer detection has become quite large. There are many common breast cancer diagnostic tools. The latest automated whole breast ultrasound (ABUS) technology can obtain a complete breast tissue structure, which improves breast cancer detection technology. However, due to the large amount of ABUS image data, manual interpretation is time-consuming and labor-intensive. If there are lesions in multiple images, there may be some omissions. In addition, if further volume information or the three-dimensional shape of the lesion is needed for therapy, it is necessary to manually segment each lesion, which is inefficient for diagnosis. Therefore, automatic lesion segmentation for ABUS is an important issue for guiding therapy. Methods Due to the amount of speckle noise in an ultrasonic image and the low contrast of the lesion boundary, it is quite difficult to automatically segment the lesion. To address the above challenges, this study proposes an automated lesion segmentation algorithm. The architecture of the proposed algorithm can be divided into four parts: (I) volume of interest selection, (II) preprocessing, (III) segmentation, and (IV) visualization. A volume of interest (VOI) is automatically selected first via a three-dimensional level-set, and then the method uses anisotropic diffusion to address the speckled noise and intensity inhomogeneity correction to eliminate shadowing artifacts before the adaptive distance regularization level set method (DRLSE) conducts segmentation. Finally, the two-dimensional segmented images are reconstructed for visualization in the three-dimensional space. Results The ground truth is delineated by two radiologists with more than 10 years of experience in breast sonography. In this study, three performance assessments are carried out to evaluate the effectiveness of the proposed algorithm. The first assessment is the similarity measurement. The second assessment is the comparison of the results of the proposed algorithm and the Chan-Vese level set method. The third assessment is the volume estimation of phantom cases. In this study, in the 2D validation of the first assessment, the area Dice similarity coefficients of the real cases named cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.84±0.06 and 0.78±0.04, real case B are 0.91±0.04 and 0.82±0.05, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.02 and 0.92±0.03, respectively. In the 3D validation, the volume Dice similarity coefficients of the real cases A, real cases B and phantoms are 0.85±0.02, 0.89±0.04 and 0.94±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.82±0.06 and 0.79±0.04, real cases B are 0.92±0.04 and 0.85±0.07, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.01 and 0.93±0.04, respectively. Therefore, the proposed algorithm is highly reliable in most cases. In the second assessment, compared with Chan-Vese level set method, the Dice of the proposed algorithm in real cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The Dice of Chan-Vese level set in real cases A, real cases B and phantoms are 0.65±0.23, 0.69±0.14 and 0.76±0.14, respectively. The Dice performance of different methods on segmentation shows a highly significant impact (P<0.01). The results show that the proposed algorithm is more accurate than Chan-Vese level set method. In the third assessment, the Spearman's correlation coefficient between the segmented volumes and the corresponding ground truth volumes is ρ=0.929 (P=0.01). Conclusions In summary, the proposed method can batch process ABUS images, segment lesions, calculate their volumes and visualize lesions to facilitate observation by radiologists and physicians.
Collapse
Affiliation(s)
- Chia-Yen Lee
- Department of Electrical Engineering, National United University, Taipei, Taiwan
| | - Tzu-Fang Chang
- Department of Electrical Engineering, National United University, Taipei, Taiwan
| | - Yi-Hong Chou
- Department of Medical Imaging and Radiological Technology, Yuanpei University of Medical Technology, Hsinchu, Taiwan.,Department of Radiology, Taipei Veterans General Hospital, Taipei, Taiwan.,School of Medicine, National Yang Ming University, Taipei, Taiwan
| | - Kuen-Cheh Yang
- Department of Family Medicine, National Taiwan University Hospital, Bei-Hu Branch, Taipei, Taiwan
| |
Collapse
|
27
|
Zhang L, Bao LY, Tan YJ, Zhu LQ, Xu XJ, Zhu QQ, Shan YN, Zhao J, Xie LS, Liu J. Diagnostic Performance Using Automated Breast Ultrasound System for Breast Cancer in Chinese Women Aged 40 Years or Older: A Comparative Study. ULTRASOUND IN MEDICINE & BIOLOGY 2019; 45:3137-3144. [PMID: 31563481 DOI: 10.1016/j.ultrasmedbio.2019.08.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/03/2019] [Revised: 07/09/2019] [Accepted: 08/23/2019] [Indexed: 06/10/2023]
Abstract
The purpose of this study was to investigate the diagnostic performance of the automated breast ultrasound system (ABUS) compared with hand-held ultrasonography (HHUS) and mammography (MG) for breast cancer in women aged 40 y or older. A total of 594 breasts in 385 patients were enrolled in the study. HHUS, ABUS and MG exams were performed for these patients. Follow-up and pathologic findings were used as the reference standard. Based on the reference standard, 519 units were benign or normal and 75 were malignant. The sensitivity, specificity, accuracy and Youden index were 97.33%, 89.79%, 90.74% and 0.87 for HHUS; 90.67%, 92.49%, 92.26% and 0.83 for ABUS; 84.00%, 92.87%, 91.75% and 0.77 for MG, respectively. The specificity of ABUS was significantly superior to that of HHUS (p = 0.024). The area under the receiver operating characteristic curve was 0.936 for HHUS, which was the highest, followed by 0.916 for ABUS and 0.884 for MG. However, the difference was not statistically significant (p > 0.05). In conclusion, the diagnostic performance of ABUS for breast cancer was equivalent to HHUS and MG and potentially can be used as an alternative method for breast cancer diagnosis.
Collapse
Affiliation(s)
- Li Zhang
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Ling-Yun Bao
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China.
| | - Yan-Juan Tan
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Luo-Qian Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Xiao-Jing Xu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Qing-Qing Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Yan-Na Shan
- Department of Radiology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Jing Zhao
- Department of Radiology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Le-Si Xie
- Department of Pathology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Jan Liu
- Department of Breast Surgery, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| |
Collapse
|
28
|
Niu L, Bao L, Zhu L, Tan Y, Xu X, Shan Y, Liu J, Zhu Q, Jiang C, Shen Y. Diagnostic Performance of Automated Breast Ultrasound in Differentiating Benign and Malignant Breast Masses in Asymptomatic Women: A Comparison Study With Handheld Ultrasound. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2019; 38:2871-2880. [PMID: 30912178 DOI: 10.1002/jum.14991] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/06/2018] [Revised: 02/22/2019] [Accepted: 02/24/2019] [Indexed: 06/09/2023]
Abstract
OBJECTIVES Our aim was to investigate the diagnostic potential of an automated breast ultrasound (ABUS) system in differentiating benign and malignant breast masses compared with handheld ultrasound (HHUS). METHODS Women were randomly and proportionally selected from outpatients and underwent both HHUS and ABUS examinations. Masses with final American College of Radiology Breast Imaging Reporting and Data System categories 2 and 3 were considered benign. Masses with final Breast Imaging Reporting and Data System categories 4 and 5 were considered malignant. The diagnosis was confirmed by pathologic results or at least a 1-year follow-up. Automated breast US and HHUS were compared on the basis of their sensitivity, specificity, positive predictive value, negative predictive value, and accuracy. Diagnostic consistency and areas under the receiver operating characteristic curves were analyzed. The maximum diameters of masses were compared among HHUS, ABUS, and pathologic results. RESULTS A total of 599 masses in 398 women were confirmed by pathologic results or at least a 1-year follow-up; 103 of 599 masses were malignant, and 496 were benign. There were no significant differences between ABUS and HHUS in terms of diagnostic accuracy (80.1% versus 80.6%), specificity (77.62% versus 80.24%), positive predictive value (46.12% versus 46.46%), and negative predictive value (97.96% versus 95.67%). There were significant differences in sensitivity (92.23% versus 82.52%; P < .01) and areas under the curve (0.85 versus 0.81; P < .05) between ABUS and HHUS. The correlation of the maximum diameter was slightly higher between ABUS and pathologic results (r = 0.885) than between HHUS and pathologic results (r = 0.855), but the difference was not significant (P > .05). CONCLUSIONS Automated breast US is better than HHUS in differentiating benign and malignant breast masses, especially with respect to specificity.
Collapse
Affiliation(s)
- Lin Niu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Lingyun Bao
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Luoqian Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Yanjuan Tan
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Xiaojing Xu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Yanna Shan
- Department of Radiology, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Jian Liu
- Department of Breast Surgery, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Qingqing Zhu
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Chenxiang Jiang
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| | - Yingzhao Shen
- Department of Ultrasound, Affiliated Hangzhou First People's Hospital, Zhejiang University School of Medicine, Hangzhou, China
| |
Collapse
|
29
|
Singhvi A, Boyle KC, Fallahpour M, Khuri-Yakub BT, Arbabian A. A Microwave-Induced Thermoacoustic Imaging System With Non-Contact Ultrasound Detection. IEEE TRANSACTIONS ON ULTRASONICS, FERROELECTRICS, AND FREQUENCY CONTROL 2019; 66:1587-1599. [PMID: 31251184 DOI: 10.1109/tuffc.2019.2925592] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Portable and easy-to-use imaging systems are in high demand for medical, security screening, nondestructive testing, and sensing applications. We present a new microwave-induced thermoacoustic imaging system with non-contact, airborne ultrasound (US) detection. In this system, a 2.7 GHz microwave excitation causes differential heating at interfaces with dielectric contrast, and the resulting US signal via the thermoacoustic effect travels out of the sample to the detector in air at a standoff. The 65 dB interface loss due to the impedance mismatch at the air-sample boundary is overcome with high-sensitivity capacitive micromachined ultrasonic transducers with minimum detectable pressures (MDPs) as low as 278 μ Pa rms and we explore two different designs-one operating at a center frequency of 71 kHz and another at a center frequency of 910 kHz. We further demonstrate that the air-sample interface presents a tradeoff with the advantage of improved resolution, as the change in wave velocity at the interface creates a strong focusing effect alongside the attenuation, resulting in axial resolutions more than 10× smaller than that predicted by the traditional speed/bandwidth limit. A piecewise synthetic aperture radar (SAR) algorithm modified for US imaging and enhanced with signal processing techniques is used for image reconstruction, resulting in mm-scale lateral and axial image resolution. Finally, measurements are conducted to verify simulations and demonstrate successful system performance.
Collapse
|
30
|
|
31
|
Yang H, Shan C, Pourtaherian A, Kolen AF, de With PHN. Catheter segmentation in three-dimensional ultrasound images by feature fusion and model fitting. J Med Imaging (Bellingham) 2019; 6:015001. [PMID: 30662926 DOI: 10.1117/1.jmi.6.1.015001] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 12/14/2018] [Indexed: 11/14/2022] Open
Abstract
Ultrasound (US) has been increasingly used during interventions, such as cardiac catheterization. To accurately identify the catheter inside US images, extra training for physicians and sonographers is needed. As a consequence, automated segmentation of the catheter in US images and optimized presentation viewing to the physician can be beneficial to accelerate the efficiency and safety of interventions and improve their outcome. For cardiac catheterization, a three-dimensional (3-D) US image is potentially attractive because of no radiation modality and richer spatial information. However, due to a limited spatial resolution of 3-D cardiac US and complex anatomical structures inside the heart, image-based catheter segmentation is challenging. We propose a cardiac catheter segmentation method in 3-D US data through image processing techniques. Our method first applies a voxel-based classification through newly designed multiscale and multidefinition features, which provide a robust catheter voxel segmentation in 3-D US. Second, a modified catheter model fitting is applied to segment the curved catheter in 3-D US images. The proposed method is validated with extensive experiments, using different in-vitro, ex-vivo, and in-vivo datasets. The proposed method can segment the catheter within an average tip-point error that is smaller than the catheter diameter (1.9 mm) in the volumetric images. Based on automated catheter segmentation and combined with optimal viewing, physicians do not have to interpret US images and can focus on the procedure itself to improve the quality of cardiac intervention.
Collapse
Affiliation(s)
- Hongxu Yang
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| | - Caifeng Shan
- Philips Research, In-Body Systems, Eindhoven, The Netherlands
| | - Arash Pourtaherian
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| | | | - Peter H N de With
- Eindhoven University of Technology, VCA Research Group, Eindhoven, The Netherlands
| |
Collapse
|
32
|
Chiang TC, Huang YS, Chen RT, Huang CS, Chang RF. Tumor Detection in Automated Breast Ultrasound Using 3-D CNN and Prioritized Candidate Aggregation. IEEE TRANSACTIONS ON MEDICAL IMAGING 2019; 38:240-249. [PMID: 30059297 DOI: 10.1109/tmi.2018.2860257] [Citation(s) in RCA: 59] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Automated whole breast ultrasound (ABUS) has been widely used as a screening modality for examination of breast abnormalities. Reviewing hundreds of slices produced by ABUS, however, is time consuming. Therefore, in this paper, a fast and effective computer-aided detection system based on 3-D convolutional neural networks (CNNs) and prioritized candidate aggregation is proposed to accelerate this reviewing. First, an efficient sliding window method is used to extract volumes of interest (VOIs). Then, each VOI is estimated the tumor probability with a 3-D CNN, and VOIs with higher estimated probability are selected as tumor candidates. Since the candidates may overlap each other, a novel scheme is designed to aggregate the overlapped candidates. During the aggregation, candidates are prioritized based on estimated tumor probability to alleviate over-aggregation issue. The relationship between the sizes of VOI and target tumor is optimally exploited to effectively perform each stage of our detection algorithm. On evaluation with a test set of 171 tumors, our method achieved sensitivities of 95% (162/171), 90% (154/171), 85% (145/171), and 80% (137/171) with 14.03, 6.92, 4.91, and 3.62 false positives per patient (with six passes), respectively. In summary, our method is more general and much faster than preliminary works and demonstrates promising results.
Collapse
|
33
|
Lei B, Huang S, Li R, Bian C, Li H, Chou YH, Cheng JZ. Segmentation of breast anatomy for automated whole breast ultrasound images with boundary regularized convolutional encoder–decoder network. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.09.043] [Citation(s) in RCA: 42] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
34
|
Rella R, Belli P, Giuliani M, Bufi E, Carlino G, Rinaldi P, Manfredi R. Automated Breast Ultrasonography (ABUS) in the Screening and Diagnostic Setting: Indications and Practical Use. Acad Radiol 2018; 25:1457-1470. [PMID: 29555568 DOI: 10.1016/j.acra.2018.02.014] [Citation(s) in RCA: 61] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Revised: 02/10/2018] [Accepted: 02/11/2018] [Indexed: 10/17/2022]
Abstract
Automated breast ultrasonography (ABUS) is a new imaging technology for automatic breast scanning through ultrasound. It was first developed to overcome the limitation of operator dependency and lack of standardization and reproducibility of handheld ultrasound. ABUS provides a three-dimensional representation of breast tissue and allows images reformatting in three planes, and the generated coronal plane has been suggested to improve diagnostic accuracy. This technique has been first used in the screening setting to improve breast cancer detection, especially in mammographically dense breasts. In recent years, numerous studies also evaluated its use in the diagnostic setting: they showed its suitability for breast cancer staging, evaluation of tumor response to neoadjuvant chemotherapy, and second-look ultrasound after magnetic resonance imaging. The purpose of this article is to provide a comprehensive review of the current body of literature about the clinical performance of ABUS, summarize available evidence, and identify gaps in knowledge for future research.
Collapse
|
35
|
Zhang J, Dashtbozorg B, Huang F, Tan T, ter Haar Romeny BM. A fully automated pipeline of extracting biomarkers to quantify vascular changes in retina-related diseases. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION 2018. [DOI: 10.1080/21681163.2018.1519851] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Affiliation(s)
- Jiong Zhang
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, the Netherlands
| | - Behdad Dashtbozorg
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, the Netherlands
| | - Fan Huang
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, the Netherlands
| | - Tao Tan
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, the Netherlands
| | - B. M. ter Haar Romeny
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, the Netherlands
| |
Collapse
|
36
|
Tan T, Li Z, Liu H, Zanjani FG, Ouyang Q, Tang Y, Hu Z, Li Q. Optimize Transfer Learning for Lung Diseases in Bronchoscopy Using a New Concept: Sequential Fine-Tuning. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2018; 6:1800808. [PMID: 30324036 PMCID: PMC6175035 DOI: 10.1109/jtehm.2018.2865787] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/27/2018] [Revised: 08/01/2018] [Accepted: 08/03/2018] [Indexed: 12/20/2022]
Abstract
Bronchoscopy inspection, as a follow-up procedure next to the radiological imaging, plays a key role in the diagnosis and treatment design for lung disease patients. When performing bronchoscopy, doctors have to make a decision immediately whether to perform a biopsy. Because biopsies may cause uncontrollable and life-threatening bleeding of the lung tissue, thus doctors need to be selective with biopsies. In this paper, to help doctors to be more selective on biopsies and provide a second opinion on diagnosis, we propose a computer-aided diagnosis (CAD) system for lung diseases, including cancers and tuberculosis (TB). Based on transfer learning (TL), we propose a novel TL method on the top of DenseNet: sequential fine-tuning (SFT). Compared with traditional fine-tuning (FT) methods, our method achieves the best performance. In a data set of recruited 81 normal cases, 76 TB cases and 277 lung cancer cases, SFT provided an overall accuracy of 82% while other traditional TL methods achieved an accuracy from 70% to 74%. The detection accuracy of SFT for cancers, TB, and normal cases are 87%, 54%, and 91%, respectively. This indicates that the CAD system has the potential to improve lung disease diagnosis accuracy in bronchoscopy and it may be used to be more selective with biopsies.
Collapse
Affiliation(s)
- Tao Tan
- Department of Biomedical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands.,ScreenPoint Medical6512 ABNijmegenThe Netherlands
| | - Zhang Li
- College of Aerospace Science and EngineeringNational University of Defense TechnologyChangsha410073China
| | - Haixia Liu
- School Of Computer ScienceUniversity of Nottingham Malaysia Campus43500SemenyihMalaysia
| | - Farhad G Zanjani
- Department of Electrical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands
| | - Quchang Ouyang
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Yuling Tang
- First Hospital of Changsha CityChangsha410000China
| | - Zheyu Hu
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Qiang Li
- Department of Respiratory MedicineShanghai East HospitalTongji University School of MedicineShanghai200120China
| |
Collapse
|
37
|
Xu X, Bao L, Tan Y, Zhu L, Kong F, Wang W. 1000-Case Reader Study of Radiologists' Performance in Interpretation of Automated Breast Volume Scanner Images with a Computer-Aided Detection System. ULTRASOUND IN MEDICINE & BIOLOGY 2018; 44:1694-1702. [PMID: 29853222 DOI: 10.1016/j.ultrasmedbio.2018.04.020] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/16/2017] [Revised: 04/24/2018] [Accepted: 04/27/2018] [Indexed: 06/08/2023]
Abstract
The objective of our study was to assess, in a reader study, radiologists' performance in interpretation of automated breast volume scanner (ABVS) images with the aid of a computer-aided detection (CADe) system. Our study is a retrospective observer study with the purpose of investigating the effectiveness of using a CADe system as an aid for radiologists in interpretation of ABVS images. The multiple-reader, multiple-case study was designed to compare the diagnostic performance of radiologists with and without CADe. The study included 1000 cases selected from ABVS examinations in our institution in 2012. Among those cases were 206 malignant, 486 benign and 308 normal cases. The cancer cases were consecutive; the benign and normal cases were randomly selected. All malignant and benign cases were confirmed by biopsy or surgery, and normal cases were confirmed by 2-y follow-up. Reader performance was compared in terms of area under the receiver operating characteristic curve, sensitivity and specificity. Additionally, the reading time per case for each reader was recorded. Nine radiologists from our institution participated in the study. Three had more than 8 y of ultrasound experience and more than 4 y of ABVS experience (group A); 3 had more than 5 y of ultrasound experience (group B), and 3 had more than 1 y of ultrasound experience (group C). Both group B and group C had no ABVS experience. The CADe system used was the QVCAD System (QView Medical, Inc., Los Altos, CA, USA). It is designed to aid radiologists in searching for suspicious areas in ABVS images. CADe results are presented to the reader simultaneously with the ABVS images; that is, the radiologists read the ABVS images concurrently with the CADe results. The cases were randomly assigned for each reader into two equal-size groups, 1 and 2. Initially the readers read their group 1 cases with the aid of CADe and their group 2 cases without CADe. After a 1-mo washout period, they re-read their group 1 cases without CADe and their group 2 cases with CADe. The areas under the receiver operating characteristic curves of all readers were 0.784 for reading with CADe and 0.747 without CADe. Areas under the curves with and without CADe were 0.833 and 0.829 for group A, 0.757 and 0.696 for group B and 0.759 and 0.718 for group C. All differences in areas under the curve were statistically significant (p <0.05), except that for group A. The average reading time was 9.3% (p < < 0.05) faster with CADe for all readers. In summary, CADe improves radiologist performance with respect to both accuracy and reading time for the detection of breast cancer using the ABVS, with the greater benefit for those inexperienced with ABVS.
Collapse
Affiliation(s)
- Xiaojing Xu
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Lingyun Bao
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China.
| | - Yanjuan Tan
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Luoxi Zhu
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Fanlei Kong
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| | - Wei Wang
- Department of Ultrasound, First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Nanjing.261 Huansha Road, Hangzhou 310006, China
| |
Collapse
|
38
|
Kozegar E, Soryani M, Behnam H, Salamati M, Tan T. Mass Segmentation in Automated 3-D Breast Ultrasound Using Adaptive Region Growing and Supervised Edge-Based Deformable Model. IEEE TRANSACTIONS ON MEDICAL IMAGING 2018; 37:918-928. [PMID: 29610071 DOI: 10.1109/tmi.2017.2787685] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Automated 3-D breast ultrasound has been proposed as a complementary modality to mammography for early detection of breast cancers. To facilitate the interpretation of these images, computer aided detection systems are being developed in which mass segmentation is an essential component for feature extraction and temporal comparisons. However, automated segmentation of masses is challenging because of the large variety in shape, size, and texture of these 3-D objects. In this paper, the authors aim to develop a computerized segmentation system, which uses a seed position as the only priori of the problem. A two-stage segmentation approach has been proposed incorporating shape information of training masses. At the first stage, a new adaptive region growing algorithm is used to give a rough estimation of the mass boundary. The similarity threshold of the proposed algorithm is determined using a Gaussian mixture model based on the volume and circularity of the training masses. In the second stage, a novel geometric edge-based deformable model is introduced using the result of the first stage as the initial contour. In a data set of 50 masses, including 38 malignant and 12 benign lesions, the proposed segmentation method achieved a mean Dice of 0.74 ± 0.19 which outperformed the adaptive region growing with a mean Dice of 0.65 ± 0.2 (p-value < 0.02). Moreover, the resulting mean Dice was significantly (p-value < 0.001) better than that of the distance regularized level set evolution method (0.52 ± 0.27). The supervised method presented in this paper achieved accurate mass segmentation results in terms of Dice measure. The suggested segmentation method can be utilized in two aspects: 1) to automatically measure the change in volume of breast lesions over time and 2) to extract features for a computer aided detection or diagnosis system.
Collapse
|
39
|
Meiburger KM, Acharya UR, Molinari F. Automated localization and segmentation techniques for B-mode ultrasound images: A review. Comput Biol Med 2017; 92:210-235. [PMID: 29247890 DOI: 10.1016/j.compbiomed.2017.11.018] [Citation(s) in RCA: 66] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2017] [Revised: 11/30/2017] [Accepted: 11/30/2017] [Indexed: 12/14/2022]
Abstract
B-mode ultrasound imaging is used extensively in medicine. Hence, there is a need to have efficient segmentation tools to aid in computer-aided diagnosis, image-guided interventions, and therapy. This paper presents a comprehensive review on automated localization and segmentation techniques for B-mode ultrasound images. The paper first describes the general characteristics of B-mode ultrasound images. Then insight on the localization and segmentation of tissues is provided, both in the case in which the organ/tissue localization provides the final segmentation and in the case in which a two-step segmentation process is needed, due to the desired boundaries being too fine to locate from within the entire ultrasound frame. Subsequenly, examples of some main techniques found in literature are shown, including but not limited to shape priors, superpixel and classification, local pixel statistics, active contours, edge-tracking, dynamic programming, and data mining. Ten selected applications (abdomen/kidney, breast, cardiology, thyroid, liver, vascular, musculoskeletal, obstetrics, gynecology, prostate) are then investigated in depth, and the performances of a few specific applications are compared. In conclusion, future perspectives for B-mode based segmentation, such as the integration of RF information, the employment of higher frequency probes when possible, the focus on completely automatic algorithms, and the increase in available data are discussed.
Collapse
Affiliation(s)
- Kristen M Meiburger
- Biolab, Department of Electronics and Telecommunications, Politecnico di Torino, Torino, Italy
| | - U Rajendra Acharya
- Department of Electronic & Computer Engineering, Ngee Ann Polytechnic, Singapore; Department of Biomedical Engineering, School of Science and Technology, SUSS University, Singapore; Department of Biomedical Imaging, Faculty of Medicine, University of Malaya, Kuala Lumpur, Malaysia
| | - Filippo Molinari
- Biolab, Department of Electronics and Telecommunications, Politecnico di Torino, Torino, Italy.
| |
Collapse
|
40
|
Choi JH, Kang BJ, Baek JE, Lee HS, Kim SH. Application of computer-aided diagnosis in breast ultrasound interpretation: improvements in diagnostic performance according to reader experience. Ultrasonography 2017; 37:217-225. [PMID: 28992680 PMCID: PMC6044219 DOI: 10.14366/usg.17046] [Citation(s) in RCA: 53] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Accepted: 08/14/2017] [Indexed: 02/04/2023] Open
Abstract
Purpose The purpose of this study was to evaluate the usefulness of applying computer-aided diagnosis (CAD) to breast ultrasound (US), depending on the reader's experience with breast imaging. Methods Between October 2015 and January 2016, two experienced readers obtained and analyzed the grayscale US images of 200 cases according to the Breast Imaging Reporting and Data System (BI-RADS) lexicon and categories. They additionally applied CAD (S-Detect) to analyze the lesions and made a diagnostic decision subjectively, based on grayscale US with CAD. For the same cases, two inexperienced readers analyzed the grayscale US images using the BI-RADS lexicon and categories, added CAD, and came to a subjective diagnostic conclusion. We then compared the diagnostic performance depending on the reader's experience with breast imaging. Results The sensitivity values for the experienced readers, inexperienced readers, and CAD (for experienced and inexperienced readers) were 91.7%, 75.0%, 75.0%, and 66.7%, respectively. The specificity values for the experienced readers, inexperienced readers, and CAD (for experienced and inexperienced readers) were 76.6%, 71.8%, 78.2%, and 76.1%, respectively. When diagnoses were made subjectively in combination with CAD, the specificity significantly improved (76.6% to 80.3%) without a change in the sensitivity (91.7%) in the experienced readers. After subjective combination with CAD, both of the sensitivity and specificity improved in the inexperienced readers (75.0% to 83.3% and 71.8% to 77.1%). In addition, the area under the curve improved for both the experienced and inexperienced readers (0.84 to 0.86 and 0.73 to 0.80) after the addition of CAD. Conclusion CAD is more useful for less experienced readers. Combining CAD with breast US led to improved specificity for both experienced and inexperienced readers.
Collapse
Affiliation(s)
- Ji-Hye Choi
- Department of Radiology, Bucheon St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Bucheon, Korea
| | - Bong Joo Kang
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Ji Eun Baek
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Hyun Sil Lee
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Sung Hun Kim
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| |
Collapse
|
41
|
Kozegar E, Soryani M, Behnam H, Salamati M, Tan T. Breast cancer detection in automated 3D breast ultrasound using iso-contours and cascaded RUSBoosts. ULTRASONICS 2017; 79:68-80. [PMID: 28448836 DOI: 10.1016/j.ultras.2017.04.008] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2016] [Revised: 03/21/2017] [Accepted: 04/18/2017] [Indexed: 06/07/2023]
Abstract
Automated 3D breast ultrasound (ABUS) is a new popular modality as an adjunct to mammography for detecting cancers in women with dense breasts. In this paper, a multi-stage computer aided detection system is proposed to detect cancers in ABUS images. In the first step, an efficient despeckling method called OBNLM is applied on the images to reduce speckle noise. Afterwards, a new algorithm based on isocontours is applied to detect initial candidates as the boundary of masses is hypo echoic. To reduce false generated isocontours, features such as hypoechoicity, roundness, area and contour strength are used. Consequently, the resulted candidates are further processed by a cascade classifier whose base classifiers are Random Under-Sampling Boosting (RUSBoost) that are introduced to deal with imbalanced datasets. Each base classifier is trained on a group of features like Gabor, LBP, GLCM and other features. Performance of the proposed system was evaluated using 104 volumes from 74 patients, including 112 malignant lesions. According to Free Response Operating Characteristic (FROC) analysis, the proposed system achieved the region-based sensitivity and case-based sensitivity of 68% and 76% at one false positive per image.
Collapse
Affiliation(s)
- Ehsan Kozegar
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Mohsen Soryani
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Hamid Behnam
- School of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Masoumeh Salamati
- Department of Reproductive Imaging, Reproductive Biomedicine Research Center, Royan Institute for Reproductive Biomedicine, ACECR, Tehran, Iran
| | - Tao Tan
- Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen 6525 GA, The Netherlands.
| |
Collapse
|
42
|
Maier A, Heil J, Lauer A, Harcos A, Schaefgen B, von Au A, Spratte J, Riedel F, Rauch G, Hennigs A, Domschke C, Schott S, Rom J, Schuetz F, Sohn C, Golatta M. Inter-rater reliability and double reading analysis of an automated three-dimensional breast ultrasound system: comparison of two independent examiners. Arch Gynecol Obstet 2017; 296:571-582. [PMID: 28748340 DOI: 10.1007/s00404-017-4473-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2017] [Accepted: 07/21/2017] [Indexed: 10/19/2022]
Abstract
PURPOSE Breast ultrasound could be a valuable tool complementary to mammography in breast cancer screening. Automated 3D breast ultrasound (ABUS) addresses challenges of hand-held ultrasound and could allow double reading analysis of ultrasound images. This trial assesses the inter-rater reliability and double reading analysis of an ABUS system. METHODS To assess the reproducibility and diagnostic validity of the ABUS system, SomoV™, a blinded double reading analysis, was performed in 1019 patients (2038 breasts) by two examiners (examiner A/B) and compared to single reading results, as well as to the reference standard regarding its diagnostic validity. Cohen's kappa coefficients were calculated to measure the inter-rater reliability and agreement of the different diagnostic modalities. Patient comfort and time consumption for image acquisition and reading were analyzed descriptively as secondary objectives. RESULTS Analysis of inter-rater reliability yielded agreement in 81.6% (κ = 0.37; p < 0.0001) showing fair agreement. Single reading analysis of SomoV™ exams (examiner A/examiner B) compared to reference standard showed good specificity (examiner A: 88.3%/examiner B: 84.5%), fair inter-rater agreement (examiner A: κ = 0.31/examiner B: κ = 0.31), and adequate sensitivity (examiner A: 53.1%/examiner B: 64.2%). Double reading analysis yielded good sensitivity and specificity (73.7 and 77.7%). Mammography (n = 1911) alone detected 160 of 176 carcinomas (sensitivity 90.1%). Adding SomoV™ to mammography would have detected 12 additional carcinomas, resulting in a higher sensitivity of 97.7%. CONCLUSION SomoV™ is a promising technique with good sensitivity, high patient comfort, and fair inter-examiner reliability. It allows double reading analysis that, in combination with mammography, could increase detection rates in breast cancer screening.
Collapse
Affiliation(s)
- Anna Maier
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Joerg Heil
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Anna Lauer
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Aba Harcos
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Benedikt Schaefgen
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Alexandra von Au
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Julia Spratte
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Fabian Riedel
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Geraldine Rauch
- Institute of Medical Biometry and Informatics, University of Heidelberg, Heidelberg, Germany.,Institute of Medical Biometry and Epidemiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - André Hennigs
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Christoph Domschke
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Sarah Schott
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Joachim Rom
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Florian Schuetz
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Christof Sohn
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany
| | - Michael Golatta
- University Breast Unit, Department of Gynecology and Obstetrics, University of Heidelberg, Im Neuenheimer Feld 440, 69120, Heidelberg, Germany.
| |
Collapse
|
43
|
Wang X, Guo Y, Wang Y, Yu J. Automatic breast tumor detection in ABVS images based on convolutional neural network and superpixel patterns. Neural Comput Appl 2017. [DOI: 10.1007/s00521-017-3138-x] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
44
|
Jalalian A, Mashohor S, Mahmud R, Karasfi B, Saripan MIB, Ramli ARB. Foundation and methodologies in computer-aided diagnosis systems for breast cancer detection. EXCLI JOURNAL 2017; 16:113-137. [PMID: 28435432 PMCID: PMC5379115 DOI: 10.17179/excli2016-701] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2016] [Accepted: 01/05/2017] [Indexed: 12/15/2022]
Abstract
Breast cancer is the most prevalent cancer that affects women all over the world. Early detection and treatment of breast cancer could decline the mortality rate. Some issues such as technical reasons, which related to imaging quality and human error, increase misdiagnosis of breast cancer by radiologists. Computer-aided detection systems (CADs) are developed to overcome these restrictions and have been studied in many imaging modalities for breast cancer detection in recent years. The CAD systems improve radiologists' performance in finding and discriminating between the normal and abnormal tissues. These procedures are performed only as a double reader but the absolute decisions are still made by the radiologist. In this study, the recent CAD systems for breast cancer detection on different modalities such as mammography, ultrasound, MRI, and biopsy histopathological images are introduced. The foundation of CAD systems generally consist of four stages: Pre-processing, Segmentation, Feature extraction, and Classification. The approaches which applied to design different stages of CAD system are summarised. Advantages and disadvantages of different segmentation, feature extraction and classification techniques are listed. In addition, the impact of imbalanced datasets in classification outcomes and appropriate methods to solve these issues are discussed. As well as, performance evaluation metrics for various stages of breast cancer detection CAD systems are reviewed.
Collapse
Affiliation(s)
- Afsaneh Jalalian
- Department of Computer and Communication Systems Engineering, Faculty of Engineering, Universiti Putra, Malaysia
| | - Syamsiah Mashohor
- Department of Computer and Communication Systems Engineering, Faculty of Engineering, Universiti Putra, Malaysia
| | - Rozi Mahmud
- Department of Imaging, Faculty of Medicine and Health Science, Universiti Putra, Malaysia
| | - Babak Karasfi
- Department of Computer Engineering, Qazvin Branch, Islamic Azad University, Qazvin, Iran
| | - M. Iqbal B. Saripan
- Department of Computer and Communication Systems Engineering, Faculty of Engineering, Universiti Putra, Malaysia
| | - Abdul Rahman B. Ramli
- Department of Computer and Communication Systems Engineering, Faculty of Engineering, Universiti Putra, Malaysia
| |
Collapse
|
45
|
Improved cancer detection in automated breast ultrasound by radiologists using Computer Aided Detection. Eur J Radiol 2017; 89:54-59. [PMID: 28267549 DOI: 10.1016/j.ejrad.2017.01.021] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2016] [Revised: 11/08/2016] [Accepted: 01/18/2017] [Indexed: 11/24/2022]
Abstract
OBJECTIVE To investigate the effect of dedicated Computer Aided Detection (CAD) software for automated breast ultrasound (ABUS) on the performance of radiologists screening for breast cancer. METHODS 90 ABUS views of 90 patients were randomly selected from a multi-institutional archive of cases collected between 2010 and 2013. This dataset included normal cases (n=40) with >1year of follow up, benign (n=30) lesions that were either biopsied or remained stable, and malignant lesions (n=20). Six readers evaluated all cases with and without CAD in two sessions. CAD-software included conventional CAD-marks and an intelligent minimum intensity projection of the breast tissue. Readers reported using a likelihood-of-malignancy scale from 0 to 100. Alternative free-response ROC analysis was used to measure the performance. RESULTS Without CAD, the average area-under-the-curve (AUC) of the readers was 0.77 and significantly improved with CAD to 0.84 (p=0.001). Sensitivity of all readers improved (range 5.2-10.6%) by using CAD but specificity decreased in four out of six readers (range 1.4-5.7%). No significant difference was observed in the AUC between experienced radiologists and residents both with and without CAD. CONCLUSIONS Dedicated CAD-software for ABUS has the potential to improve the cancer detection rates of radiologists screening for breast cancer.
Collapse
|
46
|
Meel-van den Abeelen ASS, Weijers G, van Zelst JCM, Thijssen JM, Mann RM, de Korte CL. 3D quantitative breast ultrasound analysis for differentiating fibroadenomas and carcinomas smaller than 1cm. Eur J Radiol 2017; 88:141-147. [PMID: 28189199 DOI: 10.1016/j.ejrad.2017.01.006] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 09/02/2016] [Accepted: 01/05/2017] [Indexed: 11/15/2022]
Abstract
PURPOSE In (3D) ultrasound, accurate discrimination of small solid masses is difficult, resulting in a high frequency of biopsies for benign lesions. In this study, we investigate whether 3D quantitative breast ultrasound (3DQBUS) analysis can be used for improving non-invasive discrimination between benign and malignant lesions. METHODS AND MATERIALS 3D US studies of 112 biopsied solid breast lesions (size <1cm), were included (34 fibroadenomas and 78 invasive ductal carcinomas). The lesions were manually delineated and, based on sonographic criteria used by radiologists, 3 regions of interest were defined in 3D for analysis: ROI (ellipsoid covering the inside of the lesion), PER (peritumoural surrounding: 0.5mm around the lesion), and POS (posterior-tumoural acoustic phenomena: region below the lesion with the same size as delineated for the lesion). After automatic gain correction (AGC), the mean and standard deviation of the echo level within the regions were calculated. For the ROI and POS also the residual attenuation coefficient was estimated in decibel per cm [dB/cm]. The resulting eight features were used for classification of the lesions by a logistic regression analysis. The classification accuracy was evaluated by leave-one-out cross-validation. Receiver operating characteristic (ROC) curves were constructed to assess the performance of the classification. All lesions were delineated by two readers and results were compared to assess the effect of the manual delineation. RESULTS The area under the ROC curve was 0.86 for both readers. At 100% sensitivity, a specificity of 26% and 50% was achieved for reader 1 and 2, respectively. Inter-reader variability in lesion delineation was marginal and did not affect the accuracy of the technique. The area under the ROC curve of 0.86 was reached for the second reader when the results of the first reader were used as training set yielding a sensitivity of 100% and a specificity of 40%. Consequently, 3DQBUS would have achieved a 40% reduction in biopsies for benign lesions for reader 2, without a decrease in sensitivity. CONCLUSION This study shows that 3DQBUS is a promising technique to classify suspicious breast lesions as benign, potentially preventing unnecessary biopsies.
Collapse
Affiliation(s)
- A S S Meel-van den Abeelen
- Department of Biomechanical Engineering, MIRA-Institute, University of Twente, P.O. Box 217, 7500 AE Enschede, The Netherlands; Medical UltraSound Imaging Center (MUSIC), department of Radiology and Nuclear Medicine, Radboud University Medical Center, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands.
| | - G Weijers
- Medical UltraSound Imaging Center (MUSIC), department of Radiology and Nuclear Medicine, Radboud University Medical Center, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands
| | - J C M van Zelst
- Radboud University Nijmegen Medical Centre, Department of Radiology and Nuclear Medicine, PO Box 9101, 6500 HB Nijmegen, The Netherlands
| | - J M Thijssen
- Medical UltraSound Imaging Center (MUSIC), department of Radiology and Nuclear Medicine, Radboud University Medical Center, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands
| | - R M Mann
- Radboud University Nijmegen Medical Centre, Department of Radiology and Nuclear Medicine, PO Box 9101, 6500 HB Nijmegen, The Netherlands
| | - C L de Korte
- Medical UltraSound Imaging Center (MUSIC), department of Radiology and Nuclear Medicine, Radboud University Medical Center, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands
| |
Collapse
|
47
|
Burkett BJ, Hanemann CW. A Review of Supplemental Screening Ultrasound for Breast Cancer: Certain Populations of Women with Dense Breast Tissue May Benefit. Acad Radiol 2016; 23:1604-1609. [PMID: 27374700 DOI: 10.1016/j.acra.2016.05.017] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2016] [Revised: 05/15/2016] [Accepted: 05/17/2016] [Indexed: 10/21/2022]
Abstract
Breast density has been shown to be a strong, independent risk factor for breast cancer. Unfortunately, mammography is less accurate on dense breast tissue compared to fattier breast tissue. Multiple studies suggest a solution to this by demonstrating the ability of supplemental screening ultrasound to detect additional malignant lesions in women with dense breast tissue but negative mammography. In particular, supplemental screening ultrasound may be beneficial to women with dense breast tissue and intermediate or average risk for breast cancer, women in specific ethnic populations with greater prevalence of dense breast tissue, and women living in resource-poor healthcare environments. Although magnetic resonance imaging is currently recommended for women with high risk for breast cancer, not all women can access or tolerate a magnetic resonance imaging examination. Notably, ultrasound does not require intravenous gadolinium and may be an alternative for women with socioeconomic or medical restrictions, which limit their access to magnetic resonance imaging. Limitations of supplemental screening ultrasound include a substantial rate of false-positives, increased cost, and limited resource availability, particularly in regard to the time required for image interpretation. Additional clinical experience with this application of ultrasound, improved patient selection criteria, and new technology, such as the promising results seen with automated whole breast ultrasound, may address these limitations. In light of recent legislation in some states that has called for discussing supplemental imaging with patients who have dense breast tissue, the optimal role for supplemental screening ultrasound merits further exploration.
Collapse
|
48
|
Wang S, Cong Y, Fan H, Liu L, Li X, Yang Y, Tang Y, Zhao H, Yu H. Computer-Aided Endoscopic Diagnosis Without Human-Specific Labeling. IEEE Trans Biomed Eng 2016; 63:2347-2358. [DOI: 10.1109/tbme.2016.2530141] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
49
|
Lo CM, Chan SW, Yang YW, Chang YC, Huang CS, Jou YS, Chang RF. Feasibility Testing: Three-dimensional Tumor Mapping in Different Orientations of Automated Breast Ultrasound. ULTRASOUND IN MEDICINE & BIOLOGY 2016; 42:1201-1210. [PMID: 26825468 DOI: 10.1016/j.ultrasmedbio.2015.12.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/21/2015] [Revised: 11/20/2015] [Accepted: 12/02/2015] [Indexed: 06/05/2023]
Abstract
A tumor-mapping algorithm was proposed to identify the same regions in different passes of automated breast ultrasound (ABUS). A total of 53 abnormal passes with 41 biopsy-proven tumors and 13 normal passes were collected. After computer-aided tumor detection, a mapping pair was composed of a detected region in one pass and another region in another pass. Location criteria, including the radial position as on a clock, the relative distance and the distance to the nipple, were used to extract mapping pairs with close regions. Quantitative intensity, morphology, texture and location features were then combined in a classifier for further classification. The performance of the classifier achieved a mapping rate of 80.39% (41/51), with an error rate of 5.97% (4/67). The trade-offs between the mapping and error rates were evaluated, and Az = 0.9094 was obtained. The proposed tumor-mapping algorithm was capable of automatically providing location correspondence information that would be helpful in reviews of ABUS examinations.
Collapse
Affiliation(s)
- Chung-Ming Lo
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei, Taiwan; Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan
| | - Si-Wa Chan
- Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan; Department of Radiology, Taichung Veterans General Hospital, Taichung, Taiwan
| | - Ya-Wen Yang
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Yeun-Chung Chang
- Department of Medical Imaging, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan
| | - Chiun-Sheng Huang
- Department of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan.
| | - Yi-Sheng Jou
- Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan
| | - Ruey-Feng Chang
- Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan; Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan.
| |
Collapse
|
50
|
Deng Y, Liu W, Jago J. A hierarchical model for automated breast lesion detection from ultrasound 3D data. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2016; 2015:145-8. [PMID: 26736221 DOI: 10.1109/embc.2015.7318321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Ultrasound imaging plays an important role in breast cancer screening for which early and accurate lesion detection is crucial for clinical practice. Many researches were performed on supporting the breast lesion detection based on ultrasound data. In the paper, a novel hierarchical model is proposed to automatically detect breast lesion from ultrasound 3D data. The model simultaneously considers the data information from low-level to high-level for the detection by processing with a joint probability. For each layer of the model, the corresponding algorithm is performed to denote the certain level image information. A dynamic programming approach is applied to efficiently obtain the optimal solution. With a preliminary dataset, the superior performance of the proposed model has been demonstrated for the automated detection of breast lesion with 0.375 false positive per case at 91.7% sensitivity.
Collapse
|