1
|
Manh V, Jia X, Xue W, Xu W, Mei Z, Dong Y, Zhou J, Huang R, Ni D. An efficient framework for lesion segmentation in ultrasound images using global adversarial learning and region-invariant loss. Comput Biol Med 2024; 171:108137. [PMID: 38447499 DOI: 10.1016/j.compbiomed.2024.108137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2023] [Revised: 01/16/2024] [Accepted: 02/12/2024] [Indexed: 03/08/2024]
Abstract
Lesion segmentation in ultrasound images is an essential yet challenging step for early evaluation and diagnosis of cancers. In recent years, many automatic CNN-based methods have been proposed to assist this task. However, most modern approaches often lack capturing long-range dependencies and prior information making it difficult to identify the lesions with unfixed shapes, sizes, locations, and textures. To address this, we present a novel lesion segmentation framework that guides the model to learn the global information about lesion characteristics and invariant features (e.g., morphological features) of lesions to improve the segmentation in ultrasound images. Specifically, the segmentation model is guided to learn the characteristics of lesions from the global maps using an adversarial learning scheme with a self-attention-based discriminator. We argue that under such a lesion characteristics-based guidance mechanism, the segmentation model gets more clues about the boundaries, shapes, sizes, and positions of lesions and can produce reliable predictions. In addition, as ultrasound lesions have different textures, we embed this prior knowledge into a novel region-invariant loss to constrain the model to focus on invariant features for robust segmentation. We demonstrate our method on one in-house breast ultrasound (BUS) dataset and two public datasets (i.e., breast lesion (BUS B) and thyroid nodule from TNSCUI2020). Experimental results show that our method is specifically suitable for lesion segmentation in ultrasound images and can outperform the state-of-the-art approaches with Dice of 0.931, 0.906, and 0.876, respectively. The proposed method demonstrates that it can provide more important information about the characteristics of lesions for lesion segmentation in ultrasound images, especially for lesions with irregular shapes and small sizes. It can assist the current lesion segmentation models to better suit clinical needs.
Collapse
Affiliation(s)
- Van Manh
- Medical Ultrasound Image Computing (MUSIC) lab, School of Biomedical Engineering, Shenzhen University, Shenzhen, 518060, China
| | - Xiaohong Jia
- Department of Ultrasound Medicine, Ruijin Hospital, School of Medicine, Shanghai Jiaotong University, Shanghai, 200240, China
| | - Wufeng Xue
- Medical Ultrasound Image Computing (MUSIC) lab, School of Biomedical Engineering, Shenzhen University, Shenzhen, 518060, China
| | - Wenwen Xu
- Department of Ultrasound Medicine, Ruijin Hospital, School of Medicine, Shanghai Jiaotong University, Shanghai, 200240, China
| | - Zihan Mei
- Department of Ultrasound Medicine, Ruijin Hospital, School of Medicine, Shanghai Jiaotong University, Shanghai, 200240, China
| | - Yijie Dong
- Department of Ultrasound Medicine, Ruijin Hospital, School of Medicine, Shanghai Jiaotong University, Shanghai, 200240, China
| | - Jianqiao Zhou
- Department of Ultrasound Medicine, Ruijin Hospital, School of Medicine, Shanghai Jiaotong University, Shanghai, 200240, China.
| | - Ruobing Huang
- Medical Ultrasound Image Computing (MUSIC) lab, School of Biomedical Engineering, Shenzhen University, Shenzhen, 518060, China.
| | - Dong Ni
- Medical Ultrasound Image Computing (MUSIC) lab, School of Biomedical Engineering, Shenzhen University, Shenzhen, 518060, China.
| |
Collapse
|
2
|
Wang K, Liang S, Zhong S, Feng Q, Ning Z, Zhang Y. Breast ultrasound image segmentation: A coarse-to-fine fusion convolutional neural network. Med Phys 2021; 48:4262-4278. [PMID: 34053092 DOI: 10.1002/mp.15006] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2020] [Revised: 05/20/2021] [Accepted: 05/20/2021] [Indexed: 11/11/2022] Open
Abstract
PURPOSE Breast ultrasound (BUS) image segmentation plays a crucial role in computer-aided diagnosis systems for BUS examination, which are useful for improved accuracy of breast cancer diagnosis. However, such performance remains a challenging task owing to the poor image quality and large variations in the sizes, shapes, and locations of breast lesions. In this paper, we propose a new convolutional neural network with coarse-to-fine feature fusion to address the aforementioned challenges. METHODS The proposed fusion network consists of an encoder path, a decoder path, and a core fusion stream path (FSP). The encoder path is used to capture the context information, and the decoder path is used for localization prediction. The FSP is designed to generate beneficial aggregate feature representations (i.e., various-sized lesion features, aggregated coarse-to-fine information, and high-resolution edge characteristics) from the encoder and decoder paths, which are eventually used for accurate breast lesion segmentation. To better retain the boundary information and alleviate the effect of image noise, we input the superpixel image along with the original image to the fusion network. Furthermore, a weighted-balanced loss function was designed to address the problem of lesion regions having different sizes. We then conducted exhaustive experiments on three public BUS datasets to evaluate the proposed network. RESULTS The proposed method outperformed state-of-the-art (SOTA) segmentation methods on the three public BUS datasets, with average dice similarity coefficients of 84.71(±1.07), 83.76(±0.83), and 86.52(±1.52), average intersection-over-union values of 76.34(±1.50), 75.70(±0.98), and 77.86(±2.07), average sensitivities of 86.66(±1.82), 85.21(±1.98), and 87.21(±2.51), average specificities of 97.92(±0.46), 98.57(±0.19), and 99.42(±0.21), and average accuracies of 95.89(±0.57), 97.17(±0.3), and 98.51(±0.3). CONCLUSIONS The proposed fusion network could effectively segment lesions from BUS images, thereby presenting a new feature fusion strategy to handle challenging task of segmentation, while outperforming the SOTA segmentation methods. The code is publicly available at https://github.com/mniwk/CF2-NET.
Collapse
Affiliation(s)
- Ke Wang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Shujun Liang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Shengzhou Zhong
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Qianjin Feng
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Zhenyuan Ning
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| | - Yu Zhang
- School of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong, 510515, China.,Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong, 510515, China
| |
Collapse
|
3
|
Agarwal R, Diaz O, Lladó X, Gubern-Mérida A, Vilanova JC, Martí R. Lesion Segmentation in Automated 3D Breast Ultrasound: Volumetric Analysis. ULTRASONIC IMAGING 2018; 40:97-112. [PMID: 29182056 DOI: 10.1177/0161734617737733] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Mammography is the gold standard screening technique in breast cancer, but it has some limitations for women with dense breasts. In such cases, sonography is usually recommended as an additional imaging technique. A traditional sonogram produces a two-dimensional (2D) visualization of the breast and is highly operator dependent. Automated breast ultrasound (ABUS) has also been proposed to produce a full 3D scan of the breast automatically with reduced operator dependency, facilitating double reading and comparison with past exams. When using ABUS, lesion segmentation and tracking changes over time are challenging tasks, as the three-dimensional (3D) nature of the images makes the analysis difficult and tedious for radiologists. The goal of this work is to develop a semi-automatic framework for breast lesion segmentation in ABUS volumes which is based on the Watershed algorithm. The effect of different de-noising methods on segmentation is studied showing a significant impact ([Formula: see text]) on the performance using a dataset of 28 temporal pairs resulting in a total of 56 ABUS volumes. The volumetric analysis is also used to evaluate the performance of the developed framework. A mean Dice Similarity Coefficient of [Formula: see text] with a mean False Positive ratio [Formula: see text] has been obtained. The Pearson correlation coefficient between the segmented volumes and the corresponding ground truth volumes is [Formula: see text] ([Formula: see text]). Similar analysis, performed on 28 temporal (prior and current) pairs, resulted in a good correlation coefficient [Formula: see text] ([Formula: see text]) for prior and [Formula: see text] ([Formula: see text]) for current cases. The developed framework showed prospects to help radiologists to perform an assessment of ABUS lesion volumes, as well as to quantify volumetric changes during lesions diagnosis and follow-up.
Collapse
Affiliation(s)
- Richa Agarwal
- 1 Computer Vision and Robotics Institute (VICOROB), University of Girona, Girona, Spain
| | - Oliver Diaz
- 1 Computer Vision and Robotics Institute (VICOROB), University of Girona, Girona, Spain
| | - Xavier Lladó
- 1 Computer Vision and Robotics Institute (VICOROB), University of Girona, Girona, Spain
| | - Albert Gubern-Mérida
- 2 Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, The Netherlands
| | | | - Robert Martí
- 1 Computer Vision and Robotics Institute (VICOROB), University of Girona, Girona, Spain
| |
Collapse
|
4
|
An optimized non-local means filter using automated clustering based preclassification through gap statistics for speckle reduction in breast ultrasound images. APPLIED COMPUTING AND INFORMATICS 2018. [DOI: 10.1016/j.aci.2017.01.002] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
5
|
Xie X, Shi F, Niu J, Tang X. Breast Ultrasound Image Classification and Segmentation Using Convolutional Neural Networks. ADVANCES IN MULTIMEDIA INFORMATION PROCESSING – PCM 2018 2018. [DOI: 10.1007/978-3-030-00764-5_19] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
6
|
Lal M, Kaur L, Gupta S. Automatic segmentation of tumors in B-Mode breast ultrasound images using information gain based neutrosophic clustering. JOURNAL OF X-RAY SCIENCE AND TECHNOLOGY 2018; 26:209-225. [PMID: 29154313 DOI: 10.3233/xst-17313] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
BACKGROUND Since breast ultrasound images are of low contrast, contain inherent noise and shadowing effect due to its imaging process, segmentation of breast tumors depicting ultrasound image is a challenging task. Thus, a robust breast ultrasound image segmentation technique is inevitable. OBJECTIVE To develop an automatic lesion segmentation technique for breast ultrasound images. METHODS First, the technique automatically detects the suspicious tumor region of interest and discards the unwanted complex background regions. Next, based on the concept of information gain, the technique applies an existing neutrosophic clustering method to the detected region to segment the desired tumor area. The proposed technique computes information gain values from the local neighbourhood of each pixel, which is further used to update the membership values and the cluster centers for the neutrosophic clustering process. Integrating the concept of entropy and neutrosophic logic features into the technique enabled to generate better segmentation results. RESULTS Results of proposed method were compared both qualitatively and quantitatively with fuzzy c-means, neutrosophic c-means and neutrosophic ℓ-means clustering methods. It was observed that the proposed method outperformed the other three methods and yielded the best Mean (TP: 94.72, FP: 5.85, SI: 93.75, HD: 8.2, AMED: 2.4) and Standard deviation (TP: 3.2, FP: 3.7, SI: 3.8, HD: 2.6, AMED: 1.3) values for different quality metrics on the current set of breast ultrasound images. CONCLUSION Study demonstrated that the proposed technique is robust to the shadowing effect and produces more accurate segmentation of the tumor region, which is very similar to that visually segmented by Radiologist.
Collapse
Affiliation(s)
- Madan Lal
- Department of Computer Engineering, Punjabi University, Patiala, India
| | - Lakhwinder Kaur
- Department of Computer Engineering, Punjabi University, Patiala, India
| | - Savita Gupta
- Department of Computer Science and Engineering, University Institute of Engineering and Technology, Panjab University, Chandigarh, India
| |
Collapse
|
7
|
Meiburger KM, Acharya UR, Molinari F. Automated localization and segmentation techniques for B-mode ultrasound images: A review. Comput Biol Med 2017; 92:210-235. [PMID: 29247890 DOI: 10.1016/j.compbiomed.2017.11.018] [Citation(s) in RCA: 66] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2017] [Revised: 11/30/2017] [Accepted: 11/30/2017] [Indexed: 12/14/2022]
Abstract
B-mode ultrasound imaging is used extensively in medicine. Hence, there is a need to have efficient segmentation tools to aid in computer-aided diagnosis, image-guided interventions, and therapy. This paper presents a comprehensive review on automated localization and segmentation techniques for B-mode ultrasound images. The paper first describes the general characteristics of B-mode ultrasound images. Then insight on the localization and segmentation of tissues is provided, both in the case in which the organ/tissue localization provides the final segmentation and in the case in which a two-step segmentation process is needed, due to the desired boundaries being too fine to locate from within the entire ultrasound frame. Subsequenly, examples of some main techniques found in literature are shown, including but not limited to shape priors, superpixel and classification, local pixel statistics, active contours, edge-tracking, dynamic programming, and data mining. Ten selected applications (abdomen/kidney, breast, cardiology, thyroid, liver, vascular, musculoskeletal, obstetrics, gynecology, prostate) are then investigated in depth, and the performances of a few specific applications are compared. In conclusion, future perspectives for B-mode based segmentation, such as the integration of RF information, the employment of higher frequency probes when possible, the focus on completely automatic algorithms, and the increase in available data are discussed.
Collapse
Affiliation(s)
- Kristen M Meiburger
- Biolab, Department of Electronics and Telecommunications, Politecnico di Torino, Torino, Italy
| | - U Rajendra Acharya
- Department of Electronic & Computer Engineering, Ngee Ann Polytechnic, Singapore; Department of Biomedical Engineering, School of Science and Technology, SUSS University, Singapore; Department of Biomedical Imaging, Faculty of Medicine, University of Malaya, Kuala Lumpur, Malaysia
| | - Filippo Molinari
- Biolab, Department of Electronics and Telecommunications, Politecnico di Torino, Torino, Italy.
| |
Collapse
|
8
|
Xiong H, Sultan LR, Cary TW, Schultz SM, Bouzghar G, Sehgal CM. The diagnostic performance of leak-plugging automated segmentation versus manual tracing of breast lesions on ultrasound images. ULTRASOUND : JOURNAL OF THE BRITISH MEDICAL ULTRASOUND SOCIETY 2017; 25:98-106. [PMID: 28567104 DOI: 10.1177/1742271x17690425] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/14/2016] [Accepted: 12/08/2016] [Indexed: 11/15/2022]
Abstract
PURPOSE To assess the diagnostic performance of a leak-plugging segmentation method that we have developed for delineating breast masses on ultrasound images. MATERIALS AND METHODS Fifty-two biopsy-proven breast lesion images were analyzed by three observers using the leak-plugging and manual segmentation methods. From each segmentation method, grayscale and morphological features were extracted and classified as malignant or benign by logistic regression analysis. The performance of leak-plugging and manual segmentations was compared by: size of the lesion, overlap area (Oa ) between the margins, and area under the ROC curves (Az ). RESULTS The lesion size from leak-plugging segmentation correlated closely with that from manual tracing (R2 of 0.91). Oa was higher for leak plugging, 0.92 ± 0.01 and 0.86 ± 0.06 for benign and malignant masses, respectively, compared to 0.80 ± 0.04 and 0.73 ± 0.02 for manual tracings. Overall Oa between leak-plugging and manual segmentations was 0.79 ± 0.14 for benign and 0.73 ± 0.14 for malignant lesions. Az for leak plugging was consistently higher (0.910 ± 0.003) compared to 0.888 ± 0.012 for manual tracings. The coefficient of variation of Az between three observers was 0.29% for leak plugging compared to 1.3% for manual tracings. CONCLUSION The diagnostic performance, size measurements, and observer variability for automated leak-plugging segmentations were either comparable to or better than those of manual tracings.
Collapse
Affiliation(s)
- Hui Xiong
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Laith R Sultan
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Theodore W Cary
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Susan M Schultz
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Ghizlane Bouzghar
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Chandra M Sehgal
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
9
|
Breast ultrasound image segmentation: a survey. Int J Comput Assist Radiol Surg 2017; 12:493-507. [DOI: 10.1007/s11548-016-1513-1] [Citation(s) in RCA: 69] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2016] [Accepted: 12/15/2016] [Indexed: 10/20/2022]
|
10
|
Tan T, Gubern-Mérida A, Borelli C, Manniesing R, van Zelst J, Wang L, Zhang W, Platel B, Mann RM, Karssemeijer N. Segmentation of malignant lesions in 3D breast ultrasound using a depth-dependent model. Med Phys 2016; 43:4074. [DOI: 10.1118/1.4953206] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
|
11
|
Elawady M, Sadek I, Shabayek AER, Pons G, Ganau S. Automatic Nonlinear Filtering and Segmentation for Breast Ultrasound Images. LECTURE NOTES IN COMPUTER SCIENCE 2016:206-213. [DOI: 10.1007/978-3-319-41501-7_24] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
|