1
|
Guo Y, Zhang H, Yuan L, Chen W, Zhao H, Yu QQ, Shi W. Machine learning and new insights for breast cancer diagnosis. J Int Med Res 2024; 52:3000605241237867. [PMID: 38663911 PMCID: PMC11047257 DOI: 10.1177/03000605241237867] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Accepted: 02/21/2024] [Indexed: 04/28/2024] Open
Abstract
Breast cancer (BC) is the most prominent form of cancer among females all over the world. The current methods of BC detection include X-ray mammography, ultrasound, computed tomography, magnetic resonance imaging, positron emission tomography and breast thermographic techniques. More recently, machine learning (ML) tools have been increasingly employed in diagnostic medicine for its high efficiency in detection and intervention. The subsequent imaging features and mathematical analyses can then be used to generate ML models, which stratify, differentiate and detect benign and malignant breast lesions. Given its marked advantages, radiomics is a frequently used tool in recent research and clinics. Artificial neural networks and deep learning (DL) are novel forms of ML that evaluate data using computer simulation of the human brain. DL directly processes unstructured information, such as images, sounds and language, and performs precise clinical image stratification, medical record analyses and tumour diagnosis. Herein, this review thoroughly summarizes prior investigations on the application of medical images for the detection and intervention of BC using radiomics, namely DL and ML. The aim was to provide guidance to scientists regarding the use of artificial intelligence and ML in research and the clinic.
Collapse
Affiliation(s)
- Ya Guo
- Department of Oncology, Jining No.1 People’s Hospital, Shandong First Medical University, Jining, Shandong Province, China
| | - Heng Zhang
- Department of Laboratory Medicine, Shandong Daizhuang Hospital, Jining, Shandong Province, China
| | - Leilei Yuan
- Department of Oncology, Jining No.1 People’s Hospital, Shandong First Medical University, Jining, Shandong Province, China
| | - Weidong Chen
- Department of Oncology, Jining No.1 People’s Hospital, Shandong First Medical University, Jining, Shandong Province, China
| | - Haibo Zhao
- Department of Oncology, Jining No.1 People’s Hospital, Shandong First Medical University, Jining, Shandong Province, China
| | - Qing-Qing Yu
- Phase I Clinical Research Centre, Jining No.1 People’s Hospital, Shandong First Medical University, Jining, Shandong Province, China
| | - Wenjie Shi
- Molecular and Experimental Surgery, University Clinic for General-, Visceral-, Vascular- and Trans-Plantation Surgery, Medical Faculty University Hospital Magdeburg, Otto-von Guericke University, Magdeburg, Germany
| |
Collapse
|
2
|
Systematic analysis of changes in radiomics features during dynamic breast-MRI: Evaluation of specific biomarkers. Clin Imaging 2022; 93:93-102. [DOI: 10.1016/j.clinimag.2022.10.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 10/17/2022] [Indexed: 11/19/2022]
|
3
|
Wang S, Liu H, Yang T, Huang M, Zheng B, Wu T, Han L, Zhang Y, Ren J. Machine learning based on automated breast volume scanner ( ABVS) radiomics for differential diagnosis of benign and malignant BI‐RADS 4 lesions. INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY 2022; 32:1577-1587. [DOI: 10.1002/ima.22724] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2021] [Accepted: 02/25/2022] [Indexed: 09/11/2023]
Abstract
AbstractBI‐RADS category 4 represents possibly malignant lesions and biopsy is recommended to distinguish benign and malignant. However, studies revealed that up to 67%–78% of BI‐RADS 4 lesions proved to be benign, but received unnecessary biopsies, which may cause unnecessary anxiety and discomfort to patients and increase the burden on the healthcare system. In this prospective study, machine learning (ML) based on the emerging breast ultrasound technology‐automated breast volume scanner (ABVS) was constructed to distinguish benign and malignant BI‐RADS 4 lesions and compared with different experienced radiologists. A total of 223 pathologically confirmed BI‐RADS 4 lesions were recruited and divided into training and testing cohorts. Radiomics features were extracted from axial, sagittal, and coronal ABVS images for each lesion. Seven feature selection methods and 13 ML algorithms were used to construct different ML pipelines, of which the DNN‐RFE (combination of recursive feature elimination and deep neural networks) had the best performance in both training and testing cohorts. The AUC value of the DNN‐RFE was significantly higher than less experienced radiologist at Delong's test (0.954 vs. 0.776, p = 0.004). Additionally, the accuracy, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of the DNN‐RFE were 88.9%, 83.3%, 95.2%, 83.3%, and 95.2%, which also significantly better than less experienced radiologist at McNemar's test (p = 0.043). Therefore, ML based on ABVS radiomics may be a potential method to non‐invasively distinguish benign and malignant BI‐RADS 4 lesions.
Collapse
Affiliation(s)
- Shi‐jie Wang
- Department of Medical Ultrasonics The Third Affiliated Hospital of Sun Yat‐sen University Guangzhou China
| | - Hua‐qing Liu
- Artificial Intelligence Innovation Center Research Institute of Tsinghua Guangzhou China
| | - Tao Yang
- Department of Ultrasound The Affiliated Hospital of Southwest Medical University Sichuan China
| | - Ming‐quan Huang
- Department of Breast Surgery The Affiliated Hospital of Southwest Medical University Sichuan China
| | - Bo‐wen Zheng
- Department of Medical Ultrasonics The Third Affiliated Hospital of Sun Yat‐sen University Guangzhou China
| | - Tao Wu
- Department of Medical Ultrasonics The Third Affiliated Hospital of Sun Yat‐sen University Guangzhou China
| | - Lan‐qing Han
- Artificial Intelligence Innovation Center Research Institute of Tsinghua Guangzhou China
| | - Yong Zhang
- Department of Nuclear Medicine The Third Affiliated Hospital of Sun Yat‐sen University Guangzhou China
| | - Jie Ren
- Department of Medical Ultrasonics The Third Affiliated Hospital of Sun Yat‐sen University Guangzhou China
| |
Collapse
|
4
|
Lan Z, Peng Y. Artificial intelligence diagnosis based on breast ultrasound imaging. ZHONG NAN DA XUE XUE BAO. YI XUE BAN = JOURNAL OF CENTRAL SOUTH UNIVERSITY. MEDICAL SCIENCES 2022; 47:1009-1015. [PMID: 36097768 PMCID: PMC10950100 DOI: 10.11817/j.issn.1672-7347.2022.220110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Indexed: 06/15/2023]
Abstract
Breast cancer has now become the leading cancer in women. The development of breast ultrasound artificial intelligence (AI) diagnostic technology is conducive to promoting the precise diagnosis and treatment of breast cancer and alleviating the heavy medical burden due to the unbalanced regional development in China. In recent years, on the basis of improving diagnostic efficiency, AI technology has been continuously combined with various clinical application scenarios, thereby providing more comprehensive and reliable evidence-based suggestions for clinical decision-making. Although AI diagnostic technologies based on conventional breast ultrasound gray-scale images and cutting-edge technologies such as three-dimensional (3D) imaging and elastography have been developed to some extent, there are still technical pain points, diffusion difficulties and ethical dilemmas in the development of AI diagnostic technologies for breast ultrasound.
Collapse
Affiliation(s)
- Zihan Lan
- Department of Ultrasound, West China Hospital, Sichuan University, Chengdu 610000, China.
| | - Yulan Peng
- Department of Ultrasound, West China Hospital, Sichuan University, Chengdu 610000, China.
| |
Collapse
|
5
|
Landsmann A, Ruppert C, Wieler J, Hejduk P, Ciritsis A, Borkowski K, Wurnig MC, Rossi C, Boss A. Radiomics in photon-counting dedicated breast CT: potential of texture analysis for breast density classification. Eur Radiol Exp 2022; 6:30. [PMID: 35854186 PMCID: PMC9296720 DOI: 10.1186/s41747-022-00285-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2021] [Accepted: 05/12/2022] [Indexed: 11/10/2022] Open
Abstract
Background We investigated whether features derived from texture analysis (TA) can distinguish breast density (BD) in spiral photon-counting breast computed tomography (PC-BCT). Methods In this retrospective single-centre study, we analysed 10,000 images from 400 PC-BCT examinations of 200 patients. Images were categorised into four-level density scale (a–d) using Breast Imaging Reporting and Data System (BI-RADS)-like criteria. After manual definition of representative regions of interest, 19 texture features (TFs) were calculated to analyse the voxel grey-level distribution in the included image area. ANOVA, cluster analysis, and multinomial logistic regression statistics were used. A human readout then was performed on a subset of 60 images to evaluate the reliability of the proposed feature set. Results Of the 19 TFs, 4 first-order features and 7 second-order features showed significant correlation with BD and were selected for further analysis. Multinomial logistic regression revealed an overall accuracy of 80% for BD assessment. The majority of TFs systematically increased or decreased with BD. Skewness (rho -0.81), as a first-order feature, and grey-level nonuniformity (GLN, -0.59), as a second-order feature, showed the strongest correlation with BD, independently of other TFs. Mean skewness and GLN decreased linearly from density a to d. Run-length nonuniformity (RLN), as a second-order feature, showed moderate correlation with BD, but resulted in redundant being correlated with GLN. All other TFs showed only weak correlation with BD (range -0.49 to 0.49, p < 0.001) and were neglected. Conclusion TA of PC-BCT images might be a useful approach to assess BD and may serve as an observer-independent tool.
Collapse
Affiliation(s)
- Anna Landsmann
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland.
| | - Carlotta Ruppert
- Institute of Computational Physics, Zurich University of Applied Sciences, Zurich, Switzerland
| | - Jann Wieler
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| | - Patryk Hejduk
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| | - Karol Borkowski
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| | - Moritz C Wurnig
- Institute of Diagnostic Radiology, Hospital Lachen AG, Lachen, Switzerland
| | - Cristina Rossi
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| | - Andreas Boss
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Zurich, Switzerland
| |
Collapse
|
6
|
Abstract
Machine learning (ML) methods are pervading an increasing number of fields of application because of their capacity to effectively solve a wide variety of challenging problems. The employment of ML techniques in ultrasound imaging applications started several years ago but the scientific interest in this issue has increased exponentially in the last few years. The present work reviews the most recent (2019 onwards) implementations of machine learning techniques for two of the most popular ultrasound imaging fields, medical diagnostics and non-destructive evaluation. The former, which covers the major part of the review, was analyzed by classifying studies according to the human organ investigated and the methodology (e.g., detection, segmentation, and/or classification) adopted, while for the latter, some solutions to the detection/classification of material defects or particular patterns are reported. Finally, the main merits of machine learning that emerged from the study analysis are summarized and discussed.
Collapse
|
7
|
Hejduk P, Marcon M, Unkelbach J, Ciritsis A, Rossi C, Borkowski K, Boss A. Fully automatic classification of automated breast ultrasound (ABUS) imaging according to BI-RADS using a deep convolutional neural network. Eur Radiol 2022; 32:4868-4878. [PMID: 35147776 PMCID: PMC9213284 DOI: 10.1007/s00330-022-08558-0] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 12/14/2021] [Accepted: 12/26/2021] [Indexed: 12/15/2022]
Abstract
Purpose The aim of this study was to develop and test a post-processing technique for detection and classification of lesions according to the BI-RADS atlas in automated breast ultrasound (ABUS) based on deep convolutional neural networks (dCNNs). Methods and materials In this retrospective study, 645 ABUS datasets from 113 patients were included; 55 patients had lesions classified as high malignancy probability. Lesions were categorized in BI-RADS 2 (no suspicion of malignancy), BI-RADS 3 (probability of malignancy < 3%), and BI-RADS 4/5 (probability of malignancy > 3%). A deep convolutional neural network was trained after data augmentation with images of lesions and normal breast tissue, and a sliding-window approach for lesion detection was implemented. The algorithm was applied to a test dataset containing 128 images and performance was compared with readings of 2 experienced radiologists. Results Results of calculations performed on single images showed accuracy of 79.7% and AUC of 0.91 [95% CI: 0.85–0.96] in categorization according to BI-RADS. Moderate agreement between dCNN and ground truth has been achieved (κ: 0.57 [95% CI: 0.50–0.64]) what is comparable with human readers. Analysis of whole dataset improved categorization accuracy to 90.9% and AUC of 0.91 [95% CI: 0.77–1.00], while achieving almost perfect agreement with ground truth (κ: 0.82 [95% CI: 0.69–0.95]), performing on par with human readers. Furthermore, the object localization technique allowed the detection of lesion position slice-wise. Conclusions Our results show that a dCNN can be trained to detect and distinguish lesions in ABUS according to the BI-RADS classification with similar accuracy as experienced radiologists. Key Points • A deep convolutional neural network (dCNN) was trained for classification of ABUS lesions according to the BI-RADS atlas. • A sliding-window approach allows accurate automatic detection and classification of lesions in ABUS examinations.
Collapse
Affiliation(s)
- Patryk Hejduk
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
| | - Magda Marcon
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Jan Unkelbach
- Department of Radiation Oncology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Cristina Rossi
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Karol Borkowski
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Andreas Boss
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| |
Collapse
|
8
|
Dietzel M, Clauser P, Kapetas P, Schulz-Wendtland R, Baltzer PAT. Images Are Data: A Breast Imaging Perspective on a Contemporary Paradigm. ROFO-FORTSCHR RONTG 2021; 193:898-908. [PMID: 33535260 DOI: 10.1055/a-1346-0095] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
BACKGROUND Considering radiological examinations not as mere images, but as a source of data, has become the key paradigm in the diagnostic imaging field. This change of perspective is particularly popular in breast imaging. It allows breast radiologists to apply algorithms derived from computer science, to realize innovative clinical applications, and to refine already established methods. In this context, the terminology "imaging biomarker", "radiomics", and "artificial intelligence" are of pivotal importance. These methods promise noninvasive, low-cost (e. g., in comparison to multigene arrays), and workflow-friendly (automated, only one examination, instantaneous results, etc.) delivery of clinically relevant information. METHODS AND RESULTS This paper is designed as a narrative review on the previously mentioned paradigm. The focus is on key concepts in breast imaging and important buzzwords are explained. For all areas of breast imaging, exemplary studies and potential clinical use cases are discussed. CONCLUSION Considering radiological examination as a source of data may optimize patient management by guiding individualized breast cancer diagnosis and oncologic treatment in the age of precision medicine. KEY POINTS · In conventional breast imaging, examinations are interpreted based on patterns perceivable by visual inspection.. · The radiomics paradigm treats breast images as a source of data, containing information beyond what is visible to our eyes.. · This results in radiomic signatures that may be considered as imaging biomarkers, as they provide diagnostic, predictive, and prognostic information.. · Radiomics derived imaging biomarkers may be used to individualize breast cancer treatment in the era of precision medicine.. · The concept and key research of radiomics in the field of breast imaging will be discussed in this narrative review.. CITATION FORMAT · Dietzel M, Clauser P, Kapetas P et al. Images Are Data: A Breast Imaging Perspective on a Contemporary Paradigm. Fortschr Röntgenstr 2021; 193: 898 - 908.
Collapse
Affiliation(s)
| | - Paola Clauser
- Department of Biomedical Imaging and Image-Guided Therapy, Division of Molecular and Gender Imaging, Medical University Vienna, Vienna, Austria
| | - Panagiotis Kapetas
- Department of Biomedical Imaging and Image-Guided Therapy, Division of Molecular and Gender Imaging, Medical University Vienna, Vienna, Austria
| | | | - Pascal Andreas Thomas Baltzer
- Department of Biomedical Imaging and Image-Guided Therapy, Division of Molecular and Gender Imaging, Medical University Vienna, Vienna, Austria
| |
Collapse
|
9
|
Application of ultrasound artificial intelligence in the differential diagnosis between benign and malignant breast lesions of BI-RADS 4A. BMC Cancer 2020; 20:959. [PMID: 33008320 PMCID: PMC7532640 DOI: 10.1186/s12885-020-07413-z] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Accepted: 09/15/2020] [Indexed: 12/14/2022] Open
Abstract
Background The classification of Breast Imaging Reporting and Data System 4A (BI-RADS 4A) lesions is mostly based on the personal experience of doctors and lacks specific and clear classification standards. The development of artificial intelligence (AI) provides a new method for BI-RADS categorisation. We analysed the ultrasonic morphological and texture characteristics of BI-RADS 4A benign and malignant lesions using AI, and these ultrasonic characteristics of BI-RADS 4A benign and malignant lesions were compared to examine the value of AI in the differential diagnosis of BI-RADS 4A benign and malignant lesions. Methods A total of 206 lesions of BI-RADS 4A examined using ultrasonography were analysed retrospectively, including 174 benign lesions and 32 malignant lesions. All of the lesions were contoured manually, and the ultrasonic morphological and texture features of the lesions, such as circularity, height-to-width ratio, margin spicules, margin coarseness, margin indistinctness, margin lobulation, energy, entropy, grey mean, internal calcification and angle between the long axis of the lesion and skin, were calculated using grey level gradient co-occurrence matrix analysis. Differences between benign and malignant lesions of BI-RADS 4A were analysed. Results Significant differences in margin lobulation, entropy, internal calcification and ALS were noted between the benign group and malignant group (P = 0.013, 0.045, 0.045, and 0.002, respectively). The malignant group had more margin lobulations and lower entropy compared with the benign group, and the benign group had more internal calcifications and a greater angle between the long axis of the lesion and skin compared with the malignant group. No significant differences in circularity, height-to-width ratio, margin spicules, margin coarseness, margin indistinctness, energy, and grey mean were noted between benign and malignant lesions. Conclusions Compared with the naked eye, AI can reveal more subtle differences between benign and malignant BI-RADS 4A lesions. These results remind us carefully observation of the margin and the internal echo is of great significance. With the help of morphological and texture information provided by AI, doctors can make a more accurate judgment on such atypical benign and malignant lesions.
Collapse
|
10
|
Schawkat K, Ciritsis A, von Ulmenstein S, Honcharova-Biletska H, Jüngst C, Weber A, Gubler C, Mertens J, Reiner CS. Diagnostic accuracy of texture analysis and machine learning for quantification of liver fibrosis in MRI: correlation with MR elastography and histopathology. Eur Radiol 2020; 30:4675-4685. [PMID: 32270315 DOI: 10.1007/s00330-020-06831-8] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2020] [Revised: 03/11/2020] [Accepted: 03/24/2020] [Indexed: 12/25/2022]
Abstract
OBJECTIVES To compare the diagnostic accuracy of texture analysis (TA)-derived parameters combined with machine learning (ML) of non-contrast-enhanced T1w and T2w fat-saturated (fs) images with MR elastography (MRE) for liver fibrosis quantification. METHODS In this IRB-approved prospective study, liver MRIs of participants with suspected chronic liver disease who underwent liver biopsy between August 2015 and May 2018 were analyzed. Two readers blinded to clinical and histopathological findings performed TA. The participants were categorized into no or low-stage (0-2) and high-stage (3-4) fibrosis groups. Confusion matrices were calculated using a support vector machine combined with principal component analysis. The diagnostic accuracy of ML-based TA of liver fibrosis and MRE was assessed by area under the receiver operating characteristic curves (AUC). Histopathology served as reference standard. RESULTS A total of 62 consecutive participants (40 men; mean age ± standard deviation, 48 ± 13 years) were included. The accuracy of TA and ML on T1w was 85.7% (95% confidence interval [CI] 63.7-97.0) and 61.9% (95% CI 38.4-81.9) on T2w fs for classification of liver fibrosis into low-stage and high-stage fibrosis. The AUC for TA on T1w was similar to MRE (0.82 [95% CI 0.59-0.95] vs. 0.92 [95% CI 0.71-0.99], p = 0.41), while the AUC for T2w fs was significantly lower compared to MRE (0.57 [95% CI 0.34-0.78] vs. 0.92 [95% CI 0.71-0.99], p = 0.008). CONCLUSION Our results suggest that liver fibrosis can be quantified with TA-derived parameters of T1w when combined with a ML algorithm with similar accuracy compared to MRE. KEY POINTS • Liver fibrosis can be categorized into low-stage fibrosis (0-2) and high-stage fibrosis (3-4) using texture analysis-derived parameters of T1-weighted images with a machine learning approach. • For the differentiation of low-stage fibrosis and high-stage fibrosis, the diagnostic accuracy of texture analysis on T1-weighted images combined with a machine learning algorithm is similar compared to MR elastography.
Collapse
Affiliation(s)
- Khoschy Schawkat
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland.,Division of Abdominal Imaging, Department of Radiology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA.,University of Zurich, Zurich, Switzerland
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland.,University of Zurich, Zurich, Switzerland
| | - Sophie von Ulmenstein
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland.,University of Zurich, Zurich, Switzerland
| | - Hanna Honcharova-Biletska
- University of Zurich, Zurich, Switzerland.,Institute of Pathology and Molecular Pathology, University Hospital Zurich, Zurich, Switzerland
| | - Christoph Jüngst
- University of Zurich, Zurich, Switzerland.,Department of Gastroenterology and Hepatology, University Hospital Zurich, Zurich, Switzerland
| | - Achim Weber
- University of Zurich, Zurich, Switzerland.,Institute of Pathology and Molecular Pathology, University Hospital Zurich, Zurich, Switzerland
| | - Christoph Gubler
- University of Zurich, Zurich, Switzerland.,Department of Gastroenterology and Hepatology, University Hospital Zurich, Zurich, Switzerland
| | - Joachim Mertens
- University of Zurich, Zurich, Switzerland.,Department of Gastroenterology and Hepatology, University Hospital Zurich, Zurich, Switzerland
| | - Caecilia S Reiner
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Raemistrasse 100, 8091, Zurich, Switzerland. .,University of Zurich, Zurich, Switzerland.
| |
Collapse
|