1
|
Li J, Fu T, Song H, Fan J, Xiao D, Lin Y, Gu Y, Yang J. Embedding-Alignment Fusion-Based Graph Convolution Network With Mixed Learning Strategy for 4D Medical Image Reconstruction. IEEE J Biomed Health Inform 2024; 28:2916-2929. [PMID: 38437146 DOI: 10.1109/jbhi.2024.3365203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
In recent years, 4D medical image involving structural and motion information of tissue has attracted increasing attention. The key to the 4D image reconstruction is to stack the 2D slices based on matching the aligned motion states. In this study, the distribution of the 2D slices with the different motion states is modeled as a manifold graph, and the reconstruction is turned to be the graph alignment. An embedding-alignment fusion-based graph convolution network (GCN) with a mixed-learning strategy is proposed to align the graphs. Herein, the embedding and alignment processes of graphs interact with each other to realize a precise alignment with retaining the manifold distribution. The mixed strategy of self- and semi-supervised learning makes the alignment sparse to avoid the mismatching caused by outliers in the graph. In the experiment, the proposed 4D reconstruction approach is validated on the different modalities including Computed Tomography (CT), Magnetic Resonance Imaging (MRI), and Ultrasound (US). We evaluate the reconstruction accuracy and compare it with those of state-of-the-art methods. The experiment results demonstrate that our approach can reconstruct a more accurate 4D image.
Collapse
|
2
|
Bengs M, Sprenger J, Gerlach S, Neidhardt M, Schlaefer A. Real-Time Motion Analysis With 4D Deep Learning for Ultrasound-Guided Radiotherapy. IEEE Trans Biomed Eng 2023; 70:2690-2699. [PMID: 37030809 DOI: 10.1109/tbme.2023.3262422] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023]
Abstract
Motion compensation in radiation therapy is a challenging scenario that requires estimating and forecasting motion of tissue structures to deliver the target dose. Ultrasound offers direct imaging of tissue in real-time and is considered for image guidance in radiation therapy. Recently, fast volumetric ultrasound has gained traction, but motion analysis with such high-dimensional data remains difficult. While deep learning could bring many advantages, such as fast data processing and high performance, it remains unclear how to process sequences of hundreds of image volumes efficiently and effectively. We present a 4D deep learning approach for real-time motion estimation and forecasting using long-term 4D ultrasound data. Using motion traces acquired during radiation therapy combined with various tissue types, our results demonstrate that long-term motion estimation can be performed markerless with a tracking error of 0.35±0.2 mm and with an inference time of less than 5 ms. Also, we demonstrate forecasting directly from the image data up to 900 ms into the future. Overall, our findings highlight that 4D deep learning is a promising approach for motion analysis during radiotherapy.
Collapse
|
3
|
Dai X, Lei Y, Roper J, Chen Y, Bradley JD, Curran WJ, Liu T, Yang X. Deep learning-based motion tracking using ultrasound images. Med Phys 2021; 48:7747-7756. [PMID: 34724712 PMCID: PMC11742242 DOI: 10.1002/mp.15321] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Revised: 10/13/2021] [Accepted: 10/22/2021] [Indexed: 12/25/2022] Open
Abstract
PURPOSE Ultrasound (US) imaging is an established imaging modality capable of offering video-rate volumetric images without ionizing radiation. It has the potential for intra-fraction motion tracking in radiation therapy. In this study, a deep learning-based method has been developed to tackle the challenges in motion tracking using US imaging. METHODS We present a Markov-like network, which is implemented via generative adversarial networks, to extract features from sequential US frames (one tracked frame followed by untracked frames) and thereby estimate a set of deformation vector fields (DVFs) through the registration of the tracked frame and the untracked frames. The positions of the landmarks in the untracked frames are finally determined by shifting landmarks in the tracked frame according to the estimated DVFs. The performance of the proposed method was evaluated on the testing dataset by calculating the tracking error (TE) between the predicted and ground truth landmarks on each frame. RESULTS The proposed method was evaluated using the MICCAI CLUST 2015 dataset which was collected using seven US scanners with eight types of transducers and the Cardiac Acquisitions for Multi-structure Ultrasound Segmentation (CAMUS) dataset which was acquired using GE Vivid E95 ultrasound scanners. The CLUST dataset contains 63 2D and 22 3D US image sequences respectively from 42 and 18 subjects, and the CAMUS dataset includes 2D US images from 450 patients. On CLUST dataset, our proposed method achieved a mean tracking error of 0.70 ± 0.38 mm for the 2D sequences and 1.71 ± 0.84 mm for the 3D sequences for those public available annotations. And on CAMUS dataset, a mean tracking error of 0.54 ± 1.24 mm for the landmarks in the left atrium was achieved. CONCLUSIONS A novel motion tracking algorithm using US images based on modern deep learning techniques has been demonstrated in this study. The proposed method can offer millimeter-level tumor motion prediction in real time, which has the potential to be adopted into routine tumor motion management in radiation therapy.
Collapse
Affiliation(s)
- Xianjin Dai
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Justin Roper
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Yue Chen
- The Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University School of Medicine, Atlanta, Georgia, USA
| | - Jeffrey D. Bradley
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Walter J. Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, Georgia, USA
- The Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University School of Medicine, Atlanta, Georgia, USA
| |
Collapse
|
4
|
Sánchez-Margallo JA, Tas L, Moelker A, van den Dobbelsteen JJ, Sánchez-Margallo FM, Langø T, van Walsum T, van de Berg NJ. Block-matching-based registration to evaluate ultrasound visibility of percutaneous needles in liver-mimicking phantoms. Med Phys 2021; 48:7602-7612. [PMID: 34665885 PMCID: PMC9298012 DOI: 10.1002/mp.15305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 10/08/2021] [Accepted: 10/14/2021] [Indexed: 11/24/2022] Open
Abstract
Purpose To present a novel methodical approach to compare visibility of percutaneous needles in ultrasound images. Methods A motor‐driven rotation platform was used to gradually change the needle angle while capturing image data. Data analysis was automated using block‐matching‐based registration, with a tracking and refinement step. Every 25 frames, a Hough transform was used to improve needle alignments after large rotations. The method was demonstrated by comparing three commercial needles (14G radiofrequency ablation, RFA; 18G Trocar; 22G Chiba) and six prototype needles with different sizes, materials, and surface conditions (polished, sand‐blasted, and kerfed), within polyvinyl alcohol phantom tissue and ex vivo bovine liver models. For each needle and angle, a contrast‐to‐noise ratio (CNR) was determined to quantify visibility. CNR values are presented as a function of needle type and insertion angle. In addition, the normalized area under the (CNR‐angle) curve was used as a summary metric to compare needles. Results In phantom tissue, the first kerfed needle design had the largest normalized area of visibility and the polished 1 mm diameter stainless steel needle the smallest (0.704 ± 0.199 vs. 0.154 ± 0.027, p < 0.01). In the ex vivo model, the second kerfed needle design had the largest normalized area of visibility, and the sand‐blasted stainless steel needle the smallest (0.470 ± 0.190 vs. 0.127 ± 0.047, p < 0.001). As expected, the analysis showed needle visibility peaks at orthogonal insertion angles. For acute or obtuse angles, needle visibility was similar or reduced. Overall, the variability in needle visibility was considerably higher in livers. Conclusion The best overall visibility was found with kerfed needles and the commercial RFA needle. The presented methodical approach to quantify ultrasound visibility allows comparisons of (echogenic) needles, as well as other technological innovations aiming to improve ultrasound visibility of percutaneous needles, such as coatings, material treatments, and beam steering approaches.
Collapse
Affiliation(s)
- Juan A Sánchez-Margallo
- Bioengineering and Health Technologies Unit, Jesús Usón Minimally Invasive Surgery Centre, Cáceres, Spain
| | - Lisette Tas
- Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - Adriaan Moelker
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | | | | | | | - Theo van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, The Netherlands
| | - Nick J van de Berg
- Department of Radiology & Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
5
|
Huang P, Su L, Chen S, Cao K, Song Q, Kazanzides P, Iordachita I, Lediju Bell MA, Wong JW, Li D, Ding K. 2D ultrasound imaging based intra-fraction respiratory motion tracking for abdominal radiation therapy using machine learning. Phys Med Biol 2019; 64:185006. [PMID: 31323649 DOI: 10.1088/1361-6560/ab33db] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
We have previously developed a robotic ultrasound imaging system for motion monitoring in abdominal radiation therapy. Owing to the slow speed of ultrasound image processing, our previous system could only track abdominal motions under breath-hold. To overcome this limitation, a novel 2D-based image processing method for tracking intra-fraction respiratory motion is proposed. Fifty-seven different anatomical features acquired from 27 sets of 2D ultrasound sequences were used in this study. Three 2D ultrasound sequences were acquired with the robotic ultrasound system from three healthy volunteers. The remaining datasets were provided by the 2015 MICCAI Challenge on Liver Ultrasound Tracking. All datasets were preprocessed to extract the feature point, and a patient-specific motion pattern was extracted by principal component analysis and slow feature analysis (SFA). The tracking finds the most similar frame (or indexed frame) by a k-dimensional-tree-based nearest neighbor search for estimating the tracked object location. A template image was updated dynamically through the indexed frame to perform a fast template matching (TM) within a learned smaller search region on the incoming frame. The mean tracking error between manually annotated landmarks and the location extracted from the indexed training frame is 1.80 ± 1.42 mm. Adding a fast TM procedure within a small search region reduces the mean tracking error to 1.14 ± 1.16 mm. The tracking time per frame is 15 ms, which is well below the frame acquisition time. Furthermore, the anatomical reproducibility was measured by analyzing the location's anatomical landmark relative to the probe; the position-controlled probe has better reproducibility and yields a smaller mean error across all three volunteer cases, compared to the force-controlled probe (2.69 versus 11.20 mm in the superior-inferior direction and 1.19 versus 8.21 mm in the anterior-posterior direction). Our method reduces the processing time for tracking respiratory motion significantly, which can reduce the delivery uncertainty.
Collapse
Affiliation(s)
- Pu Huang
- Shandong Key Laboratory of Medical Physics and Image Processing, School of Physics and Electronics, Shandong Normal University, Jinan, Shandong, People's Republic of China. Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins School of Medicine, Baltimore, MD, United States of America. Authors contributed equally to this work
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
6
|
Banerjee J, Sun Y, Klink C, Gahrmann R, Niessen WJ, Moelker A, van Walsum T. Multiple-correlation similarity for block-matching based fast CT to ultrasound registration in liver interventions. Med Image Anal 2019; 53:132-141. [PMID: 30772666 DOI: 10.1016/j.media.2019.02.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2018] [Revised: 01/23/2019] [Accepted: 02/07/2019] [Indexed: 11/24/2022]
Abstract
In this work we present a fast approach to perform registration of computed tomography to ultrasound volumes for image guided intervention applications. The method is based on a combination of block-matching and outlier rejection. The block-matching uses a correlation based multimodal similarity metric, where the intensity and the gradient of the computed tomography images along with the ultrasound volumes are the input images to find correspondences between blocks in the computed tomography and the ultrasound volumes. A variance and octree based feature point-set selection method is used for selecting distinct and evenly spread point locations for block-matching. Geometric consistency and smoothness criteria are imposed in an outlier rejection step to refine the block-matching results. The block-matching results after outlier rejection are used to determine the affine transformation between the computed tomography and the ultrasound volumes. Various experiments are carried out to assess the optimal performance and the influence of parameters on accuracy and computational time of the registration. A leave-one-patient-out cross-validation registration error of 3.6 mm is achieved over 29 datasets, acquired from 17 patients.
Collapse
Affiliation(s)
- Jyotirmoy Banerjee
- Biomedical Imaging Group Rotterdam, Departments of Radiology & Nuclear Medicine and Medical Informatics, Erasmus MC - University Medical Center Rotterdam, The Netherlands
| | - Yuanyuan Sun
- Biomedical Imaging Group Rotterdam, Departments of Radiology & Nuclear Medicine and Medical Informatics, Erasmus MC - University Medical Center Rotterdam, The Netherlands
| | - Camiel Klink
- Department of Radiology & Nuclear Medicine, Erasmus MC - University Medical Center Rotterdam, The Netherlands
| | - Renske Gahrmann
- Department of Radiology & Nuclear Medicine, Erasmus MC - University Medical Center Rotterdam, The Netherlands
| | - Wiro J Niessen
- Biomedical Imaging Group Rotterdam, Departments of Radiology & Nuclear Medicine and Medical Informatics, Erasmus MC - University Medical Center Rotterdam, The Netherlands; Quantitative Imaging Group, Faculty of Technical Physics, Delft University of Technology, The Netherlands
| | - Adriaan Moelker
- Department of Radiology & Nuclear Medicine, Erasmus MC - University Medical Center Rotterdam, The Netherlands
| | - Theo van Walsum
- Biomedical Imaging Group Rotterdam, Departments of Radiology & Nuclear Medicine and Medical Informatics, Erasmus MC - University Medical Center Rotterdam, The Netherlands.
| |
Collapse
|
7
|
De Luca V, Banerjee J, Hallack A, Kondo S, Makhinya M, Nouri D, Royer L, Cifor A, Dardenne G, Goksel O, Gooding MJ, Klink C, Krupa A, Le Bras A, Marchal M, Moelker A, Niessen WJ, Papiez BW, Rothberg A, Schnabel J, van Walsum T, Harris E, Lediju Bell MA, Tanner C. Evaluation of 2D and 3D ultrasound tracking algorithms and impact on ultrasound-guided liver radiotherapy margins. Med Phys 2018; 45:4986-5003. [PMID: 30168159 DOI: 10.1002/mp.13152] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Revised: 07/26/2018] [Accepted: 07/27/2018] [Indexed: 12/25/2022] Open
Abstract
PURPOSE Compensation for respiratory motion is important during abdominal cancer treatments. In this work we report the results of the 2015 MICCAI Challenge on Liver Ultrasound Tracking and extend the 2D results to relate them to clinical relevance in form of reducing treatment margins and hence sparing healthy tissues, while maintaining full duty cycle. METHODS We describe methodologies for estimating and temporally predicting respiratory liver motion from continuous ultrasound imaging, used during ultrasound-guided radiation therapy. Furthermore, we investigated the trade-off between tracking accuracy and runtime in combination with temporal prediction strategies and their impact on treatment margins. RESULTS Based on 2D ultrasound sequences from 39 volunteers, a mean tracking accuracy of 0.9 mm was achieved when combining the results from the 4 challenge submissions (1.2 to 3.3 mm). The two submissions for the 3D sequences from 14 volunteers provided mean accuracies of 1.7 and 1.8 mm. In combination with temporal prediction, using the faster (41 vs 228 ms) but less accurate (1.4 vs 0.9 mm) tracking method resulted in substantially reduced treatment margins (70% vs 39%) in contrast to mid-ventilation margins, as it avoided non-linear temporal prediction by keeping the treatment system latency low (150 vs 400 ms). Acceleration of the best tracking method would improve the margin reduction to 75%. CONCLUSIONS Liver motion estimation and prediction during free-breathing from 2D ultrasound images can substantially reduce the in-plane motion uncertainty and hence treatment margins. Employing an accurate tracking method while avoiding non-linear temporal prediction would be favorable. This approach has the potential to shorten treatment time compared to breath-hold and gated approaches, and increase treatment efficiency and safety.
Collapse
Affiliation(s)
- Valeria De Luca
- Computer Vision Laboratory, ETH Zurich, Zürich, Switzerland
- Novartis Institutes for Biomedical Research, Basel, Switzerland
| | | | - Andre Hallack
- Institute of Biomedical Engineering, University of Oxford, Oxford, UK
| | | | - Maxim Makhinya
- Computer Vision Laboratory, ETH Zurich, Zürich, Switzerland
| | | | - Lucas Royer
- Institut de Recherche Technologique b-com, Rennes, France
| | | | | | - Orcun Goksel
- Computer Vision Laboratory, ETH Zurich, Zürich, Switzerland
| | | | - Camiel Klink
- Department of Radiology, Erasmus MC, Rotterdam, The Netherlands
| | | | | | - Maud Marchal
- Institut de Recherche Technologique b-com, Rennes, France
| | - Adriaan Moelker
- Department of Radiology, Erasmus MC, Rotterdam, The Netherlands
| | - Wiro J Niessen
- Department of Radiology, Erasmus MC, Rotterdam, The Netherlands
| | | | | | - Julia Schnabel
- School of Biomedical Engineering and Imaging Sciences, King's College London, London, UK
| | - Theo van Walsum
- Department of Radiology, Erasmus MC, Rotterdam, The Netherlands
| | | | - Muyinatu A Lediju Bell
- Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, USA
| | | |
Collapse
|
8
|
Ipsen S, Bruder R, Kuhlemann I, Jauer P, Motisi L, Cremers F, Ernst F, Schweikard A. A visual probe positioning tool for 4D ultrasound-guided radiotherapy. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:883-886. [PMID: 30440532 DOI: 10.1109/embc.2018.8512390] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Ultrasound (US) guidance is a rapidly growing area in image-guided radiotherapy. For motion compensation, the therapy target needs to be visualized with the US probe to continuously determine its position and adapt for shifts. While US has obvious benefits such as real-time capability and proven safety, one of the main drawbacks to date is its user dependency - high quality results require long years of clinical experience. To provide positioning assistance for the setup of US equipment by non-experts, we developed a visual guidance tool combining real-time US volume and CT visualization in a geometrically calibrated setup. By using a 4D US station with real-time data access and an optical tracking system, we achieved a calibration accuracy of 1.2 mm and a mean 2D contour distance of 1.7 mm between organ boundaries identified in US and CT. With this low calibration error as well as the good visual alignment of the structures, the developed probe positioning tool could be a valuable aid for ultrasound-guided radiotherapy and other interventions by guiding the user to a suitable acoustic window while potentially improving setup reproducibility.
Collapse
|
9
|
Sun Y, Moelker A, Niessen WJ, van Walsum T. Towards Robust CT-Ultrasound Registration Using Deep Learning Methods. UNDERSTANDING AND INTERPRETING MACHINE LEARNING IN MEDICAL IMAGE COMPUTING APPLICATIONS 2018. [DOI: 10.1007/978-3-030-02628-8_5] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/04/2022]
|
10
|
Arif M, Moelker A, van Walsum T. Needle Tip Visibility in 3D Ultrasound Images. Cardiovasc Intervent Radiol 2017; 41:145-152. [PMID: 28929215 PMCID: PMC5735203 DOI: 10.1007/s00270-017-1798-7] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/13/2016] [Accepted: 02/02/2017] [Indexed: 11/29/2022]
Abstract
Aim Needle visibility is crucial for effective and safe ultrasound-guided interventional procedures. Several studies have investigated needle visibility in 2D ultrasound imaging, but less information is available for 3D ultrasound imaging, a modality that has great potential for image guidance interventions. We performed a prospective study, to quantitatively compare the echogenicity of various commercially available needles in 3D ultrasound images used in clinical practice under freehand needle introduction. Materials and Methods A set of seven needles, containing biopsy needles, a TIPS needle, an ablation needle and a puncture needle, were included in the study. A liver-mimicking phantom and cow liver were punctured by each needle. 3D sweeps and real-time 3D data were acquired at three different angles (20°, 55° and 90°). Needle visibility was quantified by calculating contrast-to-noise ratio. Results In the liver-mimicking phantom, all needles showed better visibility than in the cow liver. At large angles, contrast-to-noise ratio and needle visibility were almost similar in both cases, but at lower angles differences in visibility were observed with different types of needles. Conclusion The contrast-to-noise ratio increased with the increase in angle of insonation. The difference in visibility of different needles is more pronounced at 20° angle. The echogenic properties of inhomogeneous cow liver tissues make the needles visibility worse as compared to a homogenous phantom. The needle visibility becomes worse in 3D real-time data as compared to 3D ultrasound sweeps.
Collapse
Affiliation(s)
- Muhammad Arif
- Department of Medical Informatics, Erasmus MC, University Medical Center Rotterdam, Wytemaweg 80, Room Na 2506 Erasmus MC, 3015 CN, Rotterdam, The Netherlands.
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands.
| | - Adriaan Moelker
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - Theo van Walsum
- Department of Medical Informatics, Erasmus MC, University Medical Center Rotterdam, Wytemaweg 80, Room Na 2506 Erasmus MC, 3015 CN, Rotterdam, The Netherlands
- Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|