1
|
Huang Z, Wu Z, Yan H. A convex-hull based method with manifold projections for detecting cell protrusions. Comput Biol Med 2024; 173:108350. [PMID: 38555705 DOI: 10.1016/j.compbiomed.2024.108350] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 02/25/2024] [Accepted: 03/17/2024] [Indexed: 04/02/2024]
Abstract
Cell protrusions play an important role in a variety of cell physiological processes. In this paper, we propose a convex-hull based method, combined with manifold projections, to detect cell protrusions. A convex hull is generated based on the cell surface. We consider the cell surface and the boundary of its convex hull as two manifolds, which are diffeomorphic, and define a depth function based on the distance between the cell surface and its convex hull boundary. The extreme points of the depth function represent the positions of cell protrusions. To find the extreme points easily, we project the points on the cell surface onto the boundary of the convex hull and expand them in spherical polar coordinates. We conducted experiments on three types of cell protrusions. The proposed method achieved the average precision of 98.9%, 95.6%, and 94.7% on blebs, filopodia, and lamellipodia, respectively. Experiments on three datasets show that the proposed method has a robust performance.
Collapse
Affiliation(s)
- Zhaoke Huang
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong Special Administrative Region of China.
| | - Zihan Wu
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong Special Administrative Region of China
| | - Hong Yan
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong Special Administrative Region of China
| |
Collapse
|
2
|
Hradecka L, Wiesner D, Sumbal J, Koledova ZS, Maska M. Segmentation and Tracking of Mammary Epithelial Organoids in Brightfield Microscopy. IEEE TRANSACTIONS ON MEDICAL IMAGING 2023; 42:281-290. [PMID: 36170389 DOI: 10.1109/tmi.2022.3210714] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
We present an automated and deep-learning-based workflow to quantitatively analyze the spatiotemporal development of mammary epithelial organoids in two-dimensional time-lapse (2D+t) sequences acquired using a brightfield microscope at high resolution. It involves a convolutional neural network (U-Net), purposely trained using computer-generated bioimage data created by a conditional generative adversarial network (pix2pixHD), to infer semantic segmentation, adaptive morphological filtering to identify organoid instances, and a shape-similarity-constrained, instance-segmentation-correcting tracking procedure to reliably cherry-pick the organoid instances of interest in time. By validating it using real 2D+t sequences of mouse mammary epithelial organoids of morphologically different phenotypes, we clearly demonstrate that the workflow achieves reliable segmentation and tracking performance, providing a reproducible and laborless alternative to manual analyses of the acquired bioimage data.
Collapse
|
3
|
Sugawara K, Çevrim Ç, Averof M. Tracking cell lineages in 3D by incremental deep learning. eLife 2022; 11:e69380. [PMID: 34989675 PMCID: PMC8741210 DOI: 10.7554/elife.69380] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2021] [Accepted: 12/07/2021] [Indexed: 11/13/2022] Open
Abstract
Deep learning is emerging as a powerful approach for bioimage analysis. Its use in cell tracking is limited by the scarcity of annotated data for the training of deep-learning models. Moreover, annotation, training, prediction, and proofreading currently lack a unified user interface. We present ELEPHANT, an interactive platform for 3D cell tracking that addresses these challenges by taking an incremental approach to deep learning. ELEPHANT provides an interface that seamlessly integrates cell track annotation, deep learning, prediction, and proofreading. This enables users to implement cycles of incremental learning starting from a few annotated nuclei. Successive prediction-validation cycles enrich the training data, leading to rapid improvements in tracking performance. We test the software's performance against state-of-the-art methods and track lineages spanning the entire course of leg regeneration in a crustacean over 1 week (504 timepoints). ELEPHANT yields accurate, fully-validated cell lineages with a modest investment in time and effort.
Collapse
Affiliation(s)
- Ko Sugawara
- Institut de Génomique Fonctionnelle de Lyon (IGFL), École Normale Supérieure de LyonLyonFrance
- Centre National de la Recherche Scientifique (CNRS)ParisFrance
| | - Çağrı Çevrim
- Institut de Génomique Fonctionnelle de Lyon (IGFL), École Normale Supérieure de LyonLyonFrance
- Centre National de la Recherche Scientifique (CNRS)ParisFrance
| | - Michalis Averof
- Institut de Génomique Fonctionnelle de Lyon (IGFL), École Normale Supérieure de LyonLyonFrance
- Centre National de la Recherche Scientifique (CNRS)ParisFrance
| |
Collapse
|
4
|
Bao R, Al-Shakarji NM, Bunyak F, Palaniappan K. DMNet: Dual-Stream Marker Guided Deep Network for Dense Cell Segmentation and Lineage Tracking. ... IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS. IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION 2021; 2021:3354-3363. [PMID: 35386855 PMCID: PMC8982054 DOI: 10.1109/iccvw54120.2021.00375] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Accurate segmentation and tracking of cells in microscopy image sequences is extremely beneficial in clinical diagnostic applications and biomedical research. A continuing challenge is the segmentation of dense touching cells and deforming cells with indistinct boundaries, in low signal-to-noise-ratio images. In this paper, we present a dual-stream marker-guided network (DMNet) for segmentation of touching cells in microscopy videos of many cell types. DMNet uses an explicit cell marker-detection stream, with a separate mask-prediction stream using a distance map penalty function, which enables supervised training to focus attention on touching and nearby cells. For multi-object cell tracking we use M2Track tracking-by-detection approach with multi-step data association. Our M2Track with mask overlap includes short term track-to-cell association followed by track-to-track association to re-link tracklets with missing segmentation masks over a short sequence of frames. Our combined detection, segmentation and tracking algorithm has proven its potential on the IEEE ISBI 2021 6th Cell Tracking Challenge (CTC-6) where we achieved multiple top three rankings for diverse cell types. Our team name is MU-Ba-US, and the implementation of DMNet is available at, http://celltrackingchallenge.net/participants/MU-Ba-US/.
Collapse
Affiliation(s)
- Rina Bao
- University of Missouri-Columbia, MO 65211, USA
| | | | | | | |
Collapse
|
5
|
Lefevre JG, Koh YWH, Wall AA, Condon ND, Stow JL, Hamilton NA. LLAMA: a robust and scalable machine learning pipeline for analysis of large scale 4D microscopy data: analysis of cell ruffles and filopodia. BMC Bioinformatics 2021; 22:410. [PMID: 34412593 PMCID: PMC8375126 DOI: 10.1186/s12859-021-04324-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2021] [Accepted: 08/10/2021] [Indexed: 11/24/2022] Open
Abstract
Background With recent advances in microscopy, recordings of cell behaviour can result in terabyte-size datasets. The lattice light sheet microscope (LLSM) images cells at high speed and high 3D resolution, accumulating data at 100 frames/second over hours, presenting a major challenge for interrogating these datasets. The surfaces of vertebrate cells can rapidly deform to create projections that interact with the microenvironment. Such surface projections include spike-like filopodia and wave-like ruffles on the surface of macrophages as they engage in immune surveillance. LLSM imaging has provided new insights into the complex surface behaviours of immune cells, including revealing new types of ruffles. However, full use of these data requires systematic and quantitative analysis of thousands of projections over hundreds of time steps, and an effective system for analysis of individual structures at this scale requires efficient and robust methods with minimal user intervention. Results We present LLAMA, a platform to enable systematic analysis of terabyte-scale 4D microscopy datasets. We use a machine learning method for semantic segmentation, followed by a robust and configurable object separation and tracking algorithm, generating detailed object level statistics. Our system is designed to run on high-performance computing to achieve high throughput, with outputs suitable for visualisation and statistical analysis. Advanced visualisation is a key element of LLAMA: we provide a specialised tool which supports interactive quality control, optimisation, and output visualisation processes to complement the processing pipeline. LLAMA is demonstrated in an analysis of macrophage surface projections, in which it is used to i) discriminate ruffles induced by lipopolysaccharide (LPS) and macrophage colony stimulating factor (CSF-1) and ii) determine the autonomy of ruffle morphologies. Conclusions LLAMA provides an effective open source tool for running a cell microscopy analysis pipeline based on semantic segmentation, object analysis and tracking. Detailed numerical and visual outputs enable effective statistical analysis, identifying distinct patterns of increased activity under the two interventions considered in our example analysis. Our system provides the capacity to screen large datasets for specific structural configurations. LLAMA identified distinct features of LPS and CSF-1 induced ruffles and it identified a continuity of behaviour between tent pole ruffling, wave-like ruffling and filopodia deployment. Supplementary Information The online version contains supplementary material available at 10.1186/s12859-021-04324-z.
Collapse
Affiliation(s)
- James G Lefevre
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia
| | - Yvette W H Koh
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia
| | - Adam A Wall
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia
| | - Nicholas D Condon
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia
| | - Jennifer L Stow
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia
| | - Nicholas A Hamilton
- Institute for Molecular Bioscience, The University of Queensland, Brisbane, QLD, Australia. .,Research Computing Centre, The University of Queensland, Brisbane, QLD, Australia.
| |
Collapse
|
6
|
Isensee F, Jaeger PF, Kohl SAA, Petersen J, Maier-Hein KH. nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods 2021. [DOI: 10.1038/s41592-020-01008-z 10.1038/s41592-020-01008-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
|
7
|
Lutton EJ, Collier S, Bretschneider T. A Curvature-Enhanced Random Walker Segmentation Method for Detailed Capture of 3D Cell Surface Membranes. IEEE TRANSACTIONS ON MEDICAL IMAGING 2021; 40:514-526. [PMID: 33052849 DOI: 10.1109/tmi.2020.3031029] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
High-resolution 3D microscopy is a fast advancing field and requires new techniques in image analysis to handle these new datasets. In this work, we focus on detailed 3D segmentation of Dictyostelium cells undergoing macropinocytosis captured on an iSPIM microscope. We propose a novel random walker-based method with a curvature-based enhancement term, with the aim of capturing fine protrusions, such as filopodia and deep invaginations, such as macropinocytotic cups, on the cell surface. We tested our method on both real and synthetic 3D image volumes, demonstrating that the inclusion of the curvature enhancement term can improve the segmentation of the aforementioned features. We show that our method performs better than other state of the art segmentation methods in 3D images of Dictyostelium cells, and performs competitively against CNN-based methods in two Cell Tracking Challenge datasets, demonstrating the ability to obtain accurate segmentations without the requirement of large training datasets. We also present an automated seeding method for microscopy data, which, combined with the curvature-enhanced random walker method, enables the segmentation of large time series with minimal input from the experimenter.
Collapse
|
8
|
Isensee F, Jaeger PF, Kohl SAA, Petersen J, Maier-Hein KH. nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods 2020; 18:203-211. [PMID: 33288961 DOI: 10.1038/s41592-020-01008-z] [Citation(s) in RCA: 2381] [Impact Index Per Article: 476.2] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 10/17/2020] [Accepted: 10/29/2020] [Indexed: 11/09/2022]
Abstract
Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
Collapse
Affiliation(s)
- Fabian Isensee
- Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany.,Faculty of Biosciences, University of Heidelberg, Heidelberg, Germany
| | - Paul F Jaeger
- Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany
| | - Simon A A Kohl
- Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany.,DeepMind, London, UK
| | - Jens Petersen
- Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany.,Faculty of Physics & Astronomy, University of Heidelberg, Heidelberg, Germany
| | - Klaus H Maier-Hein
- Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany. .,Pattern Analysis and Learning Group, Department of Radiation Oncology, Heidelberg University Hospital, Heidelberg, Germany.
| |
Collapse
|
9
|
Wiesner D, Svoboda D, Maška M, Kozubek M. CytoPacq: a web-interface for simulating multi-dimensional cell imaging. Bioinformatics 2019; 35:4531-4533. [PMID: 31114843 PMCID: PMC6821329 DOI: 10.1093/bioinformatics/btz417] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2019] [Revised: 04/17/2019] [Accepted: 05/15/2019] [Indexed: 11/13/2022] Open
Abstract
MOTIVATION Objective assessment of bioimage analysis methods is an essential step towards understanding their robustness and parameter sensitivity, calling for the availability of heterogeneous bioimage datasets accompanied by their reference annotations. Because manual annotations are known to be arduous, highly subjective and barely reproducible, numerous simulators have emerged over past decades, generating synthetic bioimage datasets complemented with inherent reference annotations. However, the installation and configuration of these tools generally constitutes a barrier to their widespread use. RESULTS We present a modern, modular web-interface, CytoPacq, to facilitate the generation of synthetic benchmark datasets relevant for multi-dimensional cell imaging. CytoPacq poses a user-friendly graphical interface with contextual tooltips and currently allows a comfortable access to various cell simulation systems of fluorescence microscopy, which have already been recognized and used by the scientific community, in a straightforward and self-contained form. AVAILABILITY AND IMPLEMENTATION CytoPacq is a publicly available online service running at https://cbia.fi.muni.cz/simulator. More information about it as well as examples of generated bioimage datasets are available directly through the web-interface. SUPPLEMENTARY INFORMATION Supplementary data are available at Bioinformatics online.
Collapse
Affiliation(s)
- David Wiesner
- Centre for Biomedical Image Analysis, Masaryk University, Brno CZ-60200, Czech Republic
| | - David Svoboda
- Centre for Biomedical Image Analysis, Masaryk University, Brno CZ-60200, Czech Republic
| | - Martin Maška
- Centre for Biomedical Image Analysis, Masaryk University, Brno CZ-60200, Czech Republic
| | - Michal Kozubek
- Centre for Biomedical Image Analysis, Masaryk University, Brno CZ-60200, Czech Republic
| |
Collapse
|