1
|
Schienstock D, Hor JL, Devi S, Mueller SN. Cecelia: a multifunctional image analysis toolbox for decoding spatial cellular interactions and behaviour. Nat Commun 2025; 16:1931. [PMID: 39994207 PMCID: PMC11850795 DOI: 10.1038/s41467-025-57193-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2024] [Accepted: 02/14/2025] [Indexed: 02/26/2025] Open
Abstract
With the ever-increasing complexity of microscopy modalities, it is imperative to have computational workflows that enable researchers to process and perform in-depth quantitative analysis of the resulting images. However, workflows that allow flexible, interactive and intuitive analysis from raw images to analysed data are lacking for many experimental use-cases. Notably, integrated software solutions for analysis of complex 3D and live cell images are sorely needed. To address this, we present Cecelia, a toolbox that integrates various open-source packages into a coherent data management suite to make quantitative multidimensional image analysis accessible for non-specialists. We describe the application of Cecelia to several immunologically relevant scenarios and the development of an unbiased approach to distinguish dynamic cell behaviours from live imaging data. Cecelia is available as a software package with a Shiny app interface ( https://github.com/schienstockd/cecelia ). We envision that this framework and its approaches will be of broad use for biological researchers.
Collapse
Affiliation(s)
- Dominik Schienstock
- Department of Microbiology and Immunology, The University of Melbourne, The Peter Doherty Institute for Infection and Immunity, Melbourne, VIC, Australia
| | - Jyh Liang Hor
- Department of Microbiology and Immunology, The University of Melbourne, The Peter Doherty Institute for Infection and Immunity, Melbourne, VIC, Australia
- Lymphocyte Biology Section, Laboratory of Immune System Biology, NIAID, NIH, Bethesda, MD, USA
| | - Sapna Devi
- Department of Microbiology and Immunology, The University of Melbourne, The Peter Doherty Institute for Infection and Immunity, Melbourne, VIC, Australia
| | - Scott N Mueller
- Department of Microbiology and Immunology, The University of Melbourne, The Peter Doherty Institute for Infection and Immunity, Melbourne, VIC, Australia.
| |
Collapse
|
2
|
Lange D, Judson-Torres R, Zangle TA, Lex A. Aardvark: Composite Visualizations of Trees, Time-Series, and Images. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:1290-1300. [PMID: 39255114 DOI: 10.1109/tvcg.2024.3456193] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
How do cancer cells grow, divide, proliferate, and die? How do drugs influence these processes? These are difficult questions that we can attempt to answer with a combination of time-series microscopy experiments, classification algorithms, and data visualization. However, collecting this type of data and applying algorithms to segment and track cells and construct lineages of proliferation is error-prone; and identifying the errors can be challenging since it often requires cross-checking multiple data types. Similarly, analyzing and communicating the results necessitates synthesizing different data types into a single narrative. State-of-the-art visualization methods for such data use independent line charts, tree diagrams, and images in separate views. However, this spatial separation requires the viewer of these charts to combine the relevant pieces of data in memory. To simplify this challenging task, we describe design principles for weaving cell images, time-series data, and tree data into a cohesive visualization. Our design principles are based on choosing a primary data type that drives the layout and integrates the other data types into that layout. We then introduce Aardvark, a system that uses these principles to implement novel visualization techniques. Based on Aardvark, we demonstrate the utility of each of these approaches for discovery, communication, and data debugging in a series of case studies.
Collapse
|
3
|
Morth E, Sidak K, Maliga Z, Moller T, Gehlenborg N, Sorger P, Pfister H, Beyer J, Kruger R. Cell2Cell: Explorative Cell Interaction Analysis in Multi-Volumetric Tissue Data. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:569-579. [PMID: 39255170 PMCID: PMC11875984 DOI: 10.1109/tvcg.2024.3456406] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
We present Cell2Cell, a novel visual analytics approach for quantifying and visualizing networks of cell-cell interactions in three-dimensional (3D) multi-channel cancerous tissue data. By analyzing cellular interactions, biomedical experts can gain a more accurate understanding of the intricate relationships between cancer and immune cells. Recent methods have focused on inferring interaction based on the proximity of cells in low-resolution 2D multi-channel imaging data. By contrast, we analyze cell interactions by quantifying the presence and levels of specific proteins within a tissue sample (protein expressions) extracted from high-resolution 3D multi-channel volume data. Such analyses have a strong exploratory nature and require a tight integration of domain experts in the analysis loop to leverage their deep knowledge. We propose two complementary semi-automated approaches to cope with the increasing size and complexity of the data interactively: On the one hand, we interpret cell-to-cell interactions as edges in a cell graph and analyze the image signal (protein expressions) along those edges, using spatial as well as abstract visualizations. Complementary, we propose a cell-centered approach, enabling scientists to visually analyze polarized distributions of proteins in three dimensions, which also captures neighboring cells with biochemical and cell biological consequences. We evaluate our application in three case studies, where biologists and medical experts use Cell2Cell to investigate tumor micro-environments to identify and quantify T-cell activation in human tissue data. We confirmed that our tool can fully solve the use cases and enables a streamlined and detailed analysis of cell-cell interactions.
Collapse
|
4
|
Warchol S, Troidl J, Muhlich J, Krueger R, Hoffer J, Lin T, Beyer J, Glassman E, Sorger PK, Pfister H. psudo: Exploring Multi-Channel Biomedical Image Data with Spatially and Perceptually Optimized Pseudocoloring. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.04.11.589087. [PMID: 38659870 PMCID: PMC11042212 DOI: 10.1101/2024.04.11.589087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/26/2024]
Abstract
Over the past century, multichannel fluorescence imaging has been pivotal in myriad scientific breakthroughs by enabling the spatial visualization of proteins within a biological sample. With the shift to digital methods and visualization software, experts can now flexibly pseudocolor and combine image channels, each corresponding to a different protein, to explore their spatial relationships. We thus propose psudo, an interactive system that allows users to create optimal color palettes for multichannel spatial data. In psudo, a novel optimization method generates palettes that maximize the perceptual differences between channels while mitigating confusing color blending in overlapping channels. We integrate this method into a system that allows users to explore multi-channel image data and compare and evaluate color palettes for their data. An interactive lensing approach provides on-demand feedback on channel overlap and a color confusion metric while giving context to the underlying channel values. Color palettes can be applied globally or, using the lens, to local regions of interest. We evaluate our palette optimization approach using three graphical perception tasks in a crowdsourced user study with 150 participants, showing that users are more accurate at discerning and comparing the underlying data using our approach. Additionally, we showcase psudo in a case study exploring the complex immune responses in cancer tissue data with a biologist.
Collapse
Affiliation(s)
- Simon Warchol
- Harvard John A. Paulson School Of Engineering And Applied Sciences
- Visual Computing Group, Harvard University
- Laboratory of Systems Pharmacology, Harvard Medical School
| | - Jakob Troidl
- Harvard John A. Paulson School Of Engineering And Applied Sciences
- Visual Computing Group, Harvard University
| | - Jeremy Muhlich
- Department of Systems Biology, Harvard Medical School
- Visual Computing Group, Harvard University
| | - Robert Krueger
- Laboratory of Systems Pharmacology, Harvard Medical School
| | - John Hoffer
- Department of Systems Biology, Harvard Medical School
- Laboratory of Systems Pharmacology, Harvard Medical School
| | - Tica Lin
- Harvard John A. Paulson School Of Engineering And Applied Sciences
- Visual Computing Group, Harvard University
| | - Johanna Beyer
- Harvard John A. Paulson School Of Engineering And Applied Sciences
- Visual Computing Group, Harvard University
| | - Elena Glassman
- Harvard John A. Paulson School Of Engineering And Applied Sciences
| | - Peter K Sorger
- Department of Systems Biology, Harvard Medical School
- Laboratory of Systems Pharmacology, Harvard Medical School
| | - Hanspeter Pfister
- Harvard John A. Paulson School Of Engineering And Applied Sciences
- Visual Computing Group, Harvard University
- Laboratory of Systems Pharmacology, Harvard Medical School
| |
Collapse
|
5
|
Herzberger L, Hadwiger M, Kruger R, Sorger P, Pfister H, Groller E, Beyer J. Residency Octree: A Hybrid Approach for Scalable Web-Based Multi-Volume Rendering. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:1380-1390. [PMID: 37889813 PMCID: PMC10840607 DOI: 10.1109/tvcg.2023.3327193] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/29/2023]
Abstract
We present a hybrid multi-volume rendering approach based on a novel Residency Octree that combines the advantages of out-of-core volume rendering using page tables with those of standard octrees. Octree approaches work by performing hierarchical tree traversal. However, in octree volume rendering, tree traversal and the selection of data resolution are intrinsically coupled. This makes fine-grained empty-space skipping costly. Page tables, on the other hand, allow access to any cached brick from any resolution. However, they do not offer a clear and efficient strategy for substituting missing high-resolution data with lower-resolution data. We enable flexible mixed-resolution out-of-core multi-volume rendering by decoupling the cache residency of multi-resolution data from a resolution-independent spatial subdivision determined by the tree. Instead of one-to-one node-to-brick correspondences, each residency octree node is mapped to a set of bricks from different resolution levels. This makes it possible to efficiently and adaptively choose and mix resolutions, adapt sampling rates, and compensate for cache misses. At the same time, residency octrees support fine-grained empty-space skipping, independent of the data subdivision used for caching. Finally, to facilitate collaboration and outreach, and to eliminate local data storage, our implementation is a web-based, pure client-side renderer using WebGPU and WebAssembly. Our method is faster than prior approaches and efficient for many data channels with a flexible and adaptive choice of data resolution.
Collapse
|
6
|
Wentzel A, Floricel C, Canahuate G, Naser MA, Mohamed AS, Fuller CD, van Dijk L, Marai GE. DASS Good: Explainable Data Mining of Spatial Cohort Data. COMPUTER GRAPHICS FORUM : JOURNAL OF THE EUROPEAN ASSOCIATION FOR COMPUTER GRAPHICS 2023; 42:283-295. [PMID: 37854026 PMCID: PMC10583718 DOI: 10.1111/cgf.14830] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/20/2023]
Abstract
Developing applicable clinical machine learning models is a difficult task when the data includes spatial information, for example, radiation dose distributions across adjacent organs at risk. We describe the co-design of a modeling system, DASS, to support the hybrid human-machine development and validation of predictive models for estimating long-term toxicities related to radiotherapy doses in head and neck cancer patients. Developed in collaboration with domain experts in oncology and data mining, DASS incorporates human-in-the-loop visual steering, spatial data, and explainable AI to augment domain knowledge with automatic data mining. We demonstrate DASS with the development of two practical clinical stratification models and report feedback from domain experts. Finally, we describe the design lessons learned from this collaborative experience.
Collapse
Affiliation(s)
- A Wentzel
- University of Illinois Chicago, Electronic Visualization Lab
| | - C Floricel
- University of Illinois Chicago, Electronic Visualization Lab
| | | | - M A Naser
- University of Texas MD Anderson Cancer Center
| | - A S Mohamed
- University of Texas MD Anderson Cancer Center
| | - C D Fuller
- University of Texas MD Anderson Cancer Center
| | - L van Dijk
- University of Texas MD Anderson Cancer Center
| | - G E Marai
- University of Illinois Chicago, Electronic Visualization Lab
| |
Collapse
|
7
|
Warchol S, Krueger R, Nirmal AJ, Gaglia G, Jessup J, Ritch CC, Hoffer J, Muhlich J, Burger ML, Jacks T, Santagata S, Sorger PK, Pfister H. Visinity: Visual Spatial Neighborhood Analysis for Multiplexed Tissue Imaging Data. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:106-116. [PMID: 36170403 PMCID: PMC10043053 DOI: 10.1109/tvcg.2022.3209378] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
New highly-multiplexed imaging technologies have enabled the study of tissues in unprecedented detail. These methods are increasingly being applied to understand how cancer cells and immune response change during tumor development, progression, and metastasis, as well as following treatment. Yet, existing analysis approaches focus on investigating small tissue samples on a per-cell basis, not taking into account the spatial proximity of cells, which indicates cell-cell interaction and specific biological processes in the larger cancer microenvironment. We present Visinity, a scalable visual analytics system to analyze cell interaction patterns across cohorts of whole-slide multiplexed tissue images. Our approach is based on a fast regional neighborhood computation, leveraging unsupervised learning to quantify, compare, and group cells by their surrounding cellular neighborhood. These neighborhoods can be visually analyzed in an exploratory and confirmatory workflow. Users can explore spatial patterns present across tissues through a scalable image viewer and coordinated views highlighting the neighborhood composition and spatial arrangements of cells. To verify or refine existing hypotheses, users can query for specific patterns to determine their presence and statistical significance. Findings can be interactively annotated, ranked, and compared in the form of small multiples. In two case studies with biomedical experts, we demonstrate that Visinity can identify common biological processes within a human tonsil and uncover novel white-blood cell networks and immune-tumor interactions.
Collapse
|
8
|
Singer DS. A new phase of the Cancer Moonshot to end cancer as we know it. Nat Med 2022; 28:1345-1347. [PMID: 35760861 PMCID: PMC9244436 DOI: 10.1038/s41591-022-01881-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|