1
|
Prabhakaran S, Yapp C, Baker GJ, Beyer J, Chang YH, Creason AL, Krueger R, Muhlich J, Patterson NH, Sidak K, Sudar D, Taylor AJ, Ternes L, Troidl J, Xie Y, Sokolov A, Tyson DR. Addressing persistent challenges in digital image analysis of cancerous tissues. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.21.548450. [PMID: 37547011 PMCID: PMC10401923 DOI: 10.1101/2023.07.21.548450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/08/2023]
Abstract
The National Cancer Institute (NCI) supports many research programs and consortia, many of which use imaging as a major modality for characterizing cancerous tissue. A trans-consortia Image Analysis Working Group (IAWG) was established in 2019 with a mission to disseminate imaging-related work and foster collaborations. In 2022, the IAWG held a virtual hackathon focused on addressing challenges of analyzing high dimensional datasets from fixed cancerous tissues. Standard image processing techniques have automated feature extraction, but the next generation of imaging data requires more advanced methods to fully utilize the available information. In this perspective, we discuss current limitations of the automated analysis of multiplexed tissue images, the first steps toward deeper understanding of these limitations, what possible solutions have been developed, any new or refined approaches that were developed during the Image Analysis Hackathon 2022, and where further effort is required. The outstanding problems addressed in the hackathon fell into three main themes: 1) challenges to cell type classification and assessment, 2) translation and visual representation of spatial aspects of high dimensional data, and 3) scaling digital image analyses to large (multi-TB) datasets. We describe the rationale for each specific challenge and the progress made toward addressing it during the hackathon. We also suggest areas that would benefit from more focus and offer insight into broader challenges that the community will need to address as new technologies are developed and integrated into the broad range of image-based modalities and analytical resources already in use within the cancer research community.
Collapse
|
2
|
Homeyer A, Geißler C, Schwen LO, Zakrzewski F, Evans T, Strohmenger K, Westphal M, Bülow RD, Kargl M, Karjauv A, Munné-Bertran I, Retzlaff CO, Romero-López A, Sołtysiński T, Plass M, Carvalho R, Steinbach P, Lan YC, Bouteldja N, Haber D, Rojas-Carulla M, Vafaei Sadr A, Kraft M, Krüger D, Fick R, Lang T, Boor P, Müller H, Hufnagl P, Zerbe N. Recommendations on compiling test datasets for evaluating artificial intelligence solutions in pathology. Mod Pathol 2022; 35:1759-1769. [PMID: 36088478 PMCID: PMC9708586 DOI: 10.1038/s41379-022-01147-y] [Citation(s) in RCA: 24] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2022] [Revised: 07/24/2022] [Accepted: 07/25/2022] [Indexed: 12/24/2022]
Abstract
Artificial intelligence (AI) solutions that automatically extract information from digital histology images have shown great promise for improving pathological diagnosis. Prior to routine use, it is important to evaluate their predictive performance and obtain regulatory approval. This assessment requires appropriate test datasets. However, compiling such datasets is challenging and specific recommendations are missing. A committee of various stakeholders, including commercial AI developers, pathologists, and researchers, discussed key aspects and conducted extensive literature reviews on test datasets in pathology. Here, we summarize the results and derive general recommendations on compiling test datasets. We address several questions: Which and how many images are needed? How to deal with low-prevalence subsets? How can potential bias be detected? How should datasets be reported? What are the regulatory requirements in different countries? The recommendations are intended to help AI developers demonstrate the utility of their products and to help pathologists and regulatory agencies verify reported performance measures. Further research is needed to formulate criteria for sufficiently representative test datasets so that AI solutions can operate with less user intervention and better support diagnostic workflows in the future.
Collapse
Affiliation(s)
- André Homeyer
- Fraunhofer Institute for Digital Medicine MEVIS, Max-von-Laue-Straße 2, 28359, Bremen, Germany.
| | - Christian Geißler
- Technische Universität Berlin, DAI-Labor, Ernst-Reuter-Platz 7, 10587, Berlin, Germany
| | - Lars Ole Schwen
- Fraunhofer Institute for Digital Medicine MEVIS, Max-von-Laue-Straße 2, 28359, Bremen, Germany
| | - Falk Zakrzewski
- Institute of Pathology, Carl Gustav Carus University Hospital Dresden (UKD), TU Dresden (TUD), Fetscherstrasse 74, 01307, Dresden, Germany
| | - Theodore Evans
- Technische Universität Berlin, DAI-Labor, Ernst-Reuter-Platz 7, 10587, Berlin, Germany
| | - Klaus Strohmenger
- Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute of Pathology, Charitéplatz 1, 10117, Berlin, Germany
| | - Max Westphal
- Fraunhofer Institute for Digital Medicine MEVIS, Max-von-Laue-Straße 2, 28359, Bremen, Germany
| | - Roman David Bülow
- Institute of Pathology, University Hospital RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Germany
| | - Michaela Kargl
- Medical University of Graz, Diagnostic and Research Center for Molecular BioMedicine, Diagnostic & Research Institute of Pathology, Neue Stiftingtalstrasse 6, 8010, Graz, Austria
| | - Aray Karjauv
- Technische Universität Berlin, DAI-Labor, Ernst-Reuter-Platz 7, 10587, Berlin, Germany
| | - Isidre Munné-Bertran
- MoticEurope, S.L.U., C. Les Corts, 12 Poligono Industrial, 08349, Barcelona, Spain
| | - Carl Orge Retzlaff
- Technische Universität Berlin, DAI-Labor, Ernst-Reuter-Platz 7, 10587, Berlin, Germany
| | | | | | - Markus Plass
- Medical University of Graz, Diagnostic and Research Center for Molecular BioMedicine, Diagnostic & Research Institute of Pathology, Neue Stiftingtalstrasse 6, 8010, Graz, Austria
| | - Rita Carvalho
- Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute of Pathology, Charitéplatz 1, 10117, Berlin, Germany
| | - Peter Steinbach
- Helmholtz-Zentrum Dresden Rossendorf, Bautzner Landstraße 400, 01328, Dresden, Germany
| | - Yu-Chia Lan
- Institute of Pathology, University Hospital RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Germany
| | - Nassim Bouteldja
- Institute of Pathology, University Hospital RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Germany
| | - David Haber
- Lakera AI AG, Zelgstrasse 7, 8003, Zürich, Switzerland
| | | | - Alireza Vafaei Sadr
- Institute of Pathology, University Hospital RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Germany
| | | | - Daniel Krüger
- Olympus Soft Imaging Solutions GmbH, Johann-Krane-Weg 39, 48149, Münster, Germany
| | - Rutger Fick
- Tribun Health, 2 Rue du Capitaine Scott, 75015, Paris, France
| | - Tobias Lang
- Mindpeak GmbH, Zirkusweg 2, 20359, Hamburg, Germany
| | - Peter Boor
- Institute of Pathology, University Hospital RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Germany
| | - Heimo Müller
- Medical University of Graz, Diagnostic and Research Center for Molecular BioMedicine, Diagnostic & Research Institute of Pathology, Neue Stiftingtalstrasse 6, 8010, Graz, Austria
| | - Peter Hufnagl
- Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute of Pathology, Charitéplatz 1, 10117, Berlin, Germany
| | - Norman Zerbe
- Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute of Pathology, Charitéplatz 1, 10117, Berlin, Germany
| |
Collapse
|
3
|
Wright AI, Dunn CM, Hale M, Hutchins GGA, Treanor DE. The Effect of Quality Control on Accuracy of Digital Pathology Image Analysis. IEEE J Biomed Health Inform 2021; 25:307-314. [PMID: 33347418 DOI: 10.1109/jbhi.2020.3046094] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Digital slide images produced from routine diagnostic histopathological preparations suffer from variation arising at every step of the processing pipeline. Typically, pathologists compensate for such variation using expert knowledge and experience, which is difficult to replicate in automated solutions. The extent to which inconsistencies affect image analysis is explored in this work, examining in detail, the results from a previously published algorithm automating the generation of tumor:stroma ratio (TSR) in colorectal clinical trial datasets. One dataset consisting of 2,211 cases and 106,268 expert-labelled images is used to identify quality issues, by visually inspecting cases where algorithm-pathologist agreement is lowest. Twelve categories are identified and used to analyze pathologist-algorithm agreement in relation to these categories. Of the 2,211 cases, 701 were found to be free from any image quality issues. Algorithm performance was then assessed, comparing pathologist agreement with image quality classification. It was found that agreement was lowest on poorly differentiated tissue, with a mean TSR difference of 0.25 (sd = 0.24). Removing images that contained quality issues increased accuracy from 80% to 83%, at the expense of reducing the dataset to 33,736 images (32%). Training the algorithm on the optimized dataset, prior to testing on all images saw a decrease in accuracy of 4%, indicating that the optimized dataset did not contain enough variation to generate a fully representative model. The results provide an in-depth perspective on image quality, highlighting the importance of the effects on downstream image analysis.
Collapse
|
4
|
Chen Y, Zee J, Smith A, Jayapandian C, Hodgin J, Howell D, Palmer M, Thomas D, Cassol C, Farris AB, Perkinson K, Madabhushi A, Barisoni L, Janowczyk A. Assessment of a computerized quantitative quality control tool for whole slide images of kidney biopsies. J Pathol 2021; 253:268-278. [PMID: 33197281 DOI: 10.1002/path.5590] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Revised: 10/30/2020] [Accepted: 11/11/2020] [Indexed: 12/16/2022]
Abstract
Inconsistencies in the preparation of histology slides and whole-slide images (WSIs) may lead to challenges with subsequent image analysis and machine learning approaches for interrogating the WSI. These variabilities are especially pronounced in multicenter cohorts, where batch effects (i.e. systematic technical artifacts unrelated to biological variability) may introduce biases to machine learning algorithms. To date, manual quality control (QC) has been the de facto standard for dataset curation, but remains highly subjective and is too laborious in light of the increasing scale of tissue slide digitization efforts. This study aimed to evaluate a computer-aided QC pipeline for facilitating a reproducible QC process of WSI datasets. An open source tool, HistoQC, was employed to identify image artifacts and compute quantitative metrics describing visual attributes of WSIs to the Nephrotic Syndrome Study Network (NEPTUNE) digital pathology repository. A comparison in inter-reader concordance between HistoQC aided and unaided curation was performed to quantify improvements in curation reproducibility. HistoQC metrics were additionally employed to quantify the presence of batch effects within NEPTUNE WSIs. Of the 1814 WSIs (458 H&E, 470 PAS, 438 silver, 448 trichrome) from n = 512 cases considered in this study, approximately 9% (163) were identified as unsuitable for subsequent computational analysis. The concordance in the identification of these WSIs among computational pathologists rose from moderate (Gwet's AC1 range 0.43 to 0.59 across stains) to excellent (Gwet's AC1 range 0.79 to 0.93 across stains) agreement when aided by HistoQC. Furthermore, statistically significant batch effects (p < 0.001) in the NEPTUNE WSI dataset were discovered. Taken together, our findings strongly suggest that quantitative QC is a necessary step in the curation of digital pathology cohorts. © 2020 The Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
Collapse
Affiliation(s)
- Yijiang Chen
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA
| | - Jarcy Zee
- Arbor Research Collaborative for Health, Ann Arbor, MI, USA
| | - Abigail Smith
- Arbor Research Collaborative for Health, Ann Arbor, MI, USA
| | - Catherine Jayapandian
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA
| | - Jeffrey Hodgin
- Department of Pathology, University of Michigan, Ann Arbor, MI, USA
| | - David Howell
- Department of Pathology, Duke University, Durham, NC, USA
| | - Matthew Palmer
- Department of Pathology, University of Pennsylvania, Philadelphia, PA, USA
| | - David Thomas
- Department of Pathology, Duke University, Durham, NC, USA.,Nephrocor, Memphis, TN, USA
| | - Clarissa Cassol
- Renal Pathology Division, Arkana Laboratories, Little Rock, AK, USA.,Department of Pathology - Renal Pathology Division, Ohio State University Medical Center, Columbus, OH, USA
| | - Alton B Farris
- Department of Pathology and Laboratory Medicine, Emory University, Atlanta, GA, USA
| | | | - Anant Madabhushi
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes VA Medical Center, Cleveland, OH, USA
| | - Laura Barisoni
- Department of Pathology, Duke University, Durham, NC, USA.,Department of Medicine, Division of Nephrology, Duke University, Durham, NC, USA
| | - Andrew Janowczyk
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Precision Oncology Center, University of Lausanne, Lausanne, Switzerland
| |
Collapse
|
5
|
Schömig-Markiefka B, Pryalukhin A, Hulla W, Bychkov A, Fukuoka J, Madabhushi A, Achter V, Nieroda L, Büttner R, Quaas A, Tolkach Y. Quality control stress test for deep learning-based diagnostic model in digital pathology. Mod Pathol 2021; 34:2098-2108. [PMID: 34168282 PMCID: PMC8592835 DOI: 10.1038/s41379-021-00859-x] [Citation(s) in RCA: 58] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Revised: 06/06/2021] [Accepted: 06/06/2021] [Indexed: 11/23/2022]
Abstract
Digital pathology provides a possibility for computational analysis of histological slides and automatization of routine pathological tasks. Histological slides are very heterogeneous concerning staining, sections' thickness, and artifacts arising during tissue processing, cutting, staining, and digitization. In this study, we digitally reproduce major types of artifacts. Using six datasets from four different institutions digitized by different scanner systems, we systematically explore artifacts' influence on the accuracy of the pre-trained, validated, deep learning-based model for prostate cancer detection in histological slides. We provide evidence that any histological artifact dependent on severity can lead to a substantial loss in model performance. Strategies for the prevention of diagnostic model accuracy losses in the context of artifacts are warranted. Stress-testing of diagnostic models using synthetically generated artifacts might be an essential step during clinical validation of deep learning-based algorithms.
Collapse
Affiliation(s)
- Birgid Schömig-Markiefka
- grid.411097.a0000 0000 8852 305XInstitute of Pathology, University Hospital Cologne, Cologne, Germany
| | - Alexey Pryalukhin
- Institute of Pathology, Landesklinikum Wiener Neustadt, Wiener Neustadt, Austria
| | - Wolfgang Hulla
- Institute of Pathology, Landesklinikum Wiener Neustadt, Wiener Neustadt, Austria
| | - Andrey Bychkov
- grid.174567.60000 0000 8902 2273Department of Pathology, Nagasaki University Graduate School of Biomedical Sciences, Nagasaki, Japan ,grid.414927.d0000 0004 0378 2140Department of Pathology, Kameda Medical Center, Kamogawa, Japan
| | - Junya Fukuoka
- grid.174567.60000 0000 8902 2273Department of Pathology, Nagasaki University Graduate School of Biomedical Sciences, Nagasaki, Japan ,grid.414927.d0000 0004 0378 2140Department of Pathology, Kameda Medical Center, Kamogawa, Japan
| | - Anant Madabhushi
- grid.67105.350000 0001 2164 3847Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH USA ,grid.410349.b0000 0004 5912 6484Louis Stokes Cleveland Veterans Administration Medical Center, Cleveland, OH USA
| | - Viktor Achter
- grid.6190.e0000 0000 8580 3777Regional computing center (RRZK), University of Cologne, Cologne, Germany
| | - Lech Nieroda
- grid.6190.e0000 0000 8580 3777Regional computing center (RRZK), University of Cologne, Cologne, Germany
| | - Reinhard Büttner
- grid.411097.a0000 0000 8852 305XInstitute of Pathology, University Hospital Cologne, Cologne, Germany
| | - Alexander Quaas
- grid.411097.a0000 0000 8852 305XInstitute of Pathology, University Hospital Cologne, Cologne, Germany
| | - Yuri Tolkach
- Institute of Pathology, University Hospital Cologne, Cologne, Germany.
| |
Collapse
|
6
|
Janowczyk A, Zuo R, Gilmore H, Feldman M, Madabhushi A. HistoQC: An Open-Source Quality Control Tool for Digital Pathology Slides. JCO Clin Cancer Inform 2019; 3:1-7. [PMID: 30990737 PMCID: PMC6552675 DOI: 10.1200/cci.18.00157] [Citation(s) in RCA: 126] [Impact Index Per Article: 25.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/01/2019] [Indexed: 12/15/2022] Open
Abstract
PURPOSE Digital pathology (DP), referring to the digitization of tissue slides, is beginning to change the landscape of clinical diagnostic workflows and has engendered active research within the area of computational pathology. One of the challenges in DP is the presence of artefacts and batch effects, unintentionally introduced during both routine slide preparation (eg, staining, tissue folding) and digitization (eg, blurriness, variations in contrast and hue). Manual review of glass and digital slides is laborious, qualitative, and subject to intra- and inter-reader variability. Therefore, there is a critical need for a reproducible automated approach of precisely localizing artefacts to identify slides that need to be reproduced or regions that should be avoided during computational analysis. METHODS Here we present HistoQC, a tool for rapidly performing quality control to not only identify and delineate artefacts but also discover cohort-level outliers (eg, slides stained darker or lighter than others in the cohort). This open-source tool employs a combination of image metrics (eg, color histograms, brightness, contrast), features (eg, edge detectors), and supervised classifiers (eg, pen detection) to identify artefact-free regions on digitized slides. These regions and metrics are presented to the user via an interactive graphical user interface, facilitating artefact detection through real-time visualization and filtering. These same metrics afford users the opportunity to explicitly define acceptable tolerances for their workflows. RESULTS The output of HistoQC on 450 slides from The Cancer Genome Atlas was reviewed by two pathologists and found to be suitable for computational analysis more than 95% of the time. CONCLUSION These results suggest that HistoQC could provide an automated, quantifiable, quality control process for identifying artefacts and measuring slide quality, in turn helping to improve both the repeatability and robustness of DP workflows.
Collapse
Affiliation(s)
| | - Ren Zuo
- Case Western Reserve University, Cleveland, OH
| | - Hannah Gilmore
- University Hospitals Cleveland Medical Center, Cleveland, OH
| | - Michael Feldman
- University of Pennsylvania Perelman School of Medicine, Philadelphia, PA
| | - Anant Madabhushi
- Case Western Reserve University, Cleveland, OH
- Louis Stokes Cleveland Veterans Administration Medical Center, Cleveland, OH
| |
Collapse
|
7
|
Aeffner F, Zarella MD, Buchbinder N, Bui MM, Goodman MR, Hartman DJ, Lujan GM, Molani MA, Parwani AV, Lillard K, Turner OC, Vemuri VNP, Yuil-Valdes AG, Bowman D. Introduction to Digital Image Analysis in Whole-slide Imaging: A White Paper from the Digital Pathology Association. J Pathol Inform 2019; 10:9. [PMID: 30984469 PMCID: PMC6437786 DOI: 10.4103/jpi.jpi_82_18] [Citation(s) in RCA: 186] [Impact Index Per Article: 37.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2018] [Accepted: 12/11/2018] [Indexed: 12/22/2022] Open
Abstract
The advent of whole-slide imaging in digital pathology has brought about the advancement of computer-aided examination of tissue via digital image analysis. Digitized slides can now be easily annotated and analyzed via a variety of algorithms. This study reviews the fundamentals of tissue image analysis and aims to provide pathologists with basic information regarding the features, applications, and general workflow of these new tools. The review gives an overview of the basic categories of software solutions available, potential analysis strategies, technical considerations, and general algorithm readouts. Advantages and limitations of tissue image analysis are discussed, and emerging concepts, such as artificial intelligence and machine learning, are introduced. Finally, examples of how digital image analysis tools are currently being used in diagnostic laboratories, translational research, and drug development are discussed.
Collapse
Affiliation(s)
- Famke Aeffner
- Amgen Inc., Amgen Research, Comparative Biology and Safety Sciences, South San Francisco, CA, USA
| | - Mark D Zarella
- Department of Pathology and Laboratory Medicine, Drexel University, College of Medicine, Philadelphia, PA, USA
| | | | - Marilyn M Bui
- Department of Pathology, Moffitt Cancer Center, Tampa, FL, USA
| | | | | | | | - Mariam A Molani
- Department of Pathology and Microbiology, University of Nebraska Medical Center, Omaha, NE, USA
| | - Anil V Parwani
- The Ohio State University Medical Center, Columbus, OH, USA
| | | | - Oliver C Turner
- Novartis, Novartis Institutes for BioMedical Research, Preclinical Safety, East Hannover, NJ, USA
| | | | - Ana G Yuil-Valdes
- Department of Pathology and Microbiology, University of Nebraska Medical Center, Omaha, NE, USA
| | | |
Collapse
|
8
|
Wong M, Frye J, Kim S, Marchevsky AM. The Use of Screencasts with Embedded Whole-Slide Scans and Hyperlinks to Teach Anatomic Pathology in a Supervised Digital Environment. J Pathol Inform 2018; 9:39. [PMID: 30607306 PMCID: PMC6289000 DOI: 10.4103/jpi.jpi_44_18] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2018] [Accepted: 09/27/2018] [Indexed: 01/10/2023] Open
Abstract
BACKGROUND There is an increasing interest in using digitized whole-slide imaging (WSI) for routine surgical pathology diagnoses. Screencasts are digital recordings of computer screen output with advanced interactive features that allow for the preparation of videos. Screencasts that include hyperlinks to WSIs could help teach pathology residents how to become familiar with technologies that they are likely to use in their future career. MATERIALS AND METHODS Twenty screencasts were prepared with Camtasia 2.0 software (TechSmith, Okemos, MI, USA). They included clinical history, videos of chest X-rays and/or chest computed tomography images, links to WSI digitized with an Aperio Turbo AT scanner (Leica Biosystems, Buffalo Grove, IL, USA), pre- and posttests, and faculty-narrated videos of the WSI in a manner closely resembling a slide seminar and other educational materials. Screencasts were saved in a hospital network, Screencast.com, YouTube.com, and Vimeo.com. The screencasts were viewed by 12 pathology residents and fellows who made diagnoses, answered the quizzes, and took a survey with questions designed to evaluate their perception of the quality of this technology. Quiz results were automatically e-mailed to faculty. Pre- and posttest results were compared using a paired t-test. RESULTS Screencasts can be viewed with Windows PC and Mac operating systems and mobile devices; only videos saved in our network and screencast.com could be used to generate quizzes. Participants' feedback was very favorable with average scores ranging from 4.5 to 4.8 (on a scale of 5). Mean posttest scores (87.0% [±21.6%]) were significantly improved over those in the pretest quizzes (48.5% [±31.2%]) (P < 0.0001). CONCLUSION Screencasts with WSI that allow residents and fellows to diagnose cases using digital microscopy may prove to be a useful technology to enhance the pathology education. Future studies with larger numbers of screencasts and participants are needed to optimize various teaching strategies.
Collapse
Affiliation(s)
- Mary Wong
- Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| | - Joseph Frye
- Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| | - Stacey Kim
- Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| | - Alberto M. Marchevsky
- Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, CA, USA
| |
Collapse
|
9
|
Ung C, Kockx M, Waumans Y. Digital pathology in immuno-oncology – a roadmap for clinical development. EXPERT REVIEW OF PRECISION MEDICINE AND DRUG DEVELOPMENT 2017. [DOI: 10.1080/23808993.2017.1281737] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
| | - Mark Kockx
- Department of Histopathology, Imaging and Quantification, HistoGeneX NV, Antwerp, Belgium
| | - Yannick Waumans
- Department of Histopathology, Imaging and Quantification, HistoGeneX NV, Antwerp, Belgium
| |
Collapse
|
10
|
Aeffner F, Wilson K, Bolon B, Kanaly S, Mahrt CR, Rudmann D, Charles E, Young GD. Commentary. Toxicol Pathol 2016; 44:825-34. [DOI: 10.1177/0192623316653492] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist’s assessment will include automated analyses (i.e., “digital pathology”) to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists.
Collapse
Affiliation(s)
- Famke Aeffner
- Flagship Biosciences Inc., Westminster, Colorado, USA
| | | | - Brad Bolon
- Flagship Biosciences Inc., Westminster, Colorado, USA
| | | | | | - Dan Rudmann
- Flagship Biosciences Inc., Westminster, Colorado, USA
| | | | | |
Collapse
|
11
|
Clinical Neuropathology Views - 2/2016: Digital networking in European neuropathology: An initiative to facilitate truly interactive consultations. Clin Neuropathol 2016; 35:53-7. [PMID: 26833552 PMCID: PMC4806406 DOI: 10.5414/np300899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/22/2016] [Indexed: 11/18/2022] Open
Abstract
Digital technology is progressively changing our vision of the practice of neuropathology. There are a number of facts that support the introduction of digital neuropathology. With the development of whole-slide imaging (WSI) systems the difficulties involved in implementing a neuropathology network have been solved. A relevant difficulty has been image standardization, but an open digital image communication protocol defined by the Digital Imaging and Communications in Medicine (DICOM) standard is already a reality. The neuropathology network should be established in Europe because it is the expected geographic context for relationships among European neuropathologists. There are several limitations in the implementation of a digital neuropathology consultancy network such as financial support, operational costs, legal issues, and technical assistance of clients. All of these items have been considered and should be solved before implementing the proposal. Finally, the authors conclude that a European digital neuropathology network should be created for patients’ benefit.
Collapse
|