1
|
Ledwaba L, Saidu R, Malila B, Kuhn L, Mutsvangwa TE. Automated analysis of digital medical images in cervical cancer screening: A systematic review. MEDRXIV : THE PREPRINT SERVER FOR HEALTH SCIENCES 2024:2024.09.27.24314466. [PMID: 39399017 PMCID: PMC11469345 DOI: 10.1101/2024.09.27.24314466] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 10/15/2024]
Abstract
Background Cervical cancer screening programs are poorly implemented in LMICs due to a shortage of specialists and expensive diagnostic infrastructure. To address the barriers of implementation researchers have been developing low-cost portable devices and automating image analysis for decision support.However, as the knowledge base is growing rapidly, progress on the implementation status of novel imaging devices and algorithms in cervical cancer screening has become unclear. The aim of this project was to provide a systematic review summarizing the full range of automated technology systems used in cervical cancer screening. Method A search on academic databases was conducted and the search results were screened by two independent reviewers. Study selection was based on eligibility in meeting the terms of inclusion and exclusion criteria which were outlined using a Population, Intervention, Comparator and Outcome framework. Results 17 studies reported algorithms developed with source images from mobile device, viz. Pocket Colposcope, MobileODT EVA Colpo, Smartphone Camera, Smartphone-based Endoscope System, Smartscope, mHRME, and PiHRME. While 56 studies reported algorithms with source images from conventional/commercial acquisition devices. Most interventions were in the feasibility stage of development, undergoing initial clinical validations. Conclusion Researchers have proven superior prediction performance of computer aided diagnostics (CAD) in colposcopy (>80% accuracies) versus manual analysis (<70.0% accuracies). Furthermore, this review summarized evidence of the algorithms which are being created utilizing portable devices, to circumvent constraints prohibiting wider implementation in LMICs (such as expensive diagnostic infrastructure). However clinical validation of novel devices with CAD is not yet implemented adequately in LMICs.
Collapse
Affiliation(s)
- Leshego Ledwaba
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| | - Rakiya Saidu
- Obstetrics and Gynaecology, Groote Schuur Hospital/University of Cape Town, Cape Town, Western Cape, South Africa
| | - Bessie Malila
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| | - Louise Kuhn
- Gertrude H. Sergievsky Center, Vagelos College of Physicians and Surgeons; and Department of Epidemiology, Mailman School of Public Health, Columbia University Irving Medical Center, New York, New York
| | - Tinashe E.M. Mutsvangwa
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| |
Collapse
|
2
|
Vargas-Cardona HD, Rodriguez-Lopez M, Arrivillaga M, Vergara-Sanchez C, García-Cifuentes JP, Bermúdez PC, Jaramillo-Botero A. Artificial intelligence for cervical cancer screening: Scoping review, 2009-2022. Int J Gynaecol Obstet 2024; 165:566-578. [PMID: 37811597 DOI: 10.1002/ijgo.15179] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 09/04/2023] [Accepted: 09/20/2023] [Indexed: 10/10/2023]
Abstract
BACKGROUND The intersection of artificial intelligence (AI) with cancer research is increasing, and many of the advances have focused on the analysis of cancer images. OBJECTIVES To describe and synthesize the literature on the diagnostic accuracy of AI in early imaging diagnosis of cervical cancer following Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR). SEARCH STRATEGY Arksey and O'Malley methodology was used and PubMed, Scopus, and Google Scholar databases were searched using a combination of English and Spanish keywords. SELECTION CRITERIA Identified titles and abstracts were screened to select original reports and cross-checked for overlap of cases. DATA COLLECTION AND ANALYSIS A descriptive summary was organized by the AI algorithm used, total of images analyzed, data source, clinical comparison criteria, and diagnosis performance. MAIN RESULTS We identified 32 studies published between 2009 and 2022. The primary sources of images were digital colposcopy, cervicography, and mobile devices. The machine learning/deep learning (DL) algorithms applied in the articles included support vector machine (SVM), random forest classifier, k-nearest neighbors, multilayer perceptron, C4.5, Naïve Bayes, AdaBoost, XGboots, conditional random fields, Bayes classifier, convolutional neural network (CNN; and variations), ResNet (several versions), YOLO+EfficientNetB0, and visual geometry group (VGG; several versions). SVM and DL methods (CNN, ResNet, VGG) showed the best diagnostic performances, with an accuracy of over 97%. CONCLUSION We concluded that the use of AI for cervical cancer screening has increased over the years, and some results (mainly from DL) are very promising. However, further research is necessary to validate these findings.
Collapse
Affiliation(s)
| | - Mérida Rodriguez-Lopez
- Faculty of Health Sciences, Universidad Icesi, Cali, Colombia
- Fundación Valle del Lili, Centro de Investigaciones Clínicas, Cali, Colombia
| | | | | | | | | | - Andres Jaramillo-Botero
- OMICAS Research Institute (iOMICAS), Pontificia Universidad Javeriana Cali, Cali, Colombia
- Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, California, USA
| |
Collapse
|
3
|
Chen P, Liu F, Zhang J, Wang B. MFEM-CIN: A Lightweight Architecture Combining CNN and Transformer for the Classification of Pre-Cancerous Lesions of the Cervix. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY 2024; 5:216-225. [PMID: 38606400 PMCID: PMC11008799 DOI: 10.1109/ojemb.2024.3367243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 12/03/2023] [Accepted: 02/05/2024] [Indexed: 04/13/2024] Open
Abstract
Goal: Cervical cancer is one of the most common cancers in women worldwide, ranking among the top four. Unfortunately, it is also the fourth leading cause of cancer-related deaths among women, particularly in developing countries where incidence and mortality rates are higher compared to developed nations. Colposcopy can aid in the early detection of cervical lesions, but its effectiveness is limited in areas with limited medical resources and a lack of specialized physicians. Consequently, many cases are diagnosed at later stages, putting patients at significant risk. Methods: This paper proposes an automated colposcopic image analysis framework to address these challenges. The framework aims to reduce the labor costs associated with cervical precancer screening in undeserved regions and assist doctors in diagnosing patients. The core of the framework is the MFEM-CIN hybrid model, which combines Convolutional Neural Networks (CNN) and Transformer to aggregate the correlation between local and global features. This combined analysis of local and global information is scientifically useful in clinical diagnosis. In the model, MSFE and MSFF are utilized to extract and fuse multi-scale semantics. This preserves important shallow feature information and allows it to interact with the deep feature, enriching the semantics to some extent. Conclusions: The experimental results demonstrate an accuracy rate of 89.2% in identifying cervical intraepithelial neoplasia while maintaining a lightweight model. This performance exceeds the average accuracy achieved by professional physicians, indicating promising potential for practical application. Utilizing automated colposcopic image analysis and the MFEM-CIN model, this research offers a practical solution to reduce the burden on healthcare providers and improve the efficiency and accuracy of cervical cancer diagnosis in resource-constrained areas.
Collapse
Affiliation(s)
- Peng Chen
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
- Fin China-Anhui University Joint Laboratory for Financial Big Data ResearchHefei Financial China Information and Technology Company, Ltd.Hefei230022China
| | - Fobao Liu
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
| | - Jun Zhang
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
| | - Bing Wang
- School of Management Science and EngineeringAnhui University of Finance and EconomicsBengbu233030China
| |
Collapse
|
4
|
Hou H, Mitbander R, Tang Y, Azimuddin A, Carns J, Schwarz RA, Richards-Kortum RR. Optical imaging technologies for in vivo cancer detection in low-resource settings. CURRENT OPINION IN BIOMEDICAL ENGINEERING 2023; 28:100495. [PMID: 38406798 PMCID: PMC10883072 DOI: 10.1016/j.cobme.2023.100495] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/27/2024]
Abstract
Cancer continues to affect underserved populations disproportionately. Novel optical imaging technologies, which can provide rapid, non-invasive, and accurate cancer detection at the point of care, have great potential to improve global cancer care. This article reviews the recent technical innovations and clinical translation of low-cost optical imaging technologies, highlighting the advances in both hardware and software, especially the integration of artificial intelligence, to improve in vivo cancer detection in low-resource settings. Additionally, this article provides an overview of existing challenges and future perspectives of adapting optical imaging technologies into clinical practice, which can potentially contribute to novel insights and programs that effectively improve cancer detection in low-resource settings.
Collapse
Affiliation(s)
- Huayu Hou
- Department of Bioengineering, Rice University, Houston, TX 77005, USA
| | - Ruchika Mitbander
- Department of Bioengineering, Rice University, Houston, TX 77005, USA
| | - Yubo Tang
- Department of Bioengineering, Rice University, Houston, TX 77005, USA
| | - Ahad Azimuddin
- School of Medicine, Texas A&M University, Houston, TX 77030, USA
| | - Jennifer Carns
- Department of Bioengineering, Rice University, Houston, TX 77005, USA
| | - Richard A Schwarz
- Department of Bioengineering, Rice University, Houston, TX 77005, USA
| | | |
Collapse
|
5
|
Li Z, Zeng CM, Dong YG, Cao Y, Yu LY, Liu HY, Tian X, Tian R, Zhong CY, Zhao TT, Liu JS, Chen Y, Li LF, Huang ZY, Wang YY, Hu Z, Zhang J, Liang JX, Zhou P, Lu YQ. A segmentation model to detect cevical lesions based on machine learning of colposcopic images. Heliyon 2023; 9:e21043. [PMID: 37928028 PMCID: PMC10623278 DOI: 10.1016/j.heliyon.2023.e21043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 10/12/2023] [Accepted: 10/13/2023] [Indexed: 11/07/2023] Open
Abstract
Background Semantic segmentation is crucial in medical image diagnosis. Traditional deep convolutional neural networks excel in image classification and object detection but fall short in segmentation tasks. Enhancing the accuracy and efficiency of detecting high-level cervical lesions and invasive cancer poses a primary challenge in segmentation model development. Methods Between 2018 and 2022, we retrospectively studied a total of 777 patients, comprising 339 patients with high-level cervical lesions and 313 patients with microinvasive or invasive cervical cancer. Overall, 1554 colposcopic images were put into the DeepLabv3+ model for learning. Accuracy, Precision, Specificity, and mIoU were employed to evaluate the performance of the model in the prediction of cervical high-level lesions and cancer. Results Experiments showed that our segmentation model had better diagnosis efficiency than colposcopic experts and other artificial intelligence models, and reached Accuracy of 93.29 %, Precision of 87.2 %, Specificity of 90.1 %, and mIoU of 80.27 %, respectively. Conclution The DeepLabv3+ model had good performance in the segmentation of cervical lesions in colposcopic post-acetic-acid images and can better assist colposcopists in improving the diagnosis.
Collapse
Affiliation(s)
- Zhen Li
- Department of Gynecological Oncology, Zhongnan Hospital of Wuhan University, Wuhan, Hubei, 430071, China
| | - Chu-Mei Zeng
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Yan-Gang Dong
- Institute for Brain Research and Rehabilitation, the South China Normal University, Guangzhou, Guangdong, 510631, China
| | - Ying Cao
- Department of Obstetrics and Gynecology, Academician expert workstation, The Central Hospital of Wuhan, Tongji Medical College Huazhong University of Science and Technology, Wuhan, Hubei, 430014, China
| | - Li-Yao Yu
- Department of Obstetrics and Gynecology, Academician expert workstation, The Central Hospital of Wuhan, Tongji Medical College Huazhong University of Science and Technology, Wuhan, Hubei, 430014, China
| | - Hui-Ying Liu
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Xun Tian
- Department of Obstetrics and Gynecology, Academician expert workstation, The Central Hospital of Wuhan, Tongji Medical College Huazhong University of Science and Technology, Wuhan, Hubei, 430014, China
| | - Rui Tian
- the Generulor Company Bio-X Lab, Zhuhai, Guangdong, 519060, China
| | - Chao-Yue Zhong
- the Generulor Company Bio-X Lab, Zhuhai, Guangdong, 519060, China
| | - Ting-Ting Zhao
- the Generulor Company Bio-X Lab, Zhuhai, Guangdong, 519060, China
| | - Jia-Shuo Liu
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Ye Chen
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Li-Fang Li
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Zhe-Ying Huang
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Yu-Yan Wang
- Department of Obstetrics and gynecology, the First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong, 510062, China
| | - Zheng Hu
- Department of Gynecological Oncology, Zhongnan Hospital of Wuhan University, Wuhan, Hubei, 430071, China
| | - Jingjing Zhang
- Department of Gynecological Oncology, Zhongnan Hospital of Wuhan University, Wuhan, Hubei, 430071, China
| | - Jiu-Xing Liang
- Institute for Brain Research and Rehabilitation, the South China Normal University, Guangzhou, Guangdong, 510631, China
| | - Ping Zhou
- Department of Gynecology, Dongguan Maternal and Child Hospital, Dongguan, Guangdong, 523057, China
| | - Yi-Qin Lu
- Department of Gynecology, Dongzhimen Hospital, Beijing University of Chinese Medicine, Beijing, 101121, China
| |
Collapse
|
6
|
Jin E, Noble JA, Gomes M. A Review of Computer-Aided Diagnostic Algorithms for Cervical Neoplasia and an Assessment of Their Applicability to Female Genital Schistosomiasis. MAYO CLINIC PROCEEDINGS. DIGITAL HEALTH 2023; 1:247-257. [PMID: 40206624 PMCID: PMC11975695 DOI: 10.1016/j.mcpdig.2023.04.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 04/11/2025]
Abstract
Female genital schistosomiasis (FGS) affects an estimated 56 million women and girls in Africa. Nevertheless, this neglected tropical disease remains largely understudied and underdiagnosed. In this literature review, we examine the effectiveness of published computer-aided diagnostic (CAD) algorithms for cervical cancer that use colposcopy images and assess their applicability to the design of an automated image diagnostic algorithm for FGS. We searched 2 databases (Embase and MEDLINE) from database inception to June 10, 2022. We identified 393 studies, of which 13 were relevant for FGS diagnosis. These 13 studies were analyzed for their key image analysis model components and compared with the features that would be beneficial in an FGS diagnostic image analysis system.
Collapse
Affiliation(s)
- Emily Jin
- Department of Computer Science, University of Oxford, United Kingdom
| | - J. Alison Noble
- Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, United Kingdom
| | - Mireille Gomes
- Global Health Institute of Merck, Ares Trading S.A., an affiliate of Merck KGaA, Darmstadt, Germany
| |
Collapse
|
7
|
Kakotkin VV, Semina EV, Zadorkina TG, Agapov MA. Prevention Strategies and Early Diagnosis of Cervical Cancer: Current State and Prospects. Diagnostics (Basel) 2023; 13:diagnostics13040610. [PMID: 36832098 PMCID: PMC9955852 DOI: 10.3390/diagnostics13040610] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2022] [Revised: 02/03/2023] [Accepted: 02/05/2023] [Indexed: 02/11/2023] Open
Abstract
Cervical cancer ranks third among all new cancer cases and causes of cancer deaths in females. The paper provides an overview of cervical cancer prevention strategies employed in different regions, with incidence and mortality rates ranging from high to low. It assesses the effectiveness of approaches proposed by national healthcare systems by analysing data published in the National Library of Medicine (Pubmed) since 2018 featuring the following keywords: "cervical cancer prevention", "cervical cancer screening", "barriers to cervical cancer prevention", "premalignant cervical lesions" and "current strategies". WHO's 90-70-90 global strategy for cervical cancer prevention and early screening has proven effective in different countries in both mathematical models and clinical practice. The data analysis carried out within this study identified promising approaches to cervical cancer screening and prevention, which can further enhance the effectiveness of the existing WHO strategy and national healthcare systems. One such approach is the application of AI technologies for detecting precancerous cervical lesions and choosing treatment strategies. As such studies show, the use of AI can not only increase detection accuracy but also ease the burden on primary care.
Collapse
Affiliation(s)
- Viktor V. Kakotkin
- Scientific and Educational Cluster MEDBIO, Immanuel Kant Baltic Federal University, A. Nevskogo St., 14, 236041 Kaliningrad, Russia
| | - Ekaterina V. Semina
- Scientific and Educational Cluster MEDBIO, Immanuel Kant Baltic Federal University, A. Nevskogo St., 14, 236041 Kaliningrad, Russia
| | - Tatiana G. Zadorkina
- Kaliningrad Regional Centre for Specialised Medical Care, Barnaulskaia Street, 6, 236006 Kaliningrad, Russia
| | - Mikhail A. Agapov
- Scientific and Educational Cluster MEDBIO, Immanuel Kant Baltic Federal University, A. Nevskogo St., 14, 236041 Kaliningrad, Russia
- Correspondence: ; Tel.: +7-(4012)-59-55-95
| |
Collapse
|
8
|
Cervical pre-cancerous lesion detection: development of smartphone-based VIA application using artificial intelligence. BMC Res Notes 2022; 15:356. [PMID: 36463193 PMCID: PMC9719132 DOI: 10.1186/s13104-022-06250-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2022] [Accepted: 11/18/2022] [Indexed: 12/04/2022] Open
Abstract
OBJECTIVE Visual inspection of cervix after acetic acid application (VIA) has been considered an alternative to Pap smear in resource-limited settings, like Indonesia. However, VIA results mainly depend on examiner's experience and with the lack of comprehensive training of healthcare workers, VIA accuracy keeps declining. We aimed to develop an artificial intelligence (AI)-based Android application that can automatically determine VIA results in real time and may be further developed as a health care support system in cervical cancer screening. RESULT A total of 199 women who underwent VIA test was studied. Images of cervix before and after VIA test were taken with smartphone, then evaluated and labelled by experienced oncologist as VIA positive or negative. Our AI model training pipeline consists of 3 steps: image pre-processing, feature extraction, and classifier development. Out of the 199 data, 134 were used as train-validation data and the remaining 65 data were used as test data. The trained AI model generated a sensitivity of 80%, specificity of 96.4%, accuracy of 93.8%, precision of 80%, and ROC/AUC of 0.85 (95% CI 0.66-1.0). The developed AI-based Android application may potentially aid cervical cancer screening, especially in low resource settings.
Collapse
|
9
|
Coole JB, Brenes D, Possati-Resende JC, Antoniazzi M, Fonseca BDO, Maker Y, Kortum A, Vohra IS, Schwarz RA, Carns J, Borba Souza KC, Vidigal Santana IV, Kreitchmann R, Salcedo MP, Ramanujam N, Schmeler KM, Richards-Kortum R. Development of a multimodal mobile colposcope for real-time cervical cancer detection. BIOMEDICAL OPTICS EXPRESS 2022; 13:5116-5130. [PMID: 36425643 PMCID: PMC9664871 DOI: 10.1364/boe.463253] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 08/19/2022] [Accepted: 08/23/2022] [Indexed: 06/16/2023]
Abstract
Cervical cancer remains a leading cause of cancer death among women in low-and middle-income countries. Globally, cervical cancer prevention programs are hampered by a lack of resources, infrastructure, and personnel. We describe a multimodal mobile colposcope (MMC) designed to diagnose precancerous cervical lesions at the point-of-care without the need for biopsy. The MMC integrates two complementary imaging systems: 1) a commercially available colposcope and 2) a high speed, high-resolution, fiber-optic microendoscope (HRME). Combining these two image modalities allows, for the first time, the ability to locate suspicious cervical lesions using widefield imaging and then to obtain co-registered high-resolution images across an entire lesion. The MMC overcomes limitations of high-resolution imaging alone; widefield imaging can be used to guide the placement of the high-resolution imaging probe at clinically suspicious regions and co-registered, mosaicked high-resolution images effectively increase the field of view of high-resolution imaging. Representative data collected from patients referred for colposcopy at Barretos Cancer Hospital in Brazil, including 22,800 high resolution images and 9,900 colposcope images, illustrate the ability of the MMC to identify abnormal cervical regions, image suspicious areas with subcellular resolution, and distinguish between high-grade and low-grade dysplasia.
Collapse
Affiliation(s)
- Jackson B. Coole
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - David Brenes
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | - Márcio Antoniazzi
- Barretos Cancer Hospital, Department of Prevention, Barretos, Brazil
| | | | - Yajur Maker
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - Alex Kortum
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - Imran S. Vohra
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | - Jennifer Carns
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | | | - Regis Kreitchmann
- Federal University of Health Sciences of Porto Alegre (UFCSPA)/Santa Casa Hospital of Porto Alegre, Department of Obstetrics and Gynecology, Porto Alegre, Brazil
| | - Mila P. Salcedo
- Federal University of Health Sciences of Porto Alegre (UFCSPA)/Santa Casa Hospital of Porto Alegre, Department of Obstetrics and Gynecology, Porto Alegre, Brazil
- The University of Texas MD Anderson Cancer Center, Department of Gynecologic Oncology and Reproductive Medicine, Houston, TX 77005, USA
| | - Nirmala Ramanujam
- Duke University, Department of Biomedical Engineering, Durham, NC 27708, USA
| | - Kathleen M. Schmeler
- The University of Texas MD Anderson Cancer Center, Department of Gynecologic Oncology and Reproductive Medicine, Houston, TX 77005, USA
| | | |
Collapse
|
10
|
Smartphone-Based Visual Inspection with Acetic Acid: An Innovative Tool to Improve Cervical Cancer Screening in Low-Resource Setting. Healthcare (Basel) 2022; 10:healthcare10020391. [PMID: 35207002 PMCID: PMC8871553 DOI: 10.3390/healthcare10020391] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Revised: 02/06/2022] [Accepted: 02/11/2022] [Indexed: 11/17/2022] Open
Abstract
Visual inspection with acetic acid (VIA) is recommended by the World Health Organization for primary cervical cancer screening or triage of human papillomavirus-positive women living in low-resource settings. Nonetheless, traditional VIA with the naked-eye is associated with large variabilities in the detection of pre-cancer and with a lack of quality control. Digital-VIA (D-VIA), using high definition cameras, allows magnification and zooming on transformation zones and suspicious cervical regions, as well as simultaneously compare native and post-VIA images in real-time. We searched MEDLINE and LILACS between January 2015 and November 2021 for relevant studies conducted in low-resource settings using a smartphone device for D-VIA. The aim of this review was to provide an evaluation on available data for smartphone use in low-resource settings in the context of D-VIA-based cervical cancer screenings. The available results to date show that the quality of D-VIA images is satisfactory and enables CIN1/CIN2+ diagnosis, and that a smartphone is a promising tool for cervical cancer screening monitoring and for on- and off-site supervision, and training. The use of artificial intelligence algorithms could soon allow automated and accurate cervical lesion detection.
Collapse
|
11
|
Castor D, Saidu R, Boa R, Mbatani N, Mutsvangwa TEM, Moodley J, Denny L, Kuhn L. Assessment of the implementation context in preparation for a clinical study of machine-learning algorithms to automate the classification of digital cervical images for cervical cancer screening in resource-constrained settings. FRONTIERS IN HEALTH SERVICES 2022; 2:1000150. [PMID: 36925850 PMCID: PMC10012690 DOI: 10.3389/frhs.2022.1000150] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Accepted: 08/23/2022] [Indexed: 11/13/2022]
Abstract
Introduction We assessed the implementation context and image quality in preparation for a clinical study evaluating the effectiveness of automated visual assessment devices within cervical cancer screening of women living without and with HIV. Methods We developed a semi-structured questionnaire based on three Consolidated Framework for Implementation Research (CFIR) domains; intervention characteristics, inner setting, and process, in Cape Town, South Africa. Between December 1, 2020, and August 6, 2021, we evaluated two devices: MobileODT handheld colposcope; and a commercially-available cell phone (Samsung A21ST). Colposcopists visually inspected cervical images for technical adequacy. Descriptive analyses were tabulated for quantitative variables, and narrative responses were summarized in the text. Results Two colposcopists described the devices as easy to operate, without data loss. The clinical workspace and gynecological workflow were modified to incorporate devices and manage images. Providers believed either device would likely perform better than cytology under most circumstances unless the squamocolumnar junction (SCJ) were not visible, in which case cytology was expected to be better. Image quality (N = 75) from the MobileODT device and cell phone was comparable in terms of achieving good focus (81% vs. 84%), obtaining visibility of the squamous columnar junction (88% vs. 97%), avoiding occlusion (79% vs. 87%), and detection of lesion and range of lesion includes the upper limit (63% vs. 53%) but differed in taking photographs free of glare (100% vs. 24%). Conclusion Novel application of the CFIR early in the conduct of the clinical study, including assessment of image quality, highlight real-world factors about intervention characteristics, inner clinical setting, and workflow process that may affect both the clinical study findings and ultimate pace of translating to clinical practice. The application and augmentation of the CFIR in this study context highlighted adaptations needed for the framework to better measure factors relevant to implementing digital interventions.
Collapse
Affiliation(s)
- Delivette Castor
- Division of Infectious Diseases, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY, United States.,Department of Epidemiology, Mailman School of Public Health, Columbia University Irving Medical Center, New York, NY, United States
| | - Rakiya Saidu
- Department of Obstetrics and Gynaecology, University of Cape Town, Cape Town, South Africa.,Groote Schuur Hospital and South African Medical Research Council, Gynaecology Cancer Research Centre, University of Cape Town, Cape Town, South Africa
| | - Rosalind Boa
- Department of Obstetrics and Gynaecology, University of Cape Town, Cape Town, South Africa.,Groote Schuur Hospital and South African Medical Research Council, Gynaecology Cancer Research Centre, University of Cape Town, Cape Town, South Africa
| | - Nomonde Mbatani
- Department of Obstetrics and Gynaecology, University of Cape Town, Cape Town, South Africa.,Groote Schuur Hospital and South African Medical Research Council, Gynaecology Cancer Research Centre, University of Cape Town, Cape Town, South Africa
| | - Tinashe E M Mutsvangwa
- Division of Biomedical Engineering, Department of Human Biology, University of Cape Town, Cape Town, South Africa
| | - Jennifer Moodley
- Groote Schuur Hospital and South African Medical Research Council, Gynaecology Cancer Research Centre, University of Cape Town, Cape Town, South Africa.,Women's Health Research Unit, School of Public Health and Family Medicine, University of Cape Town, Cape Town, South Africa
| | - Lynette Denny
- Department of Obstetrics and Gynaecology, University of Cape Town, Cape Town, South Africa.,Groote Schuur Hospital and South African Medical Research Council, Gynaecology Cancer Research Centre, University of Cape Town, Cape Town, South Africa
| | - Louise Kuhn
- Department of Epidemiology, Mailman School of Public Health, Columbia University Irving Medical Center, New York, NY, United States.,Gertrude H. Sergievsky Center, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY, United States
| |
Collapse
|
12
|
Allanson ER, Phoolcharoen N, Salcedo MP, Fellman B, Schmeler KM. Accuracy of Smartphone Images of the Cervix After Acetic Acid Application for Diagnosing Cervical Intraepithelial Neoplasia Grade 2 or Greater in Women With Positive Cervical Screening: A Systematic Review and Meta-Analysis. JCO Glob Oncol 2021; 7:1711-1721. [PMID: 34936374 PMCID: PMC8710337 DOI: 10.1200/go.21.00168] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2021] [Revised: 09/10/2021] [Accepted: 11/09/2021] [Indexed: 02/03/2023] Open
Abstract
PURPOSE Smartphones are used in cervical screening for visual inspection after acetic acid or Lugol's iodine (VIA/VILI) application to capture and share images to improve the sensitivity and interobserver variability of VIA/VILI. We undertook a systematic review and meta-analysis assessing the diagnostic accuracy of smartphone images of the cervix at the time of VIA/VILI (termed S-VIA) in the detection of precancerous lesions in women undergoing cervical screening. METHODS This systematic review was conducted in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Studies from January 1, 2010, to June 30, 2020, were assessed. MEDLINE/PubMed, Embase, CINAHL, Cochrane, and LILACS were searched. Cohort and cross-sectional studies were considered. S-VIA was compared with the reference standard of histopathology. We excluded studies where additional technology was added to the smartphone including artificial intelligence, enhanced visual assessment, and other algorithms to automatically diagnose precancerous lesions. The primary outcome was the accuracy of S-VIA for the diagnosis of cervical intraepithelial neoplasia grade 2 or greater (CIN 2+). Data were extracted, and we plotted the sensitivity, specificity, negative predictive value, and positive predictive value of S-VIA using forest plots. This study was prospectively registered with The International Prospective Register of Systematic Reviews:CRD42020204024. RESULTS Six thousand three studies were screened, 71 full texts assessed, and eight studies met criteria for inclusion, with six included in the final meta-analysis. The sensitivity of S-VIA for the diagnosis of CIN 2+ was 74.56% (95% CI, 70.16 to 78.95; I2 61.30%), specificity was 61.75% (95% CI, 56.35 to 67.15; I2 95.00%), negative predictive value was 93.71% (95% CI, 92.81 to 94.61; I2 0%), and positive predictive value was 26.97% (95% CI, 24.13 to 29.81; I2 61.3%). CONCLUSION Our results suggest that S-VIA has accuracy in the detection of CIN 2+ and may provide additional support to health care providers delivering care in low-resource settings.
Collapse
Affiliation(s)
- Emma R. Allanson
- Department of Gynecologic Oncology and Reproductive Medicine, The University of Texas MD Anderson Cancer Center, Houston, TX
| | - Natacha Phoolcharoen
- Department of Gynecologic Oncology and Reproductive Medicine, The University of Texas MD Anderson Cancer Center, Houston, TX
- Department of Obstetrics and Gynecology, King Chulalongkorn Memorial Hospital, Faculty of Medicine, Chulalongkorn University, Bangkok, Thailand
- The Obstetrics and Gynecology Department, Federal University of Health Sciences of Porto Alegre/Santa Casa Hospital of Porto Alegre, Porto Alegre, Brazil
- Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, TX
| | - Mila P. Salcedo
- Department of Gynecologic Oncology and Reproductive Medicine, The University of Texas MD Anderson Cancer Center, Houston, TX
- The Obstetrics and Gynecology Department, Federal University of Health Sciences of Porto Alegre/Santa Casa Hospital of Porto Alegre, Porto Alegre, Brazil
| | - Bryan Fellman
- Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, TX
| | - Kathleen M. Schmeler
- Department of Gynecologic Oncology and Reproductive Medicine, The University of Texas MD Anderson Cancer Center, Houston, TX
| |
Collapse
|
13
|
Cavalcanti TC, Lew HM, Lee K, Lee SY, Park MK, Hwang JY. Intelligent smartphone-based multimode imaging otoscope for the mobile diagnosis of otitis media. BIOMEDICAL OPTICS EXPRESS 2021; 12:7765-7779. [PMID: 35003865 PMCID: PMC8713661 DOI: 10.1364/boe.441590] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2021] [Revised: 11/10/2021] [Accepted: 11/10/2021] [Indexed: 06/14/2023]
Abstract
Otitis media (OM) is one of the most common ear diseases in children and a common reason for outpatient visits to medical doctors in primary care practices. Adhesive OM (AdOM) is recognized as a sequela of OM with effusion (OME) and often requires surgical intervention. OME and AdOM exhibit similar symptoms, and it is difficult to distinguish between them using a conventional otoscope in a primary care unit. The accuracy of the diagnosis is highly dependent on the experience of the examiner. The development of an advanced otoscope with less variation in diagnostic accuracy by the examiner is crucial for a more accurate diagnosis. Thus, we developed an intelligent smartphone-based multimode imaging otoscope for better diagnosis of OM, even in mobile environments. The system offers spectral and autofluorescence imaging of the tympanic membrane using a smartphone attached to the developed multimode imaging module. Moreover, it is capable of intelligent analysis for distinguishing between normal, OME, and AdOM ears using a machine learning algorithm. Using the developed system, we examined the ears of 69 patients to assess their performance for distinguishing between normal, OME, and AdOM ears. In the classification of ear diseases, the multimode system based on machine learning analysis performed better in terms of accuracy and F1 scores than single RGB image analysis, RGB/fluorescence image analysis, and the analysis of spectral image cubes only, respectively. These results demonstrate that the intelligent multimode diagnostic capability of an otoscope would be beneficial for better diagnosis and management of OM.
Collapse
Affiliation(s)
- Thiago C Cavalcanti
- Department of Information and Communication Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, Republic of Korea
| | - Hah Min Lew
- Department of Information and Communication Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, Republic of Korea
| | - Kyungsu Lee
- Department of Information and Communication Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, Republic of Korea
| | - Sang-Yeon Lee
- Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University Hospital, Seoul, Republic of Korea
| | - Moo Kyun Park
- Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University Hospital, Seoul, Republic of Korea
- co-first authors
| | - Jae Youn Hwang
- Department of Information and Communication Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, Republic of Korea
- co-first authors
| |
Collapse
|
14
|
Abstract
Cervical cancer is one of the commonest cancers afflicting women in low and middle income countries, however, both primary prevention with human papillomavirus vaccination, and secondary prevention with screening programs and treatment of preinvasive disease are possible. A coordinated approach to eliminating cervical cancer, as has been called for by the World Health Organization, requires a complex series of steps at all levels of a health system. This article outlines the current state of cervical cancer prevention in low and middle income countries, the innovations being employed to improve outcomes, and consideration of the next steps needed as we move towards global elimination.
Collapse
Affiliation(s)
- Emma R Allanson
- Department of Gynecologic Oncology and Reproductive Medicine, MD Anderson Cancer Center, Houston, Texas
| | | |
Collapse
|
15
|
Hunt B, Ruiz AJ, Pogue BW. Smartphone-based imaging systems for medical applications: a critical review. JOURNAL OF BIOMEDICAL OPTICS 2021; 26:JBO-200421VR. [PMID: 33860648 PMCID: PMC8047775 DOI: 10.1117/1.jbo.26.4.040902] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 03/29/2021] [Indexed: 05/15/2023]
Abstract
SIGNIFICANCE Smartphones come with an enormous array of functionality and are being more widely utilized with specialized attachments in a range of healthcare applications. A review of key developments and uses, with an assessment of strengths/limitations in various clinical workflows, was completed. AIM Our review studies how smartphone-based imaging (SBI) systems are designed and tested for specialized applications in medicine and healthcare. An evaluation of current research studies is used to provide guidelines for improving the impact of these research advances. APPROACH First, the established and emerging smartphone capabilities that can be leveraged for biomedical imaging are detailed. Then, methods and materials for fabrication of optical, mechanical, and electrical interface components are summarized. Recent systems were categorized into four groups based on their intended application and clinical workflow: ex vivo diagnostic, in vivo diagnostic, monitoring, and treatment guidance. Lastly, strengths and limitations of current SBI systems within these various applications are discussed. RESULTS The native smartphone capabilities for biomedical imaging applications include cameras, touchscreens, networking, computation, 3D sensing, audio, and motion, in addition to commercial wearable peripheral devices. Through user-centered design of custom hardware and software interfaces, these capabilities have the potential to enable portable, easy-to-use, point-of-care biomedical imaging systems. However, due to barriers in programming of custom software and on-board image analysis pipelines, many research prototypes fail to achieve a prospective clinical evaluation as intended. Effective clinical use cases appear to be those in which handheld, noninvasive image guidance is needed and accommodated by the clinical workflow. Handheld systems for in vivo, multispectral, and quantitative fluorescence imaging are a promising development for diagnostic and treatment guidance applications. CONCLUSIONS A holistic assessment of SBI systems must include interpretation of their value for intended clinical settings and how their implementations enable better workflow. A set of six guidelines are proposed to evaluate appropriateness of smartphone utilization in terms of clinical context, completeness, compactness, connectivity, cost, and claims. Ongoing work should prioritize realistic clinical assessments with quantitative and qualitative comparison to non-smartphone systems to clearly demonstrate the value of smartphone-based systems. Improved hardware design to accommodate the rapidly changing smartphone ecosystem, creation of open-source image acquisition and analysis pipelines, and adoption of robust calibration techniques to address phone-to-phone variability are three high priority areas to move SBI research forward.
Collapse
Affiliation(s)
- Brady Hunt
- Dartmouth College, Thayer School of Engineering, Hanover, New Hampshire, United States
- Address all correspondence to Brady Hunt,
| | - Alberto J. Ruiz
- Dartmouth College, Thayer School of Engineering, Hanover, New Hampshire, United States
| | - Brian W. Pogue
- Dartmouth College, Thayer School of Engineering, Hanover, New Hampshire, United States
| |
Collapse
|