1
|
Talathi MA, Dabhadkar S, Doke PP, Singh V. Accuracy of the AI-Based Smart Scope® Test as a Point-of-Care Screening and Triage Tool Compared to Colposcopy: A Pilot Study. Cureus 2025; 17:e81212. [PMID: 40291220 PMCID: PMC12022722 DOI: 10.7759/cureus.81212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2024] [Accepted: 03/19/2025] [Indexed: 04/30/2025] Open
Abstract
Objectives The primary objective of this study was to compare the screening accuracy of AI assessment with colposcopy. Secondary objectives included comparing the triaging accuracy of AI and colposcopy assessments against histopathology. Methodology This prospective, single-arm screening test assessment study was conducted at the obstetrics and gynecology department of Bharati Vidyapeeth (Deemed to be University) Medical College in Pune, India. The study included sexually active, nonpregnant women aged 25-65 years visiting the OPD for per-speculum examination. Women with a clinically unhealthy cervix detected during the examination were counseled, and those who provided consent were enrolled. Patients with a history of prior cervical cancer treatment or hysterectomy were excluded. A total of 130 women were enrolled. Each participant underwent colposcopy, Smart Scope®-AI (SS-AI) assisted visual inspection with acetic acid (VIA), and visual inspection with Lugol's iodine during the same visit. Positive findings from any test led to a biopsy, with samples sent for histopathological analysis. Results Of the 130 women enrolled, 30 were referred for biopsy. Histopathology results were obtained for 18 consenting women. Using colposcopy as the reference standard (N = 130), the accuracy of SS-AI was 76.53%. When compared to histopathology (N = 18) as the gold standard, the accuracy of colposcopy and SS-AI was 63.67% and 83.33%, respectively. The sensitivity and specificity of SS-AI were both 83.33%, while colposcopy had a sensitivity of 83.33% and a specificity of 50%. Likelihood ratios for SS-AI were superior to those of colposcopy. These findings suggest that the SS-AI-assisted test, a digital VIA test, accurately detects positive and negative cervical lesions. Conclusions The SS-AI system demonstrated comparable effectiveness to colposcopy and has the potential to be used as a point-of-care screening and triaging tool in primary healthcare centers lacking colposcopy equipment for triaging purposes.
Collapse
Affiliation(s)
- Manju A Talathi
- Obstetrics and Gynecology, Bharati Vidyapeeth (Deemed to be University) Medical College, Pune, IND
| | - Suchita Dabhadkar
- Obstetrics and Gynecology, Bharati Vidyapeeth (Deemed to be University) Medical College, Pune, IND
| | - Prakash P Doke
- Community Medicine, Bharati Vidyapeeth (Deemed to be University) Medical College, Pune, IND
| | - Varsha Singh
- Clinical Research, Periwinkle Technologies Pvt. Ltd., Pune, IND
| |
Collapse
|
2
|
Liu L, Liu J, Su Q, Chu Y, Xia H, Xu R. Performance of artificial intelligence for diagnosing cervical intraepithelial neoplasia and cervical cancer: a systematic review and meta-analysis. EClinicalMedicine 2025; 80:102992. [PMID: 39834510 PMCID: PMC11743870 DOI: 10.1016/j.eclinm.2024.102992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2024] [Revised: 11/22/2024] [Accepted: 11/22/2024] [Indexed: 01/22/2025] Open
Abstract
Background Cervical cytology screening and colposcopy play crucial roles in cervical intraepithelial neoplasia (CIN) and cervical cancer prevention. Previous studies have provided evidence that artificial intelligence (AI) has remarkable diagnostic accuracy in these procedures. With this systematic review and meta-analysis, we aimed to examine the pooled accuracy, sensitivity, and specificity of AI-assisted cervical cytology screening and colposcopy for cervical intraepithelial neoplasia and cervical cancer screening. Methods In this systematic review and meta-analysis, we searched the PubMed, Embase, and Cochrane Library databases for studies published between January 1, 1986 and August 31, 2024. Studies investigating the sensitivity and specificity of AI-assisted cervical cytology screening and colposcopy for histologically verified cervical intraepithelial neoplasia and cervical cancer and a minimum of five cases were included. The performance of AI and experienced colposcopists was assessed via the area under the receiver operating characteristic curve (AUROC), sensitivity, specificity, accuracy, positive predictive value (PPV), and negative predictive value (NPV) through random effect models. Additionally, subgroup analyses of multiple diagnostic performance metrics in developed and developing countries were conducted. This study was registered with PROSPERO (CRD42024534049). Findings Seventy-seven studies met the eligibility criteria for inclusion in this study. The pooled diagnostic parameters of AI-assisted cervical cytology via Papanicolaou (Pap) smears were as follows: accuracy, 94% (95% CI 92-96); sensitivity, 95% (95% CI 91-98); specificity, 94% (95% CI 89-97); PPV, 88% (95% CI 78-96); and NPV, 95% (95% CI 89-99). The pooled accuracy, sensitivity, specificity, PPV, and NPV of AI-assisted cervical cytology via ThinPrep cytologic test (TCT) were 90% (95% CI 85-94), 97% (95% CI 95-99), 94% (95% CI 85-98), 84% (95% CI 64-98), and 96% (95% CI 94-98), respectively. Subgroup analysis revealed that, for AI-assisted cervical cytology diagnosis, certain performance indicators were superior in developed countries compared to developing countries. Compared with experienced colposcopists, AI demonstrated superior accuracy in colposcopic examinations (odds ratio (OR) 1.75; 95% CI 1.33-2.31; P < 0.0001; I2 = 93%). Interpretation These results underscore the potential and practical value of AI in preventing and enabling early diagnosis of cervical cancer. Further research should support the development of AI for cervical cancer screening, including in low- and middle-income countries with limited resources. Funding This study was supported by the National Natural Science Foundation of China (No. 81901493) and the Shanghai Pujiang Program (No. 21PJD006).
Collapse
Affiliation(s)
- Lei Liu
- Department of Gynecology, Obstetrics and Gynecology Hospital of Fudan University, Shanghai, 200011, China
| | - Jiangang Liu
- Department of Obstetrics and Gynecology, Puren Hospital Affiliated to Wuhan University of Science and Technology, Wuhan, 430080, China
| | - Qing Su
- Department of Obstetrics and Gynecology, The Fourth Hospital of Changsha, Changsha, 410006, China
| | - Yuening Chu
- Department of Obstetrics and Gynecology, Shanghai First Maternity and Infant Hospital, Tongji University School of Medicine, Shanghai, 201204, China
| | - Hexia Xia
- Department of Gynecology, Obstetrics and Gynecology Hospital of Fudan University, Shanghai, 200011, China
| | - Ran Xu
- Department of Obstetrics and Gynecology, Affiliated Zhejiang Hospital, Zhejiang University School of Medicine, Hangzhou, 310013, China
- Heidelberg University, Heidelberg, 69120, Germany
| |
Collapse
|
3
|
Ekem L, Skerrett E, Huchko MJ, Ramanujam N. Automated Image Clarity Detection for the Improvement of Colposcopy Imaging with Multiple Devices. Biomed Signal Process Control 2025; 100:106948. [PMID: 39669100 PMCID: PMC11633643 DOI: 10.1016/j.bspc.2024.106948] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2024]
Abstract
The proportion of women dying from cervical cancer in middle- and low-income countries is over 60%, twice that of their high-income counterparts. A primary screening strategy to eliminate this burden is cervix visualization and application of 3-5% acetic acid, inducing contrast in potential lesions. Recently, machine learning tools have emerged to aid visual diagnosis. As low-cost visualization tools expand, it is important to maximize image quality at the time of the exam or of images used in algorithms. OBJECTIVE We present the use of an object detection algorithm, the YOLOv5 model, to localize the cervix and describe blur within a multi-device image database. METHODS We took advantage of the Fourier domain to provide pseudo-labeling of training and testing images. A YOLOv5 model was trained using Pocket Colposcope, Mobile ODT EVA, and standard of care digital colposcope images. RESULTS When tested on all devices, this model achieved a mean average precision score, sensitivity, and specificity of 0.9, 0.89, and 0.89, respectively. Mobile ODT EVA and Pocket Colposcope hold out sets yielded mAP score of 0.81 and 0.83, respectively, reflecting the generalizability of the algorithm. Compared to physician annotation, it yielded an accuracy of 0.72. CONCLUSION This method provides an informed quantitative, generalizable analysis of captured images that is highly concordant with expert annotation. SIGNIFICANCE This quality control framework can assist in the standardization of colposcopy workflow, data acquisition, and image analysis and in doing so increase the availability of usable positive images for the development of deep learning algorithms.
Collapse
Affiliation(s)
- Lillian Ekem
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Erica Skerrett
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Megan J. Huchko
- Center for Global Reproductive Health, Duke Global Institute, Durham, NC, USA
- Department of Obstetrics and Gynecology, Duke University School of Medicine, Durham, NC, USA
| | - Nimmi Ramanujam
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
- Department of Pharmacology and Cancer Biology, Duke University Medical Center, Durham, NC 27708, US
| |
Collapse
|
4
|
Sone K, Taguchi A, Miyamoto Y, Uchino-Mori M, Iriyama T, Hirota Y, Osuga Y. Clinical Prospects for Artificial Intelligence in Obstetrics and Gynecology. JMA J 2025; 8:113-120. [PMID: 39926075 PMCID: PMC11799576 DOI: 10.31662/jmaj.2024-0197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2024] [Accepted: 09/03/2024] [Indexed: 02/11/2025] Open
Abstract
In recent years, artificial intelligence (AI) research in the medical field has been actively conducted owing to the evolution of algorithms, such as deep learning, and advances in hardware, such as graphics processing units, and some such medical devices have been used in clinics. AI research in obstetrics and gynecology has also increased. This review discusses the latest studies in each field. In the perinatal field, there are reports on cardiotocography, studies on the diagnosis of fetal abnormalities using ultrasound scans, and studies on placenta previa using magnetic resonance imaging (MRI). In the reproduction field, numerous studies have been conducted on the efficiency of assisted reproductive technology as well as selection of suitable oocyte and good embryos. As regards gynecologic cancers, there are many reports on diagnosis using MRI and prognosis prediction using histopathology in cervical cancer, diagnosis using hysteroscopy and prediction of molecular subtypes based on histopathology in endometrial cancer, and diagnosis using MRI and ultrasound as well as prediction of anticancer drug efficacy in ovarian cancer. However, concerns related to AI research include handling of personal information, lack of governing laws, and transparency. These must be addressed to facilitate advanced AI research.
Collapse
Affiliation(s)
- Kenbun Sone
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Ayumi Taguchi
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Yuichiro Miyamoto
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Mayuyo Uchino-Mori
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Takayuki Iriyama
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Yasushi Hirota
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| | - Yutaka Osuga
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
5
|
Dellino M, Cerbone M, d’Amati A, Bochicchio M, Laganà AS, Etrusco A, Malvasi A, Vitagliano A, Pinto V, Cicinelli E, Cazzato G, Cascardi E. Artificial Intelligence in Cervical Cancer Screening: Opportunities and Challenges. AI 2024; 5:2984-3000. [DOI: 10.3390/ai5040144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2025] Open
Abstract
Among gynecological pathologies, cervical cancer has always represented a health problem with great social impact. The giant strides made as a result of both the screening programs perfected and implemented over the years and the use of new and accurate technological equipment have in fact significantly improved our clinical approach in the management and personalized diagnosis of precancerous lesions of the cervix. In this context, the advent of artificial intelligence and digital algorithms could represent new directions available to gynecologists and pathologists for the following: (i) the standardization of screening procedures, (ii) the identification of increasingly early lesions, and (iii) heightening the diagnostic accuracy of targeted biopsies and prognostic analysis of cervical cancer. The purpose of our review was to evaluate to what extent artificial intelligence can be integrated into current protocols, to identify the strengths and/or weaknesses of this method, and, above all, determine what we should expect in the future to develop increasingly safer solutions, as well as increasingly targeted and personalized screening programs for these patients. Furthermore, in an innovative way, and through a multidisciplinary vision (gynecologists, pathologists, and computer scientists), with this manuscript, we highlight a key role that AI could have in the management of HPV-positive patients. In our vision, AI will move from being a simple diagnostic device to being used as a tool for performing risk analyses of HPV-related disease progression. This is thanks to the ability of new software not only to analyze clinical and histopathological images but also to evaluate and integrate clinical elements such as vaccines, the composition of the microbiota, and the immune status of patients. In fact, the single-factor evaluation of high-risk HPV strains represents a limitation that must be overcome. Therefore, AI, through multifactorial analysis, will be able to generate a risk score that will better stratify patients and will support clinicians in choosing highly personalized treatments overall. Our study remains an innovative proposal and idea, as the literature to date presents a limitation in that this topic is considered niche, but we believe that the union of common efforts can overcome this limitation.
Collapse
Affiliation(s)
- Miriam Dellino
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Marco Cerbone
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Antonio d’Amati
- Pathology Unit, Department of Precision and Regenerative Medicine and Ionian Area (DiMePRe-J), University of Bari, Piazza Giulio Cesare 11, 70124 Bari, Italy
| | - Mario Bochicchio
- Department of Computer Science, University of Bari, 70121 Bari, Italy
| | - Antonio Simone Laganà
- Unit of Obstetrics and Gynecology, “Paolo Giaccone” Hospital, Department of Health Promotion, Mother and Child Care, Internal Medicine and Medical Specialties (PROMISE), University of Palermo, 90127 Palermo, Italy
| | - Andrea Etrusco
- Unit of Obstetrics and Gynecology, “Paolo Giaccone” Hospital, Department of Health Promotion, Mother and Child Care, Internal Medicine and Medical Specialties (PROMISE), University of Palermo, 90127 Palermo, Italy
| | - Antonio Malvasi
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Amerigo Vitagliano
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Vincenzo Pinto
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Ettore Cicinelli
- 1st Unit of Obstetrics and Gynecology, Department of Interdisciplinary Medicine (DIM), University of Bari, 70124 Bari, Italy
| | - Gerardo Cazzato
- Pathology Unit, Department of Precision and Regenerative Medicine and Ionian Area (DiMePRe-J), University of Bari, Piazza Giulio Cesare 11, 70124 Bari, Italy
| | - Eliano Cascardi
- Pathology Unit, Department of Precision and Regenerative Medicine and Ionian Area (DiMePRe-J), University of Bari, Piazza Giulio Cesare 11, 70124 Bari, Italy
| |
Collapse
|
6
|
Poli UR, Gudlavalleti AG, Bharadwaj Y J, Pant HB, Agiwal V, Murthy GVS. Development and Clinical Validation of Visual Inspection With Acetic Acid Application-Artificial Intelligence Tool Using Cervical Images in Screen-and-Treat Visual Screening for Cervical Cancer in South India: A Pilot Study. JCO Glob Oncol 2024; 10:e2400146. [PMID: 39666915 DOI: 10.1200/go.24.00146] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Revised: 08/01/2024] [Accepted: 10/01/2024] [Indexed: 12/14/2024] Open
Abstract
PURPOSE The burden of cervical cancer in India is enormous, with more than 60,000 deaths being reported in 2020. The key intervention in the WHO's global strategy for the elimination of cervical cancer is to aim for the treatment and care of 90% of women diagnosed with cervical lesions. The current screen-and-treat approach as an option for resource-limited health care systems where screening of the cervix with visual inspection with acetic acid application (VIA) is followed by immediate ablative treatment by nurses in the case of a positive test. This approach often results in overtreatment, owing to the subjective nature of the test. Unnecessary treatments can be diminished with the use of emerging computer-assisted visual evaluation technology, using artificial intelligence (AI) tool to triage VIA-positive women. The aim of this study was (1) to develop a VIA-AI tool using cervical images to identify and categorize the VIA-screen-positive areas for eligibility and suitability for ablative treatment, and (2) to understand the efficacy of the VIA-AI tool in guiding the nurses to decide on treatment eligibility in the screen-and-treat cervical screening program. METHODS This was an exploratory, interventional study. The VIA-AI tool was developed using deep-learning AI from the image bank collected in our previously conducted screening programs. This VIA-AI tool was then pilot-tested in an ongoing nurse-led VIA screening program. RESULTS A comparative assessment of the cervical features performed in all women using the VIA-AI tool showed clinical accuracy of 76%. The perceived challenge rate for false positives was 20%. CONCLUSION This novel cervical image-based VIA-AI algorithm showed promising results in real-life settings, and could help minimize overtreatment in single-visit VIA screening and treatment programs in resource-constrained situations.
Collapse
Affiliation(s)
| | | | | | - Hira B Pant
- Public Health Foundation of India, Hyderabad, India
| | - Varun Agiwal
- Indian Institute of Public Health, Hyderabad, India
| | - G V S Murthy
- Public Health Foundation of India, Hyderabad, India
| |
Collapse
|
7
|
Tamang P, Gupta M, Thatal A. Digital colposcopy image analysis techniques requirements and their role in clinical diagnosis: a systematic review. Expert Rev Med Devices 2024; 21:955-969. [PMID: 39370601 DOI: 10.1080/17434440.2024.2407549] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2024] [Accepted: 09/18/2024] [Indexed: 10/08/2024]
Abstract
INTRODUCTION Colposcopy is a medical procedure for detecting cervical lesions. Access to devices required for colposcopy procedures is limited in low- and middle-income countries. However, various existing digital imaging techniques based on artificial intelligence offer solutions to analyze colposcopy images and address accessibility challenges. METHODS We systematically searched PubMed, National Library of Medicine, and Crossref, which met our inclusion criteria for our study. Various methods and research gaps are addressed, including how variability in images and sample size affect the accuracy of the methods. The quality and risk of each study were assessed following the QUADAS-2 guidelines. RESULTS Development of image analysis and compression algorithms, and their efficiency are analyzed. Most of the studied algorithms have attained specificity, sensitivity, and accuracy which range from 86% to 95%, 75%-100%, and 100%, respectively, and these results were validated by the clinician to analyze the images quickly and thus minimize biases among the clinicians. CONCLUSION This systematic review provides a comprehensive study on colposcopy image analysis stages and the advantages of utilizing digital imaging techniques to enhance image analysis and diagnostic procedures and ensure prompt consultations. Furthermore, compression techniques can be applied to send medical images over media for further analysis among periphery hospitals.
Collapse
Affiliation(s)
- Parimala Tamang
- Department of Computer Applications, Sikkim Manipal Institute of Technology, Sikkim Manipal University, Sikkim, India
| | - Mousumi Gupta
- Department of Computer Applications, Sikkim Manipal Institute of Technology, Sikkim Manipal University, Sikkim, India
| | - Annet Thatal
- Department of Obstetrics and Gynecology, Al-Falah University, Faridabad, India
| |
Collapse
|
8
|
Ledwaba L, Saidu R, Malila B, Kuhn L, Mutsvangwa TE. Automated analysis of digital medical images in cervical cancer screening: A systematic review. MEDRXIV : THE PREPRINT SERVER FOR HEALTH SCIENCES 2024:2024.09.27.24314466. [PMID: 39399017 PMCID: PMC11469345 DOI: 10.1101/2024.09.27.24314466] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 10/15/2024]
Abstract
Background Cervical cancer screening programs are poorly implemented in LMICs due to a shortage of specialists and expensive diagnostic infrastructure. To address the barriers of implementation researchers have been developing low-cost portable devices and automating image analysis for decision support.However, as the knowledge base is growing rapidly, progress on the implementation status of novel imaging devices and algorithms in cervical cancer screening has become unclear. The aim of this project was to provide a systematic review summarizing the full range of automated technology systems used in cervical cancer screening. Method A search on academic databases was conducted and the search results were screened by two independent reviewers. Study selection was based on eligibility in meeting the terms of inclusion and exclusion criteria which were outlined using a Population, Intervention, Comparator and Outcome framework. Results 17 studies reported algorithms developed with source images from mobile device, viz. Pocket Colposcope, MobileODT EVA Colpo, Smartphone Camera, Smartphone-based Endoscope System, Smartscope, mHRME, and PiHRME. While 56 studies reported algorithms with source images from conventional/commercial acquisition devices. Most interventions were in the feasibility stage of development, undergoing initial clinical validations. Conclusion Researchers have proven superior prediction performance of computer aided diagnostics (CAD) in colposcopy (>80% accuracies) versus manual analysis (<70.0% accuracies). Furthermore, this review summarized evidence of the algorithms which are being created utilizing portable devices, to circumvent constraints prohibiting wider implementation in LMICs (such as expensive diagnostic infrastructure). However clinical validation of novel devices with CAD is not yet implemented adequately in LMICs.
Collapse
Affiliation(s)
- Leshego Ledwaba
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| | - Rakiya Saidu
- Obstetrics and Gynaecology, Groote Schuur Hospital/University of Cape Town, Cape Town, Western Cape, South Africa
| | - Bessie Malila
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| | - Louise Kuhn
- Gertrude H. Sergievsky Center, Vagelos College of Physicians and Surgeons; and Department of Epidemiology, Mailman School of Public Health, Columbia University Irving Medical Center, New York, New York
| | - Tinashe E.M. Mutsvangwa
- Division of Biomedical Engineering, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, Western Cape, South Africa
| |
Collapse
|
9
|
Brenes D, Salcedo MP, Coole JB, Maker Y, Kortum A, Schwarz RA, Carns J, Vohra IS, Possati-Resende JC, Antoniazzi M, de Oliveira Fonseca B, Souza KCB, Santana IVV, Barbin FF, Kreitchmann R, Ramanujam N, Schmeler KM, Richards-Kortum R. Multiscale Optical Imaging Fusion for Cervical Precancer Diagnosis: Integrating Widefield Colposcopy and High-Resolution Endomicroscopy. IEEE Trans Biomed Eng 2024; 71:2547-2556. [PMID: 38507389 PMCID: PMC11441333 DOI: 10.1109/tbme.2024.3379898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/22/2024]
Abstract
OBJECTIVE Early detection and treatment of cervical precancers can prevent disease progression. However, in low-resource communities with a high incidence of cervical cancer, high equipment costs and a shortage of specialists hinder preventative strategies. This manuscript presents a low-cost multiscale in vivo optical imaging system coupled with a computer-aided diagnostic system that could enable accurate, real-time diagnosis of high-grade cervical precancers. METHODS The system combines portable colposcopy and high-resolution endomicroscopy (HRME) to acquire spatially registered widefield and microscopy videos. A multiscale imaging fusion network (MSFN) was developed to identify cervical intraepithelial neoplasia grade 2 or more severe (CIN 2+). The MSFN automatically identifies and segments the ectocervix and lesions from colposcopy images, extracts nuclear morphology features from HRME videos, and integrates the colposcopy and HRME information. RESULTS With a threshold value set to achieve sensitivity equal to clinical impression (0.98 [p = 1.0]), the MSFN achieved a significantly higher specificity than clinical impression (0.75 vs. 0.43, p = 0.000006). CONCLUSION Our findings show that multiscale optical imaging of the cervix allows the highly sensitive and specific detection of high-grade precancers. SIGNIFICANCE The multiscale imaging system and MSFN could facilitate the accurate, real-time diagnosis of cervical precancers in low-resource settings.
Collapse
|
10
|
Karthika J, Anantharaju A, Koodi D, Pandya HJ, Pal UM. Label-free assessment of the transformation zone using multispectral diffuse optical imaging toward early detection of cervical cancer. JOURNAL OF BIOPHOTONICS 2024:e202400114. [PMID: 39032125 DOI: 10.1002/jbio.202400114] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2024] [Revised: 07/10/2024] [Accepted: 07/11/2024] [Indexed: 07/22/2024]
Abstract
The assessment of the transformation zone is a critical step toward diagnosis of cervical cancer. This work involves the development of a portable, label-free transvaginal multispectral diffuse optical imaging (MDOI) imaging probe to estimate the transformation zone. The images were acquired from N = 5 (N = 1 normal, N = 2 premalignant, and N = 2 malignant) patients. Key parameters such as spectral contrast ratio (ρ) at 545 and 450 nm were higher in premalignant (0.29, 0.25 for 450 nm and 0.30, 0.17 for 545 nm) as compared to the normal patients (0.13 and 0.14 for 450 and 545 nm, respectively). The threshold for the spectral intensity ratio R610/R450 and R610/R545 can also be used as a marker to correlate with the new and original squamous columnar junction (SCJ), respectively. The pilot study highlights the use of new markers such as spectral contrast ratio (ρ) and spectral intensity ratio (R610/R450 and R610/R545) images.
Collapse
Affiliation(s)
- J Karthika
- Department of Sciences and Humanities, Indian Institute of Information Technology, Design and Manufacturing, Kancheepuram, Chennai, Tamil Nadu, India
| | - Arpitha Anantharaju
- Department of Gynecology and Obstetrics, Jawaharlal Institute of Postgraduate Medical Education & Research, Puducherry, Puducherry, India
| | - Dhanush Koodi
- Department of Electronics and Communication Engineering, Sri Sairam Engineering College, Chennai, Tamil Nadu, India
| | - Hardik J Pandya
- Department of Electronic Systems Engineering, Indian Institute of Science, Bangalore, Karnataka, India
| | - Uttam M Pal
- Department of Electronics and Communications, Indian Institute of Information Technology, Design and Manufacturing, Kancheepuram, Chennai, Tamil Nadu, India
| |
Collapse
|
11
|
Hong Z, Xiong J, Yang H, Mo YK. Lightweight Low-Rank Adaptation Vision Transformer Framework for Cervical Cancer Detection and Cervix Type Classification. Bioengineering (Basel) 2024; 11:468. [PMID: 38790335 PMCID: PMC11118906 DOI: 10.3390/bioengineering11050468] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2024] [Revised: 05/01/2024] [Accepted: 05/02/2024] [Indexed: 05/26/2024] Open
Abstract
Cervical cancer is a major health concern worldwide, highlighting the urgent need for better early detection methods to improve outcomes for patients. In this study, we present a novel digital pathology classification approach that combines Low-Rank Adaptation (LoRA) with the Vision Transformer (ViT) model. This method is aimed at making cervix type classification more efficient through a deep learning classifier that does not require as much data. The key innovation is the use of LoRA, which allows for the effective training of the model with smaller datasets, making the most of the ability of ViT to represent visual information. This approach performs better than traditional Convolutional Neural Network (CNN) models, including Residual Networks (ResNets), especially when it comes to performance and the ability to generalize in situations where data are limited. Through thorough experiments and analysis on various dataset sizes, we found that our more streamlined classifier is highly accurate in spotting various cervical anomalies across several cases. This work advances the development of sophisticated computer-aided diagnostic systems, facilitating more rapid and accurate detection of cervical cancer, thereby significantly enhancing patient care outcomes.
Collapse
Affiliation(s)
- Zhenchen Hong
- Department of Physics and Astronomy, University of California, Riverside, CA 92521, USA
| | - Jingwei Xiong
- Graduate Group in Biostatistics, University of California, Davis, CA 95616, USA
| | - Han Yang
- Department of Chemistry, Columbia University, New York, NY 10027, USA;
| | - Yu K. Mo
- Department of Computer Science, Indiana University, Bloomington, IN 47405, USA;
- Department of Biology, Indiana University, Bloomington, IN 47405, USA
| |
Collapse
|
12
|
Boonya-ananta T, Gonzalez M, Ajmal A, Du Le VN, DeHoog E, Paidas MJ, Jayakumar A, Ramella-Roman JC. Speculum-free portable preterm imaging system. JOURNAL OF BIOMEDICAL OPTICS 2024; 29:052918. [PMID: 38282917 PMCID: PMC10821769 DOI: 10.1117/1.jbo.29.5.052918] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 01/02/2024] [Accepted: 01/05/2024] [Indexed: 01/30/2024]
Abstract
Significance Preterm birth is defined as a birth before 37 weeks of gestation and is one of the leading contributors to infant mortality rates globally. Premature birth can lead to life-long developmental impairment for the child. Unfortunately, there is a significant lack of tools to diagnose preterm birth risk, which limits patient care and the development of new therapies. Aim To develop a speculum-free, portable preterm imaging system (PPRIM) for cervical imaging; testing of the PPRIM system to resolve polarization properties of birefringent samples; and testing of the PPRIM under an IRB on healthy, non-pregnant volunteers for visualization and polarization analysis of cervical images. Approach The PPRIM can perform 4 × 3 Mueller-matrix imaging to characterize the remodeling of the uterine cervix during pregnancy. The PPRIM is built with a polarized imaging probe and a flexible insertable sheath made with a compatible flexible rubber-like material to maximize comfort and ease of use. Results The PPRIM device is developed to meet specific design specifications as a speculum-free, portable, and comfortable imaging system with polarized imaging capabilities. This system comprises a main imaging component and a flexible silicone inserter. The inserter is designed to maximize comfort and usability for the patient. The PPRIM shows high-resolution imaging capabilities at the 20 mm working distance and 25 mm circular field of view. The PPRIM demonstrates the ability to resolve birefringent sample orientation and full field capture of a healthy, non-pregnant cervix. Conclusion The development of the PPRIM aims to improve access to the standard of care for women's reproductive health using polarized Mueller-matrix imaging of the cervix and reduce infant and maternal mortality rates and better quality of life.
Collapse
Affiliation(s)
- Tananant Boonya-ananta
- Florida International University, Department of Biomedical Engineering, Miami, Florida, United States
| | - Mariacarla Gonzalez
- Florida International University, Department of Biomedical Engineering, Miami, Florida, United States
| | - Ajmal Ajmal
- Florida International University, Department of Biomedical Engineering, Miami, Florida, United States
| | - Vinh Nguyen Du Le
- Florida International University, Department of Biomedical Engineering, Miami, Florida, United States
| | - Edward DeHoog
- Optical Design and Engineering, Long Beach, California, United States
| | - Michael J. Paidas
- Miller School of Medicine, Department of Obstetrics, Gynecology and Reproductive Sciences, Miami, Florida, United States
| | - Arumugam Jayakumar
- Miller School of Medicine, Department of Obstetrics, Gynecology and Reproductive Sciences, Miami, Florida, United States
| | - Jessica C. Ramella-Roman
- Florida International University, Department of Biomedical Engineering, Miami, Florida, United States
- Florida International University, Herbert Wertheim College of Medicine, Miami, Florida, United States
| |
Collapse
|
13
|
Vargas-Cardona HD, Rodriguez-Lopez M, Arrivillaga M, Vergara-Sanchez C, García-Cifuentes JP, Bermúdez PC, Jaramillo-Botero A. Artificial intelligence for cervical cancer screening: Scoping review, 2009-2022. Int J Gynaecol Obstet 2024; 165:566-578. [PMID: 37811597 DOI: 10.1002/ijgo.15179] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 09/04/2023] [Accepted: 09/20/2023] [Indexed: 10/10/2023]
Abstract
BACKGROUND The intersection of artificial intelligence (AI) with cancer research is increasing, and many of the advances have focused on the analysis of cancer images. OBJECTIVES To describe and synthesize the literature on the diagnostic accuracy of AI in early imaging diagnosis of cervical cancer following Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR). SEARCH STRATEGY Arksey and O'Malley methodology was used and PubMed, Scopus, and Google Scholar databases were searched using a combination of English and Spanish keywords. SELECTION CRITERIA Identified titles and abstracts were screened to select original reports and cross-checked for overlap of cases. DATA COLLECTION AND ANALYSIS A descriptive summary was organized by the AI algorithm used, total of images analyzed, data source, clinical comparison criteria, and diagnosis performance. MAIN RESULTS We identified 32 studies published between 2009 and 2022. The primary sources of images were digital colposcopy, cervicography, and mobile devices. The machine learning/deep learning (DL) algorithms applied in the articles included support vector machine (SVM), random forest classifier, k-nearest neighbors, multilayer perceptron, C4.5, Naïve Bayes, AdaBoost, XGboots, conditional random fields, Bayes classifier, convolutional neural network (CNN; and variations), ResNet (several versions), YOLO+EfficientNetB0, and visual geometry group (VGG; several versions). SVM and DL methods (CNN, ResNet, VGG) showed the best diagnostic performances, with an accuracy of over 97%. CONCLUSION We concluded that the use of AI for cervical cancer screening has increased over the years, and some results (mainly from DL) are very promising. However, further research is necessary to validate these findings.
Collapse
Affiliation(s)
| | - Mérida Rodriguez-Lopez
- Faculty of Health Sciences, Universidad Icesi, Cali, Colombia
- Fundación Valle del Lili, Centro de Investigaciones Clínicas, Cali, Colombia
| | | | | | | | | | - Andres Jaramillo-Botero
- OMICAS Research Institute (iOMICAS), Pontificia Universidad Javeriana Cali, Cali, Colombia
- Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, California, USA
| |
Collapse
|
14
|
Chen P, Liu F, Zhang J, Wang B. MFEM-CIN: A Lightweight Architecture Combining CNN and Transformer for the Classification of Pre-Cancerous Lesions of the Cervix. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY 2024; 5:216-225. [PMID: 38606400 PMCID: PMC11008799 DOI: 10.1109/ojemb.2024.3367243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 12/03/2023] [Accepted: 02/05/2024] [Indexed: 04/13/2024] Open
Abstract
Goal: Cervical cancer is one of the most common cancers in women worldwide, ranking among the top four. Unfortunately, it is also the fourth leading cause of cancer-related deaths among women, particularly in developing countries where incidence and mortality rates are higher compared to developed nations. Colposcopy can aid in the early detection of cervical lesions, but its effectiveness is limited in areas with limited medical resources and a lack of specialized physicians. Consequently, many cases are diagnosed at later stages, putting patients at significant risk. Methods: This paper proposes an automated colposcopic image analysis framework to address these challenges. The framework aims to reduce the labor costs associated with cervical precancer screening in undeserved regions and assist doctors in diagnosing patients. The core of the framework is the MFEM-CIN hybrid model, which combines Convolutional Neural Networks (CNN) and Transformer to aggregate the correlation between local and global features. This combined analysis of local and global information is scientifically useful in clinical diagnosis. In the model, MSFE and MSFF are utilized to extract and fuse multi-scale semantics. This preserves important shallow feature information and allows it to interact with the deep feature, enriching the semantics to some extent. Conclusions: The experimental results demonstrate an accuracy rate of 89.2% in identifying cervical intraepithelial neoplasia while maintaining a lightweight model. This performance exceeds the average accuracy achieved by professional physicians, indicating promising potential for practical application. Utilizing automated colposcopic image analysis and the MFEM-CIN model, this research offers a practical solution to reduce the burden on healthcare providers and improve the efficiency and accuracy of cervical cancer diagnosis in resource-constrained areas.
Collapse
Affiliation(s)
- Peng Chen
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
- Fin China-Anhui University Joint Laboratory for Financial Big Data ResearchHefei Financial China Information and Technology Company, Ltd.Hefei230022China
| | - Fobao Liu
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
| | - Jun Zhang
- National Engineering Research Center for Agro-Ecological Big Data Analysis and Application, Information Materials and Intelligent Sensing Laboratory of Anhui Province, Institutes of Physical Science and Information Technology and School of InternetAnhui UniversityHefei230601China
| | - Bing Wang
- School of Management Science and EngineeringAnhui University of Finance and EconomicsBengbu233030China
| |
Collapse
|
15
|
Brandão M, Mendes F, Martins M, Cardoso P, Macedo G, Mascarenhas T, Mascarenhas Saraiva M. Revolutionizing Women's Health: A Comprehensive Review of Artificial Intelligence Advancements in Gynecology. J Clin Med 2024; 13:1061. [PMID: 38398374 PMCID: PMC10889757 DOI: 10.3390/jcm13041061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2023] [Revised: 02/04/2024] [Accepted: 02/05/2024] [Indexed: 02/25/2024] Open
Abstract
Artificial intelligence has yielded remarkably promising results in several medical fields, namely those with a strong imaging component. Gynecology relies heavily on imaging since it offers useful visual data on the female reproductive system, leading to a deeper understanding of pathophysiological concepts. The applicability of artificial intelligence technologies has not been as noticeable in gynecologic imaging as in other medical fields so far. However, due to growing interest in this area, some studies have been performed with exciting results. From urogynecology to oncology, artificial intelligence algorithms, particularly machine learning and deep learning, have shown huge potential to revolutionize the overall healthcare experience for women's reproductive health. In this review, we aim to establish the current status of AI in gynecology, the upcoming developments in this area, and discuss the challenges facing its clinical implementation, namely the technological and ethical concerns for technology development, implementation, and accountability.
Collapse
Affiliation(s)
- Marta Brandão
- Faculty of Medicine, University of Porto, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (M.B.); (P.C.); (G.M.); (T.M.)
| | - Francisco Mendes
- Department of Gastroenterology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (F.M.); (M.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-427 Porto, Portugal
| | - Miguel Martins
- Department of Gastroenterology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (F.M.); (M.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-427 Porto, Portugal
| | - Pedro Cardoso
- Faculty of Medicine, University of Porto, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (M.B.); (P.C.); (G.M.); (T.M.)
- Department of Gastroenterology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (F.M.); (M.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-427 Porto, Portugal
| | - Guilherme Macedo
- Faculty of Medicine, University of Porto, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (M.B.); (P.C.); (G.M.); (T.M.)
- Department of Gastroenterology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (F.M.); (M.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-427 Porto, Portugal
| | - Teresa Mascarenhas
- Faculty of Medicine, University of Porto, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (M.B.); (P.C.); (G.M.); (T.M.)
- Department of Obstetrics and Gynecology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal
| | - Miguel Mascarenhas Saraiva
- Faculty of Medicine, University of Porto, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (M.B.); (P.C.); (G.M.); (T.M.)
- Department of Gastroenterology, São João University Hospital, Alameda Professor Hernâni Monteiro, 4200-427 Porto, Portugal; (F.M.); (M.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-427 Porto, Portugal
| |
Collapse
|
16
|
Nakisige C, de Fouw M, Kabukye J, Sultanov M, Nazrui N, Rahman A, de Zeeuw J, Koot J, Rao AP, Prasad K, Shyamala G, Siddharta P, Stekelenburg J, Beltman JJ. Artificial intelligence and visual inspection in cervical cancer screening. Int J Gynecol Cancer 2023; 33:1515-1521. [PMID: 37666527 PMCID: PMC10579490 DOI: 10.1136/ijgc-2023-004397] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Accepted: 08/07/2023] [Indexed: 09/06/2023] Open
Abstract
INTRODUCTION Visual inspection with acetic acid is limited by subjectivity and a lack of skilled human resource. A decision support system based on artificial intelligence could address these limitations. We conducted a diagnostic study to assess the diagnostic performance using visual inspection with acetic acid under magnification of healthcare workers, experts, and an artificial intelligence algorithm. METHODS A total of 22 healthcare workers, 9 gynecologists/experts in visual inspection with acetic acid, and the algorithm assessed a set of 83 images from existing datasets with expert consensus as the reference. Their diagnostic performance was determined by analyzing sensitivity, specificity, and area under the curve, and intra- and inter-observer agreement was measured using Fleiss kappa values. RESULTS Sensitivity, specificity, and area under the curve were, respectively, 80.4%, 80.5%, and 0.80 (95% CI 0.70 to 0.90) for the healthcare workers, 81.6%, 93.5%, and 0.93 (95% CI 0.87 to 1.00) for the experts, and 80.0%, 83.3%, and 0.84 (95% CI 0.75 to 0.93) for the algorithm. Kappa values for the healthcare workers, experts, and algorithm were 0.45, 0.68, and 0.63, respectively. CONCLUSION This study enabled simultaneous assessment and demonstrated that expert consensus can be an alternative to histopathology to establish a reference standard for further training of healthcare workers and the artificial intelligence algorithm to improve diagnostic accuracy.
Collapse
Affiliation(s)
| | - Marlieke de Fouw
- Gynecology, Leiden University Medical Center department of Gynecology, Leiden, Zuid-Holland, Netherlands
| | | | - Marat Sultanov
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands, Groningen, Netherlands
| | | | - Aminur Rahman
- ICDDRB Public Health Sciences Division, Dhaka, Dhaka District, Bangladesh
| | - Janine de Zeeuw
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands, Groningen, Netherlands
| | - Jaap Koot
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands, Groningen, Netherlands
| | - Arathi P Rao
- Prasanna School of Public Health, Manipal Academy of Higher Education, Manipal, India, Manipal, India
| | - Keerthana Prasad
- Manipal Academy of Higher Education School of Life Sciences, Manipal, Karnataka, India
| | - Guruvare Shyamala
- Manipal Academy of Higher Education - Mangalore Campus, Mangalore, Karnataka, India
| | - Premalatha Siddharta
- Gynecological Oncology, St John's National Academy of Health Sciences, Bangalore, Karnataka, India
| | - Jelle Stekelenburg
- University Medical Center Groningen, University of Groningen, Groningen, Netherlands, Groningen, Netherlands
| | | |
Collapse
|
17
|
Shamsunder S, Mishra A, Kumar A, Kolte S. Automated Assessment of Digital Images of Uterine Cervix Captured Using Transvaginal Device-A Pilot Study. Diagnostics (Basel) 2023; 13:3085. [PMID: 37835828 PMCID: PMC10573017 DOI: 10.3390/diagnostics13193085] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Revised: 09/25/2023] [Accepted: 09/26/2023] [Indexed: 10/15/2023] Open
Abstract
In low-resource settings, a point-of-care test for cervical cancer screening that can give an immediate result to guide management is urgently needed. A transvaginal digital device, "Smart Scope®" (SS), with an artificial intelligence-enabled auto-image-assessment (SS-AI) feature, was developed. In a single-arm observational study, eligible consenting women underwent a Smart Scope®-aided VIA-VILI test. Images of the cervix were captured using SS and categorized by SS-AI in four groups (green, amber, high-risk amber (HRA), red) based on risk assessment. Green and amber were classified as SS-AI negative while HRA and red were classified as SS-AI positive. The SS-AI-positive women were advised colposcopy and guided biopsy. The cervix images of SS-AI-negative cases were evaluated by an expert colposcopist (SS-M); those suspected of being positive were also recommended colposcopy and guided biopsy. Histopathology was considered a gold standard. Data on 877 SS-AI, 485 colposcopy, and 213 histopathology were available for analysis. The SS-AI showed high sensitivity (90.3%), specificity (75.3%), accuracy (84.04%), and correlation coefficient (0.670, p = 0.0) in comparison with histology at the CINI+ cutoff. In conclusion, the AI-enabled Smart Scope® test is a good alternative to the existing screening tests as it gives a real-time accurate assessment of cervical health and an opportunity for immediate triaging with visual evidence.
Collapse
Affiliation(s)
- Saritha Shamsunder
- Gynecology Department, Safdarjung Hospital, New Delhi 110029, India; (A.M.); (A.K.)
| | - Archana Mishra
- Gynecology Department, Safdarjung Hospital, New Delhi 110029, India; (A.M.); (A.K.)
| | - Anita Kumar
- Gynecology Department, Safdarjung Hospital, New Delhi 110029, India; (A.M.); (A.K.)
| | - Sachin Kolte
- Department of Pathology, VMMC and Safdarjung Hospital, New Delhi 110029, India;
| |
Collapse
|
18
|
Darwish M, Altabel MZ, Abiyev RH. Enhancing Cervical Pre-Cancerous Classification Using Advanced Vision Transformer. Diagnostics (Basel) 2023; 13:2884. [PMID: 37761252 PMCID: PMC10529431 DOI: 10.3390/diagnostics13182884] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Revised: 08/28/2023] [Accepted: 09/05/2023] [Indexed: 09/29/2023] Open
Abstract
One of the most common types of cancer among in women is cervical cancer. Incidence and fatality rates are steadily rising, particularly in developing nations, due to a lack of screening facilities, experienced specialists, and public awareness. Visual inspection is used to screen for cervical cancer after the application of acetic acid (VIA), histopathology test, Papanicolaou (Pap) test, and human papillomavirus (HPV) test. The goal of this research is to employ a vision transformer (ViT) enhanced with shifted patch tokenization (SPT) techniques to create an integrated and robust system for automatic cervix-type identification. A vision transformer enhanced with shifted patch tokenization is used in this work to learn the distinct features between the three different cervical pre-cancerous types. The model was trained and tested on 8215 colposcopy images of the three types, obtained from the publicly available mobile-ODT dataset. The model was tested on 30% of the whole dataset and it showed a good generalization capability of 91% accuracy. The state-of-the art comparison indicated the outperformance of our model. The experimental results show that the suggested system can be employed as a decision support tool in the detection of the cervical pre-cancer transformation zone, particularly in low-resource settings with limited experience and resources.
Collapse
Affiliation(s)
| | | | - Rahib H. Abiyev
- Department of Computer Engineering, Applied Artificial Intelligence Research Centre, Near East University, Mersin 10, 99138 Nicosia, Turkey; (M.D.); (M.Z.A.)
| |
Collapse
|
19
|
Rahaman A, Anantharaju A, Jeyachandran K, Manideep R, Pal UM. Optical imaging for early detection of cervical cancer: state of the art and perspectives. JOURNAL OF BIOMEDICAL OPTICS 2023; 28:080902. [PMID: 37564164 PMCID: PMC10411916 DOI: 10.1117/1.jbo.28.8.080902] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2023] [Revised: 07/26/2023] [Accepted: 07/28/2023] [Indexed: 08/12/2023]
Abstract
Significance Cervical cancer is one of the major causes of death in females worldwide. HPV infection is the key cause of uncontrolled cell growth leading to cervical cancer. About 90% of cervical cancer is preventable because of the slow progression of the disease, giving a window of about 10 years for the precancerous lesion to be recognized and treated. Aim The present challenges for cervical cancer diagnosis are interobserver variation in clinicians' interpretation of visual inspection with acetic acid/visual inspection with Lugol's iodine, cost of cytology-based screening, and lack of skilled clinicians. The optical modalities can assist in qualitatively and quantitatively analyzing the tissue to differentiate between cancerous and surrounding normal tissues. Approach This work is on the recent advances in optical techniques for cervical cancer diagnosis, which promise to overcome the above-listed challenges faced by present screening techniques. Results The optical modalities provide substantial measurable information in addition to the conventional colposcopy and Pap smear test to clinically aid the diagnosis. Conclusions Recent optical modalities on fluorescence, multispectral imaging, polarization-sensitive imaging, microendoscopy, Raman spectroscopy, especially with the portable design and assisted by artificial intelligence, have a significant scope in the diagnosis of premalignant cervical cancer in future.
Collapse
Affiliation(s)
- Alisha Rahaman
- Savitribai Phule Pune University, Department of Microbiology, Pune, Maharashtra, India
| | - Arpitha Anantharaju
- Jawaharlal Institute of Postgraduate Medical Education and Research, Department of Obstetrics and Gynaecology, Puducherry, India
| | - Karthika Jeyachandran
- Indian Institute of Information Technology, Design and Manufacturing, Kancheepuram, Department of Electronics and Communication Engineering, Chennai, Tamil Nadu, India
| | - Repala Manideep
- Indian Institute of Information Technology, Design and Manufacturing, Kancheepuram, Department of Electronics and Communication Engineering, Chennai, Tamil Nadu, India
| | - Uttam M. Pal
- Indian Institute of Information Technology, Design and Manufacturing, Kancheepuram, Department of Electronics and Communication Engineering, Chennai, Tamil Nadu, India
| |
Collapse
|
20
|
Mustafa WA, Ismail S, Mokhtar FS, Alquran H, Al-Issa Y. Cervical Cancer Detection Techniques: A Chronological Review. Diagnostics (Basel) 2023; 13:diagnostics13101763. [PMID: 37238248 DOI: 10.3390/diagnostics13101763] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Revised: 05/12/2023] [Accepted: 05/15/2023] [Indexed: 05/28/2023] Open
Abstract
Cervical cancer is known as a major health problem globally, with high mortality as well as incidence rates. Over the years, there have been significant advancements in cervical cancer detection techniques, leading to improved accuracy, sensitivity, and specificity. This article provides a chronological review of cervical cancer detection techniques, from the traditional Pap smear test to the latest computer-aided detection (CAD) systems. The traditional method for cervical cancer screening is the Pap smear test. It consists of examining cervical cells under a microscope for abnormalities. However, this method is subjective and may miss precancerous lesions, leading to false negatives and a delayed diagnosis. Therefore, a growing interest has been in shown developing CAD methods to enhance cervical cancer screening. However, the effectiveness and reliability of CAD systems are still being evaluated. A systematic review of the literature was performed using the Scopus database to identify relevant studies on cervical cancer detection techniques published between 1996 and 2022. The search terms used included "(cervix OR cervical) AND (cancer OR tumor) AND (detect* OR diagnosis)". Studies were included if they reported on the development or evaluation of cervical cancer detection techniques, including traditional methods and CAD systems. The results of the review showed that CAD technology for cervical cancer detection has come a long way since it was introduced in the 1990s. Early CAD systems utilized image processing and pattern recognition techniques to analyze digital images of cervical cells, with limited success due to low sensitivity and specificity. In the early 2000s, machine learning (ML) algorithms were introduced to the CAD field for cervical cancer detection, allowing for more accurate and automated analysis of digital images of cervical cells. ML-based CAD systems have shown promise in several studies, with improved sensitivity and specificity reported compared to traditional screening methods. In summary, this chronological review of cervical cancer detection techniques highlights the significant advancements made in this field over the past few decades. ML-based CAD systems have shown promise for improving the accuracy and sensitivity of cervical cancer detection. The Hybrid Intelligent System for Cervical Cancer Diagnosis (HISCCD) and the Automated Cervical Screening System (ACSS) are two of the most promising CAD systems. Still, deeper validation and research are required before being broadly accepted. Continued innovation and collaboration in this field may help enhance cervical cancer detection as well as ultimately reduce the disease's burden on women worldwide.
Collapse
Affiliation(s)
- Wan Azani Mustafa
- Faculty of Electrical Engineering Technology, Campus Pauh Putra, Universiti Malaysia Perlis, Arau 02600, Perlis, Malaysia
- Advanced Computing (AdvComp), Centre of Excellence (CoE), Universiti Malaysia Perlis, Arau 02600, Perlis, Malaysia
| | - Shahrina Ismail
- Faculty of Science and Technology, Universiti Sains Islam Malaysia (USIM), Bandar Baru Nilai 71800, Negeri Sembilan, Malaysia
| | - Fahirah Syaliza Mokhtar
- Faculty of Business, Economy and Social Development, Universiti Malaysia Terengganu, Kuala Nerus 21300, Terengganu, Malaysia
| | - Hiam Alquran
- Department of Biomedical Systems and Informatics Engineering, Yarmouk University, 556, Irbid 21163, Jordan
| | - Yazan Al-Issa
- Department of Computer Engineering, Yarmouk University, Irbid 22110, Jordan
| |
Collapse
|
21
|
Shinohara T, Murakami K, Matsumura N. Diagnosis Assistance in Colposcopy by Segmenting Acetowhite Epithelium Using U-Net with Images before and after Acetic Acid Solution Application. Diagnostics (Basel) 2023; 13:diagnostics13091596. [PMID: 37174987 PMCID: PMC10178183 DOI: 10.3390/diagnostics13091596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 04/20/2023] [Accepted: 04/27/2023] [Indexed: 05/15/2023] Open
Abstract
Colposcopy is an essential examination tool to identify cervical intraepithelial neoplasia (CIN), a precancerous lesion of the uterine cervix, and to sample its tissues for histological examination. In colposcopy, gynecologists visually identify the lesion highlighted by applying an acetic acid solution to the cervix using a magnifying glass. This paper proposes a deep learning method to aid the colposcopic diagnosis of CIN by segmenting lesions. In this method, to segment the lesion effectively, the colposcopic images taken before acetic acid solution application were input to the deep learning network, U-Net, for lesion segmentation with the images taken following acetic acid solution application. We conducted experiments using 30 actual colposcopic images of acetowhite epithelium, one of the representative types of CIN. As a result, it was confirmed that accuracy, precision, and F1 scores, which were 0.894, 0.837, and 0.834, respectively, were significantly better when images taken before and after acetic acid solution application were used than when only images taken after acetic acid solution application were used (0.882, 0.823, and 0.823, respectively). This result indicates that the image taken before acetic acid solution application is helpful for accurately segmenting the CIN in deep learning.
Collapse
Affiliation(s)
- Toshihiro Shinohara
- Department of Computational Systems Biology, Faculty of Biology-Oriented Science and Technology, Kindai University, Kinokawa 649-6493, Wakayama, Japan
| | - Kosuke Murakami
- Department of Obstetrics and Gynecology, Faculty of Medicine, Kindai University, Osakasayama 589-8511, Osaka, Japan
| | - Noriomi Matsumura
- Department of Obstetrics and Gynecology, Faculty of Medicine, Kindai University, Osakasayama 589-8511, Osaka, Japan
| |
Collapse
|
22
|
Wu A, Xue P, Abulizi G, Tuerxun D, Rezhake R, Qiao Y. Artificial intelligence in colposcopic examination: A promising tool to assist junior colposcopists. Front Med (Lausanne) 2023; 10:1060451. [PMID: 37056736 PMCID: PMC10088560 DOI: 10.3389/fmed.2023.1060451] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 02/08/2023] [Indexed: 03/17/2023] Open
Abstract
Introduction Well-trained colposcopists are in huge shortage worldwide, especially in low-resource areas. Here, we aimed to evaluate the Colposcopic Artificial Intelligence Auxiliary Diagnostic System (CAIADS) to detect abnormalities based on digital colposcopy images, especially focusing on its role in assisting junior colposcopist to correctly identify the lesion areas where biopsy should be performed. Materials and methods This is a hospital-based retrospective study, which recruited the women who visited colposcopy clinics between September 2021 to January 2022. A total of 366 of 1,146 women with complete medical information recorded by a senior colposcopist and valid histology results were included. Anonymized colposcopy images were reviewed by CAIADS and a junior colposcopist separately, and the junior colposcopist reviewed the colposcopy images with CAIADS results (named CAIADS-Junior). The diagnostic accuracy and biopsy efficiency of CAIADS and CAIADS-Junior were assessed in detecting cervical intraepithelial neoplasia grade 2 or worse (CIN2+), CIN3+, and cancer in comparison with the senior and junior colposcipists. The factors influencing the accuracy of CAIADS were explored. Results For CIN2 + and CIN3 + detection, CAIADS showed a sensitivity at ~80%, which was not significantly lower than the sensitivity achieved by the senior colposcopist (for CIN2 +: 80.6 vs. 91.3%, p = 0.061 and for CIN3 +: 80.0 vs. 90.0%, p = 0.189). The sensitivity of the junior colposcopist was increased significantly with the assistance of CAIADS (for CIN2 +: 95.1 vs. 79.6%, p = 0.002 and for CIN3 +: 97.1 vs. 85.7%, p = 0.039) and was comparable to those of the senior colposcopists (for CIN2 +: 95.1 vs. 91.3%, p = 0.388 and for CIN3 +: 97.1 vs. 90.0%, p = 0.125). In detecting cervical cancer, CAIADS achieved the highest sensitivity at 100%. For all endpoints, CAIADS showed the highest specificity (55-64%) and positive predictive values compared to both senior and junior colposcopists. When CIN grades became higher, the average biopsy numbers decreased for the subspecialists and CAIADS required a minimum number of biopsies to detect per case (2.2-2.6 cut-points). Meanwhile, the biopsy sensitivity of the junior colposcopist was the lowest, but the CAIADS-assisted junior colposcopist achieved a higher biopsy sensitivity. Conclusion Colposcopic Artificial Intelligence Auxiliary Diagnostic System could assist junior colposcopists to improve diagnostic accuracy and biopsy efficiency, which might be a promising solution to improve the quality of cervical cancer screening in low-resource settings.
Collapse
Affiliation(s)
- Aiyuan Wu
- The Affiliated Cancer Hospital of Xinjiang Medical University, Urumqi, China
| | - Peng Xue
- School of Population Medicine and Public Health, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, China
| | - Guzhalinuer Abulizi
- The Affiliated Cancer Hospital of Xinjiang Medical University, Urumqi, China
| | - Dilinuer Tuerxun
- The Affiliated Cancer Hospital of Xinjiang Medical University, Urumqi, China
| | - Remila Rezhake
- The Affiliated Cancer Hospital of Xinjiang Medical University, Urumqi, China
| | - Youlin Qiao
- The Affiliated Cancer Hospital of Xinjiang Medical University, Urumqi, China
- School of Population Medicine and Public Health, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, China
| |
Collapse
|
23
|
ColpoClassifier: A Hybrid Framework for Classification of the Cervigrams. Diagnostics (Basel) 2023; 13:diagnostics13061103. [PMID: 36980411 PMCID: PMC10047578 DOI: 10.3390/diagnostics13061103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2023] [Revised: 03/03/2023] [Accepted: 03/06/2023] [Indexed: 03/17/2023] Open
Abstract
Colposcopy plays a vital role in detecting cervical cancer. Artificial intelligence-based methods have been implemented in the literature for the classification of colposcopy images. However, there is a need for a more effective method that can accurately classify cervigrams. In this paper, ColpoClassifier, a hybrid framework for the classification of cervigrams, is proposed, which consists of feature extraction followed by classification. This paper uses a Gray-level co-occurrence matrix (GLCM), a Gray-level run length matrix (GLRLM), and a histogram of gradients (HOG) for feature extraction. These features are combined to form a feature fusion vector of the form GLCM + GLRLM + HOG. The different machine learning classifiers are used for classification by using individual feature vectors as well as feature fusion vectors. The dataset used in this paper is compiled by downloading images from the WHO website. Two variants of this dataset are created, Dataset-I contains images of the aceto-whitening effect, green filter, iodine application, and raw cervigram while Dataset-II only contains images of the aceto-whitening effect. This paper presents the classification performance on all kinds of images with the individual as well as hybrid feature fusion vector and concludes that hybrid feature fusion vectors on aceto-whitening images have given the best results.
Collapse
|
24
|
CTIFI: Clinical-experience-guided three-vision images features integration for diagnosis of cervical lesions. Biomed Signal Process Control 2023. [DOI: 10.1016/j.bspc.2022.104235] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
25
|
Cervical pre-cancerous lesion detection: development of smartphone-based VIA application using artificial intelligence. BMC Res Notes 2022; 15:356. [PMID: 36463193 PMCID: PMC9719132 DOI: 10.1186/s13104-022-06250-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2022] [Accepted: 11/18/2022] [Indexed: 12/04/2022] Open
Abstract
OBJECTIVE Visual inspection of cervix after acetic acid application (VIA) has been considered an alternative to Pap smear in resource-limited settings, like Indonesia. However, VIA results mainly depend on examiner's experience and with the lack of comprehensive training of healthcare workers, VIA accuracy keeps declining. We aimed to develop an artificial intelligence (AI)-based Android application that can automatically determine VIA results in real time and may be further developed as a health care support system in cervical cancer screening. RESULT A total of 199 women who underwent VIA test was studied. Images of cervix before and after VIA test were taken with smartphone, then evaluated and labelled by experienced oncologist as VIA positive or negative. Our AI model training pipeline consists of 3 steps: image pre-processing, feature extraction, and classifier development. Out of the 199 data, 134 were used as train-validation data and the remaining 65 data were used as test data. The trained AI model generated a sensitivity of 80%, specificity of 96.4%, accuracy of 93.8%, precision of 80%, and ROC/AUC of 0.85 (95% CI 0.66-1.0). The developed AI-based Android application may potentially aid cervical cancer screening, especially in low resource settings.
Collapse
|
26
|
Allahqoli L, Laganà AS, Mazidimoradi A, Salehiniya H, Günther V, Chiantera V, Karimi Goghari S, Ghiasvand MM, Rahmani A, Momenimovahed Z, Alkatout I. Diagnosis of Cervical Cancer and Pre-Cancerous Lesions by Artificial Intelligence: A Systematic Review. Diagnostics (Basel) 2022; 12:2771. [PMID: 36428831 PMCID: PMC9689914 DOI: 10.3390/diagnostics12112771] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2022] [Revised: 11/06/2022] [Accepted: 11/10/2022] [Indexed: 11/16/2022] Open
Abstract
OBJECTIVE The likelihood of timely treatment for cervical cancer increases with timely detection of abnormal cervical cells. Automated methods of detecting abnormal cervical cells were established because manual identification requires skilled pathologists and is time consuming and prone to error. The purpose of this systematic review is to evaluate the diagnostic performance of artificial intelligence (AI) technologies for the prediction, screening, and diagnosis of cervical cancer and pre-cancerous lesions. MATERIALS AND METHODS Comprehensive searches were performed on three databases: Medline, Web of Science Core Collection (Indexes = SCI-EXPANDED, SSCI, A & HCI Timespan) and Scopus to find papers published until July 2022. Articles that applied any AI technique for the prediction, screening, and diagnosis of cervical cancer were included in the review. No time restriction was applied. Articles were searched, screened, incorporated, and analyzed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-analyses guidelines. RESULTS The primary search yielded 2538 articles. After screening and evaluation of eligibility, 117 studies were incorporated in the review. AI techniques were found to play a significant role in screening systems for pre-cancerous and cancerous cervical lesions. The accuracy of the algorithms in predicting cervical cancer varied from 70% to 100%. AI techniques make a distinction between cancerous and normal Pap smears with 80-100% accuracy. AI is expected to serve as a practical tool for doctors in making accurate clinical diagnoses. The reported sensitivity and specificity of AI in colposcopy for the detection of CIN2+ were 71.9-98.22% and 51.8-96.2%, respectively. CONCLUSION The present review highlights the acceptable performance of AI systems in the prediction, screening, or detection of cervical cancer and pre-cancerous lesions, especially when faced with a paucity of specialized centers or medical resources. In combination with human evaluation, AI could serve as a helpful tool in the interpretation of cervical smears or images.
Collapse
Affiliation(s)
- Leila Allahqoli
- Midwifery Department, Ministry of Health and Medical Education, Tehran 1467664961, Iran
| | - Antonio Simone Laganà
- Unit of Gynecologic Oncology, ARNAS “Civico-Di Cristina-Benfratelli”, Department of Health Promotion, Mother and Child Care, Internal Medicine and Medical Specialties (PROMISE), University of Palermo, 90127 Palermo, Italy
| | - Afrooz Mazidimoradi
- Neyriz Public Health Clinic, Shiraz University of Medical Sciences, Shiraz 7134814336, Iran
| | - Hamid Salehiniya
- Social Determinants of Health Research Center, Birjand University of Medical Sciences, Birjand 9717853577, Iran
| | - Veronika Günther
- University Hospitals Schleswig-Holstein, Campus Kiel, Kiel School of Gynaecological Endoscopy, Arnold-Heller-Str. 3, Haus 24, 24105 Kiel, Germany
| | - Vito Chiantera
- Unit of Gynecologic Oncology, ARNAS “Civico-Di Cristina-Benfratelli”, Department of Health Promotion, Mother and Child Care, Internal Medicine and Medical Specialties (PROMISE), University of Palermo, 90127 Palermo, Italy
| | - Shirin Karimi Goghari
- School of Industrial and Systems Engineering, Tarbiat Modares University (TMU), Tehran 1411713114, Iran
| | - Mohammad Matin Ghiasvand
- Department of Computer Engineering, Amirkabir University of Technology (AUT), Tehran 1591634311, Iran
| | - Azam Rahmani
- Nursing and Midwifery Care Research Centre, School of Nursing and Midwifery, Tehran University of Medical Sciences, Tehran 141973317, Iran
| | - Zohre Momenimovahed
- Reproductive Health Department, Qom University of Medical Sciences, Qom 3716993456, Iran
| | - Ibrahim Alkatout
- University Hospitals Schleswig-Holstein, Campus Kiel, Kiel School of Gynaecological Endoscopy, Arnold-Heller-Str. 3, Haus 24, 24105 Kiel, Germany
| |
Collapse
|
27
|
Coole JB, Brenes D, Possati-Resende JC, Antoniazzi M, Fonseca BDO, Maker Y, Kortum A, Vohra IS, Schwarz RA, Carns J, Borba Souza KC, Vidigal Santana IV, Kreitchmann R, Salcedo MP, Ramanujam N, Schmeler KM, Richards-Kortum R. Development of a multimodal mobile colposcope for real-time cervical cancer detection. BIOMEDICAL OPTICS EXPRESS 2022; 13:5116-5130. [PMID: 36425643 PMCID: PMC9664871 DOI: 10.1364/boe.463253] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 08/19/2022] [Accepted: 08/23/2022] [Indexed: 06/16/2023]
Abstract
Cervical cancer remains a leading cause of cancer death among women in low-and middle-income countries. Globally, cervical cancer prevention programs are hampered by a lack of resources, infrastructure, and personnel. We describe a multimodal mobile colposcope (MMC) designed to diagnose precancerous cervical lesions at the point-of-care without the need for biopsy. The MMC integrates two complementary imaging systems: 1) a commercially available colposcope and 2) a high speed, high-resolution, fiber-optic microendoscope (HRME). Combining these two image modalities allows, for the first time, the ability to locate suspicious cervical lesions using widefield imaging and then to obtain co-registered high-resolution images across an entire lesion. The MMC overcomes limitations of high-resolution imaging alone; widefield imaging can be used to guide the placement of the high-resolution imaging probe at clinically suspicious regions and co-registered, mosaicked high-resolution images effectively increase the field of view of high-resolution imaging. Representative data collected from patients referred for colposcopy at Barretos Cancer Hospital in Brazil, including 22,800 high resolution images and 9,900 colposcope images, illustrate the ability of the MMC to identify abnormal cervical regions, image suspicious areas with subcellular resolution, and distinguish between high-grade and low-grade dysplasia.
Collapse
Affiliation(s)
- Jackson B. Coole
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - David Brenes
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | - Márcio Antoniazzi
- Barretos Cancer Hospital, Department of Prevention, Barretos, Brazil
| | | | - Yajur Maker
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - Alex Kortum
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | - Imran S. Vohra
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | - Jennifer Carns
- Rice University, Department of Bioengineering, Houston, TX 77005, USA
| | | | | | - Regis Kreitchmann
- Federal University of Health Sciences of Porto Alegre (UFCSPA)/Santa Casa Hospital of Porto Alegre, Department of Obstetrics and Gynecology, Porto Alegre, Brazil
| | - Mila P. Salcedo
- Federal University of Health Sciences of Porto Alegre (UFCSPA)/Santa Casa Hospital of Porto Alegre, Department of Obstetrics and Gynecology, Porto Alegre, Brazil
- The University of Texas MD Anderson Cancer Center, Department of Gynecologic Oncology and Reproductive Medicine, Houston, TX 77005, USA
| | - Nirmala Ramanujam
- Duke University, Department of Biomedical Engineering, Durham, NC 27708, USA
| | - Kathleen M. Schmeler
- The University of Texas MD Anderson Cancer Center, Department of Gynecologic Oncology and Reproductive Medicine, Houston, TX 77005, USA
| | | |
Collapse
|
28
|
Thai PL, Merry Geisa J. Classification of microscopic cervical blood cells using inception ResNet V2 with modified activation function. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-220511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Cervical cancer is the most frequent and fatal malignancy among women worldwide. If this tumor is detected and treated early enough, the complications it causes can be minimized. Deep learning demonstrated significant promise when imposed on biomedical difficulties such as medical image processing and disease prognostication. Therefore, in this paper, an automatic cervical cell classification approach named IR-PapNet is developed based on Inception-ResNet which is an optimized version of Inception. The learning model’s conventional ReLu activation is replaced with the parametric-rectified linear unit (PReLu) to overcome the nullification of negative values and dying ReLu. Finally, the model loss function is minimized with the SGD optimization model by modifying the attributes of the neural network. Furthermore, we present a simple but efficient noise removal technique called 2D-Discrete Wavelet Transform (2D-DWT) algorithm for enhancing image quality. Experimental results show that this model can achieve a top-1 average identification accuracy of 99.8% on the pap smear cervical Herlev datasets, which verifies its satisfactory performance. The restructured Inception-ResNet network model can obtain significant improvements over most of the state-of-the-art models in 2-class classification, and it achieves a high learning rate without experiencing dead nodes.
Collapse
Affiliation(s)
- Pon L.T. Thai
- Department of Computer Science and Engineering, Arunachala College of Engineering for Women, Nagercoil, Tamil Nadu, India
| | - J. Merry Geisa
- Department of Electrical and ElectronicsEngineering, St. Xavier’s Catholic College of Engineering, Nagercoil, Tamil Nadu, India
| |
Collapse
|
29
|
Yu H, Fan Y, Ma H, Zhang H, Cao C, Yu X, Sun J, Cao Y, Liu Y. Segmentation of the cervical lesion region in colposcopic images based on deep learning. Front Oncol 2022; 12:952847. [PMID: 35992860 PMCID: PMC9385196 DOI: 10.3389/fonc.2022.952847] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2022] [Accepted: 07/04/2022] [Indexed: 11/13/2022] Open
Abstract
Background Colposcopy is an important method in the diagnosis of cervical lesions. However, experienced colposcopists are lacking at present, and the training cycle is long. Therefore, the artificial intelligence-based colposcopy-assisted examination has great prospects. In this paper, a cervical lesion segmentation model (CLS-Model) was proposed for cervical lesion region segmentation from colposcopic post-acetic-acid images and accurate segmentation results could provide a good foundation for further research on the classification of the lesion and the selection of biopsy site. Methods First, the improved Faster Region-convolutional neural network (R-CNN) was used to obtain the cervical region without interference from other tissues or instruments. Afterward, a deep convolutional neural network (CLS-Net) was proposed, which used EfficientNet-B3 to extract the features of the cervical region and used the redesigned atrous spatial pyramid pooling (ASPP) module according to the size of the lesion region and the feature map after subsampling to capture multiscale features. We also used cross-layer feature fusion to achieve fine segmentation of the lesion region. Finally, the segmentation result was mapped to the original image. Results Experiments showed that on 5455 LSIL+ (including cervical intraepithelial neoplasia and cervical cancer) colposcopic post-acetic-acid images, the accuracy, specificity, sensitivity, and dice coefficient of the proposed model were 93.04%, 96.00%, 74.78%, and 73.71%, respectively, which were all higher than those of the mainstream segmentation model. Conclusion The CLS-Model proposed in this paper has good performance in the segmentation of cervical lesions in colposcopic post-acetic-acid images and can better assist colposcopists in improving the diagnostic level.
Collapse
Affiliation(s)
- Hui Yu
- Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China
- School of Precision Instrument and Optoelectronics Engineering, Tianjin University, Tianjin, China
| | - Yinuo Fan
- Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China
| | - Huizhan Ma
- School of Precision Instrument and Optoelectronics Engineering, Tianjin University, Tianjin, China
| | - Haifeng Zhang
- Obstetrics and Gynecology, Affiliated Hospital of Weifang Medical University, Weifang, China
| | - Chengcheng Cao
- Obstetrics and Gynecology, Affiliated Hospital of Weifang Medical University, Weifang, China
| | - Xuyao Yu
- Tianjin Medical University Cancer Institute and Hospital, National Clinical Research Center for Cancer, Tianjin, China
| | - Jinglai Sun
- School of Precision Instrument and Optoelectronics Engineering, Tianjin University, Tianjin, China
| | - Yuzhen Cao
- School of Precision Instrument and Optoelectronics Engineering, Tianjin University, Tianjin, China
| | - Yuzhen Liu
- Obstetrics and Gynecology, Affiliated Hospital of Weifang Medical University, Weifang, China
| |
Collapse
|
30
|
Agustiansyah P, Nurmaini S, Nuranna L, Irfannuddin I, Sanif R, Legiran L, Rachmatullah MN, Florina GO, Sapitri AI, Darmawahyuni A. Automated Precancerous Lesion Screening Using an Instance Segmentation Technique for Improving Accuracy. SENSORS (BASEL, SWITZERLAND) 2022; 22:5489. [PMID: 35897993 PMCID: PMC9332449 DOI: 10.3390/s22155489] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 07/13/2022] [Accepted: 07/17/2022] [Indexed: 06/15/2023]
Abstract
Precancerous screening using visual inspection with acetic acid (VIA) is suggested by the World Health Organization (WHO) for low-middle-income countries (LMICs). However, because of the limited number of gynecological oncologist clinicians in LMICs, VIA screening is primarily performed by general clinicians, nurses, or midwives (called medical workers). However, not being able to recognize the significant pathophysiology of human papilloma virus (HPV) infection in terms of the columnar epithelial-cell, squamous epithelial-cell, and white-spot regions with abnormal blood vessels may be further aggravated by VIA screening, which achieves a wide range of sensitivity (49-98%) and specificity (75-91%); this might lead to a false result and high interobserver variances. Hence, the automated detection of the columnar area (CA), subepithelial region of the squamocolumnar junction (SCJ), and acetowhite (AW) lesions is needed to support an accurate diagnosis. This study proposes a mask-RCNN architecture to simultaneously segment, classify, and detect CA and AW lesions. We conducted several experiments using 262 images of VIA+ cervicograms, and 222 images of VIA-cervicograms. The proposed model provided a satisfactory intersection over union performance for the CA of about 63.60%, and AW lesions of about 73.98%. The dice similarity coefficient performance was about 75.67% for the CA and about 80.49% for the AW lesion. It also performed well in cervical-cancer precursor-lesion detection, with a mean average precision of about 86.90% for the CA and of about 100% for the AW lesion, while also achieving 100% sensitivity and 92% specificity. Our proposed model with the instance segmentation approach can segment, detect, and classify cervical-cancer precursor lesions with satisfying performance only from a VIA cervicogram.
Collapse
Affiliation(s)
- Patiyus Agustiansyah
- Doctoral Program, Biology Science, Faculty of Medicine, Universitas Sriwijaya, Palembang 30139, Indonesia;
- Division of Oncology-Gynecology, Department of Obstetrics and Gynecology, Mohammad Hoesin General Hospital, Palembang 30126, Indonesia
| | - Siti Nurmaini
- Intelligent System Research Group, Faculty of Computer Science, Universitas Sriwijaya, Palembang 30139, Indonesia; (M.N.R.); (G.O.F.); (A.I.S.); (A.D.)
| | - Laila Nuranna
- Obstetrics & Gynecology Department, Faculty of Medicine, University of Indonesia, Jakarta 10430, Indonesia;
| | - Irfannuddin Irfannuddin
- Obstetrics & Gynecology Department, Faculty of Medicine, Universitas Sriwijaya, Palembang 30139, Indonesia; (I.I.); (R.S.); (L.L.)
| | - Rizal Sanif
- Obstetrics & Gynecology Department, Faculty of Medicine, Universitas Sriwijaya, Palembang 30139, Indonesia; (I.I.); (R.S.); (L.L.)
| | - Legiran Legiran
- Obstetrics & Gynecology Department, Faculty of Medicine, Universitas Sriwijaya, Palembang 30139, Indonesia; (I.I.); (R.S.); (L.L.)
| | - Muhammad Naufal Rachmatullah
- Intelligent System Research Group, Faculty of Computer Science, Universitas Sriwijaya, Palembang 30139, Indonesia; (M.N.R.); (G.O.F.); (A.I.S.); (A.D.)
| | - Gavira Olipa Florina
- Intelligent System Research Group, Faculty of Computer Science, Universitas Sriwijaya, Palembang 30139, Indonesia; (M.N.R.); (G.O.F.); (A.I.S.); (A.D.)
| | - Ade Iriani Sapitri
- Intelligent System Research Group, Faculty of Computer Science, Universitas Sriwijaya, Palembang 30139, Indonesia; (M.N.R.); (G.O.F.); (A.I.S.); (A.D.)
| | - Annisa Darmawahyuni
- Intelligent System Research Group, Faculty of Computer Science, Universitas Sriwijaya, Palembang 30139, Indonesia; (M.N.R.); (G.O.F.); (A.I.S.); (A.D.)
| |
Collapse
|
31
|
Li P, Wang X, Liu P, Xu T, Sun P, Dong B, Xue H. Cervical Lesion Classification Method Based on Cross-Validation Decision Fusion Method of Vision Transformer and DenseNet. JOURNAL OF HEALTHCARE ENGINEERING 2022; 2022:3241422. [PMID: 35607393 PMCID: PMC9124126 DOI: 10.1155/2022/3241422] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Revised: 04/24/2022] [Accepted: 04/28/2022] [Indexed: 11/17/2022]
Abstract
Objective In order to better adapt to clinical applications, this paper proposes a cross-validation decision-making fusion method of Vision Transformer and DenseNet161. Methods The dataset is the most critical acetic acid image for clinical diagnosis, and the SR areas are processed by a specific method. Then, the Vision Transformer and DenseNet161 models are trained by the fivefold cross-validation method, and the fivefold prediction results corresponding to the two models are fused by different weights. Finally, the five fused results are averaged to obtain the category with the highest probability. Results The results show that the fusion method in this paper reaches an accuracy rate of 68% for the four classifications of cervical lesions. Conclusions It is more suitable for clinical environments, effectively reducing the missed detection rate and ensuring the life and health of patients.
Collapse
Affiliation(s)
- Ping Li
- Department of Gynecology and Obstetrics, Quanzhou First Hospital Affiliated to Fujian Medical University, Quanzhou 362000, Fujian, China
| | - Xiaoxia Wang
- School of Medicine, Huaqiao University, Quanzhou 362000, Fujian, China
| | - Peizhong Liu
- School of Medicine, Huaqiao University, Quanzhou 362000, Fujian, China
- College of Engineering, Huaqiao University, Quanzhou 362000, Fujian, China
| | - Tianxiang Xu
- College of Engineering, Huaqiao University, Quanzhou 362000, Fujian, China
| | - Pengming Sun
- Fujian Maternity and Child Health Hospital, Affiliated Hospital of Fujian Medical University, Fuzhou 350001, Fujian, China
| | - Binhua Dong
- Fujian Maternity and Child Health Hospital, Affiliated Hospital of Fujian Medical University, Fuzhou 350001, Fujian, China
| | - Huifeng Xue
- Fujian Maternity and Child Health Hospital, Affiliated Hospital of Fujian Medical University, Fuzhou 350001, Fujian, China
| |
Collapse
|
32
|
Brenes D, Barberan CJ, Hunt B, Parra SG, Salcedo MP, Possati-Resende JC, Cremer ML, Castle PE, Fregnani JHTG, Maza M, Schmeler KM, Baraniuk R, Richards-Kortum R. Multi-task network for automated analysis of high-resolution endomicroscopy images to detect cervical precancer and cancer. Comput Med Imaging Graph 2022; 97:102052. [PMID: 35299096 PMCID: PMC9250128 DOI: 10.1016/j.compmedimag.2022.102052] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2021] [Revised: 02/04/2022] [Accepted: 02/10/2022] [Indexed: 10/19/2022]
Abstract
Cervical cancer is a public health emergency in low- and middle-income countries where resource limitations hamper standard-of-care prevention strategies. The high-resolution endomicroscope (HRME) is a low-cost, point-of-care device with which care providers can image the nuclear morphology of cervical lesions. Here, we propose a deep learning framework to diagnose cervical intraepithelial neoplasia grade 2 or more severe from HRME images. The proposed multi-task convolutional neural network uses nuclear segmentation to learn a diagnostically relevant representation. Nuclear segmentation was trained via proxy labels to circumvent the need for expensive, manually annotated nuclear masks. A dataset of images from over 1600 patients was used to train, validate, and test our algorithm; data from 20% of patients were reserved for testing. An external evaluation set with images from 508 patients was used to further validate our findings. The proposed method consistently outperformed other state-of-the art architectures achieving a test per patient area under the receiver operating characteristic curve (AUC-ROC) of 0.87. Performance was comparable to expert colposcopy with a test sensitivity and specificity of 0.94 (p = 0.3) and 0.58 (p = 1.0), respectively. Patients with recurrent human papillomavirus (HPV) infections are at a higher risk of developing cervical cancer. Thus, we sought to incorporate HPV DNA test results as a feature to inform prediction. We found that incorporating patient HPV status improved test specificity to 0.71 at a sensitivity of 0.94.
Collapse
Affiliation(s)
| | | | - Brady Hunt
- Rice University, Houston, TX 77005, USA.
| | | | - Mila P Salcedo
- University of Texas MD Anderson Cancer Center, Houston, TX 77030, USA.
| | | | | | | | | | - Mauricio Maza
- Basic Health International, San Savlador, El Salvador.
| | | | | | | |
Collapse
|
33
|
Ben M'Barek I, Jauvion G, Ceccaldi PF. [Artificial Intelligence in medicine: What about gynecology-obstetric?]. GYNECOLOGIE, OBSTETRIQUE, FERTILITE & SENOLOGIE 2022; 50:340-343. [PMID: 35183787 DOI: 10.1016/j.gofs.2022.02.075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 01/17/2022] [Accepted: 02/10/2022] [Indexed: 06/14/2023]
Affiliation(s)
- I Ben M'Barek
- Service de gynécologie obstétrique, Assistance publique-Hôpitaux de Paris-Beaujon, 100, boulevard du Général-Leclerc, Clichy, France; Université de Paris, 75006 Paris, France; Département de simulation en Santé, Université de Paris, Paris, France.
| | | | - P-F Ceccaldi
- Service de gynécologie obstétrique, Assistance publique-Hôpitaux de Paris-Beaujon, 100, boulevard du Général-Leclerc, Clichy, France; Université de Paris, 75006 Paris, France; Département de simulation en Santé, Université de Paris, Paris, France
| |
Collapse
|
34
|
Elakkiya R, Subramaniyaswamy V, Vijayakumar V, Mahanti A. Cervical Cancer Diagnostics Healthcare System Using Hybrid Object Detection Adversarial Networks. IEEE J Biomed Health Inform 2022; 26:1464-1471. [PMID: 34214045 DOI: 10.1109/jbhi.2021.3094311] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Cervical cancer is one of the common cancers among women and it causes significant mortality in many developing countries. Diagnosis of cervical lesions is done using pap smear test or visual inspection using acetic acid (staining). Digital colposcopy, an inexpensive methodology, provides painless and efficient screening results. Therefore, automating cervical cancer screening using colposcopy images will be highly useful in saving many lives. Nowadays, many automation techniques using computer vision and machine learning in cervical screening gained attention, paving the way for diagnosing cervical cancer. However, most of the methods rely entirely on the annotation of cervical spotting and segmentation. This paper aims to introduce the Faster Small-Object Detection Neural Networks (FSOD-GAN) to address the cervical screening and diagnosis of cervical cancer and the type of cancer using digital colposcopy images. The proposed approach automatically detects the cervical spot using Faster Region-Based Convolutional Neural Network (FR-CNN) and performs the hierarchical multiclass classification of three types of cervical cancer lesions. Experimentation was done with colposcopy data collected from available open sources consisting of 1,993 patients with three cervical categories, and the proposed approach shows 99% accuracy in diagnosing the stages of cervical cancer.
Collapse
|
35
|
Chen K, Wang Q, Ma Y. Cervical optical coherence tomography image classification based on contrastive self-supervised texture learning. Med Phys 2022; 49:3638-3653. [PMID: 35342956 DOI: 10.1002/mp.15630] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Revised: 02/26/2022] [Accepted: 03/16/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Cervical cancer seriously affects the health of the female reproductive system. Optical coherence tomography (OCT) emerged as a non-invasive, high-resolution imaging technology for cervical disease detection. However, OCT image annotation is knowledge-intensive and time-consuming, which impedes the training process of deep-learning-based classification models. PURPOSE This study aims to develop a computer-aided diagnosis (CADx) approach to classifying in-vivo cervical OCT images based on self-supervised learning. METHODS In addition to high-level semantic features extracted by a convolutional neural network (CNN), the proposed CADx approach designs a contrastive texture learning (CTL) strategy to leverage unlabeled cervical OCT images' texture features. We conducted ten-fold cross-validation on the OCT image dataset from a multi-center clinical study on 733 patients from China. RESULTS In a binary classification task for detecting high-risk diseases, including high-grade squamous intraepithelial lesion and cervical cancer, our method achieved an area-under-the-curve value of 0.9798 ± 0.0157 with a sensitivity of 91.17 ± 4.99% and a specificity of 93.96 ± 4.72% for OCT image patches; also, it outperformed two out of four medical experts on the test set. Furthermore, our method achieved a 91.53% sensitivity and 97.37% specificity on an external validation dataset containing 287 3D OCT volumes from 118 Chinese patients in a new hospital using a cross-shaped threshold voting strategy. CONCLUSIONS The proposed contrastive-learning-based CADx method outperformed the end-to-end CNN models and provided better interpretability based on texture features, which holds great potential to be used in the clinical protocol of "see-and-treat." This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Kaiyi Chen
- School of Computer Science, Wuhan University, Wuhan, 430072, China
| | - Qingbin Wang
- School of Computer Science, Wuhan University, Wuhan, 430072, China
| | - Yutao Ma
- School of Computer Science, Wuhan University, Wuhan, 430072, China
| |
Collapse
|
36
|
Hou X, Shen G, Zhou L, Li Y, Wang T, Ma X. Artificial Intelligence in Cervical Cancer Screening and Diagnosis. Front Oncol 2022; 12:851367. [PMID: 35359358 PMCID: PMC8963491 DOI: 10.3389/fonc.2022.851367] [Citation(s) in RCA: 54] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Accepted: 02/10/2022] [Indexed: 12/11/2022] Open
Abstract
Cervical cancer remains a leading cause of cancer death in women, seriously threatening their physical and mental health. It is an easily preventable cancer with early screening and diagnosis. Although technical advancements have significantly improved the early diagnosis of cervical cancer, accurate diagnosis remains difficult owing to various factors. In recent years, artificial intelligence (AI)-based medical diagnostic applications have been on the rise and have excellent applicability in the screening and diagnosis of cervical cancer. Their benefits include reduced time consumption, reduced need for professional and technical personnel, and no bias owing to subjective factors. We, thus, aimed to discuss how AI can be used in cervical cancer screening and diagnosis, particularly to improve the accuracy of early diagnosis. The application and challenges of using AI in the diagnosis and treatment of cervical cancer are also discussed.
Collapse
Affiliation(s)
- Xin Hou
- Department of Obstetrics and Gynecology, Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China
| | - Guangyang Shen
- Department of Obstetrics and Gynecology, Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China
| | - Liqiang Zhou
- Cancer Centre and Center of Reproduction, Development and Aging, Faculty of Health Sciences, University of Macau, Macau, Macau SAR, China
| | - Yinuo Li
- Department of Obstetrics and Gynecology, Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China
| | - Tian Wang
- Department of Obstetrics and Gynecology, Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China
| | - Xiangyi Ma
- Department of Obstetrics and Gynecology, Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China
- *Correspondence: Xiangyi Ma,
| |
Collapse
|
37
|
Takahashi T, Matsuoka H, Sakurai R, Akatsuka J, Kobayashi Y, Nakamura M, Iwata T, Banno K, Matsuzaki M, Takayama J, Aoki D, Yamamoto Y, Tamiya G. Development of a prognostic prediction support system for cervical intraepithelial neoplasia using artificial intelligence-based diagnosis. J Gynecol Oncol 2022; 33:e57. [PMID: 35712970 PMCID: PMC9428307 DOI: 10.3802/jgo.2022.33.e57] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2022] [Revised: 04/07/2022] [Accepted: 04/29/2022] [Indexed: 11/30/2022] Open
Abstract
Objective Human papillomavirus subtypes are predictive indicators of cervical intraepithelial neoplasia (CIN) progression. While colposcopy is also an essential part of cervical cancer prevention, its accuracy and reproducibility are limited because of subjective evaluation. This study aimed to develop an artificial intelligence (AI) algorithm that can accurately detect the optimal lesion associated with prognosis using colposcopic images of CIN2 patients by utilizing objective AI diagnosis. Methods We identified colposcopic findings associated with the prognosis of patients with CIN2. We developed a convolutional neural network that can automatically detect the rate of high-grade lesions in the uterovaginal area in 12 segments. We finally evaluated the detection accuracy of our AI algorithm compared with the scores by multiple gynecologic oncologists. Results High-grade lesion occupancy in the uterovaginal area detected by senior colposcopists was significantly correlated with the prognosis of patients with CIN2. The detection rate for high-grade lesions in 12 segments of the uterovaginal area by the AI system was 62.1% for recall, and the overall correct response rate was 89.7%. Moreover, the percentage of high-grade lesions detected by the AI system was significantly correlated with the rate detected by multiple gynecologic senior oncologists (r=0.61). Conclusion Our novel AI algorithm can accurately determine high-grade lesions associated with prognosis on colposcopic images, and these results provide an insight into the additional utility of colposcopy for the management of patients with CIN2. High-grade lesion occupancy in the uterovaginal area was significantly correlated with CIN2 patients’ prognosis. The number of high-grade lesions in 12 segments detected by an artificial intelligence (AI)-based system was comparable to that detected by senior colposcopists. The overall correct response rate of the AI algorithm for detecting high-grade lesions was 89.7%.
Collapse
Affiliation(s)
- Takayuki Takahashi
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Hikaru Matsuoka
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
| | - Rieko Sakurai
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
| | - Jun Akatsuka
- Pathology Informatics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
- Department of Urology, Nippon Medical School Hospital, Tokyo, Japan
| | - Yusuke Kobayashi
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Masaru Nakamura
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Takashi Iwata
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Kouji Banno
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Motomichi Matsuzaki
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
| | - Jun Takayama
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
- Tohoku University Graduate School of Medicine, Miyagi, Japan
| | - Daisuke Aoki
- Department of Obstetrics and Gynecology, Keio University School of Medicine, Tokyo, Japan
| | - Yoichiro Yamamoto
- Pathology Informatics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
| | - Gen Tamiya
- Statistical Genetics Team, RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
- Tohoku University Graduate School of Medicine, Miyagi, Japan
| |
Collapse
|
38
|
HLDnet: Novel deep learning based Artificial Intelligence tool fuses acetic acid and Lugol’s iodine cervicograms for accurate pre-cancer screening. Biomed Signal Process Control 2022. [DOI: 10.1016/j.bspc.2021.103163] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
39
|
Mitchell EM, Doede AL, McLean Estrada M, Granera OB, Maldonado F, Dunn B, Banks S, Marks-Symeonides I, Morrone D, Pitt C, Dillingham RA. Feasibility and Acceptability of Tele-Colposcopy on the Caribbean Coast of Nicaragua: A Descriptive Mixed-Methods Study. TELEMEDICINE REPORTS 2021; 2:264-272. [PMID: 35720751 PMCID: PMC9049806 DOI: 10.1089/tmr.2021.0024] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 10/04/2021] [Indexed: 06/15/2023]
Abstract
Background: Cervical cancer, a preventable cancer of disparities, is the primary cause of cancer death for women in Nicaragua. Clinics and personnel in rural and remote Nicaragua may not be accessible to perform recommended screening or follow-up services. Objective: To assess acceptability and feasibility of integrating innovations for high-quality screening and treatment follow-up (tele-colposcopy) into existing pathways on Nicaragua's Caribbean Coast within the context of the National Cervical Cancer Control Program. Methods: Provider focus groups, key informant interviews, and environmental scans were conducted for 13 clinics on the Caribbean Coast of Nicaragua. Topics discussed included a smartphone-based mobile colposcope (MobileODT hardware and mobile platform), mobile connectivity capacity, clinic resources, provider acceptability, and current diagnostic and clinical protocols. We tested device connectivity through image upload availability and real-time video connection and simulated clinical encounters utilizing MobileODT and a low-cost cervical simulator. We developed a database of colposcopic images to establish feasibility of integrating this database and clinical characteristics into the cervical cancer registry. Results: Provider acceptability of integrating tele-colposcopy into existing cancer control efforts was high. Image upload connectivity varied by location (mean = 1 h 9 min). Most clinics had running water (84.6%) and consistent electricity (92.3%), but some did not have access to landline telephones (53.8%). Conclusions: As faster connectivity becomes available in remote settings, Mobile Health tools such as tele-colposcopy will be increasingly feasible to provide access to high-quality cervical cancer follow-up. World Health Organization guidance on integrating technology into existing programs will remain important to ensure programmatic efficacy, local relevance, and sustainability.
Collapse
Affiliation(s)
| | - Aubrey L. Doede
- University of Virginia School of Nursing, Charlottesville, Virginia, USA
| | | | | | | | - Brian Dunn
- University of Virginia Karen S. Rheuban Center for Telehealth, Charlottesville, Virginia, USA
| | | | | | - Danielle Morrone
- University of Virginia Health System, Charlottesville, Virginia, USA
| | - Charlotte Pitt
- University of Virginia Health System, Charlottesville, Virginia, USA
| | - Rebecca A. Dillingham
- University of Virginia School of Medicine, University of Virginia Center for Global Health, Charlottesville, Virginia, USA
| |
Collapse
|
40
|
Kumar Y, Gupta S, Singla R, Hu YC. A Systematic Review of Artificial Intelligence Techniques in Cancer Prediction and Diagnosis. ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING : STATE OF THE ART REVIEWS 2021; 29:2043-2070. [PMID: 34602811 PMCID: PMC8475374 DOI: 10.1007/s11831-021-09648-w] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2021] [Accepted: 09/11/2021] [Indexed: 05/05/2023]
Abstract
Artificial intelligence has aided in the advancement of healthcare research. The availability of open-source healthcare statistics has prompted researchers to create applications that aid cancer detection and prognosis. Deep learning and machine learning models provide a reliable, rapid, and effective solution to deal with such challenging diseases in these circumstances. PRISMA guidelines had been used to select the articles published on the web of science, EBSCO, and EMBASE between 2009 and 2021. In this study, we performed an efficient search and included the research articles that employed AI-based learning approaches for cancer prediction. A total of 185 papers are considered impactful for cancer prediction using conventional machine and deep learning-based classifications. In addition, the survey also deliberated the work done by the different researchers and highlighted the limitations of the existing literature, and performed the comparison using various parameters such as prediction rate, accuracy, sensitivity, specificity, dice score, detection rate, area undercover, precision, recall, and F1-score. Five investigations have been designed, and solutions to those were explored. Although multiple techniques recommended in the literature have achieved great prediction results, still cancer mortality has not been reduced. Thus, more extensive research to deal with the challenges in the area of cancer prediction is required.
Collapse
Affiliation(s)
- Yogesh Kumar
- Department of Computer Engineering, Indus Institute of Technology & Engineering, Indus University, Rancharda, Via: Shilaj, Ahmedabad, Gujarat 382115 India
| | - Surbhi Gupta
- School of Computer Science and Engineering, Model Institute of Engineering and Technology, Kot bhalwal, Jammu, J&K 181122 India
| | - Ruchi Singla
- Department of Research, Innovations, Sponsored Projects and Entrepreneurship, Chandigarh Group of Colleges, Landran, Mohali India
| | - Yu-Chen Hu
- Department of Computer Science and Information Management, Providence University, Taichung City, Taiwan, ROC
| |
Collapse
|
41
|
Yue Z, Ding S, Li X, Yang S, Zhang Y. Automatic Acetowhite Lesion Segmentation via Specular Reflection Removal and Deep Attention Network. IEEE J Biomed Health Inform 2021; 25:3529-3540. [PMID: 33684051 DOI: 10.1109/jbhi.2021.3064366] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Automatic acetowhite lesion segmentation in colposcopy images (cervigrams) is essential in assisting gynecologists for the diagnosis of cervical intraepithelial neoplasia grades and cervical cancer. It can also help gynecologists determine the correct lesion areas for further pathological examination. Existing computer-aided diagnosis algorithms show poor segmentation performance because of specular reflections, insufficient training data and the inability to focus on semantically meaningful lesion parts. In this paper, a novel computer-aided diagnosis algorithm is proposed to segment acetowhite lesions in cervigrams automatically. To reduce the interference of specularities on segmentation performance, a specular reflection removal mechanism is presented to detect and inpaint these areas with precision. Moreover, we design a cervigram image classification network to classify pathology results and generate lesion attention maps, which are subsequently leveraged to guide a more accurate lesion segmentation task by the proposed lesion-aware convolutional neural network. We conducted comprehensive experiments to evaluate the proposed approaches on 3045 clinical cervigrams. Our results show that our method outperforms state-of-the-art approaches and achieves better Dice similarity coefficient and Hausdorff Distance values in acetowhite legion segmentation.
Collapse
|
42
|
Liu L, Wang Y, Liu X, Han S, Jia L, Meng L, Yang Z, Chen W, Zhang Y, Qiao X. Computer-aided diagnostic system based on deep learning for classifying colposcopy images. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:1045. [PMID: 34422957 PMCID: PMC8339824 DOI: 10.21037/atm-21-885] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Accepted: 05/23/2021] [Indexed: 12/24/2022]
Abstract
Background Colposcopy is widely used to detect cervical cancer, but developing countries lack the experienced colposcopists necessary for accurate diagnosis. Artificial intelligence (AI) is being widely used in computer-aided diagnosis (CAD) systems. In this study, we developed and validated a CAD model based on deep learning to classify cervical lesions on colposcopy images. Methods Patient data, including clinical information, colposcopy images, and pathological results, were collected from Qilu Hospital. The study included 15,276 images from 7,530 patients. We performed two tasks in this study: normal cervix (NC) vs. low grade squamous intraepithelial lesion or worse (LSIL+) and high-grade squamous intraepithelial lesion (HSIL)- vs. HSIL+. The residual neural network (ResNet) probability was calculated for each patient to reflect the probability of lesions through a ResNet model. Next, a combination model was constructed by incorporating the ResNet probability and clinical features. We divided the dataset into a training set, validation set, and testing set at a ratio of 7:1:2. Finally, we randomly selected 300 patients from the testing set and compared the results with the diagnosis of a senior colposcopist and a junior colposcopist. Results The model that combines ResNet and clinical features performs better than ResNet alone. In the classification of NC and LSIL+, the area under the receiver operating characteristic curve (AUC), accuracy, sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 0.953, 0.886, 0.932, 0.846, 0.838, and 0.936, respectively. In the classification of HSIL- and HSIL+, the AUC, accuracy, sensitivity, specificity, PPV, and NPV were 0.900, 0.807, 0.823, 0.800, 0.618, and 0.920, respectively. In the two classification tasks, the diagnostic performance of the model was determined to be comparable to that of the senior colposcopist and exhibited a stronger diagnostic performance than the junior colposcopist. Conclusions The CAD system for cervical lesion diagnosis based on deep learning performs well in the classification of cervical lesions and can provide an objective diagnostic basis for colposcopists.
Collapse
Affiliation(s)
- Lu Liu
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Ying Wang
- Department of Obstetrics and Gynecology, Yidu Central Hospital of Weifang, Weifang, China
| | - Xiaoli Liu
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Sai Han
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Lin Jia
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Lihua Meng
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Ziyan Yang
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Wei Chen
- School and Hospital of Stomatology, Cheeloo College of Medicine, Shandong University & Shandong Key Laboratory of Oral Tissue Regeneration & Shandong Engineering Laboratory for Dental Materials and Oral Tissue Regeneration, Jinan, China
| | - Youzhong Zhang
- Department of Obstetrics and Gynecology, Qilu Hospital, Cheeloo College of Medicine, Shandong University, Jinan, China
| | - Xu Qiao
- School of Control Science and Engineering, Shandong University, Jinan, China
| |
Collapse
|
43
|
Cellphone enabled point-of-care assessment of breast tumor cytology and molecular HER2 expression from fine-needle aspirates. NPJ Breast Cancer 2021; 7:85. [PMID: 34215753 PMCID: PMC8253731 DOI: 10.1038/s41523-021-00290-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2020] [Accepted: 06/03/2021] [Indexed: 12/13/2022] Open
Abstract
Management of breast cancer in limited-resource settings is hindered by a lack of low-cost, logistically sustainable approaches toward molecular and cellular diagnostic pathology services that are needed to guide therapy. To address these limitations, we have developed a multimodal cellphone-based platform—the EpiView-D4—that can evaluate both cellular morphology and molecular expression of clinically relevant biomarkers directly from fine-needle aspiration (FNA) of breast tissue specimens within 1 h. The EpiView-D4 is comprised of two components: (1) an immunodiagnostic chip built upon a “non-fouling” polymer brush-coating (the “D4”) which quantifies expression of protein biomarkers directly from crude cell lysates, and (2) a custom cellphone-based optical microscope (“EpiView”) designed for imaging cytology preparations and D4 assay readout. As a proof-of-concept, we used the EpiView-D4 for assessment of human epidermal growth factor receptor-2 (HER2) expression and validated the performance using cancer cell lines, animal models, and human tissue specimens. We found that FNA cytology specimens (prepared in less than 5 min with rapid staining kits) imaged by the EpiView-D4 were adequate for assessment of lesional cellularity and tumor content. We also found our device could reliably distinguish between HER2 expression levels across multiple different cell lines and animal xenografts. In a pilot study with human tissue (n = 19), we were able to accurately categorize HER2-negative and HER2-positve tumors from FNA specimens. Taken together, the EpiView-D4 offers a promising alternative to invasive—and often unavailable—pathology services and may enable the democratization of effective breast cancer management in limited-resource settings.
Collapse
|
44
|
Yan L, Li S, Guo Y, Ren P, Song H, Yang J, Shen X. Multi-state colposcopy image fusion for cervical precancerous lesion diagnosis using BF-CNN. Biomed Signal Process Control 2021. [DOI: 10.1016/j.bspc.2021.102700] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
45
|
Using Dynamic Features for Automatic Cervical Precancer Detection. Diagnostics (Basel) 2021; 11:diagnostics11040716. [PMID: 33920732 PMCID: PMC8073487 DOI: 10.3390/diagnostics11040716] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Revised: 04/07/2021] [Accepted: 04/15/2021] [Indexed: 11/17/2022] Open
Abstract
Cervical cancer remains a major public health concern in developing countries due to financial and human resource constraints. Visual inspection with acetic acid (VIA) of the cervix was widely promoted and routinely used as a low-cost primary screening test in low- and middle-income countries. It can be performed by a variety of health workers and the result is immediate. VIA provides a transient whitening effect which appears and disappears differently in precancerous and cancerous lesions, as compared to benign conditions. Colposcopes are often used during VIA to magnify the view of the cervix and allow clinicians to visually assess it. However, this assessment is generally subjective and unreliable even for experienced clinicians. Computer-aided techniques may improve the accuracy of VIA diagnosis and be an important determinant in the promotion of cervical cancer screening. This work proposes a smartphone-based solution that automatically detects cervical precancer from the dynamic features extracted from videos taken during VIA. The proposed solution achieves a sensitivity and specificity of 0.9 and 0.87 respectively, and could be a solution for screening in countries that suffer from the lack of expensive tools such as colposcopes and well-trained clinicians.
Collapse
|
46
|
Asiedu MN, Skerrett E, Sapiro G, Ramanujam N. Combining multiple contrasts for improving machine learning-based classification of cervical cancers with a low-cost point-of-care Pocket colposcope. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2020:1148-1151. [PMID: 33018190 DOI: 10.1109/embc44109.2020.9175858] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
We apply feature-extraction and machine learning methods to multiple sources of contrast (acetic acid, Lugol's iodine and green light) from the white Pocket Colposcope, a low-cost point of care colposcope for cervical cancer screening. We combine features from the sources of contrast and analyze diagnostic improvements with addition of each contrast. We find that overall AUC increases with additional contrast agents compared to using only one source.
Collapse
|
47
|
Asiedu MN, Agudogo JS, Dotson ME, Skerrett E, Krieger MS, Lam CT, Agyei D, Amewu J, Asah-Opoku K, Huchko M, Schmitt JW, Samba A, Srofenyoh E, Ramanujam N. A novel speculum-free imaging strategy for visualization of the internal female lower reproductive system. Sci Rep 2020; 10:16570. [PMID: 33024146 PMCID: PMC7538883 DOI: 10.1038/s41598-020-72219-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2020] [Accepted: 08/25/2020] [Indexed: 12/16/2022] Open
Abstract
Fear of the speculum and feelings of vulnerability during the gynecologic exams are two of the biggest barriers to cervical cancer screening for women. To address these barriers, we have developed a novel, low-cost tool called the Callascope to reimagine the gynecological exam, enabling clinician and self-imaging of the cervix without the need for a speculum. The Callascope contains a 2 megapixel camera and contrast agent spray mechanism housed within a form factor designed to eliminate the need for a speculum during contrast agent administration and image capture. Preliminary bench testing for comparison of the Callascope camera to a $20,000 high-end colposcope demonstrated that the Callascope camera meets visual requirements for cervical imaging. Bench testing of the spray mechanism demonstrates that the contrast agent delivery enables satisfactory administration and cervix coverage. Clinical studies performed at Duke University Medical Center, Durham, USA and in Greater Accra Regional Hospital, Accra, Ghana assessed (1) the Callascope's ability to visualize the cervix compared to the standard-of-care speculum exam, (2) the feasibility and willingness of women to use the Callascope for self-exams, and (3) the feasibility and willingness of clinicians and their patients to use the Callascope for clinician-based examinations. Cervix visualization was comparable between the Callascope and speculum (83% or 44/53 women vs. 100%) when performed by a clinician. Visualization was achieved in 95% (21/22) of women who used the Callascope for self-imaging. Post-exam surveys indicated that participants preferred the Callascope to a speculum-based exam. Our results indicate the Callascope is a viable option for clinician-based and self-exam speculum-free cervical imaging.Clinical study registration ClinicalTrials.gov https://clinicaltrials.gov/ct2/show/record/ NCT00900575, Pan African Clinical Trial Registry (PACTR) https://www.pactr.org/ PACTR201905806116817.
Collapse
Affiliation(s)
- Mercy N. Asiedu
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA ,grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA
| | - Júlia S. Agudogo
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA
| | - Mary E. Dotson
- grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA
| | - Erica Skerrett
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA ,grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA
| | - Marlee S. Krieger
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA
| | - Christopher T. Lam
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA ,grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA
| | - Doris Agyei
- Family Planning and Reproductive Health Unit, Greater Accra Regional Hospital, Accra, Ghana
| | - Juliet Amewu
- Family Planning and Reproductive Health Unit, Greater Accra Regional Hospital, Accra, Ghana
| | - Kwaku Asah-Opoku
- grid.415489.50000 0004 0546 3805Department of Obstetrics and Gynecology, Korle Bu Teaching Hospital, Accra, Ghana ,grid.8652.90000 0004 1937 1485The University of Ghana Medical School, Accra, Ghana
| | - Megan Huchko
- grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA ,grid.414179.e0000 0001 2232 0951Department of Obstetrics and Gynecology, Duke Medical Center, Durham, NC USA
| | - John W. Schmitt
- grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA ,grid.414179.e0000 0001 2232 0951Department of Obstetrics and Gynecology, Duke Medical Center, Durham, NC USA
| | - Ali Samba
- grid.415489.50000 0004 0546 3805Department of Obstetrics and Gynecology, Korle Bu Teaching Hospital, Accra, Ghana ,grid.8652.90000 0004 1937 1485The University of Ghana Medical School, Accra, Ghana
| | - Emmanuel Srofenyoh
- Family Planning and Reproductive Health Unit, Greater Accra Regional Hospital, Accra, Ghana
| | - Nirmala Ramanujam
- grid.26009.3d0000 0004 1936 7961Department of Biomedical Engineering, Duke University, Gross Hall Rm 370, Durham, NC 27713 USA ,grid.26009.3d0000 0004 1936 7961Center for Global Women’s Health Technologies, Duke University, Durham, NC USA ,grid.26009.3d0000 0004 1936 7961Duke Global Health Institute, Duke University, Durham, NC USA
| |
Collapse
|
48
|
Classification of cervical neoplasms on colposcopic photography using deep learning. Sci Rep 2020; 10:13652. [PMID: 32788635 PMCID: PMC7423899 DOI: 10.1038/s41598-020-70490-4] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2020] [Accepted: 06/17/2020] [Indexed: 01/07/2023] Open
Abstract
Colposcopy is widely used to detect cervical cancers, but experienced physicians who are needed for an accurate diagnosis are lacking in developing countries. Artificial intelligence (AI) has been recently used in computer-aided diagnosis showing remarkable promise. In this study, we developed and validated deep learning models to automatically classify cervical neoplasms on colposcopic photographs. Pre-trained convolutional neural networks were fine-tuned for two grading systems: the cervical intraepithelial neoplasia (CIN) system and the lower anogenital squamous terminology (LAST) system. The multi-class classification accuracies of the networks for the CIN system in the test dataset were 48.6 ± 1.3% by Inception-Resnet-v2 and 51.7 ± 5.2% by Resnet-152. The accuracies for the LAST system were 71.8 ± 1.8% and 74.7 ± 1.8%, respectively. The area under the curve (AUC) for discriminating high-risk lesions from low-risk lesions by Resnet-152 was 0.781 ± 0.020 for the CIN system and 0.708 ± 0.024 for the LAST system. The lesions requiring biopsy were also detected efficiently (AUC, 0.947 ± 0.030 by Resnet-152), and presented meaningfully on attention maps. These results may indicate the potential of the application of AI for automated reading of colposcopic photographs.
Collapse
|
49
|
Yuan C, Yao Y, Cheng B, Cheng Y, Li Y, Li Y, Liu X, Cheng X, Xie X, Wu J, Wang X, Lu W. The application of deep learning based diagnostic system to cervical squamous intraepithelial lesions recognition in colposcopy images. Sci Rep 2020; 10:11639. [PMID: 32669565 PMCID: PMC7363819 DOI: 10.1038/s41598-020-68252-3] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 06/16/2020] [Indexed: 12/12/2022] Open
Abstract
Background Deep learning has presented considerable potential and is gaining more importance in computer assisted diagnosis. As the gold standard for pathologically diagnosing cervical intraepithelial lesions and invasive cervical cancer, colposcopy-guided biopsy faces challenges in improving accuracy and efficiency worldwide, especially in developing countries. To ease the heavy burden of cervical cancer screening, it is urgent to establish a scientific, accurate and efficient method for assisting diagnosis and biopsy. Methods The data were collected to establish three deep-learning-based models. For every case, one saline image, one acetic image, one iodine image and the corresponding clinical information, including age, the results of human papillomavirus testing and cytology, type of transformation zone, and pathologic diagnosis, were collected. The dataset was proportionally divided into three subsets including the training set, the test set and the validation set, at a ratio of 8:1:1. The validation set was used to evaluate model performance. After model establishment, an independent dataset of high-definition images was collected to further evaluate the model performance. In addition, the comparison of diagnostic accuracy between colposcopists and models weas performed. Results The sensitivity, specificity and accuracy of the classification model to differentiate negative cases from positive cases were 85.38%, 82.62% and 84.10% respectively, with an AUC of 0.93. The recall and DICE of the segmentation model to segment suspicious lesions in acetic images were 84.73% and 61.64%, with an average accuracy of 95.59%. Furthermore, 84.67% of high-grade lesions were detected by the acetic detection model. Compared to colposcopists, the diagnostic system performed better in ordinary colposcopy images but slightly unsatisfactory in high-definition images. Implications The deep learning-based diagnostic system could help assist colposcopy diagnosis and biopsy for HSILs.
Collapse
Affiliation(s)
- Chunnv Yuan
- Women's Reproductive Health Laboratory of Zhejiang Province, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, Zhejiang, China
| | - Yeli Yao
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Bei Cheng
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Yifan Cheng
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Ying Li
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Yang Li
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Xuechen Liu
- College of Computer Science and Technology, Zhejiang University, Hangzhou, 310027, China
| | - Xiaodong Cheng
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Xing Xie
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
| | - Jian Wu
- College of Computer Science and Technology, Zhejiang University, Hangzhou, 310027, China
| | - Xinyu Wang
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China
- Center for Uterine Cancer Diagnosis & Therapy Research of Zhejiang Province, Hangzhou, 310006, China
| | - Weiguo Lu
- Department of Gynecologic Oncology, Women's Hospital, School of Medicine, Zhejiang University, Hangzhou, 310006, China.
- Center for Uterine Cancer Diagnosis & Therapy Research of Zhejiang Province, Hangzhou, 310006, China.
| |
Collapse
|
50
|
The challenges of colposcopy for cervical cancer screening in LMICs and solutions by artificial intelligence. BMC Med 2020; 18:169. [PMID: 32493320 PMCID: PMC7271416 DOI: 10.1186/s12916-020-01613-x] [Citation(s) in RCA: 75] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/12/2020] [Accepted: 04/30/2020] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND The World Health Organization (WHO) called for global action towards the elimination of cervical cancer. One of the main strategies is to screen 70% of women at the age between 35 and 45 years and 90% of women managed appropriately by 2030. So far, approximately 85% of cervical cancers occur in low- and middle-income countries (LMICs). The colposcopy-guided biopsy is crucial for detecting cervical intraepithelial neoplasia (CIN) and becomes the main bottleneck limiting screening performance. Unprecedented advances in artificial intelligence (AI) enable the synergy of deep learning and digital colposcopy, which offers opportunities for automatic image-based diagnosis. To this end, we discuss the main challenges of traditional colposcopy and the solutions applying AI-guided digital colposcopy as an auxiliary diagnostic tool in low- and middle- income countries (LMICs). MAIN BODY Existing challenges for the application of colposcopy in LMICs include strong dependence on the subjective experience of operators, substantial inter- and intra-operator variabilities, shortage of experienced colposcopists, consummate colposcopy training courses, and uniform diagnostic standard and strict quality control that are hard to be followed by colposcopists with limited diagnostic ability, resulting in discrepant reporting and documentation of colposcopy impressions. Organized colposcopy training courses should be viewed as an effective way to enhance the diagnostic ability of colposcopists, but implementing these courses in practice may not always be feasible to improve the overall diagnostic performance in a short period of time. Fortunately, AI has the potential to address colposcopic bottleneck, which could assist colposcopists in colposcopy imaging judgment, detection of underlying CINs, and guidance of biopsy sites. The automated workflow of colposcopy examination could create a novel cervical cancer screening model, reduce potentially false negatives and false positives, and improve the accuracy of colposcopy diagnosis and cervical biopsy. CONCLUSION We believe that a practical and accurate AI-guided digital colposcopy has the potential to strengthen the diagnostic ability in guiding cervical biopsy, thereby improves cervical cancer screening performance in LMICs and accelerates the process of global cervical cancer elimination eventually.
Collapse
|