1
|
Siddiqui S, Akram T, Ashraf I, Raza M, Khan MA, Damaševičius R. CG‐Net: A novel CNN framework for gastrointestinal tract diseases classification. INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY 2024; 34. [DOI: 10.1002/ima.23081] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/30/2023] [Accepted: 03/31/2024] [Indexed: 09/23/2024]
Abstract
AbstractThe classification of medical images has had a significant influence on the diagnostic techniques and therapeutic interventions. Conventional disease diagnosis procedures require a substantial amount of time and effort to accurately diagnose. Based on global statistics, gastrointestinal cancer has been recognized as a major contributor to cancer‐related deaths. The complexities involved in resolving gastrointestinal tract (GIT) ailments arise from the need for elaborate methods to precisely identify the exact location of the problem. Therefore, doctors frequently use wireless capsule endoscopy to diagnose and treat GIT problems. This research aims to develop a robust framework using deep learning techniques to effectively classify GIT diseases for therapeutic purposes. A CNN based framework, in conjunction with the feature selection method, has been proposed to improve the classification rate. The proposed framework has been evaluated using various performance measures, including accuracy, recall, precision, F1 measure, mean absolute error, and mean squared error.
Collapse
Affiliation(s)
- Samra Siddiqui
- Department of Computer Science HITEC University Taxila Pakistan
- Department of Computer Science COMSATS University Islamabad Wah Campus Pakistan
| | - Tallha Akram
- Department of Information Systems, College of Computer Engineering and Sciences Prince Sattam bin Abdulaziz University Al‐Kharj Saudi Arabia
- Department of Machine Learning Convex Solutions Pvt (Ltd) Islamabad Pakistan
| | - Imran Ashraf
- Department of Computer Science, Department of Computer Science NUCES (FAST) Islamabad Pakistan
| | - Muddassar Raza
- Department of Computer Science HITEC University Taxila Pakistan
| | | | | |
Collapse
|
2
|
Kim BS, Cho M, Chung GE, Lee J, Kang HY, Yoon D, Cho WS, Lee JC, Bae JH, Kong HJ, Kim S. Density clustering-based automatic anatomical section recognition in colonoscopy video using deep learning. Sci Rep 2024; 14:872. [PMID: 38195632 PMCID: PMC10776865 DOI: 10.1038/s41598-023-51056-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2023] [Accepted: 12/29/2023] [Indexed: 01/11/2024] Open
Abstract
Recognizing anatomical sections during colonoscopy is crucial for diagnosing colonic diseases and generating accurate reports. While recent studies have endeavored to identify anatomical regions of the colon using deep learning, the deformable anatomical characteristics of the colon pose challenges for establishing a reliable localization system. This study presents a system utilizing 100 colonoscopy videos, combining density clustering and deep learning. Cascaded CNN models are employed to estimate the appendix orifice (AO), flexures, and "outside of the body," sequentially. Subsequently, DBSCAN algorithm is applied to identify anatomical sections. Clustering-based analysis integrates clinical knowledge and context based on the anatomical section within the model. We address challenges posed by colonoscopy images through non-informative removal preprocessing. The image data is labeled by clinicians, and the system deduces section correspondence stochastically. The model categorizes the colon into three sections: right (cecum and ascending colon), middle (transverse colon), and left (descending colon, sigmoid colon, rectum). We estimated the appearance time of anatomical boundaries with an average error of 6.31 s for AO, 9.79 s for HF, 27.69 s for SF, and 3.26 s for outside of the body. The proposed method can facilitate future advancements towards AI-based automatic reporting, offering time-saving efficacy and standardization.
Collapse
Grants
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- IITP-2023-2018-0-01833 the Ministry of Science and ICT, Korea under the Information Technology Research Center (ITRC) support program
Collapse
Affiliation(s)
- Byeong Soo Kim
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Minwoo Cho
- Innovative Medical Technology Research Institute, Seoul National University Hospital, Seoul, 03080, Korea
- Department of Transdisciplinary Medicine, Seoul National University Hospital, Seoul, 03080, Korea
- Department of Medicine, Seoul National University College of Medicine, Seoul, 03080, Korea
| | - Goh Eun Chung
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Jooyoung Lee
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Hae Yeon Kang
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Dan Yoon
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Woo Sang Cho
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Jung Chan Lee
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, 03080, Korea
- Institute of Bioengineering, Seoul National University, Seoul, 08826, Republic of Korea
- Institute of Medical and Biological Engineering, Medical Research Center, Seoul National University, Seoul, 03080, Korea
| | - Jung Ho Bae
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea.
| | - Hyoun-Joong Kong
- Innovative Medical Technology Research Institute, Seoul National University Hospital, Seoul, 03080, Korea.
- Department of Transdisciplinary Medicine, Seoul National University Hospital, Seoul, 03080, Korea.
- Department of Medicine, Seoul National University College of Medicine, Seoul, 03080, Korea.
- Medical Big Data Research Center, Seoul National University College of Medicine, Seoul, 03087, Korea.
| | - Sungwan Kim
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, 03080, Korea.
- Institute of Bioengineering, Seoul National University, Seoul, 08826, Republic of Korea.
- Artificial Intelligence Institute, Seoul National University, Research Park Building 942, 2 Fl., Seoul, 08826, Korea.
| |
Collapse
|
3
|
Kader R, Cid‐Mejias A, Brandao P, Islam S, Hebbar S, Puyal JG, Ahmad OF, Hussein M, Toth D, Mountney P, Seward E, Vega R, Stoyanov D, Lovat LB. Polyp characterization using deep learning and a publicly accessible polyp video database. Dig Endosc 2023; 35:645-655. [PMID: 36527309 PMCID: PMC10570984 DOI: 10.1111/den.14500] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 12/13/2022] [Indexed: 01/20/2023]
Abstract
OBJECTIVES Convolutional neural networks (CNN) for computer-aided diagnosis of polyps are often trained using high-quality still images in a single chromoendoscopy imaging modality with sessile serrated lesions (SSLs) often excluded. This study developed a CNN from videos to classify polyps as adenomatous or nonadenomatous using standard narrow-band imaging (NBI) and NBI-near focus (NBI-NF) and created a publicly accessible polyp video database. METHODS We trained a CNN with 16,832 high and moderate quality frames from 229 polyp videos (56 SSLs). It was evaluated with 222 polyp videos (36 SSLs) across two test-sets. Test-set I consists of 14,320 frames (157 polyps, 111 diminutive). Test-set II, which is publicly accessible, 3317 video frames (65 polyps, 41 diminutive), which was benchmarked with three expert and three nonexpert endoscopists. RESULTS Sensitivity for adenoma characterization was 91.6% in test-set I and 89.7% in test-set II. Specificity was 91.9% and 88.5%. Sensitivity for diminutive polyps was 89.9% and 87.5%; specificity 90.5% and 88.2%. In NBI-NF, sensitivity was 89.4% and 89.5%, with a specificity of 94.7% and 83.3%. In NBI, sensitivity was 85.3% and 91.7%, with a specificity of 87.5% and 90.0%, respectively. The CNN achieved preservation and incorporation of valuable endoscopic innovations (PIVI)-1 and PIVI-2 thresholds for each test-set. In the benchmarking of test-set II, the CNN was significantly more accurate than nonexperts (13.8% difference [95% confidence interval 3.2-23.6], P = 0.01) with no significant difference with experts. CONCLUSIONS A single CNN can differentiate adenomas from SSLs and hyperplastic polyps in both NBI and NBI-NF. A publicly accessible NBI polyp video database was created and benchmarked.
Collapse
Affiliation(s)
- Rawen Kader
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | | | | | - Shahraz Islam
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
| | | | - Juana González‐Bueno Puyal
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Odin Vision LtdLondonUK
| | - Omer F. Ahmad
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Mohamed Hussein
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | | | | | - Ed Seward
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Roser Vega
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
| | - Laurence B. Lovat
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| |
Collapse
|
4
|
Houwen BBSL, Nass KJ, Vleugels JLA, Fockens P, Hazewinkel Y, Dekker E. Comprehensive review of publicly available colonoscopic imaging databases for artificial intelligence research: availability, accessibility, and usability. Gastrointest Endosc 2023; 97:184-199.e16. [PMID: 36084720 DOI: 10.1016/j.gie.2022.08.043] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Revised: 08/24/2022] [Accepted: 08/30/2022] [Indexed: 01/28/2023]
Abstract
BACKGROUND AND AIMS Publicly available databases containing colonoscopic imaging data are valuable resources for artificial intelligence (AI) research. Currently, little is known regarding the available number and content of these databases. This review aimed to describe the availability, accessibility, and usability of publicly available colonoscopic imaging databases, focusing on polyp detection, polyp characterization, and quality of colonoscopy. METHODS A systematic literature search was performed in MEDLINE and Embase to identify AI studies describing publicly available colonoscopic imaging databases published after 2010. Second, a targeted search using Google's Dataset Search, Google Search, GitHub, and Figshare was done to identify databases directly. Databases were included if they contained data about polyp detection, polyp characterization, or quality of colonoscopy. To assess accessibility of databases, the following categories were defined: open access, open access with barriers, and regulated access. To assess the potential usability of the included databases, essential details of each database were extracted using a checklist derived from the Checklist for Artificial Intelligence in Medical Imaging. RESULTS We identified 22 databases with open access, 3 databases with open access with barriers, and 15 databases with regulated access. The 22 open access databases contained 19,463 images and 952 videos. Nineteen of these databases focused on polyp detection, localization, and/or segmentation; 6 on polyp characterization, and 3 on quality of colonoscopy. Only half of these databases have been used by other researcher to develop, train, or benchmark their AI system. Although technical details were in general well reported, important details such as polyp and patient demographics and the annotation process were under-reported in almost all databases. CONCLUSIONS This review provides greater insight on public availability of colonoscopic imaging databases for AI research. Incomplete reporting of important details limits the ability of researchers to assess the usability of current databases.
Collapse
Affiliation(s)
- Britt B S L Houwen
- Department of Gastroenterology and Hepatology, Amsterdam Gastroenterology Endocrinology Metabolism, Amsterdam University Medical Centres, location Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| | - Karlijn J Nass
- Department of Gastroenterology and Hepatology, Amsterdam Gastroenterology Endocrinology Metabolism, Amsterdam University Medical Centres, location Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| | - Jasper L A Vleugels
- Department of Gastroenterology and Hepatology, Amsterdam Gastroenterology Endocrinology Metabolism, Amsterdam University Medical Centres, location Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| | - Paul Fockens
- Department of Gastroenterology and Hepatology, Amsterdam Gastroenterology Endocrinology Metabolism, Amsterdam University Medical Centres, location Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| | - Yark Hazewinkel
- Department of Gastroenterology and Hepatology, Radboud University Nijmegen Medical Center, Radboud University of Nijmegen, Nijmegen, the Netherlands
| | - Evelien Dekker
- Department of Gastroenterology and Hepatology, Amsterdam Gastroenterology Endocrinology Metabolism, Amsterdam University Medical Centres, location Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| |
Collapse
|
5
|
González-Bueno Puyal J, Brandao P, Ahmad OF, Bhatia KK, Toth D, Kader R, Lovat L, Mountney P, Stoyanov D. Polyp detection on video colonoscopy using a hybrid 2D/3D CNN. Med Image Anal 2022; 82:102625. [PMID: 36209637 DOI: 10.1016/j.media.2022.102625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2021] [Revised: 08/22/2022] [Accepted: 09/10/2022] [Indexed: 12/15/2022]
Abstract
Colonoscopy is the gold standard for early diagnosis and pre-emptive treatment of colorectal cancer by detecting and removing colonic polyps. Deep learning approaches to polyp detection have shown potential for enhancing polyp detection rates. However, the majority of these systems are developed and evaluated on static images from colonoscopies, whilst in clinical practice the treatment is performed on a real-time video feed. Non-curated video data remains a challenge, as it contains low-quality frames when compared to still, selected images often obtained from diagnostic records. Nevertheless, it also embeds temporal information that can be exploited to increase predictions stability. A hybrid 2D/3D convolutional neural network architecture for polyp segmentation is presented in this paper. The network is used to improve polyp detection by encompassing spatial and temporal correlation of the predictions while preserving real-time detections. Extensive experiments show that the hybrid method outperforms a 2D baseline. The proposed architecture is validated on videos from 46 patients and on the publicly available SUN polyp database. A higher performance and increased generalisability indicate that real-world clinical implementations of automated polyp detection can benefit from the hybrid algorithm and the inclusion of temporal information.
Collapse
Affiliation(s)
- Juana González-Bueno Puyal
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London, London, W1W 7TY, UK; Odin Vision, London, W1W 7TY, UK.
| | | | - Omer F Ahmad
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London, London, W1W 7TY, UK
| | | | | | - Rawen Kader
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London, London, W1W 7TY, UK
| | - Laurence Lovat
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London, London, W1W 7TY, UK
| | | | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London, London, W1W 7TY, UK
| |
Collapse
|
6
|
Nogueira-Rodríguez A, Reboiro-Jato M, Glez-Peña D, López-Fernández H. Performance of Convolutional Neural Networks for Polyp Localization on Public Colonoscopy Image Datasets. Diagnostics (Basel) 2022; 12:898. [PMID: 35453946 PMCID: PMC9027927 DOI: 10.3390/diagnostics12040898] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2022] [Revised: 03/31/2022] [Accepted: 04/01/2022] [Indexed: 01/10/2023] Open
Abstract
Colorectal cancer is one of the most frequent malignancies. Colonoscopy is the de facto standard for precancerous lesion detection in the colon, i.e., polyps, during screening studies or after facultative recommendation. In recent years, artificial intelligence, and especially deep learning techniques such as convolutional neural networks, have been applied to polyp detection and localization in order to develop real-time CADe systems. However, the performance of machine learning models is very sensitive to changes in the nature of the testing instances, especially when trying to reproduce results for totally different datasets to those used for model development, i.e., inter-dataset testing. Here, we report the results of testing of our previously published polyp detection model using ten public colonoscopy image datasets and analyze them in the context of the results of other 20 state-of-the-art publications using the same datasets. The F1-score of our recently published model was 0.88 when evaluated on a private test partition, i.e., intra-dataset testing, but it decayed, on average, by 13.65% when tested on ten public datasets. In the published research, the average intra-dataset F1-score is 0.91, and we observed that it also decays in the inter-dataset setting to an average F1-score of 0.83.
Collapse
Affiliation(s)
- Alba Nogueira-Rodríguez
- CINBIO, Department of Computer Science, ESEI-Escuela Superior de Ingeniería Informática, Universidade de Vigo, 32004 Ourense, Spain; (A.N.-R.); (M.R.-J.); (D.G.-P.)
- SING Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), SERGAS-UVIGO, 36213 Vigo, Spain
| | - Miguel Reboiro-Jato
- CINBIO, Department of Computer Science, ESEI-Escuela Superior de Ingeniería Informática, Universidade de Vigo, 32004 Ourense, Spain; (A.N.-R.); (M.R.-J.); (D.G.-P.)
- SING Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), SERGAS-UVIGO, 36213 Vigo, Spain
| | - Daniel Glez-Peña
- CINBIO, Department of Computer Science, ESEI-Escuela Superior de Ingeniería Informática, Universidade de Vigo, 32004 Ourense, Spain; (A.N.-R.); (M.R.-J.); (D.G.-P.)
- SING Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), SERGAS-UVIGO, 36213 Vigo, Spain
| | - Hugo López-Fernández
- CINBIO, Department of Computer Science, ESEI-Escuela Superior de Ingeniería Informática, Universidade de Vigo, 32004 Ourense, Spain; (A.N.-R.); (M.R.-J.); (D.G.-P.)
- SING Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), SERGAS-UVIGO, 36213 Vigo, Spain
| |
Collapse
|
7
|
Hsieh YH, Tang CP, Tseng CW, Lin TL, Leung FW. Computer-Aided Detection False Positives in Colonoscopy. Diagnostics (Basel) 2021; 11:1113. [PMID: 34207226 PMCID: PMC8235696 DOI: 10.3390/diagnostics11061113] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Revised: 06/08/2021] [Accepted: 06/14/2021] [Indexed: 12/24/2022] Open
Abstract
Randomized control trials and meta-analyses comparing colonoscopies with and without computer-aided detection (CADe) assistance showed significant increases in adenoma detection rates (ADRs) with CADe. A major limitation of CADe is its false positives (FPs), ranked 3rd in importance among 59 research questions in a modified Delphi consensus review. The definition of FPs varies. One commonly used definition defines an FP as an activation of the CADe system, irrespective of the number of frames or duration of time, not due to any polypoid or nonpolypoid lesions. Although only 0.07 to 0.2 FPs were observed per colonoscopy, video analysis studies using FPs as the primary outcome showed much higher numbers of 26 to 27 per colonoscopy. Most FPs were of short duration (91% < 0.5 s). A higher number of FPs was also associated with suboptimal bowel preparation. The appearance of FPs can lead to user fatigue. The polypectomy of FPs results in increased procedure time and added use of resources. Re-training the CADe algorithms is one way to reduce FPs but is not practical in the clinical setting during colonoscopy. Water exchange (WE) is an emerging method that the colonoscopist can use to provide salvage cleaning during insertion. We discuss the potential of WE for reducing FPs as well as the augmentation of ADRs through CADe.
Collapse
Affiliation(s)
- Yu-Hsi Hsieh
- Division of Gastroenterology, Department of Internal Medicine, Dalin Tzu Chi Hospital, Buddhist Tzu Chi Medical Foundation, Chiayi 62247, Taiwan; (C.-P.T.); (C.-W.T.)
- School of Medicine, Tzu Chi University, Hualien City 97004, Taiwan
| | - Chia-Pei Tang
- Division of Gastroenterology, Department of Internal Medicine, Dalin Tzu Chi Hospital, Buddhist Tzu Chi Medical Foundation, Chiayi 62247, Taiwan; (C.-P.T.); (C.-W.T.)
- School of Medicine, Tzu Chi University, Hualien City 97004, Taiwan
| | - Chih-Wei Tseng
- Division of Gastroenterology, Department of Internal Medicine, Dalin Tzu Chi Hospital, Buddhist Tzu Chi Medical Foundation, Chiayi 62247, Taiwan; (C.-P.T.); (C.-W.T.)
- School of Medicine, Tzu Chi University, Hualien City 97004, Taiwan
| | - Tu-Liang Lin
- Department of Management Information Systems, National Chiayi University, Chiayi 60054, Taiwan;
| | - Felix W. Leung
- Sepulveda Ambulatory Care Center, Veterans Affairs Greater Los Angeles Healthcare System, North Hills, CA 91343, USA;
- David Geffen School of Medicine at University of California at Los Angeles, Los Angeles, CA 90024, USA
| |
Collapse
|