1
|
Gravino G, Patel J, Ratneswaren T, Craven I, Chandran A. Diagnostic and interventional neuroradiology training in the UK: a national trainee survey. Clin Radiol 2024; 79:e854-e867. [PMID: 38527920 DOI: 10.1016/j.crad.2024.02.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2023] [Revised: 02/05/2024] [Accepted: 02/13/2024] [Indexed: 03/27/2024]
Abstract
AIM Training structure in neuroradiology can be variable, nationally and internationally. Globally, there is a trend towards standardised training pathways, curricula and targeted competencies. Currently, there is limited understanding of the structure of neuroradiology training in the UK. This survey aims to: [1] identify different contemporary models of neuroradiology training in the UK, [2] compare UK trainees' commitments against national and international standards, and [3] understand whether career expectations match the predicted future demands of neuroradiologists. MATERIALS AND METHODS A survey was developed after consultation with BSNR and UKNG representatives. The eligibility criteria included current neuroradiology trainees in the UK with at least 3 months of experience or had recently completed neuroradiology training, but less than 18 months had elapsed since achieving a certificate of completion of training. RESULTS A total of 50 trainees responded to the survey; 26 (52%) diagnostic neuroradiologists (DNRs) and 24 (48%) interventional neuroradiologists (INRs) with an overall mean age of 33 years. The mean duration of training at the time of survey was 18 months. The survey details trainee demographics, experience at work, research and teaching commitments and future goals. CONCLUSION Most respondents are satisfied with their training and 90% want to remain in the UK after completion of training. There is room for improvement but the future of training and working in neuroradiology seems promising internationally, with ever-evolving techniques and developments. ADVANCES IN KNOWLEDGE Advances in knowledge: This study evaluates neuroradiology training in the UK to enhance the training of future neuroradiologists, and safeguard the future of the speciality.
Collapse
Affiliation(s)
- G Gravino
- Department of Neuroradiology, The Walton Centre NHS Trust, Liverpool, UK.
| | - J Patel
- Department of Neuroradiology, The Walton Centre NHS Trust, Liverpool, UK
| | - T Ratneswaren
- Department of Radiology, Imperial College NHS Trust, London, UK
| | - I Craven
- Department of Neuroradiology, Leeds Teaching Hospitals NHS Trust, Leeds, UK
| | - A Chandran
- Department of Neuroradiology, The Walton Centre NHS Trust, Liverpool, UK
| |
Collapse
|
2
|
Tang SM, Durieux JC, Faraji N, Mohamed I, Wien M, Nayate AP. "Are They Listening, and Do They Find It Useful?"-Evaluation of Mid-Rotation Formative Subjective and Objective Feedback to Radiology Trainees. Curr Probl Diagn Radiol 2024; 53:114-120. [PMID: 37690968 DOI: 10.1067/j.cpradiol.2023.08.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Accepted: 08/23/2023] [Indexed: 09/12/2023]
Abstract
BACKGROUND Residents commonly receive only end-of-rotation evaluations and thus are often unaware of their progress during a rotation. In 2021, our neuroradiology section instituted mid-rotation feedback in which rotating residents received formative subjective and objective feedback. The purpose of this study was to describe our feedback method and to evaluate if residents found it helpful. METHODS Radiology residents rotate 3-4 times on the neuroradiology service for 1-month blocks. At the midpoint of the rotation (2 weeks), 7-10 neuroradiology attendings discussed the rotating residents' subjective performance. One attending was tasked with facilitating this discussion and taking notes. Objective metrics were obtained from our dictation software. Compiled feedback was relayed to residents via email. A 16-question anonymous survey was sent to 39 radiology residents (R1-R4) to evaluate the perceived value of mid-rotation feedback. Odds ratios and 95% confidence intervals were computed using logistic regression. RESULTS Sixty-nine percent (27/39) of residents responded to the survey; 92.6% (25/27) of residents reported receiving mid-rotation feedback in ≥50% of neuroradiology rotations; 92.3% (24/26) of residents found the subjective feedback helpful; 88.4% (23/26) of residents reported modifying their performance as suggested (100% R1-R2 vs 70% R3-R4; OR: 15.4 CI:1.26, >30.0);59.1% (13/22) of residents found the objective metrics helpful (75% R1-R2 vs 40% R3-R4; OR: 3.92 CI:0.74, 24.39) and 68.2% (15/22) stated they modified their performance based on these metrics (83.3% R1-R2 vs 50.0% R3-R4; OR:4.2 CI:0.73, 30.55); and 84.6% (22/26) of residents stated that mid-rotation subjective feedback and 45.5% (10/22) stated that mid-rotation objective feedback should be implemented in other sections. CONCLUSIONS Majority of residents found mid-rotation feedback to be helpful in informing them about their progress and areas for improvement in the neuroradiology rotation, more so for subjective feedback than objective feedback. The majority of residents stated all rotations should provide mid-rotation subjective feedback.
Collapse
Affiliation(s)
- Stephen M Tang
- Case Western Reserve University School of Medicine, Cleveland, OH
| | - Jared C Durieux
- University Hospitals Cleveland Medical Center, Cleveland, OH
| | - Navid Faraji
- University Hospitals Cleveland Medical Center, Cleveland, OH
| | - Inas Mohamed
- University Hospitals Cleveland Medical Center, Cleveland, OH
| | - Michael Wien
- University Hospitals Cleveland Medical Center, Cleveland, OH
| | - Ameya P Nayate
- University Hospitals Cleveland Medical Center, Cleveland, OH.
| |
Collapse
|
3
|
Poyiadji N, Klochko C, Griffith B. Radiology Resident Diagnostic In-Training Exam Scores: Impact of Subspecialty Imaging Volume and Rotation Scheduling. Curr Probl Diagn Radiol 2024; 53:111-113. [PMID: 37704488 DOI: 10.1067/j.cpradiol.2023.08.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Accepted: 08/23/2023] [Indexed: 09/15/2023]
Abstract
PURPOSE To determine the relationship between resident imaging volumes and number of subspecialty rotations with Diagnostic Radiology In-Training (DXIT) subspecialty scores. METHODS DXIT-scaled subspecialty scores from a single large diagnostic radiology training program from 2014 to 2020 were obtained. The cumulative number of imaging studies dictated by each resident and specific rotations were mapped to each subspecialty for each year of training. DXIT subspecialty scores were compared against the total subspecialty imaging volume and the total number of rotations in a subspecialty for each resident year. A total of 52 radiology residents were trained during the study period and included in the dataset. RESULTS There was a positive linear relationship between the number of neuro studies and scaled neuro DXIT scores for R1s (Pearson coefficient: 0.29; p-value: 0.034) and between the number of breast studies and the number of neuro studies with DXIT scores for R2s (Pearson coefficients: 0.50 and 0.45, respectively; p-values: 0.001 and 0.003, respectively). Furthermore, a positive significant linear relationship between the total number of rotations in cardiac, breast, neuro, and thoracic subspecialties and their scaled DXIT scores for R2 residents (Pearson coefficients: 0.34, 0.49, 0.33, and 0.32, respectively; p-value: 0.025, 0.001, 0.03, and 0.036, respectively) and between the total number of nuclear medicine rotations with DXIT scores for R3s (Pearson coefficient: 0.41; p-value: 0.016). CONCLUSION Resident subspecialty imaging volumes and rotations have a variable impact on DXIT scores. Understanding the impact of study volume and the number of subspecialty rotations on resident medical knowledge will help residents and program directors determine how much emphasis to place on these factors during residency.
Collapse
Affiliation(s)
- Neo Poyiadji
- Department of Radiology, Henry Ford Hospital, Detroit, MI
| | - Chad Klochko
- Department of Radiology, Henry Ford Hospital, Detroit, MI
| | - Brent Griffith
- Department of Radiology, Henry Ford Hospital, Detroit, MI; Michigan State University College of Human Medicine, East Lansing, MI.
| |
Collapse
|
4
|
Hofmeijer EIS, Wu SC, Vliegenthart R, Slump CH, van der Heijden F, Tan CO. Artificial CT images can enhance variation of case images in diagnostic radiology skills training. Insights Imaging 2023; 14:186. [PMID: 37934344 PMCID: PMC10630276 DOI: 10.1186/s13244-023-01508-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 08/22/2023] [Indexed: 11/08/2023] Open
Abstract
OBJECTIVES We sought to investigate if artificial medical images can blend with original ones and whether they adhere to the variable anatomical constraints provided. METHODS Artificial images were generated with a generative model trained on publicly available standard and low-dose chest CT images (805 scans; 39,803 2D images), of which 17% contained evidence of pathological formations (lung nodules). The test set (90 scans; 5121 2D images) was used to assess if artificial images (512 × 512 primary and control image sets) blended in with original images, using both quantitative metrics and expert opinion. We further assessed if pathology characteristics in the artificial images can be manipulated. RESULTS Primary and control artificial images attained an average objective similarity of 0.78 ± 0.04 (ranging from 0 [entirely dissimilar] to 1[identical]) and 0.76 ± 0.06, respectively. Five radiologists with experience in chest and thoracic imaging provided a subjective measure of image quality; they rated artificial images as 3.13 ± 0.46 (range of 1 [unrealistic] to 4 [almost indistinguishable to the original image]), close to their rating of the original images (3.73 ± 0.31). Radiologists clearly distinguished images in the control sets (2.32 ± 0.48 and 1.07 ± 0.19). In almost a quarter of the scenarios, they were not able to distinguish primary artificial images from the original ones. CONCLUSION Artificial images can be generated in a way such that they blend in with original images and adhere to anatomical constraints, which can be manipulated to augment the variability of cases. CRITICAL RELEVANCE STATEMENT Artificial medical images can be used to enhance the availability and variety of medical training images by creating new but comparable images that can blend in with original images. KEY POINTS • Artificial images, similar to original ones, can be created using generative networks. • Pathological features of artificial images can be adjusted through guiding the network. • Artificial images proved viable to augment the depth and broadening of diagnostic training.
Collapse
Affiliation(s)
- Elfi Inez Saïda Hofmeijer
- Robotics and Mechatronics, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands.
| | - Sheng-Chih Wu
- Robotics and Mechatronics, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands
| | - Rozemarijn Vliegenthart
- Dept of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
- Data Science Center in Health (DASH), University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Cornelis Herman Slump
- Robotics and Mechatronics, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands
| | - Ferdi van der Heijden
- Robotics and Mechatronics, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands
| | - Can Ozan Tan
- Robotics and Mechatronics, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands
| |
Collapse
|
5
|
Lees AF, Beni C, Lee A, Wedgeworth P, Dzara K, Joyner B, Tarczy-Hornoch P, Leu M. Uses of Electronic Health Record Data to Measure the Clinical Learning Environment of Graduate Medical Education Trainees: A Systematic Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1326-1336. [PMID: 37267042 PMCID: PMC10615720 DOI: 10.1097/acm.0000000000005288] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
PURPOSE This study systematically reviews the uses of electronic health record (EHR) data to measure graduate medical education (GME) trainee competencies. METHOD In January 2022, the authors conducted a systematic review of original research in MEDLINE from database start to December 31, 2021. The authors searched for articles that used the EHR as their data source and in which the individual GME trainee was the unit of observation and/or unit of analysis. The database query was intentionally broad because an initial survey of pertinent articles identified no unifying Medical Subject Heading terms. Articles were coded and clustered by theme and Accreditation Council for Graduate Medical Education (ACGME) core competency. RESULTS The database search yielded 3,540 articles, of which 86 met the study inclusion criteria. Articles clustered into 16 themes, the largest of which were trainee condition experience (17 articles), work patterns (16 articles), and continuity of care (12 articles). Five of the ACGME core competencies were represented (patient care and procedural skills, practice-based learning and improvement, systems-based practice, medical knowledge, and professionalism). In addition, 25 articles assessed the clinical learning environment. CONCLUSIONS This review identified 86 articles that used EHR data to measure individual GME trainee competencies, spanning 16 themes and 6 competencies and revealing marked between-trainee variation. The authors propose a digital learning cycle framework that arranges sequentially the uses of EHR data within the cycle of clinical experiential learning central to GME. Three technical components necessary to unlock the potential of EHR data to improve GME are described: measures, attribution, and visualization. Partnerships between GME programs and informatics departments will be pivotal in realizing this opportunity.
Collapse
Affiliation(s)
- A Fischer Lees
- A. Fischer Lees is a clinical informatics fellow, Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington
| | - Catherine Beni
- C. Beni is a general surgery resident, Department of Surgery, University of Washington School of Medicine, Seattle, Washington
| | - Albert Lee
- A. Lee is a clinical informatics fellow, Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington
| | - Patrick Wedgeworth
- P. Wedgeworth is a clinical informatics fellow, Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington
| | - Kristina Dzara
- K. Dzara is assistant dean for educator development, director, Center for Learning and Innovation in Medical Education, and associate professor of medical education, Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington
| | - Byron Joyner
- B. Joyner is vice dean for graduate medical education and a designated institutional official, Graduate Medical Education, University of Washington School of Medicine, Seattle, Washington
| | - Peter Tarczy-Hornoch
- P. Tarczy-Hornoch is professor and chair, Department of Biomedical Informatics and Medical Education, and professor, Department of Pediatrics (Neonatology), University of Washington School of Medicine, and adjunct professor, Allen School of Computer Science and Engineering, University of Washington, Seattle, Washington
| | - Michael Leu
- M. Leu is professor and director, Clinical Informatics Fellowship, Department of Biomedical Informatics and Medical Education, and professor, Department of Pediatrics, University of Washington School of Medicine, Seattle, Washington
| |
Collapse
|
6
|
Gordon EB, Wingrove P, Branstetter IV BF, Hughes MA. Evidence for an adverse impact of remote readouts on radiology resident productivity: Implications for training and clinical practice. PLOS DIGITAL HEALTH 2023; 2:e0000332. [PMID: 37738228 PMCID: PMC10516412 DOI: 10.1371/journal.pdig.0000332] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Accepted: 07/18/2023] [Indexed: 09/24/2023]
Abstract
After their rapid adoption at the onset of the coronavirus pandemic, remote case reviews (remote readouts) between diagnostic radiology residents and their attendings have persisted in an increasingly remote workforce, despite relaxing social distancing guidelines. Our objective was to evaluate the impact of the transition to remote readouts on resident case volumes after the recovery of institutional volumes. We tabulated radiology reports co-authored by first-to-third-year radiology residents (R1-R3) between July 1 and December 31 of the first pandemic year, 2020, and compared to the prior two pre-pandemic years. Half-years were analyzed because institutional volumes recovered by July 2020. Resident volumes were normalized to rotations, which were in divisions categorized by the location of the supervising faculty during the pandemic period; in 'remote' divisions, all faculty worked off-site, whereas 'hybrid' divisions had a mix of attendings working on-site and remotely. All residents worked on-site. Data analysis was performed with Student's t test and multivariate linear regression. The largest drops in total case volume occurred in the two remote divisions (38% [6,086 to 3,788], and 26% [11,046 to 8,149]). None of the hybrid divisions with both in-person and remote supervision decreased by more than 5%. With multivariate regression, a resident assigned to a standardized remote rotation in 2020 would complete 32% (253 to 172) fewer studies than in identical pre-pandemic rotations (coefficent of -81.6, p = .005) but would be similar for hybrid rotations. R1 residents would be expected to interpret 40% fewer (180 to 108) cases on remote rotations during the pandemic (coefficient of -72.3, p = .007). No significant effect was seen for R2 or R3 residents (p = .099 and p = .29, respectively). Radiology residents interpreted fewer studies during remote rotations than on hybrid rotations that included in-person readouts. As resident case volume is correlated with clinical performance and board pass rate, monitoring the readout model for downstream educational effects is essential. Until evidence shows that educational outcomes remain unchanged, radiology residencies may wish to preserve in-person resident readouts, particularly for junior residents.
Collapse
Affiliation(s)
- Emile B. Gordon
- Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
| | - Peter Wingrove
- Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
| | - Barton F. Branstetter IV
- Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
- Department of Otolaryngology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
| | - Marion A. Hughes
- Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
- Department of Otolaryngology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
7
|
Gunderman P, Gunderman D. There Is No Optimal Case Count for Passing the ABR Core Exam. Acad Radiol 2023; 30:1010. [PMID: 36882353 DOI: 10.1016/j.acra.2023.01.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2022] [Revised: 01/25/2023] [Accepted: 01/25/2023] [Indexed: 03/07/2023]
|
8
|
Artificial Intelligence for Personalised Ophthalmology Residency Training. J Clin Med 2023; 12:jcm12051825. [PMID: 36902612 PMCID: PMC10002549 DOI: 10.3390/jcm12051825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Revised: 02/06/2023] [Accepted: 02/19/2023] [Indexed: 02/26/2023] Open
Abstract
Residency training in medicine lays the foundation for future medical doctors. In real-world settings, training centers face challenges in trying to create balanced residency programs, with cases encountered by residents not always being fairly distributed among them. In recent years, there has been a tremendous advancement in developing artificial intelligence (AI)-based algorithms with human expert guidance for medical imaging segmentation, classification, and prediction. In this paper, we turned our attention from training machines to letting them train us and developed an AI framework for personalised case-based ophthalmology residency training. The framework is built on two components: (1) a deep learning (DL) model and (2) an expert-system-powered case allocation algorithm. The DL model is trained on publicly available datasets by means of contrastive learning and can classify retinal diseases from color fundus photographs (CFPs). Patients visiting the retina clinic will have a CFP performed and afterward, the image will be interpreted by the DL model, which will give a presumptive diagnosis. This diagnosis is then passed to a case allocation algorithm which selects the resident who would most benefit from the specific case, based on their case history and performance. At the end of each case, the attending expert physician assesses the resident's performance based on standardised examination files, and the results are immediately updated in their portfolio. Our approach provides a structure for future precision medical education in ophthalmology.
Collapse
|
9
|
Morgan DE. Use of Attending Radiologist Reviews of Resident Clinical Performance to Predict Outcomes on the American Board of Radiology Qualifying (Core) Exam: A Call to Action. Acad Radiol 2022; 29:1727-1729. [PMID: 36050263 DOI: 10.1016/j.acra.2022.07.024] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 07/31/2022] [Indexed: 11/20/2022]
Affiliation(s)
- Desiree E Morgan
- University of Alabama at Birmingham, Department of Radiology, JTN456, 619 South 19th Street, Birmingham, AL 35249.
| |
Collapse
|
10
|
Horn GL, Masood I, Heymann JC, Saleem A, Nguyen QD. Attending Reviews of Residents Correlate with ABR Qualifying (Core) Examination Failure. Acad Radiol 2022; 29:1723-1726. [PMID: 35232656 DOI: 10.1016/j.acra.2022.01.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2021] [Revised: 01/02/2022] [Accepted: 01/04/2022] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES Since the American Board of Radiology (ABR) instituted the new system of board certification, there has been much discussion as to the test's validity. We decided to evaluate if subjective evaluation of resident performance correlated with ABR Qualifying (Core) Examination performance at this single institution. MATERIALS AND METHODS Data regarding resident evaluation scores by attending physicians and passage of board examinations was gathered regarding residents who had taken the ABR Qualifying (Core) Examination from 2013 through 2019 for a total of 42 residents, eight of whom failed the ABR Qualifying (Core) Examination on their first attempt. A univariate analysis comparing scores with resident passage or failure of the ABR Qualifying (Core) Examination on the first attempt and analyses correcting for class year only and class year and number of evaluations was performed. RESULTS The non-weighted average evaluation score of years 1, 2, and 3 was 80.24% for those who failed the ABR Qualifying (Core) Examination and 83.71 % for those who passed. On univariate analysis along with analyses correcting for class year only and class year along with number of evaluations, there was a statistically significant correlation with decreased evaluation scores averaged over the three years of residency and failure of the ABR Qualifying (Core) Examination (p = 0.0102, p = 0.003, and p = 0.0043). The statistical significance held for the average numerical score in each individual year of training in all analyses except for year 1 of the univariate analysis (p = 0.1264). CONCLUSION At the studied institution, there was a statistically significant correlation between lower subjective faculty evaluation scores and failure of the ABR Qualifying (Core) Examination.
Collapse
Affiliation(s)
- Gary Lloyd Horn
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas.
| | - Irfan Masood
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - John C Heymann
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - Arsalan Saleem
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| | - Quan Dang Nguyen
- Baylor College of Medicine Department of Radiology (G.L.H.), One Baylor Plaza, Houston, Texas 77573; University of Texas Medical Branch at Galveston Department of Radiology (I.M., J.C.H., A.S., Q.D.N.), Galveston, Texas
| |
Collapse
|
11
|
Davenport MS, Higgins EJ, Weinstein S. Physician Extenders in Radiology Education. J Am Coll Radiol 2022; 19:754-756. [DOI: 10.1016/j.jacr.2022.03.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Revised: 03/11/2022] [Accepted: 03/17/2022] [Indexed: 11/29/2022]
|
12
|
Freeman CW, Dhanaliwala A, Moore S, Kunchala S, Scanlon MH. Homeward Bound: A Comparison of Resident Case Volume on Home-Read Workstations and On-Site during the COVID-19 Pandemic. J Am Coll Radiol 2022; 19:476-479. [PMID: 35123956 PMCID: PMC8786630 DOI: 10.1016/j.jacr.2021.12.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2021] [Revised: 12/04/2021] [Accepted: 12/08/2021] [Indexed: 12/03/2022]
|
13
|
Jewell C, Kraut A, Miller D, Ray K, Werley E, Schnapp B. Metrics of Resident Achievement for Defining Program Aims. West J Emerg Med 2022; 23:1-8. [PMID: 35060852 PMCID: PMC8782131 DOI: 10.5811/westjem.2021.12.53554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Accepted: 12/06/2021] [Indexed: 11/28/2022] Open
Abstract
Introduction Resident achievement data is a powerful but underutilized means of program evaluation, allowing programs to empirically measure whether they are meeting their program aims, facilitate refinement of curricula and improve resident recruitment efforts. The goal was to provide an overview of available metrics of resident achievement and how these metrics can be used to inform program aims. Methods A literature search was performed using PubMed and Google Scholar between May and November of 2020. Publications were eligible for inclusion if they discussed or assessed “excellence” or “success” during residency training. A narrative review structure was chosen due to the intention to provide an examination of the literature on available resident achievement metrics. Results 57 publications met inclusion criteria and were included in the review. Metrics of excellence were grouped into larger categories, including success defined by program factors, academics, national competencies, employer factors, and possible new metrics. Conclusions Programs can best evaluate whether they are meeting their program aims by creating a list of important resident-level metrics based on their stated goals and values using one or more of the published definitions as a foundation. Each program must define which metrics align best with their individual program aims and mission.
Collapse
Affiliation(s)
- Corlin Jewell
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Aaron Kraut
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Danielle Miller
- University of Colorado School of Medicine, Department of Emergency Medicine, Aurora, Colorado
| | - Kaitlin Ray
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Elizabeth Werley
- PennState College of Medicine, Department of Emergency Medicine, Hershey, Pennsylvania
| | - Bejamin Schnapp
- University of Wisconsin School of Medicine and Public Health, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| |
Collapse
|
14
|
Kwan BYM, Mbanwi A, Cofie N, Rogoza C, Islam O, Chung AD, Dalgarno N, Dagnone D, Wang X, Mussari B. Creating a Competency-Based Medical Education Curriculum for Canadian Diagnostic Radiology Residency (Queen’s Fundamental Innovations in Residency Education)—Part 1: Transition to Discipline and Foundation of Discipline Stages. Can Assoc Radiol J 2021; 72:372-380. [PMID: 32126802 DOI: 10.1177/0846537119894723] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/10/2023] Open
Abstract
Purpose: The Royal College of Physicians and Surgeons of Canada (RCPSC) has mandated the transition of postgraduate medical training in Canada to a competency-based medical education (CBME) model divided into 4 stages of training. As part of the Queen’s University Fundamental Innovations in Residency Education proposal, Queen’s University in Canada is the first institution to transition all of its residency programs simultaneously to this model, including Diagnostic Radiology. The objective of this report is to describe the Queen’s Diagnostic Radiology Residency Program’s implementation of a CBME curriculum. Methods: At Queen’s University, the novel curriculum was developed using the RCPSC’s competency continuum and the CanMEDS framework to create radiology-specific entrustable professional activities (EPAs) and milestones. In addition, new committees and assessment strategies were established. As of July 2015, 3 cohorts of residents (n = 9) have been enrolled in this new curriculum. Results: EPAs, milestones, and methods of evaluation for the Transition to Discipline and Foundations of Discipline stages, as well as the opportunities and challenges associated with the implementation of a competency-based curriculum in a Diagnostic Radiology Residency Program, are described. Challenges include the increased frequency of resident assessments, establishing stage-specific learner expectations, and the creation of volumetric guidelines for case reporting and procedures. Conclusions: Development of a novel CBME curriculum requires significant resources and dedicated administrative time within an academic Radiology department. This article highlights challenges and provides guidance for this process.
Collapse
Affiliation(s)
- Benjamin Yin Ming Kwan
- Department of Diagnostic Radiology, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| | - Achire Mbanwi
- Queen’s University Faculty of Health Sciences, Kingston, Ontario, Canada
| | - Nicholas Cofie
- Queen’s University Faculty of Health Sciences, Kingston, Ontario, Canada
| | - Christina Rogoza
- Queen’s University Faculty of Health Sciences, Kingston, Ontario, Canada
| | - Omar Islam
- Department of Diagnostic Radiology, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| | - Andrew D. Chung
- Department of Diagnostic Radiology, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| | - Nancy Dalgarno
- Queen’s University Faculty of Health Sciences, Kingston, Ontario, Canada
| | - Damon Dagnone
- Department of Emergency Medicine, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| | - Xi Wang
- Department of Diagnostic Radiology, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| | - Ben Mussari
- Department of Diagnostic Radiology, Kingston Health Sciences Centre, Kingston, Ontario, Canada
| |
Collapse
|
15
|
Richards TJ, Schmitt JE, Wolansky LJ, Nayate AP. Radiology Performed Fluoroscopy-Guided Lumbar Punctures Decrease Volume of Diagnostic Study Interpretation - Impact on Resident Training and Potential Solutions. J Clin Imaging Sci 2021; 11:39. [PMID: 34345529 PMCID: PMC8326109 DOI: 10.25259/jcis_2_2021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2021] [Accepted: 06/16/2021] [Indexed: 11/16/2022] Open
Abstract
Objectives: Lumbar punctures performed in radiology departments have significantly increased over the last few decades and are typically performed in academic centers by radiology trainees using fluoroscopy guidance. Performing fluoroscopy-guided lumbar punctures (FGLPs) can often constitute a large portion of a trainee’s workday and the impact of performing FGLPs on the trainee’s clinical productivity (i.e. dictating reports on neuroradiology cross-sectional imaging) has not been studied. The purpose of the study was to evaluate the relationship between the number of FGLPs performed and cross-sectional neuroimaging studies dictated by residents during their neuroradiology rotation (NR). Material and Methods: The number of FGLPs and myelograms performed and neuroimaging studies dictated by radiology residents on our neuroradiology service from July 2008 to December 2017 were retrospectively reviewed. The relationship between the number of FGLPs performed and neuroimaging studies (CT and MRI) dictated per day by residents was examined. Results: Radiology residents (n = 84) performed 3437 FGLPs and myelograms and interpreted 33402 cross-sectional studies. Poisson regression demonstrated an exponential decrease in number of studies dictated daily with a rising number of FGLPs performed (P = 0.0001) and the following formula was derived: Number of expected studies dictated per day assuming no FGLPs × e-0.25 x number of FGLPs = adjusted expected studies dictated for the day. Conclusion: We quantified the impact performing FGLPs can have on the number of neuroimaging reports residents dictate on the NR. We described solutions to potentially decrease unnecessary FGLP referrals including establishing departmental guidelines for FGLP referrals and encouraging bedside lumbar punctures attempts before referral. We also emphasized equally distributing the FGLPs among trainees to mitigate procedural burden.
Collapse
Affiliation(s)
- Tyler John Richards
- Department of Radiology, University of Utah, Salt Lake City, Utah, United States
| | - James Eric Schmitt
- Department of Penn Radiology, Hospital of the University of Pennsylvania, Philadelphia, Pennsylvania, United States
| | - Leo J Wolansky
- Department of Diagnostic Imaging and Therapeutics, University of Connecticut School of Medicine, Farmington, Connecticut, United States
| | - Ameya P Nayate
- Department of Radiology, University Hospitals Cleveland Medical Center, Cleveland, Ohio, United States
| |
Collapse
|
16
|
Geanacopoulos AT, Sundheim KM, Greco KF, Michelson KA, Parsons CR, Hron JD, Winn AS. Pediatric Intern Clinical Exposure During the COVID-19 Pandemic. Hosp Pediatr 2021; 11:e106-e110. [PMID: 33863816 PMCID: PMC8650022 DOI: 10.1542/hpeds.2021-005899] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
BACKGROUND AND OBJECTIVES Pediatric health care encounters declined during the coronavirus disease 2019 (COVID-19) pandemic, and pediatric residency programs have adapted trainee schedules to meet the needs of this changing clinical environment. We sought to evaluate the impact of the pandemic on pediatric interns' clinical exposure. METHODS In this retrospective cohort study, we quantified patient exposure among pediatric interns from a single large pediatric residency program at a freestanding children's hospital. Patient encounters and shifts per pediatric intern in the inpatient and emergency department settings were evaluated during the COVID-19 pandemic, from March to June 2020, as compared with these 3 months in 2019. Patient encounters by diagnosis were also evaluated. RESULTS The median number of patient encounters per intern per 2-week block declined on the pediatric hospital medicine service (37.5 vs 27.0; P < .001) and intensive care step-down unit (29.0 vs 18.8; P = .004) during the pandemic. No significant difference in emergency department encounters was observed (63.0 vs 40.5; P = .06). The median number of shifts worked per intern per 2-week block also decreased on the pediatric hospital medicine service (10.5 vs 9.5, P < .001). Across all settings, there were more encounters for screening for infectious disease and fewer encounters for respiratory illnesses. CONCLUSIONS Pediatric interns at the onset of the COVID-19 pandemic were exposed to fewer patients and had reduced clinical schedules. Careful consideration is needed to track and supplement missed clinical experiences during the pandemic.
Collapse
Affiliation(s)
| | | | | | - Kenneth A Michelson
- Department of Pediatrics
- Division of Emergency Medicine, Boston Children's Hospital, Boston, Massachusetts
| | | | | | | |
Collapse
|
17
|
Case Volume Analysis of Neurological Surgery Training Programs in the United States: 2017-2019. NEUROSURGERY OPEN 2021. [DOI: 10.1093/neuopn/okaa017] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
18
|
Quality and Safety in Healthcare, Part LXIV. Clin Nucl Med 2020; 45:954-956. [DOI: 10.1097/rlu.0000000000002932] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
19
|
Kwan BYM, Mussari B, Moore P, Meilleur L, Islam O, Menard A, Soboleski D, Cofie N. A Pilot Study on Diagnostic Radiology Residency Case Volumes From a Canadian Perspective: A Marker of Resident Knowledge. Can Assoc Radiol J 2020; 71:490-494. [PMID: 32037849 DOI: 10.1177/0846537119899227] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/10/2023] Open
Abstract
Purpose: New guidelines from the Accreditation Council for Graduate Medical Education (ACGME) have proposed minimum case volumes to be obtained during residency. While radiology residency programs in Canada are accredited by the Royal College of Physicians and Surgeons of Canada, there are currently no minimum case volumes standards for radiology residency training in Canada. New changes in residency training throughout Canada are coming in the form of competency-based medical education. Using data from a pilot study, this article examines radiology resident case volumes among recently graduated cohorts of residents and determines whether there is a correlation between case volumes and measures of resident success. Materials and Methods: Resident case volumes for 3 cohorts of graduated residents (2016-2018) were extracted from the institutional database. Achievement of minimum case volumes based on the ACGME guidelines was performed for each resident. Pearson correlation analysis (n = 9) was performed to examine the relationships between resident case volumes and markers of resident success including residents’ relative knowledge ranking and their American College of Radiology (ACR) in-training exam scores. Results: A statistically significant, positive correlation was observed between residents’ case volume and their relative knowledge ranking ( r = 0.682, P < .05). Residents’ relative knowledge ranking was also statistically significant and positively correlated with their ACR in-training percentile score ( r = 0.715, P < .05). Conclusions: This study suggests that residents who interpret more cases are more likely to demonstrate higher knowledge, thereby highlighting the utility of case volumes as a prognostic marker of resident success. As well, the results underscore the potential use of ACGME minimum case volumes as a prognostic marker. These findings can inform future curriculum planning and development in radiology residency training programs.
Collapse
Affiliation(s)
- Benjamin Y. M. Kwan
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Benedetto Mussari
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Pam Moore
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Lynne Meilleur
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Omar Islam
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Alexandre Menard
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Don Soboleski
- Department of Radiology, School of Medicine, Queen’s University, Kingston, Ontario, Canada
| | - Nicholas Cofie
- Faculty of Health Sciences, Queen’s University, Kingston, Ontario, Canada
| |
Collapse
|
20
|
Duong MT, Rauschecker AM, Mohan S. Diverse Applications of Artificial Intelligence in Neuroradiology. Neuroimaging Clin N Am 2020; 30:505-516. [PMID: 33039000 DOI: 10.1016/j.nic.2020.07.003] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Recent advances in artificial intelligence (AI) and deep learning (DL) hold promise to augment neuroimaging diagnosis for patients with brain tumors and stroke. Here, the authors review the diverse landscape of emerging neuroimaging applications of AI, including workflow optimization, lesion segmentation, and precision education. Given the many modalities used in diagnosing neurologic diseases, AI may be deployed to integrate across modalities (MR imaging, computed tomography, PET, electroencephalography, clinical and laboratory findings), facilitate crosstalk among specialists, and potentially improve diagnosis in patients with trauma, multiple sclerosis, epilepsy, and neurodegeneration. Together, there are myriad applications of AI for neuroradiology."
Collapse
Affiliation(s)
- Michael Tran Duong
- Department of Radiology, Perelman School of Medicine at the University of Pennsylvania, 3400 Spruce Street, 219 Dulles Building, Philadelphia, PA 19104, USA. https://twitter.com/MichaelDuongMD
| | - Andreas M Rauschecker
- Department of Radiology & Biomedical Imaging, University of California, San Francisco, 513 Parnassus Avenue, Room S-261, San Francisco, CA 94143, USA. https://twitter.com/DrDreMDPhD
| | - Suyash Mohan
- Department of Radiology, Perelman School of Medicine at the University of Pennsylvania, 3400 Spruce Street, 219 Dulles Building, Philadelphia, PA 19104, USA.
| |
Collapse
|
21
|
Woznitza N, Steele R, Groombridge H, Compton E, Gower S, Hussain A, Norman H, O'Brien A, Robertson K. Clinical reporting of radiographs by radiographers: Policy and practice guidance for regional imaging networks. Radiography (Lond) 2020; 27:645-649. [PMID: 32814647 DOI: 10.1016/j.radi.2020.08.004] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2020] [Revised: 07/31/2020] [Accepted: 08/03/2020] [Indexed: 11/28/2022]
Abstract
OBJECTIVES Radiographer reporting is an essential component of imaging across the United Kingdom. Since the previous policy and practice guidance in 2004 the role and contribution of reporting radiographers has changed significantly. The move to imaging networks further reinforces the need for consistency in scope of practice and clinical governance for radiographer reporting. KEY FINDINGS This guidance provides a consistent, evidence-based template for planning a reporting service, resourcing, clinical governance, preceptorship, volume and frequency of reporting, a peer learning framework and expected standards. CONCLUSION Developed for North Central and East London, this framework and standards will help reduce unwarranted variation. IMPLICATIONS FOR PRACTICE Consistency in practice could help maximise the contribution of radiographer reporting.
Collapse
Affiliation(s)
- N Woznitza
- Radiology Department, Homerton University Hospital, UK; School of Allied and Public Health Professions, Canterbury Christ Church University, UK; North Central and East London Cancer Alliance, UK; Health Education England, London, UK.
| | - R Steele
- North Central and East London Cancer Alliance, UK; Radiology Department, University College London Hospitals, UK
| | - H Groombridge
- Radiology Department, University College London Hospitals, UK
| | - E Compton
- Radiology Department, Guys & St Thomas' Hospitals, UK
| | - S Gower
- Radiology Department, Kings College Hospitals, UK
| | - A Hussain
- North Central and East London Cancer Alliance, UK
| | - H Norman
- North Central and East London Cancer Alliance, UK
| | - A O'Brien
- Radiology Department, Kings College Hospitals, UK
| | - K Robertson
- NHS England and Improvement, London, UK; South East London Cancer Alliance, UK
| |
Collapse
|
22
|
Patel MD, Tomblinson CM, Benefield T, Ali K, DeBenedectis CM, England E, Gaviola GC, Ho CP, Jay AK, Milburn JM, Ong S, Robbins JB, Sarkany DS, Heitkamp DE, Jordan SG. The Relationship Between US Medical Licensing Examination Step Scores and ABR Core Examination Outcome and Performance: A Multi-institutional Study. J Am Coll Radiol 2020; 17:1037-1045. [PMID: 32220580 DOI: 10.1016/j.jacr.2020.02.017] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 02/25/2020] [Accepted: 02/27/2020] [Indexed: 11/17/2022]
Abstract
PURPOSE We analyzed multi-institutional data to understand the relationship of US Medical Licensing Examination (USMLE) Step scores to ABR Core examination performance to identify Step score tiers that stratify radiology residents into different Core performance groups. METHODS We collected USMLE Step scores and ABR Core examination outcomes and scores for anonymized residents from 13 different diagnostic radiology residency programs taking the ABR Core examination between 2013 and 2019. USMLE scores were grouped into noniles using z scores and then aggregated into three tiers based on similar Core examination pass-or-fail outcomes. Core performance was grouped using standard deviation from the mean and then measured by the percent of residents with scores below the mean. Differences between Step tiers for Core outcome and Core performance were statistically evaluated (P < .05 considered significant). RESULTS Differences in Step 1 terciles Core failure rates (45.9%, 11.9%, and 3.0%, from lowest to highest Step tiers; n = 416) and below-mean Core performance (83.8%, 54.1%, and 21.1%, respectively; n = 402) were significant. Differences in Step 2 groups Core failure rates (30.0%, 10.6%, and 2.0%, from lowest to highest Step tiers; n = 387) and below-mean Core performance (80.0%, 43.7%, and 14.0%, respectively; n = 380) were significant. Step 2 results modified Core outcome and performance predictions for residents in Step 1 terciles of varying statistical significance. CONCLUSIONS Tiered scoring of USMLE Step results has value in predicting radiology resident performance on the ABR Core examination; effective stratification of radiology resident applicants can be done without reporting numerical Step scores.
Collapse
Affiliation(s)
- Maitray D Patel
- Executive Board, Society of Radiologists in Ultrasound, Department of Radiology, Mayo Clinic Arizona, Phoenix, Arizona.
| | - Courtney M Tomblinson
- Associate Program Director Diagnostic Radiology Residency; Associate Director, Women in Radiology, Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Thad Benefield
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, North Carolina
| | - Kamran Ali
- Program Director Diagnostic Radiology Residency; President, Radiology Group, Department of Radiology, University of Kansas School of Medicine, Wichita, Kansas
| | - Carolynn M DeBenedectis
- Vice Chair for Education; Program Director, Radiology Residency program, Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts
| | - Eric England
- Vice Chair of Education, Department of Radiology; Program Director, Diagnostic Radiology Residency; Jerome F. Wiot Endowed Chair of Radiology Residency Education, Department of Radiology, University of Cincinnati Medical Center, Cincinnati, Ohio
| | - Glenn C Gaviola
- Program Director Diagnostic Radiology Residency, Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Christopher P Ho
- Program Director Diagnostic Radiology Residency, Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia
| | - Ann K Jay
- Vice Chair of Education and Program Director Diagnostic Radiology Residency, Department of Radiology, MedStar Georgetown University Hospital, Washington, DC
| | - James M Milburn
- Vice Chair of Radiology, Section Head Neuroradiology, Program Director of Diagnostic Radiology Residency. ACR: Louisiana State Councilor; Department of Radiology, Ochsner Clinic Foundation, New Orleans, Louisiana
| | - Seng Ong
- Program Director Diagnostic Radiology Residency, Department of Radiology, University of Chicago Medical Center, Chicago, Illinois
| | - Jessica B Robbins
- Vice Chair of Faculty Development and Enrichment; Associate Program Director Diagnostic Radiology and Integrated Diagnostic/Interventional Radiology Residencies, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - David S Sarkany
- Diagnostic Radiology Program Director, Department of Radiology, Staten Island University Hospital Northwell Health, Staten Island, New York
| | | | - Sheryl G Jordan
- Education Director Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, North Carolina
| |
Collapse
|
23
|
Nickerson JP, Koski C, Anderson JC, Beckett B, Jackson VP. Correlation Between Radiology ACGME Case Logs Values and ABR Core Exam Pass Rate. Acad Radiol 2020; 27:269-273. [PMID: 31694780 DOI: 10.1016/j.acra.2019.10.004] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2019] [Accepted: 10/06/2019] [Indexed: 10/25/2022]
Abstract
RATIONALE AND OBJECTIVES There is discordance between the American Board of Radiology (ABR) and many radiology trainees with respect to the most appropriate means to prepare for the ABR Core Examination. Whereas the ABR suggests that participation in routine clinical examination interpretation best prepares a trainee for the practical material of the test, residents, and many program directors feel that time away from clinical service for study and review courses are necessary. This study examines the relationship between studies interpreted in the first three years of residency as reported in the Accreditation Council for Graduate Medical Education case logs and performance of first-time test takers on the ABR Core Examination. MATERIALS AND METHODS Accreditation Council for Graduate Medical Education case log data was anonymized for a single year cohort of residents in all accredited radiology residencies. This was then provided to the ABR and matched with performance on the Core Examination. A random effects logistic regression model was used to evaluate for a relationship between the number of examinations read and the pass/fail status of the Core Exam. RESULTS Modeling using a linear and a quadratic term yields a significant relationship between case log values and Core Exam performance. There is a positive correlation until an inflection point of approximately 11,000 examinations, at which point a negative correlation develops. CONCLUSION The data supports that active engagement in clinical duties is associated with better performance on the ABR Core Examination, with the caveat that there appears to be a point at which service outweighs educational value. Beyond this, performance on the examination declines.
Collapse
|
24
|
ACGME Case Log Values Correlate with Performance on ABR Core Exam. Acad Radiol 2020; 27:274-275. [PMID: 31759797 DOI: 10.1016/j.acra.2019.10.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Accepted: 10/23/2019] [Indexed: 11/22/2022]
|
25
|
Duong MT, Rauschecker AM, Rudie JD, Chen PH, Cook TS, Bryan RN, Mohan S. Artificial intelligence for precision education in radiology. Br J Radiol 2019; 92:20190389. [PMID: 31322909 DOI: 10.1259/bjr.20190389] [Citation(s) in RCA: 67] [Impact Index Per Article: 13.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023] Open
Abstract
In the era of personalized medicine, the emphasis of health care is shifting from populations to individuals. Artificial intelligence (AI) is capable of learning without explicit instruction and has emerging applications in medicine, particularly radiology. Whereas much attention has focused on teaching radiology trainees about AI, here our goal is to instead focus on how AI might be developed to better teach radiology trainees. While the idea of using AI to improve education is not new, the application of AI to medical and radiological education remains very limited. Based on the current educational foundation, we highlight an AI-integrated framework to augment radiology education and provide use case examples informed by our own institution's practice. The coming age of "AI-augmented radiology" may enable not only "precision medicine" but also what we describe as "precision medical education," where instruction is tailored to individual trainees based on their learning styles and needs.
Collapse
Affiliation(s)
- Michael Tran Duong
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.,Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Andreas M Rauschecker
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA.,Department of Radiology & Biomedical Imaging, University of California San Francisco, San Francisco, CA, USA
| | - Jeffrey D Rudie
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA.,Department of Radiology & Biomedical Imaging, University of California San Francisco, San Francisco, CA, USA
| | - Po-Hao Chen
- Imaging Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Tessa S Cook
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA
| | - R Nick Bryan
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA.,Department of Diagnostic Medicine, Dell Medical School, University of Texas at Austin, Austin, TX, USA
| | - Suyash Mohan
- Department of Radiology, University of Pennsylvania, Philadelphia, PA, USA.,Department of Neurosurgery, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|