1
|
Plewa MC, Ledrick DJ, Jenkins K, Orqvist A, McCrea M. Can USMLE and COMLEX-USA Scores Predict At-Risk Emergency Medicine Residents' Performance on In-Training Examinations? Cureus 2024; 16:e58684. [PMID: 38651085 PMCID: PMC11033967 DOI: 10.7759/cureus.58684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/21/2024] [Indexed: 04/25/2024] Open
Abstract
PURPOSE The United States Medical Licensing Examination (USMLE) and Comprehensive Osteopathic Medical Licensing Examination (COMLEX) scores are standard methods used to determine residency candidates' medical knowledge. The authors were interested in using the USMLE and COMLEX part 2 scores in our emergency medicine (EM) residency program to identify at-risk residents who may have difficulty on the in-training exam (ITE) and to determine the cutoff values under which an intern could be given an individualized study plan to ensure medical knowledge competency. METHODS The authors abstracted the USMLE and COMLEX part 2 scores and the American Board of Emergency Medicine (ABEM) ITE scores for a cohort of first-year EM residents graduating years 2010-2022, converting raw scores to percentiles, and compared part 2 and ABEM ITE scores with Pearson's correlation, a Bland-Altman analysis of bias and 95% limits of agreement, and ROC analysis to determine optimal the cut-off values for predicting ABEM ITE < 50th percentile and the estimated test characteristics. RESULTS Scores were available for 152 residents, including 93 USMLE and 88 COMLEX exams. The correlations between part 2 scores and ABEM ITE were r = 0.36 (95%CI: 0.17, 0.52; p < 0.001) for USMLE and r = 0.50 (95%CI: 0.33, 0.64; p < 0.001) for COMLEX. Bias and limits of agreement for both part 2 scores were -14 ± 63% for USMLE and 13 ± 50% for COMLEX in predicting the ABEM ITE scores. USMLE < 37th percentile and COMLEX < 53rd percentile identified 42% (N = 39) and 27% (N = 24) of EM residents, respectively, as at risk, with a sensitivity of 61% and 49% and specificity of 71% and 92%, respectively. CONCLUSION USMLE and COMLEX part 2 scores have a very limited role in identifying those at risk of low ITE performance, suggesting that other factors should be considered to identify interns in need of medical knowledge remediation.
Collapse
Affiliation(s)
- Michael C Plewa
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - David J Ledrick
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - Kenneth Jenkins
- Emergency Medicine, Ohio University Heritage College of Osteopathic Medicine, Athens, USA
| | - Aaron Orqvist
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| | - Michael McCrea
- Emergency Medicine, Mercy Health - St. Vincent Medical Center, Toledo, USA
| |
Collapse
|
2
|
Hong CX, Russell CB, Southworth EA, Fairchild PS. Use of In-Training Examination Scores as a Fellowship Candidate Evaluation Metric: Time for a Change. Urogynecology (Phila) 2024; 30:394-398. [PMID: 38564624 DOI: 10.1097/spv.0000000000001489] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/04/2024]
Abstract
ABSTRACT In the field of obstetrics and gynecology (OB/GYN), the Council on Resident Education in Obstetrics and Gynecology (CREOG) administers an annual in-training examination to all OB/GYN residents as a formative educational tool for assessing medical knowledge and promoting self-improvement. Although the CREOG examination is not designed or intended for knowledge certification, many OB/GYN subspecialty fellowship programs request and use CREOG examination scores as a metric to evaluate fellowship candidates. Among the 57 gynecology-based urogynecology fellowship programs, 30 programs (53%) request CREOG examination scores to be submitted by candidates, as of March 2023. Although the use of CREOG examination scores as an evaluation metric may constitute a minor component within the fellowship match process, this practice fundamentally contradicts the intended purpose of the examination as an educational self-assessment. In addition, it introduces the potential for bias in fellowship recruitment, lacks psychometric validity in predicting specialty board examination failure, and shifts the CREOG examination from its original intention as low-stakes self-assessment into a high-stakes examination akin to a certification examination. For these reasons, we call upon the urogynecology community to prioritize the educational mission of the CREOG examination and reconsider the practice of requesting or using CREOG examination scores in the fellowship match progress.
Collapse
Affiliation(s)
- Christopher X Hong
- From the Department of Obstetrics and Gynecology, University of Michigan, Ann Arbor, MI
| | | | | | | |
Collapse
|
3
|
Abstract
In this critical narrative review, we challenge the belief that single-moment-in-time high-stakes examinations (SMITHSEx) are an essential component of contemporary specialist training. We explore the arguments both for and against SMITHSEx, examine potential alternatives, and discuss the barriers to change.SMITHSEx are viewed as the "gold standard" assessment of competence but focus excessively on knowledge assessment rather than capturing essential competencies required for safe and competent workplace performance. Contrary to popular belief, regulatory bodies do not mandate SMITHSEx in specialist training. Though acting as significant drivers of learning and professional identity formation, these attributes are not exclusive to SMITHSEx.Skills such as crisis management, procedural skills, professionalism, communication, collaboration, lifelong learning, reflection on practice, and judgement are often overlooked by SMITHSEx. Their inherent design raises questions about the validity and objectivity of SMITHSEx as a measure of workplace competence. They have a detrimental impact on trainee well-being, contributing to burnout and differential attainment.Alternatives to SMITHSEx include continuous low-stakes assessments throughout training, ongoing evaluation of competence in the workplace, and competency-based medical education (CBME) concepts. These aim to provide a more comprehensive and context-specific assessment of trainees' competence while also improving trainee welfare.Specialist training colleges should evolve from exam providers to holistic education sources. Assessments should emphasise essential practical knowledge over trivia, align with clinical practice, aid learning, and be part of a diverse toolkit. Eliminating SMITHSEx from specialist training will foster a competency-based approach, benefiting future medical professionals' well-being and success.
Collapse
Affiliation(s)
- Navdeep S Sidhu
- Department of Anaesthesiology, School of Medicine, University of Auckland, Auckland, New Zealand
- Department of Anaesthesia and Perioperative Medicine, North Shore Hospital, Auckland, New Zealand
| | - Simon Fleming
- Department of Hand Surgery, Royal North Shore Hospital, Sydney, New South Wales, Australia
| |
Collapse
|
4
|
Baynouna AlKetbi L, Nagelkerke N, AlZarouni AA, AlKuwaiti MM, AlDhaheri R, AlNeyadi AM, AlAlawi SS, AlKuwaiti MH. Assessing the impact of adopting a competency-based medical education framework and ACGME-I accreditation on educational outcomes in a family medicine residency program in Abu Dhabi Emirate, United Arab Emirates. Front Med (Lausanne) 2024; 10:1257213. [PMID: 38259827 PMCID: PMC10802161 DOI: 10.3389/fmed.2023.1257213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Accepted: 11/24/2023] [Indexed: 01/24/2024] Open
Abstract
Background Competency-Based Medical Education (CBME) is now mandated by many graduate and undergraduate accreditation standards. Evaluating CBME is essential for quantifying its impact, finding supporting evidence for the efforts invested in accreditation processes, and determining future steps. The Ambulatory Healthcare Services (AHS) family medicine residency program has been accredited by the Accreditation Council of Graduate Medical Education-International (ACGME-I) since 2013. This study aims to report the Abu Dhabi program's experience in implementing CBME and accreditation. Objectives Compare the two residents' cohorts' performance pre-and post-ACGME-I accreditation.Study the bi-annually reported milestones as a graduating residents' performance prognostic tool. Methods All residents in the program from 2008 to 2019 were included. They are called Cohort one-the intake from 2008 to 2012, before the ACGME accreditation, and Cohort two-the intake from 2013 to 2019, after the ACGME accreditation, with the milestones used. The mandatory annual in-training exam was used as an indication of the change in competency between the two cohorts. Among Cohort two ACGME-I, the biannual milestones data were studied to find the correlation between residents' early and graduating milestones. Results A total of 112 residents were included: 36 in Cohort one and 76 in Cohort two. In Cohort one, before the ACGME accreditation, no significant associations were identified between residents' graduation in-training exam and their early performance indicators, while in Cohort two, there were significant correlations between almost all performance metrics. Early milestones are correlated with the graduation in-training exam score. Linear regression confirmed this relationship after controlling the residents' undergraduate Grade Point Average (GPA). Competency development continues to improve even after residents complete training at Post Graduate Year, PGY4, as residents' achievement in PGY5 continues to improve. Conclusion Improved achievement of residents after the introduction of the ACGME-I accreditation is evident. Additionally, the correlation between the graduation in-training exam and graduation milestones, with earlier milestones, suggests a possible use of early milestones in predicting outcomes.
Collapse
Affiliation(s)
| | - Nico Nagelkerke
- Community Medicine Department, UAEU, Al Ain, United Arab Emirates
| | - Amal A. AlZarouni
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Mariam M. AlKuwaiti
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Ruwaya AlDhaheri
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Amna M. AlNeyadi
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Shamma S. AlAlawi
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| | - Mouza H. AlKuwaiti
- Abu Dhabi Healthcare Services, Ambulatory Healthcare Services, Al Ain, United Arab Emirates
| |
Collapse
|
5
|
Weaver ML, Carter T, Yamazaki K, Hamstra SJ, Holmboe E, Chaer R, Park YS, Smith BK. The Association of ACGME Milestones With Performance on American Board of Surgery Assessments: A National Investigation of Surgical Trainees. Ann Surg 2024; 279:180-186. [PMID: 37436889 DOI: 10.1097/sla.0000000000005998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/14/2023]
Abstract
OBJECTIVE To determine the relationship between, and predictive utility of, milestone ratings and subsequent American Board of Surgery (ABS) vascular surgery in-training examination (VSITE), vascular qualifying examination (VQE), and vascular certifying examination (VCE) performance in a national cohort of vascular surgery trainees. BACKGROUND Specialty board certification is an important indicator of physician competence. However, predicting future board certification examination performance during training continues to be challenging. METHODS This is a national longitudinal cohort study examining relational and predictive associations between Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings and performance on VSITE, VQE, and VCE for all vascular surgery trainees from 2015 to 2021. Predictive associations between milestone ratings and VSITE were conducted using cross-classified random-effects regression. Cross-classified random-effects logistic regression was used to identify predictive associations between milestone ratings and VQE and VCE. RESULTS Milestone ratings were obtained for all residents and fellows(n=1,118) from 164 programs during the study period (from July 2015 to June 2021), including 145,959 total trainee assessments. Medical knowledge (MK) and patient care (PC) milestone ratings were strongly predictive of VSITE performance across all postgraduate years (PGYs) of training, with MK ratings demonstrating a slightly stronger predictive association overall (MK coefficient 17.26 to 35.76, β = 0.15 to 0.23). All core competency ratings were predictive of VSITE performance in PGYs 4 and 5. PGY 5 MK was highly predictive of VQE performance [OR 4.73, (95% CI, 3.87-5.78), P <0.001]. PC subcompetencies were also highly predictive of VQE performance in the final year of training [OR 4.14, (95% CI, 3.17-5.41), P <0.001]. All other competencies were also significantly predictive of first-attempt VQE pass with ORs of 1.53 and higher. PGY 4 ICS ratings [OR 4.0, (95% CI, 3.06-5.21), P <0.001] emerged as the strongest predictor of VCE first-attempt pass. Again, all subcompetency ratings remained significant predictors of first-attempt pass on CE with ORs of 1.48 and higher. CONCLUSIONS ACGME Milestone ratings are highly predictive of future VSITE performance, and first-attempt pass achievement on VQE and VCE in a national cohort of surgical trainees.
Collapse
Affiliation(s)
- M Libby Weaver
- Division of Vascular and Endovascular Surgery, University of Virginia, Charlottesville, VA
| | - Taylor Carter
- Department of Surgery, University of North Carolina, Chapel Hill, NC
| | - Kenji Yamazaki
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Stanley J Hamstra
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
- Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, IL
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL
| | - Rabih Chaer
- Division of Vascular Surgery, University of Pittsburgh Medical Center, Pittsburgh, PA
| | - Yoon Soo Park
- Massachusetts General Hospital, Harvard Medical School, Boston, MA
| | - Brigitte K Smith
- Division of Vascular Surgery, University of Utah, Salt Lake City, UT
| |
Collapse
|
6
|
Seaberg PH, Kling JM, Klanderman MC, Mead-Harvey C, Williams KE, Labonte HR, Jain A, Taylor GE, Blair JE. Resident factors associated with American board of internal medicine certification exam failure. Med Educ Online 2023; 28:2152162. [PMID: 36443907 PMCID: PMC9718560 DOI: 10.1080/10872981.2022.2152162] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 11/21/2022] [Accepted: 11/22/2022] [Indexed: 06/16/2023]
Abstract
INTRODUCTION Performance on the certifying examinations such as the American Board of Internal Medicine Certification Exam (ABIM-CE) is of great interest to residents and their residency programs. Identification of factors associated with certification exam result may allow residency programs to recognize and intervene for residents at risk of failing. Despite this, residency programs have few evidence-based predictors of certification exam outcome. The change to pass-or-fail score reporting of the USA Medical Licensing Exam (USMLE) Step 1 removes one such predictor. MATERIALS AND METHODS We performed a retrospective study of residents from a medium-sized internal medicine residency program who graduated from 1998 through 2017. We used univariate tests of associations between ABIM-CE result and various demographic and scholastic factors. RESULTS Of 166 graduates, 14 (8.4%) failed the ABIM-CE on the first attempt. Failing the first attempt of the ABIM-CE was associated with older median age on entering residency (29 vs 27 years; P = 0.01); lower percentile rank on the Internal Medicine In-Training Examination (IM-ITE) in each of the first, second, and third years of training (P < 0.001 for all); and lower scores on the USMLE Steps 1, 2 Clinical Knowledge, and 3 (P < 0.05 for all). No association was seen between a variety of other scholastic or demographic factors and first-attempt ABIM-CE result. DISCUSSION Although USMLE step 1 has changed to a pass-or-fail reporting structure, there are still other characteristics that allow residency programs to identify residents at risk of ABIM-CE first time failure and who may benefit from intervention.
Collapse
Affiliation(s)
- Preston H. Seaberg
- Department of Internal Medicine Charleston Division, West Virginia University School of Medicine, Charleston, West Virginia, USA
| | - Juliana M. Kling
- Division of Women’s Health Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Molly C. Klanderman
- Division of Clinical Trials and Biostatistics, Mayo Clinic, Scottsdale, Arizona, USA
| | - Carolyn Mead-Harvey
- Division of Clinical Trials and Biostatistics, Mayo Clinic, Scottsdale, Arizona, USA
| | | | - Helene R. Labonte
- Division of Community Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Atul Jain
- Division of General Internal Medicine, Mayo Clinic, Scottsdale, Arizona, USA
| | - Gretchen E. Taylor
- Division of Hospital Internal Medicine, Mayo Clinic, Phoenix, Arizona, USA
| | - Janis E. Blair
- Division of Infectious Diseases, Mayo Clinic, Phoenix, AZ, USA
| |
Collapse
|
7
|
Mickey W. Critical care medicine training in the age of COVID-19. J Osteopath Med 2023; 123:427-434. [PMID: 37307290 DOI: 10.1515/jom-2022-0244] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Accepted: 05/15/2023] [Indexed: 06/14/2023]
Abstract
CONTEXT The COVID-19 pandemic caused the largest disruption to graduate medical education in modern history. The danger associated with SARS-CoV-2 necessitated a paradigm shift regarding the fundamental approach to the education of medical residents and fellows. Whereas prior work has examined the effect of the pandemic on residents' experiences during training, the effect of the pandemic on academic performance of critical care medicine (CCM) fellows is not well understood. OBJECTIVES This study examined the relationship between CCM fellow's lived experiences during the COVID-19 pandemic and performance on in-training examinations. METHODS This mixed-methods study consisted of a quantitative retrospective analysis of critical care fellows' in-training examination scores and a qualitative, interview-based phenomenological examination of fellows' experiences during the pandemic while training in a single large academic hospital in the American Midwest. Quantitative: Prepandemic (2019 and 2020) and intrapandemic (2021 and 2022) in-training examination scores were analyzed utilizing an independent samples t test to determine whether a significant change occurred during the pandemic. Qualitative: Individual semi-structured interviews were conducted with CCM fellows exploring their lived experiences during the pandemic and their perception of the effect on their academic performance. Transcribed interviews were analyzed for thematic patterns. These themes were coded and categorized, and subcategories were developed as indicated during the analysis. The identified codes were then analyzed for thematic connections and apparent patterns. Relationships between themes and categories were analyzed. This process was continued until a coherent picture could be assembled from the data to answer the research questions. Analysis was performed from a phenomenological perspective with an emphasis on interpretation of the data from the participants' perspectives. RESULTS Quantitative: Fifty-one in-training examination scores from 2019 to 2022 were obtained for analysis. Scores from 2019 to 2020 were grouped as prepandemic scores, while scores from 2021 to 2022 were grouped as intrapandemic scores. Twenty-four prepandemic and 27 intrapandemic scores were included in the final analysis. A significant difference was found between mean total prepandemic and intrapandemic in-service examination scores (t 49=2.64, p=0.01), with mean intrapandemic scores being 4.5 points lower than prepandemic scores (95 % CI, 1.08-7.92). Qualitative: Interviews were conducted with eight CCM fellows. Thematic analysis of the qualitative interviews revealed three main themes: psychosocial/emotional effects, effects on training, and effects on health. The factors that most effected participants' perceptions of their training were burnout, isolation, increased workload, decreased bedside teaching, decreased formal academic training opportunities, decreased procedural experience, a lack of an external reference point for normal training in CCM, fear of spreading COVID-19, and neglect of personal health during the pandemic. CONCLUSIONS In-training examination scores decreased significantly during the COVID-19 pandemic for CCM fellows in this study. The fellows in this study reported perceived effects of the pandemic on their psychosocial/emotional well-being, medical training, and health.
Collapse
|
8
|
Jacobs JW, Booth GS, Usmani A, Burner J, Adkins BD. Fellowship Board Pass Rates Rising: Analysis of Pathology Subspecialty Board Examination Performance. Arch Pathol Lab Med 2023; 147:964-968. [PMID: 36343371 DOI: 10.5858/arpa.2022-0129-oa] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/29/2022] [Indexed: 07/28/2023]
Abstract
CONTEXT.— The American Board of Pathology (ABPath) publishes annual performance data for the anatomic pathology (AP) and clinical pathology (CP) board examinations, as well as for ABPath subspecialty examinations. Overall board pass rates for all AP and CP board examinees have increased during the past decade; however, no study has analyzed the board pass rates for pathology subspecialty examinations, and whether these follow the same trend. OBJECTIVE.— To evaluate ABPath subspecialty examination pass rates to assess the trend in certification. DESIGN.— We analyzed the total number of first-time test takers and board pass rates for 11 pathology subspecialties recognized by the ABPath from 2007 to 2021, acquired from annual reports published by the ABPath. We compared the pass rates in 5-year intervals (2007-2011, 2012-2016, 2017-2021) for each individual specialty. We also analyzed the pass rate of CP subspecialties compared with AP subspecialties. RESULTS.— The overall mean pass rate for ABPath subspecialty examinations during the previous 15 years was 89% (range, 78.9%-100%), with the overall pass rate being significantly higher in 2017-2021 (P = .02). The contemporary overall rate of passing was significantly higher for AP subspecialty examinations (P < .001) and was higher, though not significantly, for CP subspecialties (P = .13). There were significant differences between first-time test takers' mean pass rate (92.1%), repeat test takers' mean pass rate (54.5%), and the overall rate (P < .001). CONCLUSIONS.— Contemporary pathology subspecialty board examination pass rates are significantly higher than historic rates, possibly reflecting continuously improving and readily available preparatory materials.
Collapse
Affiliation(s)
- Jeremy W Jacobs
- From the Department of Laboratory Medicine, Yale School of Medicine, New Haven, Connecticut (Jacobs)
| | - Garrett S Booth
- The Department of Pathology, Microbiology, and Immunology, Vanderbilt University Medical Center, Nashville, Tennessee (Booth)
| | - Amena Usmani
- The Department of Pathology, University of Texas Southwestern Medical Center, Dallas (Usmani, Burner, Adkins)
| | - James Burner
- The Department of Pathology, University of Texas Southwestern Medical Center, Dallas (Usmani, Burner, Adkins)
| | - Brian D Adkins
- The Department of Pathology, University of Texas Southwestern Medical Center, Dallas (Usmani, Burner, Adkins)
| |
Collapse
|
9
|
Watari T, Nishizaki Y, Houchens N, Kataoka K, Sakaguchi K, Shiraishi Y, Shimizu T, Yamamoto Y, Tokuda Y. Medical resident's pursuing specialty and differences in clinical proficiency among medical residents in Japan: a nationwide cross-sectional study. BMC Med Educ 2023; 23:464. [PMID: 37349724 DOI: 10.1186/s12909-023-04429-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/11/2023] [Accepted: 06/06/2023] [Indexed: 06/24/2023]
Abstract
IMPORTANCE Standardized examinations assess both learners and training programs within the medical training system in Japan. However, it is unknown if there is an association between clinical proficiency as assessed by the General Medicine In-Training Examination (GM-ITE) and pursuing specialty. OBJECTIVE To determine the relative achievement of fundamental skills as assessed by the standardized GM-ITE based on pursuing career specialty among residents in the Japanese training system. DESIGN Nationwide cross-sectional study. SETTING Medical residents in Japan who attempted the GM-ITE in their first or second year were surveyed. PARTICIPANTS A total of 4,363 postgraduate years 1 and 2 residents who completed the GM-ITE were surveyed between January 18 and March 31, 2021. MAIN MEASURES GM-ITE total scores and individual scores in each of four domains assessing clinical knowledge: 1) medical interview and professionalism, 2) symptomatology and clinical reasoning, 3) physical examination and treatment, and 4) detailed disease knowledge. RESULTS When compared to the most pursued specialty, internal medicine, only those residents who chose general medicine achieved higher GM-ITE scores (coefficient 1.38, 95% CI 0.08 to 2.68, p = 0.038). Conversely, the nine specialties and "Other/Not decided" groups scored significantly lower. Higher scores were noted among residents entering general medicine, emergency medicine, and internal medicine and among those who trained in community hospitals with higher numbers of beds, were more advanced in their training, spent more time working and studying, and cared for a moderate but not an extreme number of patients at a time. CONCLUSIONS Levels of basic skill achievement differed depending on respective chosen future specialties among residents in Japan. Scores were higher among those pursuing careers in general medical fields and lower among those pursuing highly specialized careers. Residents in training programs devoid of specialty-specific competition may not possess the same motivations as those in competitive systems.
Collapse
Affiliation(s)
- Takashi Watari
- General Medicine Center, Shimane University Hospital, 89-1, Enya-Cho, Izumo Shi, Shimane, 693-8501, Japan.
- Medicine Service, VA Ann Arbor Healthcare System, Ann Arbor, MI, USA.
- Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI, USA.
| | - Yuji Nishizaki
- Division of Medical Education, Juntendo University School of Medicine, Tokyo, Japan
| | - Nathan Houchens
- Medicine Service, VA Ann Arbor Healthcare System, Ann Arbor, MI, USA
- Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Koshi Kataoka
- Division of Medical Education, Juntendo University School of Medicine, Tokyo, Japan
| | - Kota Sakaguchi
- General Medicine Center, Shimane University Hospital, 89-1, Enya-Cho, Izumo Shi, Shimane, 693-8501, Japan
| | - Yoshihiko Shiraishi
- General Medicine Center, Shimane University Hospital, 89-1, Enya-Cho, Izumo Shi, Shimane, 693-8501, Japan
| | - Taro Shimizu
- Department of Diagnostic and Generalist Medicine, Dokkyo Medical University Hospital, Tochigi, Japan
| | - Yu Yamamoto
- Division of General Medicine, Center for Community Medicine, Jichi Medical University, Tochigi, Japan
| | - Yasuharu Tokuda
- Muribushi Okinawa Project for Teaching Hospitals, Okinawa, Japan
| |
Collapse
|
10
|
Scott NP, Martin TW, Schmidt AM, Shanks AL. Impact of an Online Question Bank on Resident In-Training Exam Performance. J Med Educ Curric Dev 2023; 10:23821205231206221. [PMID: 37822782 PMCID: PMC10563493 DOI: 10.1177/23821205231206221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2022] [Accepted: 06/12/2023] [Indexed: 10/13/2023]
Abstract
OBJECTIVE In-training exams (ITEs) are administered annually to Obstetrics and Gynecology (OBGYN) residents and have been demonstrated to correlate with success on licensing examinations. Our study objective was to determine the impact of a question bank and mock exam on the performance of Council on Resident Education in Obstetrics and Gynecology (CREOG) ITEs. Secondarily, we investigated the correlation between the extent of question bank usage and performance on the exam. METHODS Pre-post intervention study of resident performance on CREOG ITE before and after implementation of the question bank and mock ITE at Indiana University in 2018. Performance was measured as year-to-year improvement in percent correct on ITE exams. Scores were excluded if a resident did not have a prequestion bank score report or if they did not sit for all eligible ITE exams. RESULTS There were 51 OBGYN residents at Indiana University during the study period, with 38 available for analysis (75%). Before implementation, average year-to-year improvement for PGY1-2, PGY2-3 and PGY3-4 classes were 0.60%, 1.0% and -1.6%, respectively. After implementation, all resident classes had significant improvements in ITE scores of 6.6% (P < .01), 9.0% (P < .01), and 7.2% (P < .01), respectively. There was a moderate program-wide correlation between the number of questions completed and the percent improvement on the ITE of R = 0.36 (P = .046). CONCLUSIONS Our study demonstrated that access to a question bank and mock ITE significantly improved CREOG ITE performance of OBGYN residents at Indiana University.
Collapse
Affiliation(s)
- Nicole P. Scott
- Indiana University School of Medicine, Indianapolis, IN, USA
| | | | | | | |
Collapse
|
11
|
Kern MW, Jewell CM, Hekman DJ, Schnapp BH. Number of Patient Encounters in Emergency Medicine Residency Does Not Correlate with In-Training Exam Domain Scores. West J Emerg Med 2023; 24:114-118. [PMID: 36602486 PMCID: PMC9897253 DOI: 10.5811/westjem.2022.11.57997] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2022] [Accepted: 11/19/2022] [Indexed: 01/06/2023] Open
Abstract
INTRODUCTION Emergency medicine (EM) residents take the American Board of Emergency Medicine (ABEM) In-Training Examination (ITE) every year. This examination is based on the ABEM Model of Clinical Practice (Model). The purpose of this study was to determine whether a relationship exists between the number of patient encounters a resident sees within a specific clinical domain and their ITE performance on questions that are related to that domain. METHODS Chief complaint data for each patient encounter was taken from the electronic health record for EM residents graduating in three consecutive years between 2016-2021. We excluded patient encounters without an assigned resident or a listed chief complaint. Chief complaints were then categorized into one of 20 domains based on the 2016 Model. We calculated correlations between the total number of encounters seen by a resident for all clinical years and their ITE performance for the corresponding clinical domain from their third year of training. RESULTS Available for analysis were a total of 232,625 patient encounters and 69 eligible residents who treated the patients. We found no statistically significant correlations following Bonferroni correction for multiple analyses. CONCLUSION There was no correlation between the number of patient encounters a resident has within a clinical domain and their ITE performance on questions corresponding to that domain. This suggests the need for separate but parallel educational missions to achieve success in both the clinical environment and standardized testing.
Collapse
Affiliation(s)
- Michael W. Kern
- Mayo Clinic Health System – Northwest Wisconsin Region, Department of Emergency Medicine, Eau Claire, Wisconsin
| | - Corlin M. Jewell
- BerbeeWalsh Department of Emergency Medicine, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - Dann J. Hekman
- BerbeeWalsh Department of Emergency Medicine, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| | - Benjamin H. Schnapp
- BerbeeWalsh Department of Emergency Medicine, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
| |
Collapse
|
12
|
Green I, Weaver A, Kircher S, Levy G, Michael Brady R, Flicker AB, Gala RB, Peterson J, Decesare J, Breitkopf D. Impact of Question Bank Use for In-Training Examination Preparation by OBGYN Residents - A Multicenter Study. J Surg Educ 2022; 79:775-782. [PMID: 35086789 DOI: 10.1016/j.jsurg.2021.12.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/23/2021] [Revised: 11/28/2021] [Accepted: 12/28/2021] [Indexed: 06/14/2023]
Abstract
OBJECTIVE To examine the impact of access to and utilization of a commercially available question bank (TrueLearn) for in-training examination (ITE) preparation in Obstetrics and Gynecology (OBGYN). DESIGN This was a retrospective cohort study examining the impact of TrueLearn usage on ITE examination performance outcomes. Produced by the educational arm of the American College of Obstetricians and Gynecologists, the Council on Resident Education in Obstetrics and Gynecology (CREOG) exam is a multiple-choice test given to all residents annually. Residency programs participating in this study provided residency program mean CREOG scores from the year prior (2015), and the first (2016) and second (2017) years of TrueLearn usage. Programs also contributed resident-specific CREOG scores for each resident for 2016 and 2017. This data was combined with each resident's TrueLearn usage data that was provided by TrueLearn with residency program consent. The CREOG scores consisted of the CREOG score standardized to all program years, the CREOG score standardized to the same program year (PGY) and the total percent (%) correct. TrueLearn usage data included number of practice questions completed, number of practice tests taken, average number of days between successive tests, and percent correct of answered practice questions. SETTING OBGYN Residency Training Programs. PARTICIPANTS OBGYN residency programs that purchased and utilized TrueLearn for the 2016 CREOG examination were eligible for participation (n = 14). Ten residency programs participated, which consisted of 212 residents in 2016 and 218 residents in 2017. RESULTS TrueLearn was used by 78.8% (167/212) of the residents in 2016 and 84.9% (185/218) of the residents in 2017. No significant difference was seen in the average CREOG scores available on a per- program level before versus after the first year of implementation either using the CREOG score standardized to all PGYs (mean difference 1.0; p = 0.58) or standardized to the same PGY (mean difference 3.1; p = 0.25). Using resident-level data, there was no significant difference in mean CREOG score standardized to all PGYs between users and non-users of TrueLearn in 2016 (mean, 199.4 vs 196.7; p = 0.41) or 2017 (mean, 198.2 vs 203.4; p = 0.19). The percent of practice questions answered correctly on TrueLearn was positively correlated with the CREOG score standardized to all PGYs (r = 0.47 for 2016 and r = 0.60 for 2017), as well as with the CREOG total percent correct (r = 0.47 for 2016 and r = 0.61 for 2017). Based on a simple linear regression, for every 500 practice questions completed, the CREOG score significantly increased for PGY-2 residents by an average (±SE) of 7.3 ± 2.8 points (p = 0.013); the average increase was 0.7 ± 2.5 (p = 0.79) for PGY-3 residents and 5.8 ± 3.3 points (p = 0.09) for PGY-4 residents. CONCLUSIONS Adoption of an online question bank did not result in higher mean CREOG scores at participating institutions. However, performance on the TrueLearn questions correlated with ITE performance, supporting predictive validity and the use of this question bank as a formative assessment for resident education and exam preparation.
Collapse
Affiliation(s)
- Isabel Green
- Department of Obstetrics and Gynecology, Division of Gynecologic Surgery Mayo Clinic, Rochester, Minnesota.
| | - Amy Weaver
- Department of Quantitative Health Sciences, Mayo Clinic, Rochester, Minnesota
| | - Samantha Kircher
- Department of Obstetrics and Gynecology, MetroHealth Medical Center, Cleveland, Ohio
| | - Gary Levy
- Uniformed Services University, Bethesda, Maryland, Tripler Army Medical Center, Honolulu, Hawaii
| | - Robert Michael Brady
- Creighton University School of Medicine - Phoenix, Department of Obstetrics and Gynecology, Phoenix, Arizona
| | - Amanda B Flicker
- Department of Obstetrics and Gynecology, Lehigh Valley Health Network, Allentown, Pennsylvania
| | - Rajiv B Gala
- Ochsner Health - Baptist, McFarland Medical Plaza, New Orleans, Louisiana
| | - Joseph Peterson
- Ascension Sacred Heart Women's Care Center, Pensacola, Florida
| | - Julie Decesare
- West Florida Medical Group - Obstetrics & Gynecology, Pensacola, Florida
| | - Daniel Breitkopf
- Department of Obstetrics and Gynecology, Division of Gynecologic Surgery Mayo Clinic, Rochester, Minnesota
| |
Collapse
|