1
|
Kulasegaram K, Archibald D, Bartman I, Chahine S, Kirpalani A, Wilson C, Ross B, Cameron E, Hogenbirk J, Barber C, Burgess R, Katsoulas E, Touchie C, Grierson L. Can all roads lead to competency? School levels effects in Licensing examinations scores. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2025; 30:37-52. [PMID: 39636529 DOI: 10.1007/s10459-024-10398-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Accepted: 11/17/2024] [Indexed: 12/07/2024]
Abstract
At the foundation of research concerned with professional training is the idea of an assumed causal chain between the policies and practices of education and the eventual behaviours of those that graduate these programs. In medicine, given the social accountability to ensure that teaching and learning gives way to a health human resource that is willing and able to provide the healthcare that patients and communities need, it is of critical importance to generate evidence regarding this causal relationship. One question that medical education scholars ask regularly is the degree to which the unique features of training programs and learning environments impact trainee achievement of the intended learning outcomes. To date, this evidence has been difficult to generate because data pertaining to learners is only rarely systematically brought together across institutions or periods of training. We describe new research which leverages an inter-institutional data-driven approach to investigate the influence of school-level factors on the licensing outcomes of medical students. Specifically, we bring together sociodemographic, admissions, and in-training assessment variables pertaining to medical trainee graduates at each of the six medical schools in Ontario, Canada into multilevel stepwise regression models that determine the degree of association between these variables and graduate performances on the Medical Council of Canada Qualifying Examinations (Part 1, n = 1097 observations; Part 2, n = 616 observations), established predictors of downstream physician performance. As part of this analysis, we include an anonymized school-level (School 1, School 2) independent variable in each of these models. Our results demonstrate that the largest variable associated with performance on both the first and second parts of the licensing examinations is prior academic achievement, notably clerkship performance. Ratings of biomedical knowledge were also significantly associated with the first examination, while clerkship OSCE scores and enrollment in a family medicine residency were significantly associated with the Part 2. Small significant school effects were realized in both models accounting for 4% and 2% of the variance realized in the first and second examinations, respectively. These findings highlight that school enrollment plays a minor role relative to individual student performance in influencing examination outcomes.
Collapse
Affiliation(s)
- Kulamakan Kulasegaram
- Wilson Centre, University of Toronto, Toronto, Canada.
- Department of Family & Community Medicine Temerty Faculty of Medicine, University of Toronto, Toronto, Canada.
| | - Douglas Archibald
- Department of Family Medicine, University of Ottawa, Ottawa, Canada
- Bruyère Research Institute, Ottawa, Canada
| | | | | | - Amrit Kirpalani
- Division of Paediatric Surgery, London Health Sciences Centre, London, Canada
- Department of Pediatrics, Schulich School of Medicine & Dentistry, Western University, London, ON, Canada
- Department of Pediatrics, London Health Sciences Centre, London, ON, Canada
| | - Claire Wilson
- Division of Paediatric Surgery, London Health Sciences Centre, London, Canada
| | - Brian Ross
- Northern Ontario School of Medicine University, Thunder Bay, Canada
| | - Erin Cameron
- Northern Ontario School of Medicine University, Thunder Bay, Canada
- Dr. Gilles Arcand Centre for Health Equity, NOSM University, Thunder Bay, Canada
| | - John Hogenbirk
- Centre for Rural and Northern Health Research, Laurentian University, Sudbury, Canada
| | - Cassandra Barber
- Department of Family Medicine, McMaster University, Hamilton, Canada
- School of Health Profession Education, Faculty of Health Medicine & Life Sciences, Maastricht University, Maastricht, Netherlands
| | - Raquel Burgess
- Department of Family Medicine, McMaster University, Hamilton, Canada
- Department of Social and Behavioral Sciences, Yale School of Public Health, Yale University, New Haven, USA
| | | | - Claire Touchie
- Departments of Medicine and of Innovation in Medical Education, University of Ottawa, Ottawa, Canada
- Ottawa Hospital, Ottawa, Canada
| | - Lawrence Grierson
- Department of Family Medicine, McMaster University, Hamilton, Canada
- McMaster Education Research, Innovation, and Theory Program, McMaster University, 200 Elizabeth St, Toronto, ON, M5G 2C4, Canada
| |
Collapse
|
2
|
Dale ED, Abulela MAA, Jia H, Violato C. Are medical school preclinical tests biased for sex and race? A differential item functioning analysis. BMC MEDICAL EDUCATION 2025; 25:146. [PMID: 39881271 PMCID: PMC11780802 DOI: 10.1186/s12909-024-06540-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Accepted: 12/16/2024] [Indexed: 01/31/2025]
Abstract
BACKGROUND A common practice in assessment development, fundamental for fairness and consequently the validity of test score interpretations and uses, is to ascertain whether test items function equally across test-taker groups. Accordingly, we conducted differential item functioning (DIF) analysis, a psychometric procedure for detecting potential item bias, for three preclinical medical school foundational courses based on students' sex and race. METHODS The sample included 520, 519, and 344 medical students for anatomy, histology, and physiology, respectively, collected from 2018 to 2020. To conduct DIF analysis, we used the Wald test based on the two-parameter logistic model as utilized in the IRTPRO software. RESULTS The three assessments had as many as one-fifth of the items that functioned statistically differentially across one or more of the variables sex and race: 10 out of 49 items (20%), six out of 40 items (15%), 5 out of 45 items (11%) showed statistically significant DIF for Anatomy, Histology, and Physiology courses, respectively. Measurement specialists and subject matter experts independently reviewed the items to identify construct-irrelevant factors as potential sources for DIF as demonstrated in Appendix A. Most identified items were generally poorly written or had unclear images. CONCLUSIONS The validity of score-based inferences, particularly for group comparisons, requires test items to function equally across test-taker groups. In the present study, we found DIF of some items for sex and race in three content areas. The present approach should be utilized in other medical schools to address the generalizability of the present findings. Item level DIF should also be routinely conducted as part of psychometric analyses for basic sciences courses and other assessments. CLINICAL TRIAL NUMBER Not applicable.
Collapse
Affiliation(s)
- Esther Dasari Dale
- University of Minnesota Medical School, 420 Delaware Street SE, Mayo Building, Minneapolis, MN, 55455, USA
| | - Mohammed A A Abulela
- University of Minnesota Medical School, 420 Delaware Street SE, Mayo Building, Minneapolis, MN, 55455, USA.
- Department of Educational Psychology, University of Minnesota, Minneapolis, MN, 55455, USA.
- Associate Professor of Educational Psychology, South Valley University, Qena, Egypt.
| | - Hao Jia
- University of Minnesota Medical School, 420 Delaware Street SE, Mayo Building, Minneapolis, MN, 55455, USA
- Department of Educational Psychology, University of Minnesota, Minneapolis, MN, 55455, USA
| | - Claudio Violato
- University of Minnesota Medical School, 420 Delaware Street SE, Mayo Building, Minneapolis, MN, 55455, USA
| |
Collapse
|
3
|
Kelleher M, Schumacher DJ, Zhou C, Kwakye D, Santen SA, Warm E, Kinnear B. Public Board Score Reporting Undermines Holistic Review for Residency Selection. J Gen Intern Med 2025; 40:17-21. [PMID: 39496852 PMCID: PMC11780034 DOI: 10.1007/s11606-024-09133-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2024] [Accepted: 10/09/2024] [Indexed: 11/06/2024]
Abstract
Holistic review has become the gold standard for residency selection. As a result, many programs are de-emphasizing standardized exam scores and other normative metrics. However, if standardized exam scores predict passing of an initial certifying exam, this may lead to an increase in board failure rates within specific residency training programs who do not emphasize test scores on entry. Currently, the board pass rates of residency programs from many of the American Board of Medical Subspecialities (ABMS) are publicly reported as a rolling average. In theory, this should create accountability but may also create pressure and distort the way residency program selects applicants. The risk to programs of having a lower board pass rate publicly reported incentivizes programs to focus increasingly on standardized test scores, threatening holistic review. All programs do not recruit students entering residency with an identical chance of passing boards. Therefore, we believe the ABMS member boards should stop publicly reporting raw certifying exam rates above a certain threshold for normative comparison. We strongly encourage the use of learning analytics to create a residency "expected board pass rate" that would be a better metric for program evaluation and accreditation.
Collapse
Affiliation(s)
- Matthew Kelleher
- Internal Medicine and Pediatrics Hospital Medicine, University of Cincinnati College of Medicine/Cincinnati Children's Hospital Medical Center, Cincinnati, USA.
| | - Daniel J Schumacher
- University of Cincinnati College of Medicine/Cincinnati Children's Medical Center, Cincinnati, OH, USA
| | - Christine Zhou
- University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Derek Kwakye
- University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Sally A Santen
- University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric Warm
- University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Benjamin Kinnear
- Internal Medicine and Pediatrics Hospital Medicine, University of Cincinnati College of Medicine/Cincinnati Children's Hospital Medical Center, Cincinnati, USA
| |
Collapse
|
4
|
Tolleson S, Diec S, Listiyo D, Al-Mallah A, Varisco T. Assessing the relationship between curricular placement of law courses and multistate pharmacy jurisprudence examination pass rates. CURRENTS IN PHARMACY TEACHING & LEARNING 2024; 16:102202. [PMID: 39293210 DOI: 10.1016/j.cptl.2024.102202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2024] [Revised: 08/22/2024] [Accepted: 09/03/2024] [Indexed: 09/20/2024]
Abstract
OBJECTIVE To identify if there is a relationship between the placement of standalone pharmacy law courses within the PharmD curriculum and Multistate Pharmacy Jurisprudence Examination (MPJE) first-time pass rates. METHODS Colleges of pharmacy were identified using the MPJE Passing Rates for 2019-2022 Graduates found on the National Association of Boards of Pharmacy (NABP) website. Characteristics of the pharmacy law content delivery within the curriculum were extracted from the program, Pharmacy College Application Service, American Association of Colleges of Pharmacy (AACP), and NABP websites. Pharmacy programs with standalone law courses, MPJE pass rates reported by NABP, and data that could be obtained via publicly available sources were included. To standardize between three year and four-year programs, law course delivery within the curriculum was measured as number of semesters (fall, spring, or summer) before graduation. RESULTS One hundred nine schools met the inclusion criteria. Linear path analysis revealed no relationship between the number of semesters a law course was scheduled before graduation and 4-year average first-time MPJE pass rates and 4-year average all-time MPJE pass rates. CONCLUSION The findings did not show that earlier placement of pharmacy law courses predicted MPJE first-time pass rates. However, a strong correlation existed between NAPLEX and MPJE pass rates, suggesting NAPLEX performance may indicate overall licensure exam preparedness. Notable differences in pass rates were observed between public and private pharmacy programs, highlighting the need to investigate program characteristics impacting exam success. Further research is warranted to identify predictive factors for MPJE outcomes.
Collapse
Affiliation(s)
- Shane Tolleson
- Clinical Assistant Professor; Director, Ambulatory Care APPEs, University of Houston College of Pharmacy, 4349 Martin Luther King Blvd., Houston, TX 77204, United States of America.
| | - Sandy Diec
- University of Houston - College of Pharmacy, 4349 Martin Luther King Jr. Blvd, Houston, TX 77204, United States.
| | - Daniel Listiyo
- University of Houston - College of Pharmacy, 4349 Martin Luther King Jr. Blvd, Houston, TX 77204, United States.
| | - Asma Al-Mallah
- University of Houston - College of Pharmacy, 4349 Martin Luther King Jr. Blvd, Houston, TX 77204, United States.
| | - Tyler Varisco
- University of Houston - College of Pharmacy, 4349 Martin Luther King Jr. Blvd, Houston, TX 77204, United States.
| |
Collapse
|
5
|
Goldenberg MG. Surgical Artificial Intelligence in Urology: Educational Applications. Urol Clin North Am 2024; 51:105-115. [PMID: 37945096 DOI: 10.1016/j.ucl.2023.06.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2023]
Abstract
Surgical education has seen immense change recently. Increased demand for iterative evaluation of trainees from medical school to independent practice has led to the generation of an overwhelming amount of data related to an individual's competency. Artificial intelligence has been proposed as a solution to automate and standardize the ability of stakeholders to assess the technical and nontechnical abilities of a surgical trainee. In both the simulation and clinical environments, evidence supports the use of machine learning algorithms to both evaluate trainee skill and provide real-time and automated feedback, enabling a shortened learning curve for many key procedural skills and ensuring patient safety.
Collapse
Affiliation(s)
- Mitchell G Goldenberg
- Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, 1441 Eastlake Avenue, Suite 7416, Los Angeles, CA 90033, USA.
| |
Collapse
|
6
|
Tello C, Goode CA. Factors and barriers that influence the matriculation of underrepresented students in medicine. Front Psychol 2023; 14:1141045. [PMID: 37303920 PMCID: PMC10247986 DOI: 10.3389/fpsyg.2023.1141045] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2023] [Accepted: 04/27/2023] [Indexed: 06/13/2023] Open
Abstract
Despite many initiatives over more than 4 decades, the diversity of United States physicians still does not reflect the diversity of the United States population. The present study undertakes a literature review of the last 30 years to investigate barriers and protective factors underrepresented college students encounter as applicants for medical school. Known barriers that influence matriculation into medical school were analyzed such as academic metrics and test scores. Additionally, elements that are less well studied were investigated such as factors perceived as barriers by underrepresented applicants in addition to protective factors that allow them to persist in their journey in the face of difficulties and adversity.
Collapse
Affiliation(s)
- Cynthia Tello
- American University of the Caribbean School of Medicine, Cupecoy, Sint Maarten
- Graduate College of Biomedical Sciences and College of Dental Medicine, Western University of Health Sciences, Pomona, CA, United States
| | - Christine A. Goode
- Graduate College of Biomedical Sciences and College of Dental Medicine, Western University of Health Sciences, Pomona, CA, United States
| |
Collapse
|
7
|
McCray E, Atkinson WR, McCray CE, Hubler Z, Maher Y, Waguia R, Kearney M, Kaprielian V. Impact of Medical Student Participation in Student-Run Clinics on Education, Residency Selection, and Patient Care: A Review of Selected Articles. Cureus 2022; 14:e26183. [PMID: 35891868 PMCID: PMC9306404 DOI: 10.7759/cureus.26183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/21/2022] [Indexed: 11/17/2022] Open
Abstract
Student-run clinics (SRCs) are becoming increasingly popular at medical schools in the United States. These clinics have provided a variety of benefits, including serving disadvantaged populations and providing early clinical exposure for students. There has been no consensus on the impact of SRCs on medical education, specialty selection, and patient care. This review provides a thorough overview of student and patient outcomes as a function of medical students volunteering at SRCs. We queried PubMed for original literature published in English between the years 2000 and 2020. Inclusion criteria included primary research articles evaluating the impact of medical student participation in SRCs on education, specialty selection, and patient care. All articles included in the final review were agreed upon by three reviewers, and the pertinent data were extracted. Of 10,200 initial search results, seven papers were included in this review. These included two studies evaluating medical education, five studies evaluating residency selection, and three studies analyzing patient care. Three studies were included in multiple evaluations. The relationship between volunteering at SRCs and academic performance is unclear. Clinic volunteers had increased retention of empathy compared to non-volunteers. Additionally, clinic volunteers provided satisfactory care as determined by patient-reported outcomes, and were not more likely to pursue primary care specialties. As SRCs are increasing in number, research into the impact on medical students and patients is necessary to understand how these clinics may affect the field of health care. It is important to further evaluate how medical student involvement in SRCs can further improve patient care and outcomes.
Collapse
|
8
|
Jacobparayil A, Ali H, Pomeroy B, Baronia R, Chavez M, Ibrahim Y. Predictors of Performance on the United States Medical Licensing Examination Step 2 Clinical Knowledge: A Systematic Literature Review. Cureus 2022; 14:e22280. [PMID: 35350504 PMCID: PMC8933259 DOI: 10.7759/cureus.22280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/16/2022] [Indexed: 11/05/2022] Open
|
9
|
Puri N, McCarthy M, Miller B. Validity and Reliability of Pre-matriculation and Institutional Assessments in Predicting USMLE STEP 1 Success: Lessons From a Traditional 2 x 2 Curricular Model. Front Med (Lausanne) 2022; 8:798876. [PMID: 35155475 PMCID: PMC8829749 DOI: 10.3389/fmed.2021.798876] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Accepted: 12/20/2021] [Indexed: 11/21/2022] Open
Abstract
Purpose We have observed that students' performance in our pre-clerkship curriculum does not align well with their United States Medical Licensing Examination (USMLE) STEP1 scores. Students at-risk of failing or underperforming on STEP1 have often excelled on our institutional assessments. We sought to test the validity and reliability of our course assessments in predicting STEP1 scores, and in the process, generate and validate a more accurate prediction model for STEP1 performance. Methods Student pre-matriculation and course assessment data of the Class of 2020 (n = 76) is used to generate a stepwise STEP1 prediction model, which is tested with the students of the Class of 2021 (n = 71). Predictions are developed at the time of matriculation and subsequently at the end of each course in the programing language R. For the Class of 2021, the predicted STEP1 score is correlated with their actual STEP1 scores, and data agreement is tested with means-difference plots. A similar model is generated and tested for the Class of 2022. Results STEP1 predictions based on pre-matriculation data are unreliable and fail to identify at-risk students (R2 = 0.02). STEP1 predictions for most year one courses (anatomy, biochemistry, physiology) correlate poorly with students' actual STEP1 scores (R2 = 0.30). STEP1 predictions improve for year two courses (microbiology, pathology, and pharmacology). But integrated courses with customized NBMEs provide more reliable predictions (R2 = 0.66). Predictions based on these integrated courses are reproducible for the Class of 2022. Conclusion MCAT and undergraduate GPA are poor predictors of student's STEP1 scores. Partially integrated courses with biweekly assessments do not promote problem-solving skills and leave students' at-risk of failing STEP1. Only courses with integrated and comprehensive assessments are reliable indicators of students' STEP1 preparation.
Collapse
Affiliation(s)
- Nitin Puri
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| | - Michael McCarthy
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| | - Bobby Miller
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| |
Collapse
|
10
|
Kortz MW, Kongs BM, Bisesi DR, Roffler M, Sheehy RM. A retrospective and correlative analysis of academic and nonacademic predictors of COMLEX level 1 performance. J Osteopath Med 2022; 122:187-194. [PMID: 35084145 DOI: 10.1515/jom-2021-0175] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2021] [Accepted: 10/12/2021] [Indexed: 12/20/2022]
Abstract
CONTEXT National licensing exams (NLEs) including the Comprehensive Osteopathic Medical Licensing Examination (COMLEX) Level 1 evaluate student achievement. Scores have historically been utilized to stratify medical student applicants for residency. Grade point average (GPA), number of practice questions completed, and performance on practice exams have been shown to be predictive of NLE performance. Test anxiety and acute stress have been shown to negatively impact NLE performance. The role of study behaviors and other nonacademic factors in COMLEX Level 1 performance is unknown. OBJECTIVES This study aims to evaluate academic and nonacademic factors and to correlate them with COMLEX Level 1 performance. Additional analysis is conducted to associate COMLEX Level 1 performance with academic and nonacademic factors when controlling for GPA. METHODS An anonymous online survey was administered to third- (OMS III) and fourth-year (OMS IV) osteopathic medical students at Kansas City University that had completed the COMLEX Level 1 examination. In total, 72 students responded to the survey. Survey results were linked to student records of GPA and COMLEX Level 1 scores, resulting in 59 complete responses for analysis. Independent-sample t-tests and linear ordinary least squares regression were utilized to analyze the results. RESULTS The majority of participants are male (62.7%) and OMS III (98.3%) with an average age of 27.14 ± 2.58 (mean ± standard deviation). Further demographic data reveal hours per week spent for personal time during dedicated study (n=46, 19.7 ± 18.53), hours of sleep per night during dedicated study (7.34 ± 0.92), and money spent on board preparation ($1,319.12 ± $689.17). High ($1,600-$3,000), average ($1,000-$1,500), and low ($100-$900) spenders do not statistically differ and COMLEX Level 1 performance is not related to the number of resources utilized (F statistics <1; p>0.05). Pearson correlations reveal a statistically significant relationship between COMLEX Level 1 scores with GPA (0.73, p<0.001), number of practice exams completed (0.39, p<0.001), number of questions completed (0.46, p<0.001), number of weeks of study (0.55, p<0.001), and preparation cost (0.28, p<0.05). The regression analysis revealed that money spent on board preparation, number of questions completed, and time spent studying accounted for 75.8% of the variance in COMLEX Level 1 scores after controlling for GPA. CONCLUSIONS The data show the association of money spent on board preparation, numbers of questions competed, and time spent studying with a student's COMLEX Level 1 score. Additionally, these results highlight the amount of money students spend on extracurricular materials to prepare for COMLEX Level 1, yet the data show that the number of resources that students utilized is not related to a student's COMLEX Level 1 performance.
Collapse
Affiliation(s)
- Michael W Kortz
- College of Osteopathic Medicine at Kansas City University, Kansas City, MO, USA
| | - Brian M Kongs
- College of Osteopathic Medicine at Kansas City University, Kansas City, MO, USA
| | - Dominic R Bisesi
- College of Osteopathic Medicine at Kansas City University, Kansas City, MO, USA
| | - Marissa Roffler
- Assistant Professor, Department of Psychology, Rockhurst University, Kansas City, MO, USA
| | - Ryan M Sheehy
- Interim Assistant Dean of Basic Science Curriculum and Assistant Professor, Department of Medical Education, College of Medicine, The University of Tennessee Health Science Center, Memphis, TN, USA
| |
Collapse
|
11
|
Ellis R, Brennan PA, Scrimgeour DSG, Lee AJ, Cleland J. Does performance at the intercollegiate Membership of the Royal Colleges of Surgeons (MRCS) examination vary according to UK medical school and course type? A retrospective cohort study. BMJ Open 2022; 12:e054616. [PMID: 34987044 PMCID: PMC8734024 DOI: 10.1136/bmjopen-2021-054616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/19/2021] [Accepted: 12/02/2021] [Indexed: 12/03/2022] Open
Abstract
OBJECTIVES The knowledge, skills and behaviours required of new UK medical graduates are the same but how these are achieved differs given medical schools vary in their mission, curricula and pedagogy. Medical school differences seem to influence performance on postgraduate assessments. To date, the relationship between medical schools, course types and performance at the Membership of the Royal Colleges of Surgeons examination (MRCS) has not been investigated. Understanding this relationship is vital to achieving alignment across undergraduate and postgraduate training, learning and assessment values. DESIGN AND PARTICIPANTS A retrospective longitudinal cohort study of UK medical graduates who attempted MRCS Part A (n=9730) and MRCS Part B (n=4645) between 2007 and 2017, using individual-level linked sociodemographic and prior academic attainment data from the UK Medical Education Database. METHODS We studied MRCS performance across all UK medical schools and examined relationships between potential predictors and MRCS performance using χ2 analysis. Multivariate logistic regression models identified independent predictors of MRCS success at first attempt. RESULTS MRCS pass rates differed significantly between individual medical schools (p<0.001) but not after adjusting for prior A-Level performance. Candidates from courses other than those described as problem-based learning (PBL) were 53% more likely to pass MRCS Part A (OR 1.53 (95% CI 1.25 to 1.87) and 54% more likely to pass Part B (OR 1.54 (1.05 to 2.25)) at first attempt after adjusting for prior academic performance. Attending a Standard-Entry 5-year medicine programme, having no prior degree and attending a Russell Group university were independent predictors of MRCS success in regression models (p<0.05). CONCLUSIONS There are significant differences in MRCS performance between medical schools. However, this variation is largely due to individual factors such as academic ability, rather than medical school factors. This study also highlights group level attainment differences that warrant further investigation to ensure equity within medical training.
Collapse
Affiliation(s)
- Ricky Ellis
- University of Aberdeen Institute of Applied Health Sciences, Aberdeen, UK
- Urology Department, Nottingham University Hospitals NHS Trust, Nottingham, UK
| | - Peter A Brennan
- Department of Maxillo-Facial Surgery, Queen Alexandra Hospital, Portsmouth, UK
| | - Duncan S G Scrimgeour
- University of Aberdeen Institute of Applied Health Sciences, Aberdeen, UK
- Department of Colorectal Surgery, Aberdeen Royal Infirmary, Aberdeen, UK
| | - Amanda J Lee
- Medical Statistics Team, University of Aberdeen Institute of Applied Health Sciences, Aberdeen, UK
| | - Jennifer Cleland
- Medical Education Research and Scholarship Unit (MERSU), Lee Kong Chian School of Medicine, Singapore
| |
Collapse
|
12
|
Lin JC, Lokhande A, Chen AJ, Scott IU, Greenberg PB. Characteristics of First-Year Residents in Top-Ranked United States Ophthalmology Residency Programs. JOURNAL OF ACADEMIC OPHTHALMOLOGY (2017) 2022; 14:e7-e17. [PMID: 37388472 PMCID: PMC9927972 DOI: 10.1055/s-0041-1735152] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 07/21/2021] [Indexed: 10/19/2022]
Abstract
Objective The aim of the study is to investigate the characteristics of first-year residents associated with attending a top-ranked United States (U.S.) ophthalmology residency program over the past decade. Methods First-year ophthalmology residents in 2009, 2013, 2016, and 2019 were identified from institutional websites, Doximity, LinkedIn and the Wayback Machine. Publications were obtained from Scopus and Google Scholar; research productivity was measured using the h -index, and medical school region based on U.S. Census Bureau designations. Medical school and ophthalmology training program rankings were based on U.S. News & World Report (U.S. News) rankings and National Institutes of Health (NIH) funding. One-way ANOVA, Wilcoxon rank sum, χ 2 , and t -tests were used to analyze trends, and odds ratios (ORs) were calculated using logistic regression. Results Data were obtained on 81% (1,496/1,850) of the residents; 43% were female; 5% were international medical graduates (IMGs); and 10% had other graduate degrees. Over the decade, the mean h -index increased (0.87-1.26; p <0.05) and the proportion of residents who attended a top 20 medical school decreased (28-18%; p <0.05). In a multivariate logistic regression model, characteristics associated with being a first-year resident in a top 20 program ranked by U.S. News were female gender [OR: 1.32, 95% CI: 1.02-1.72], having a Master's degree [OR: 2.28, 95% CI: 1.29-4.01] or PhD [OR: 2.23, 95% CI: 1.32-3.79], attending a top 20 [OR: 5.26, 95% CI: 3.66-7.55] or a top 40 medical school by NIH funding [OR: 2.45, 95% CI: 1.70-3.54], attending a medical school with a mean USMLE Step 2 score above 243 [OR: 1.64, 95% CI: 1.01-2.67] or located in the Northeast [OR: 2.00, 95% CI: 1.38-2.89] and having an h -index of one or more [OR: 1.92, 95% CI: 1.47-2.51]. Except for gender, these characteristics were also significantly associated with matching to a top 20 ophthalmology program by NIH funding. Conclusion Female gender, graduate degrees, research productivity, and attending a medical school with high research productivity, high mean USMLE Step 2 score or in the Northeast were key characteristics of first-year residents in top-ranked U.S. ophthalmology residency programs.
Collapse
Affiliation(s)
- John C. Lin
- Program in Liberal Medical Education, Brown University, Providence, Rhode Island
- Division of Ophthalmology, Alpert Medical School, Brown University, Providence, Rhode Island
| | - Anagha Lokhande
- Division of Ophthalmology, Alpert Medical School, Brown University, Providence, Rhode Island
| | - Allison J. Chen
- Shiley Eye Institute, University of California San Diego, La Jolla, California
| | - Ingrid U. Scott
- Departments of Ophthalmology and Public Health Sciences, Penn State College of Medicine, Hershey, Pennsylvania
| | - Paul B. Greenberg
- Division of Ophthalmology, Alpert Medical School, Brown University, Providence, Rhode Island
- Section of Ophthalmology, Providence VA Medical Center, Providence, Rhode Island
| |
Collapse
|
13
|
Kracaw RA, Dizon W, Antonio S, Simanton E. Predicting United States Medical Licensing Examination Step 2 Clinical Knowledge Scores from Previous Academic Performance Measures within a Longitudinal Interleaved Curriculum. Cureus 2021; 13:e18143. [PMID: 34584812 PMCID: PMC8457316 DOI: 10.7759/cureus.18143] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/20/2021] [Indexed: 11/26/2022] Open
Abstract
Background United States Medical Licensing Examination (USMLE) Step 1 is a common metric looked at by residency programs to determine invitations for candidates to interview. However, USMLE Step 2 Clinical Knowledge (CK) has also been an important factor for selecting applicants to interview and plays a significant role during applicant selection. This study aims to identify academic performance measures that correlate with USMLE Step 2 CK scores and to develop a model to predict USMLE Step 2 CK scores using previous academic measures from the first two cohorts in the longitudinal interleaved clerkship (LInC) at the Kirk Kerkorian School of Medicine at the University of Nevada, Las Vegas (KSOM). Setting The KSOM is a newly accredited US allopathic medical school that accepted its first class in 2017. At KSOM, a LInC model is used in the primary clinical year. In this model, rotations are two weeks in duration before moving on to the next specialty. Students complete the National Board of Medical Examiners (NBME) subject examinations in all six specialties in one week at the midpoint and the end of the LInC. Students who passed an exam at the midpoint can opt out of that exam at the end as the higher of the two exam scores is recorded. However, most students choose to take all the exams again to improve their scores and prepare for USMLE Step 2 CK. Methodology Academic performance measures were gathered from the class of 2021 and 2022 (n = 101) including undergraduate grade point average (GPA), undergraduate science GPA, medical college admission test score, USMLE Step 1 score, NBME clinical subject exam scores, and USMLE Step 2 CK scores. Pearson correlations were run between the performance variables and USMLE Step 2 CK scores to measure influence variables individually, then a regression model measured impacts of variables together. Results All variables except undergraduate science GPA significantly correlated with USMLE Step 2 CK score. USMLE Step 1 had the strongest correlation (r = 0.752, p < 0.001). The regression model had an R of 0.859 with the internal medicine subject exam showing the highest beta coefficient (0.327, p < 0.001). Conclusions This study determined that USMLE Step 2 CK scores can be effectively predicted using available performance measures. With USMLE Step 1 becoming pass/fail in January 2022, the importance of USMLE Step 2 CK as a screening tool in the residency application process will likely increase. This study was conducted within a LInC curriculum and may have limited value in the prediction of scores within other clinical year curricula.
Collapse
Affiliation(s)
- Rachel A Kracaw
- Medical Education, University of Nevada, Las Vegas School of Medicine, Las Vegas, USA
| | - Wynona Dizon
- Medical Education, University of Nevada, Las Vegas School of Medicine, Las Vegas, USA
| | - Sabrina Antonio
- Medical Education, University of Nevada, Las Vegas School of Medicine, Las Vegas, USA
| | - Edward Simanton
- Medical Education, University of Nevada, Las Vegas School of Medicine, Las Vegas, USA
| |
Collapse
|
14
|
Kane KY, Hosokawa MC, Quinn KJ, Young-Walker L. University of Missouri-Columbia School of Medicine. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:S277-S281. [PMID: 33626700 DOI: 10.1097/acm.0000000000003370] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
|
15
|
Torre DM, Dong T, Schreiber-Gregory D, Durning SJ, Pangaro L, Pock A, Hemmer PA. Exploring the Predictors of Post-Clerkship USMLE Step 1 Scores. TEACHING AND LEARNING IN MEDICINE 2020; 32:330-336. [PMID: 32075437 DOI: 10.1080/10401334.2020.1721293] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Theory: We used two theoretical frameworks for this study: a) experiential learning, whereby learners construct new knowledge based on prior experience, and learning grows out of a continuous process of reconstructing experience, and b) deliberate practice, whereby the use of testing (test-enhanced learning) promotes learning and produces better long-term retention. Hypothesis: We hypothesized that moving the USMLE Step 1 exam to follow the clerkship year would provide students with a context for basic science learning that may enhance exam performance. We also hypothesized that examination performance variables, specifically National Board of Medical Examiners (NBME) Customized Basic Science Examinations and NBME subject examinations in clinical disciplines would account for a moderate to large amount of the variance in Step 1 scores. Thus we examined predictors of USMLE Step 1 scores when taken after the core clerkship year. Method: In 2011, we revised our medical school curriculum and moved the timing of Step 1 to follow the clerkship year. We performed descriptive statistics, an ANCOVA to compare Step 1 mean scores for three graduating classes of medical students before and after the curriculum changes, and stepwise linear regression to investigate the association between independent variables and the primary outcome measure after curriculum changes. Results: 993 students took the Step 1 exam, which included graduating classes before (2012-2014, N = 491) and after (2015-2017, N = 502) the curriculum change. Step 1 scores increased significantly following curricular revision (mean 218, SD 18.2, vs. 228, SD 16.7, p < 0.01) after controlling for MCAT and undergraduate GPA. Overall, 66.4% of the variance in Step 1 scores after the clerkship year was explained by: the mean score on fourteen pre-clerkship customized NBME exams (p < 0.01, 57.0% R2); performance on the surgery NBME subject exam (p < 0.01, 3.0% R2); the pediatrics NBME subject exam (p < 0.01, 2.0% R2); the Comprehensive Basic Science Self-Assessment (p < .01, 2.0% R2) ; the internal medicine NBME subject exam (p < 0.01, 0.03% R2), pre-clerkship Integrated Clinical Skills score (p < 0.01, 0.05% R2), and the pre-matriculation MCAT (p < 0.01, 0.01% R2). Conclusion: In our institution, nearly two-thirds of the variance in performance on Step 1 taken after the clerkship year was explained mainly by pre-clerkship variables, with a smaller contribution emanating from clerkship measures. Further study is needed to uncover the specific aspects of the clerkship experience that might contribute to success on high stakes licensing exam performance.
Collapse
Affiliation(s)
- Dario M Torre
- Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Ting Dong
- Curriculum, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Deanna Schreiber-Gregory
- Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Steven J Durning
- Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Louis Pangaro
- Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Arnyce Pock
- Curriculum, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| | - Paul A Hemmer
- Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
| |
Collapse
|
16
|
Burk-Rafel J, Pulido RW, Elfanagely Y, Kolars JC. Institutional differences in USMLE Step 1 and 2 CK performance: Cross-sectional study of 89 US allopathic medical schools. PLoS One 2019; 14:e0224675. [PMID: 31682639 PMCID: PMC6827894 DOI: 10.1371/journal.pone.0224675] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2019] [Accepted: 10/19/2019] [Indexed: 11/22/2022] Open
Abstract
Introduction The United States Medical Licensing Examination (USMLE) Step 1 and Step 2 Clinical Knowledge (CK) are important for trainee medical knowledge assessment and licensure, medical school program assessment, and residency program applicant screening. Little is known about how USMLE performance varies between institutions. This observational study attempts to identify institutions with above-predicted USMLE performance, which may indicate educational programs successful at promoting students’ medical knowledge. Methods Self-reported institution-level data was tabulated from publicly available US News and World Report and Association of American Medical Colleges publications for 131 US allopathic medical schools from 2012–2014. Bivariate and multiple linear regression were performed. The primary outcome was institutional mean USMLE Step 1 and Step 2 CK scores outside a 95% prediction interval (≥2 standard deviations above or below predicted) based on multiple regression accounting for students’ prior academic performance. Results Eighty-nine US medical schools (54 public, 35 private) reported complete USMLE scores over the three-year study period, representing over 39,000 examinees. Institutional mean grade point average (GPA) and Medical College Admission Test score (MCAT) achieved an adjusted R2 of 72% for Step 1 (standardized βMCAT 0.7, βGPA 0.2) and 41% for Step 2 CK (standardized βMCAT 0.5, βGPA 0.3) in multiple regression. Using this regression model, 5 institutions were identified with above-predicted institutional USMLE performance, while 3 institutions had below-predicted performance. Conclusions This exploratory study identified several US allopathic medical schools with significant above- or below-predicted USMLE performance. Although limited by self-reported data, the findings raise questions about inter-institutional USMLE performance parity, and thus, educational parity. Additional work is needed to determine the etiology and robustness of the observed performance differences.
Collapse
Affiliation(s)
- Jesse Burk-Rafel
- Department of Internal Medicine, New York University Langone Health, New York, NY, United States of America
- * E-mail:
| | - Ricardo W. Pulido
- Department of Otolaryngology–Head and Neck Surgery, University of Washington, Seattle, WA, United States of America
| | - Yousef Elfanagely
- Department of Internal Medicine, Brown University, Providence, RI, United States of America
| | - Joseph C. Kolars
- Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI, United States of America
| |
Collapse
|