1
|
Bird JB, Olvet DM, Willey JM, Brenner JM. A Generalizable Approach to Predicting Performance on USMLE Step 2 CK. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2022; 13:939-944. [PMID: 36039184 PMCID: PMC9419904 DOI: 10.2147/amep.s373300] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Accepted: 07/31/2022] [Indexed: 06/15/2023]
Abstract
INTRODUCTION The elimination of the USMLE Step 1 three-digit score has created a deficit in standardized performance metrics for undergraduate medical educators and residency program directors. It is likely that there will be greater emphasis on USMLE Step 2 CK, an exam found to be associated with later clinical performance in residents and physicians. Because many previous models relied on Step 1 scores to predict student performance on Step 2 CK, we developed a model using other metrics. MATERIALS AND METHODS Assessment data for 228 students in three cohorts (classes of 2018, 2019, and 2020) were collected, including the Medical College Admission Test (MCAT), NBME Customized Assessment Service (CAS) exams and NBME Subject exams. A linear regression model was conducted to predict Step 2 CK scores at five time-points: at the end of years one and two and at three trimester intervals in year three. An additional cohort (class of 2021) was used to validate the model. RESULTS Significant models were found at 5 time-points in the curriculum and increased in predictability as students progressed: end of year 1 (adj R2 = 0.29), end of year 2 (adj R2 = 0.34), clerkship trimester 1 (adj R2 = 0.52), clerkship trimester 2 (adj R2 = 0.58), clerkship trimester 3 (adj R2 = 0.62). Including Step 1 scores did not significantly improve the final model. Using metrics from the class of 2021, the model predicted Step 2 CK performance within a mean square error (MSE) of 8.3 points (SD = 6.8) at the end of year 1 increasing predictability incrementally to within a mean of 5.4 points (SD = 4.1) by the end of year 3. CONCLUSION This model is highly generalizable and enables medical educators to predict student performance on Step 2 CK in the absence of Step 1 quantitative data as early as the end of the first year of medical education with increasingly stronger predictions as students progressed through the clerkship year.
Collapse
Affiliation(s)
- Jeffrey B Bird
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Doreen M Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Joanne M Willey
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| | - Judith M Brenner
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
- Department of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA
| |
Collapse
|
2
|
Khalil MK. Weekly near-peer tutoring sessions improve students' performance on basic medical sciences and USMLE Step1 examinations. MEDICAL TEACHER 2022; 44:752-757. [PMID: 35073221 DOI: 10.1080/0142159x.2022.2027901] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE This study examined the relationship between attendance at weekly near-peer tutoring (NPT) sessions offered in the second year of medical school and academic performance on basic science and USMLE Step 1 examinations. METHODS Twenty-four weekly NPT sessions were delivered across all modules in the second year of medical school. Attendance of the sessions was recorded and students were divided into three groups: high (16-24 sessions), moderate (7-15 sessions), and low-no (0-6 sessions) attendance groups. Pearson product-moment correlation coefficient was computed to determine the relationship between students' frequency of attendance and their performance on overall basic sciences, two NBME CBSEs, and USMLE Step 1 examinations. Students' academic performance was also analyzed using ANOVA and Bonferroni post hoc test (p < 0.05) to compare differences between the three groups. RESULTS Pearson correlation analyses revealed that attending peer tutoring sessions was significantly correlated with students' performance in overall basic sciences, CBSE mid-year, CBSE final, and USMLE Step 1 examinations. The high attendance groups significantly outperformed the low-no attendance groups on overall basic sciences (p = 0.007), CBSE mid-year (p < 0.001), CBSE final (p < 0.018), and USMLE Step 1 (p = 0.048) examinations. CONCLUSIONS Attending NPT sessions are significantly correlated with students' performance on basic sciences and on USMLE Step1 examinations. Attendance of weekly NPT sessions is a valuable experience for M2 students.
Collapse
Affiliation(s)
- Mohammed K Khalil
- Department of Biomedical Sciences, University of South Carolina School of Medicine Greenville, Greenville, SC, USA
| |
Collapse
|
3
|
Puri N, McCarthy M, Miller B. Validity and Reliability of Pre-matriculation and Institutional Assessments in Predicting USMLE STEP 1 Success: Lessons From a Traditional 2 x 2 Curricular Model. Front Med (Lausanne) 2022; 8:798876. [PMID: 35155475 PMCID: PMC8829749 DOI: 10.3389/fmed.2021.798876] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Accepted: 12/20/2021] [Indexed: 11/21/2022] Open
Abstract
Purpose We have observed that students' performance in our pre-clerkship curriculum does not align well with their United States Medical Licensing Examination (USMLE) STEP1 scores. Students at-risk of failing or underperforming on STEP1 have often excelled on our institutional assessments. We sought to test the validity and reliability of our course assessments in predicting STEP1 scores, and in the process, generate and validate a more accurate prediction model for STEP1 performance. Methods Student pre-matriculation and course assessment data of the Class of 2020 (n = 76) is used to generate a stepwise STEP1 prediction model, which is tested with the students of the Class of 2021 (n = 71). Predictions are developed at the time of matriculation and subsequently at the end of each course in the programing language R. For the Class of 2021, the predicted STEP1 score is correlated with their actual STEP1 scores, and data agreement is tested with means-difference plots. A similar model is generated and tested for the Class of 2022. Results STEP1 predictions based on pre-matriculation data are unreliable and fail to identify at-risk students (R2 = 0.02). STEP1 predictions for most year one courses (anatomy, biochemistry, physiology) correlate poorly with students' actual STEP1 scores (R2 = 0.30). STEP1 predictions improve for year two courses (microbiology, pathology, and pharmacology). But integrated courses with customized NBMEs provide more reliable predictions (R2 = 0.66). Predictions based on these integrated courses are reproducible for the Class of 2022. Conclusion MCAT and undergraduate GPA are poor predictors of student's STEP1 scores. Partially integrated courses with biweekly assessments do not promote problem-solving skills and leave students' at-risk of failing STEP1. Only courses with integrated and comprehensive assessments are reliable indicators of students' STEP1 preparation.
Collapse
Affiliation(s)
- Nitin Puri
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| | - Michael McCarthy
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| | - Bobby Miller
- Office of Medical Education, Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, United States
| |
Collapse
|
4
|
Keltner C, Haedinger L, Carney PA, Bonura EM. Preclinical Assessment Performance as a Predictor of USMLE Step 1 Scores or Passing Status. MEDICAL SCIENCE EDUCATOR 2021; 31:1453-1462. [PMID: 34457984 PMCID: PMC8368122 DOI: 10.1007/s40670-021-01334-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/20/2021] [Indexed: 06/13/2023]
Abstract
PURPOSE To determine the association between student performance on preclinical pass/fail assessments in an allopathic medical school curriculum and Step 1 scores or passing status. MATERIALS AND METHODS This observational retrospective study involved preclinical assessments, including National Board of Medical Examiners Customized Assessment Services (NBME CAS) exams, faculty developed exams, and the United States Medical Licensing Examination (USMLE) Step 1 from 582 medical students in four cohorts (2018-2021). Analyses included descriptive statistics, Pearson's correlation coefficient (ρ) and logistic regression, presented as odds ratios (ORs) and associated p values. RESULTS Mean scores on Component 4 end-of-block NBME CAS examinations positively correlated with Step 1 scores (ρ = 0.83, p < .001), as did mean scores on both Component 1 weekly faculty-created assessments and Component 3 end-of-block faculty-created assessments (ρ = 0.70, p < .001; ρ = 0.73, p < .001). Passing all Component 3 end-of-block faculty-created assessments in all blocks was associated with passing Step 1 (OR = 8.66, p < .001). Independently, passing all Component 4 NBME CAS exams or passing all Component 1 weekly faculty-derived assessments in all blocks did not correlate with passing Step 1 (OR = 2.40, p = .12.; OR = 0.29, p = .30). Passing all assessment types in all blocks was among the strongest correlators with passing Step 1 (OR = 9.026, p < .001). CONCLUSIONS Scores on faculty-derived and NBME CAS end-of-block assessments were positively correlated with Step 1 scores. Passing status on institution-derived end-of-block assessments was associated with passing Step 1, whereas passing status on weekly institution-derived assessments or end-of-block NBME CAS assessments was not associated with passing Step 1. End-of-block pass/fail NBME CAS and faculty-derived preclinical examinations may help prepare students for Step 1 and predict their outcomes. Weekly faculty-created assessments should primarily be used to continuously reinforce educational material.
Collapse
Affiliation(s)
- Case Keltner
- School of Medicine, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Portland, OR 97239 USA
| | | | | | - Erin M. Bonura
- Division of Infectious Diseases, Oregon Health & Science University, Portland, OR USA
| |
Collapse
|
5
|
Hauer KE, Boscardin C, Brenner JM, van Schaik SM, Papp KK. Twelve tips for assessing medical knowledge with open-ended questions: Designing constructed response examinations in medical education. MEDICAL TEACHER 2020; 42:880-885. [PMID: 31282798 DOI: 10.1080/0142159x.2019.1629404] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Medical knowledge examinations employing open-ended (constructed response) items can be useful to assess medical students' factual and conceptual understanding. Modern day curricula that emphasize active learning in small groups and other interactive formats lend themselves to an assessment format that prompts students to share conceptual understanding, explain, and elaborate. The open-ended question examination format can provide faculty with insights into learners' abilities to apply information to clinical or scientific problems, and reveal learners' misunderstandings about essential content. To implement formative or summative assessments with open-ended questions in a rigorous manner, educators must design systems for exam creation and scoring. This includes systems for constructing exam blueprints, items and scoring rubrics, and procedures for scoring and standard setting. Information gained through review of students' responses can guide future educational sessions and curricular changes in a cycle of continuous improvement.
Collapse
Affiliation(s)
- Karen E Hauer
- Department of Medicine, School of Medicine, University of California, San Francisco, CA, USA
| | - Christy Boscardin
- Department of Medicine, School of Medicine, University of California, San Francisco, CA, USA
| | - Judith M Brenner
- Department of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA
| | - Sandrijn M van Schaik
- Department of Pediatrics, University of California, San Francisco School of Medicine, San Francisco, CA, USA
| | - Klara K Papp
- Division of General Medical Sciences, Case Western Reserve University School of Medicine, Cleveland, OH, USA
| |
Collapse
|
6
|
Kauffman CA, Derazin M, Asmar A, Kibble JD. Patterns of medical student engagement in a second-year pathophysiology course: relationship to USMLE Step 1 performance. ADVANCES IN PHYSIOLOGY EDUCATION 2019; 43:512-518. [PMID: 31553640 DOI: 10.1152/advan.00082.2019] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Historically, attendance has been a marker of academic performance, but the current medical education literature has had mixed results. In addition, attendance is dropping in the preclinical curricula, whereas, at the same time, the focus on United States Medical Licensing Examination Step 1 performance is increasing. This present study is a mixed-method approach correlating student attendance and access to the formal curriculum in a second-year pathophysiology course to performance on Step 1. Additionally, survey and focus group data evaluated the usage and importance of both the formal curriculum and third-party resources. Out of 112 eligible students, 77 participated in the study. There was no correlation between attendance or access to the learning materials and Step 1 performance. There was a strong correlation between the performance on the final examination and that of Step 1 (r = 0.813; P < 0.001) and a moderate correlation between formative quiz (r = 0.321; P = 0.005) and individual readiness assessment test performance (r = 0.351; P = 0.002) and Step 1 performance. Survey and focus group data show that students place high importance on faculty-developed materials that they can use on their own, but not attendance. The third-party resources are highly used as an adjunct to the formal curriculum and to focus on Step 1 study. Attendance and access to the formal curriculum do not predict Step 1 performance, whereas performance on high- and low-stakes internal assessments do. Further study on how the lack of social interaction gained from attendance affects development of other competencies and the learning climate are warranted.
Collapse
Affiliation(s)
| | - Megan Derazin
- College of Medicine, University of Central Florida, Orlando, Florida
| | - Abdo Asmar
- College of Medicine, University of Central Florida, Orlando, Florida
| | - Jonathan D Kibble
- College of Medicine, University of Central Florida, Orlando, Florida
| |
Collapse
|
7
|
Chen PH, Scanlon MH. Teaching Radiology Trainees From the Perspective of a Millennial. Acad Radiol 2018; 25:794-800. [PMID: 29573938 DOI: 10.1016/j.acra.2018.02.008] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Revised: 02/17/2018] [Accepted: 02/18/2018] [Indexed: 10/17/2022]
Abstract
The millennial generation consists of today's medical students, radiology residents, fellows, and junior staff. Millennials' comfort with immersive technology, high expectations for success, and desire for constant feedback differentiate them from previous generations. Drawing from an author's experiences through radiology residency and fellowship as a millennial, from published literature, and from the mentorship of a long-time radiology educator, this article explores educational strategies that embrace these characteristics to engage today's youngest generation both in and out of the reading room.
Collapse
|
8
|
Varpio L, Farnan JM, Park YS. Summary: Research Diseases Need Holistic Care. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:S7-S11. [PMID: 29065017 DOI: 10.1097/acm.0000000000001923] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Affiliation(s)
- Lara Varpio
- L. Varpio is associate professor, Department of Medicine, Uniformed Services University of Health Sciences, Bethesda, Maryland. J.M. Farnan is associate professor of medicine and assistant dean of curriculum development and evaluation, University of Chicago Pritzker School of Medicine, Chicago, Illinois. Y.S. Park is associate professor, Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, Illinois
| | | | | |
Collapse
|