1
|
Toonkel RL, Pock AR, Hauer KE, Kogan JR, Seibert CS, Swan Sein A, Monrad SU, Gordon D, Daniel M, Ryan MS, Ismail N, Fazio SB, Santen SA. Stepping Back: How Should Pass/Fail Scoring Influence Step 1 Timing? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2025; 100:137-143. [PMID: 39316463 DOI: 10.1097/acm.0000000000005887] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/26/2024]
Abstract
ABSTRACT Although most students complete Step 1 before clerkships, some institutions delay the exam until after clerkships. The change to pass/fail grading adds additional complexity that should be considered when deciding about exam timing. Both early and late administration may affect learning outcomes, learner behavior, student well-being, and residency match success. Step 1 completion before clerkships promotes learning outcomes (e.g., integration and mastery of foundational material), may encourage students to focus on the curriculum, and may better prepare students for clinical science exams (CSEs). However, delaying the exam ensures that students maintain foundational knowledge and may encourage clinical educators to demonstrate basic science illustrations. An early Step 1 may affect learner behavior by allowing clerkship students to focus on clinical learning. The associated National Board of Medical Examiners performance report may also be used for Step 2 and CSE preparation. However, delaying Step 1 allows greater scheduling flexibility based on developmental milestones. Administration of Step 1 before clerkships removes a significant stressor from the clinical year and decompresses the residency application period. However, a delayed Step 1 reduces the pressure on students to engage in numerous extracurricular and research activities to distinguish themselves due to the pass/fail change. An early Step 1 exam may also lead to improved CSE performance, which is often linked to clerkship honors criteria, an increasingly valuable distinction for residency match success after the change to pass/fail. In contrast, delaying Step 1 is associated with higher first-time pass rates, which may be especially important for students at risk for failure. Medical educators and students should collaboratively approach the question of Step 1 timing, considering these factors within the context of the medical school program, curricular constraints and priorities, and students' individual needs and goals.
Collapse
|
2
|
Chosie K. Using Osteopathic Medical School Institutional Data to Establish the Correlation Between Academic Performance and First-Time Board Pass Rates to Develop Academic Interventions for Students at Risk of First Board Failure. Cureus 2024; 16:e72354. [PMID: 39583446 PMCID: PMC11585896 DOI: 10.7759/cureus.72354] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2024] [Accepted: 10/25/2024] [Indexed: 11/26/2024] Open
Abstract
The rapid increase in the number of osteopathic medical schools creates a highly competitive medical school environment as each osteopathic medical school seeks to enroll top-performing students. This results in osteopathic colleges' need to be innovative in the academic support interventions they provide to help increase their student's chances of successfully passing their Comprehensive Osteopathic Medical Licensing Exam (COMLEX) level 1 and 2 board exams on their first attempt. A correlation analysis was performed of the Alabama College of Osteopathic Medicine's (ACOM) class of 2024's (188 students) pre-matriculation metrics, including Grade Point Average(GPA), Science Grade Point Average (SGPA), and Medical College Admissions Test (MCAT), performance in each course, and on two specific exams to determine if causality or correlation existed between these variables and first-time Comprehensive Osteopathic Medical Licensing Exam (COMLEX) level 1 and level 2 pass rates. Using the Spearman correlation (significant at the p<0.01 level) reveals that, if a student was in the lowest quartile in any of the following courses, they were at risk of failing COMLEX level 1: medical biochemistry (00.156), neuroanatomy (00.168), cardiology (00.275), renal (00.176), respiratory (00.235), or gastrointestinal (00.209). The lowest quartile performance in neuroanatomy (00.228) and gastrointestinal (00.154 - significant at the p<0.05 level) correlates to a first-time failure of COMLEX level 2. Specific Interventions have been developed for students in the lowest quartile of the courses identified in the findings of the institutional data analysis to better prepare them to pass the COMLEX level 1 and level 2 board exams.
Collapse
Affiliation(s)
- Kim Chosie
- Academic Support Services, Alabama College of Osteopathic Medicine, Dothan, USA
| |
Collapse
|
3
|
Alexander SM, Shenvi CL, Nichols KR, Dent G, Smith KL. Multivariate Modeling of Student Performance on NBME Subject Exams. Cureus 2023; 15:e40809. [PMID: 37485212 PMCID: PMC10362906 DOI: 10.7759/cureus.40809] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/22/2023] [Indexed: 07/25/2023] Open
Abstract
Aim This study sought to determine whether it was possible to develop statistical models which could be used to accurately correlate student performance on clinical subject exams based on their National Board of Medical Examiner (NBME) self-assessment performance and other variables, described below, as such tools are not currently available. Methods Students at a large public medical school were provided fee vouchers for NBME self-assessments before clinical subject exams. Multivariate regression models were then developed based on how self-assessment performance correlated to student success on the subsequent subject exam (Medicine, Surgery, Family Medicine, Obstetrics-Gynecology, Pediatrics, and Psychiatry) while controlling for the proximity of the self-assessment to the exam, USMLE Step 1 score, and the academic quarter. Results The variables analyzed satisfied the requirements of linear regression. The correlation strength of individual variables and overall models varied by discipline and outcome (equated percent correct or percentile, Model R2 Range: 0.1799-0.4915). All models showed statistical significance on the Omnibus F-test (p<0.001). Conclusion The correlation coefficients demonstrate that these models have weak to moderate predictive value, dependent on the clinical subject, in predicting student performance; however, this varies widely based on the subject exam in question. The next step is to utilize these models to identify struggling students to determine if their use reduces failure rates and to further improve model accuracy by controlling for additional variables.
Collapse
Affiliation(s)
- Seth M Alexander
- Medicine, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
- Education, Harvard Graduate School of Education, Cambridge, USA
| | - Christina L Shenvi
- Emergency Medicine, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
| | - Kimberley R Nichols
- Anesthesiology, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
| | - Georgette Dent
- Pathology and Laboratory Medicine, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
| | - Kelly L Smith
- Family Medicine, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
| |
Collapse
|
4
|
Zubiaurre Bitzer LA, Dathatri S, Fine JB, Swan Sein A. Building a student learning-focused assessment and grading system in dental school: One school's experience. J Dent Educ 2023; 87:614-624. [PMID: 36607618 DOI: 10.1002/jdd.13158] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Revised: 11/21/2022] [Accepted: 12/06/2022] [Indexed: 01/07/2023]
Abstract
PURPOSE/OBJECTIVES As health professions education moves toward competency-based education, there has been increased focus on the structure of assessment systems that support student competency development and learning. This has been buoyed by a growing body of research supporting assessment for learning processes to promote student growth and learning rather than relying on assessment systems primarily to measure performance. This paper presents the rationale and evidence for moving to an assessment for learning system and the results of a quasi-experimental interrupted time series study using data from 2015 to 2022 to evaluate the impacts of these changes. METHODS Columbia University College of Dental Medicine faculty voted to implement assessment for learning system changes beginning in 2017 with the graduating class of 2021. These changes included moving from using a grading system for didactic courses with Honors, Pass, and Fail as available grades to a grading system with only Pass and Fail as available grades, as well as creating synthesis and assessment weeks, weekly problem sets, post-exam review sessions, exam remediation opportunities, and formative progress exams throughout the curriculum. The revised assessment and grading system changes were communicated to residency program directors, and programmatic competency data about student performance across the curriculum were shared with programs in Dean's Letters. RESULTS Once assessment system changes were implemented, it was found that student exam failure rates were lower, course exam scores were the same or higher, and performance on board exams improved compared to the national average. Students reported positive perceptions with regard to well-being and learning climate that they associated with the adoption of Pass/Fail grading. Match outcomes, including student satisfaction and program director ratings, have remained consistently positive. CONCLUSION As dental educators, our goal is to nurture students to become life-long learners. Adopting a grading structure that is Pass/Fail and an assessment system that fosters learning allows students to shape learning practices that favor long-term retention and application of information, also enhancing the learning environment and student well-being. These system changes may also facilitate the inclusion and support of students whose backgrounds are underrepresented in dentistry.
Collapse
Affiliation(s)
| | - Shubha Dathatri
- Vagelos College of Physicians and Surgeons, Columbia University, New York, New York, USA
| | - James B Fine
- College of Dental Medicine, Columbia University, New York, New York, USA
| | - Aubrie Swan Sein
- Vagelos College of Physicians and Surgeons, Columbia University, New York, New York, USA
| |
Collapse
|
5
|
Jeyaraju M, Linford H, Bosco Mendes T, Caufield-Noll C, Tackett S. Factors Leading to Successful Performance on U.S. National Licensure Exams for Medical Students: A Scoping Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:136-148. [PMID: 35857389 DOI: 10.1097/acm.0000000000004877] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE To synthesize the evidence of the factors leading to successful performance on knowledge-based national licensure exams (NLEs) for medical students. METHOD The authors conducted a scoping review to summarize the peer-reviewed empiric literature that used United States Medical Licensing Examination (USMLE) Step 1 or Step 2 Clinical Knowledge or Comprehensive Osteopathic Medical Licensing Examination (COMLEX) Level 1 or Level 2 Cognitive Evaluation scores as outcomes. The authors searched PubMed and Scopus without date restrictions through April 30, 2021. Two reviewers independently screened and selected studies for inclusion. Data were summarized narratively and with descriptive statistics. RESULTS The authors screened 1,185 unique citations and included 233 full-text studies in their review. Of these, 201 (86%) were studies of USMLE exams, 31 (13%) were studies of COMLEX exams, and 1 (0.4%) reported on both. The authors classified 29 studies (12%) as informing NLE preparation, 163 (70%) as attempting to identify predictive variables, and 76 (33%) as using NLE scores for program evaluation. Preparation studies found that the number of practice test items, practice exam scores, and less time in dedicated preparation correlated with higher NLE scores. Use of other commercial resources or study strategies was not consistently associated with higher scores. Predictive studies found the strongest relationships between individuals' performance on past assessments and their NLE scores. CONCLUSIONS The factors leading to successful performance on knowledge-based NLEs align with well-known principles from the cognitive sciences. Learners build on existing foundations of knowledge (reflected in their prior academic performance) and are likely to learn more efficiently with testing and spaced learning over time. While commercial test preparation resources are ubiquitous, there is no evidence that a single resource gives students a competitive advantage on NLEs. Developing habits of regular and continuous learning is necessary for clinical practice and successful NLE performance.
Collapse
Affiliation(s)
- Maniraj Jeyaraju
- M. Jeyaraju was a medical student, University of Maryland School of Medicine, Baltimore, Maryland, at the time this study was completed. He is now a family medicine resident, University of North Carolina School of Medicine, Chapel Hill, North Carolina; ORCID: https://orcid.org/0000-0003-1170-2422
| | - Henry Linford
- H. Linford was a postgraduate year 1 transitional resident, Crozer Health, Upland, Pennsylvania, at the time this study was completed. He is now a psychiatry resident, Texas Institute for Graduate Medical Education and Research, San Antonio, Texas
| | - Thiago Bosco Mendes
- T. Bosco Mendes was endocrinologist, Departamento de Medicina Interna, Universidade do Estado de São Paulo (Unesp), Botucatu, São Paulo, Brasil, at the time this study was completed. He is now an internal medicine resident, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania; ORCID: https://orcid.org/0000-0001-8349-3303
| | - Christine Caufield-Noll
- C. Caufield-Noll was informationist, National Institutes of Health Library, National Institutes of Health, Bethesda, Maryland, at the time this study was completed; ORCID: https://orcid.org/0000-0002-5637-3717
| | - Sean Tackett
- S. Tackett is associate professor of medicine and international medical education director, Division of General Internal Medicine, Johns Hopkins Bayview Medical Center, Baltimore, Maryland; ORCID: https://orcid.org/0000-0001-5369-7225
| |
Collapse
|
6
|
Lynch TV, Beach IR, Kajtezovic S, Larkin OG, Rosen L. Step Siblings: a Novel Peer-Mentorship Program for Medical Student Wellness During USMLE Step 1 Preparation. MEDICAL SCIENCE EDUCATOR 2022; 32:803-810. [PMID: 35729988 PMCID: PMC9189789 DOI: 10.1007/s40670-022-01571-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 05/29/2022] [Indexed: 06/15/2023]
Abstract
Introduction The US Medical Licensing Examination (USMLE) Step 1 exam has proven a difficult stressor for medical students during their training, even with the advent of pass-fail scoring. The preparation period before the exam places students at high risk for burnout and depression, leading to impaired exam performance and other serious consequences including suicide. Many medical schools already provide academic support for students during USMLE Step 1 preparation, yet to date, there are no published programs specifically geared towards mental health support during this time. Methods Students from the Larner College of Medicine at the University of Vermont developed the "Step-Siblings" program to partner pre-clinical level students preparing for Step 1 (Little Sibs) with clinical-level students (Big Sibs) in an effort to promote near-peer mentorship and support for those studying. Big Sibs were trained to offer emotional support and wellness advice, but specifically not to provide academic counselling. The pilot program was evaluated by student surveys. Results Our program successfully paired Little Sibs (n = 125) with Big Sibs (n = 75) several months preceding the Step 1 dedicated study period, achieving the intended effect of reducing burnout and fostering a supportive community during a notoriously isolating and emotionally challenging time. Survey results indicated that a majority of Little and Big Sibs found the program helpful. Conclusions This student-driven mentorship model is simple to implement, easily generalizable to other medical schools and other board exams, and bears the lasting benefit of combatting the stress and burnout so prevalent in medical education. Supplementary Information The online version contains supplementary material available at 10.1007/s40670-022-01571-4.
Collapse
Affiliation(s)
- Tierra V. Lynch
- Larner College of Medicine, University of Vermont, Burlington, VT USA
| | - Isidora R. Beach
- Larner College of Medicine, University of Vermont, Burlington, VT USA
| | - Sidika Kajtezovic
- Larner College of Medicine, University of Vermont, Burlington, VT USA
| | - Olivia G. Larkin
- Larner College of Medicine, University of Vermont, Burlington, VT USA
| | - Lee Rosen
- Larner College of Medicine, University of Vermont, Burlington, VT USA
| |
Collapse
|
7
|
Impact of USMLE Step-1 accommodation denial on US medical schools: A national survey. PLoS One 2022; 17:e0266685. [PMID: 35421144 PMCID: PMC9009603 DOI: 10.1371/journal.pone.0266685] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2021] [Accepted: 03/24/2022] [Indexed: 11/19/2022] Open
Abstract
INTRODUCTION In 2019, 4.6% of US-MD students self-identified as students with disabilities (SWD); many of these students will require accommodations on the USMLE Step-1 examination. Given the high-stakes nature of Step-1 for medical school advancement and residency match, SWD denied accommodations on Step-1 face considerable consequences. To date no study has investigated the rate of accommodation denial and its impact on medical school operations. METHODS To investigate the rate of accommodation denial and evaluate whether Step-1 accommodation denial impacts medical school operations, a 10-question survey was sent to Student Affairs Deans and disability resource professionals at all fully-accredited US-MD granting programs. Two open-ended questions were analyzed using qualitative content analysis. RESULTS Seventy-three of the 141 schools responded (52%). In the 2018-2019 academic year, 276 students from 73 schools applied for Step-1 accommodations. Of these, 144 (52%) were denied. Of those denied, 74/144 (51%) were delayed entry into the next phase of curriculum and 110/144 (76%) took the Step-1 exam unaccommodated. Of the 110 who took Step-1 without accommodations, 35/110 (32%) failed the exam, and 4/110 (3%) withdrew or were dismissed following exam failure. Schools reported varied investments of time and financial support for students denied accommodations, with most schools investing less than 20 hours (67%) and less than $1,000.00 (69%). Open-responses revealed details regarding the impact of denial on schools and students including frustration with process; financial and human resources allocation; delay in student progression; lack of resourcing and expertise; and emotional and financial burdens on students. DISCUSSION Step-1 accommodation denial has non-trivial financial, operational, and career impacts on medical schools and students alike. The cause of accommodation denial in this population requires further exploration.
Collapse
|
8
|
The US Medical Licensing Examination Step 1 Scoring Change: A Survey of Orthopaedic Surgery Residency Applicants From the 2019 to 2020 Match Cycle. J Am Acad Orthop Surg 2022; 30:240-246. [PMID: 35025821 DOI: 10.5435/jaaos-d-21-00615] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Accepted: 11/24/2021] [Indexed: 02/01/2023] Open
Abstract
INTRODUCTION The USMLE Step 1 examination has been used as an objective measure for comparing residency applicants. Recently, the National Board of Medical Examiners and the Federation of State Medical Boards decided that the USMLE Step 1 examination will transition to a pass/fail result starting no earlier than 2022. The purpose of this study was to investigate the perspective of medical students who applied for orthopaedic surgery residency positions during the 2019 to 2020 interview cycle on the USMLE scoring change, and the potential effect this change may result in for future applicants and the residency selection process. METHODS A 15-item anonymous web-based survey was sent to 1,090 orthopaedic surgery residency applicants from four regionally diverse residency programs. The survey elicited attitudes toward the transition of the Step 1 examination to pass/fail and perspectives this change may or may not have on the residency selection process. RESULTS Responses were received from 356 applicants (32.7%). The majority (61.6%) disagreed with the change to pass/fail scoring, and 68.5% do not believe that the change will decrease stress levels in medical students. For interview invitations, respondents chose Step 2 clinical knowledge, letters of recommendation, and performance on away rotations as the most influential factors in the absence of a Step 1 score. CONCLUSION Most of the students surveyed who applied for an orthopaedic surgery residency position during the most recent application cycle disagreed with the National Board of Medical Examiner/Federation of State Medical Board decision to change Step 1 to pass/fail and feel that this change may have disadvantage in certain student groups while either increasing or having no effect on medical student stress. LEVEL OF EVIDENCE IV.
Collapse
|
9
|
Burton E, Assi L, Vongsachang H, Swenor BK, Srikumaran D, Woreta FA, Johnson TV. Demographics, clinical interests, and ophthalmology skills confidence of medical student volunteers and non-volunteers in an extracurricular community vision screening service-learning program. BMC MEDICAL EDUCATION 2022; 22:143. [PMID: 35246114 PMCID: PMC8894556 DOI: 10.1186/s12909-022-03194-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/11/2021] [Accepted: 02/22/2022] [Indexed: 05/29/2023]
Abstract
BACKGROUND Medical school curricular hours dedicated to ophthalmology are low and declining. Extracurricular ophthalmology activities, such as participation in community vision screenings, may serve an important adjunctive role in medical school curricula. The Johns Hopkins University (JHU) Vision Screening In Our Neighborhoods (ViSION) Program is an example of a voluntary medical student-directed community service-learning program. METHODS We used a mixed-methods cross-sectional approach, including an online survey and semi-structured interviews. JHU School of Medicine students enrolled in MD or MD/PhD programs during the 2019-2020 academic year were surveyed regarding demographics, career and service interests, involvement in ophthalmology-related activities, and confidence in their ophthalmology-related skills. Survey responses were compared between ViSION volunteers and non-volunteers using Fisher's exact chi-square tests. Semi-structured interviews were conducted via webconference with 8 prior or current ViSION volunteers and responses analyzed using inductive thematic analysis. Data were collected when ViSION volunteers were in variable stages of their medical education and involvement with the ViSION program. RESULTS A total of 118 medical students were included, representing an overall response rate of 24.6% of JHU medical students. ViSION volunteers reported greater involvement in ophthalmology-related research (42% vs. 4%, p < 0.001), intent to apply to ophthalmology residency programs (35% vs. 1%, p = 0.001), and confidence with multiple ophthalmology knowledge and clinical skill domains. In particular, ViSION volunteers were more likely to feel confident estimating cup-to-disc ratio using direct ophthalmoscopy (20% vs. 0%, p < 0.001). In open-ended survey and interview questions, most volunteers attributed at least some degree of their ophthalmology skill development and desire to pursue ophthalmology and public health careers to their ViSION experience. CONCLUSIONS Medical students who volunteered with a student-led community vision screening program were more likely to have a prior interest in ophthalmology than those who did not volunteer, but only 1/3 of volunteers planned to pursue a career in ophthalmology. Overall, volunteers reported higher confidence performing ophthalmology-related clinical skills, suggesting that student-led community vision screening programs may provide an important avenue for medical students to explore public health aspects of ophthalmology, while practicing ophthalmology exam skills and learning about common ophthalmic pathologies, regardless of their career intentions.
Collapse
Affiliation(s)
- Eleanor Burton
- Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Lama Assi
- Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 N Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA
| | | | - Bonnielin K Swenor
- Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 N Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA
| | - Divya Srikumaran
- Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 N Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA
| | - Fasika A Woreta
- Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 N Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA
| | - Thomas V Johnson
- Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 N Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA.
| |
Collapse
|
10
|
Raborn LN, Janis JE. Current Views on the New United States Medical Licensing Examination Step 1 Pass/Fail Format: A Review of the Literature. J Surg Res 2022; 274:31-45. [PMID: 35121548 DOI: 10.1016/j.jss.2022.01.002] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2021] [Revised: 11/20/2021] [Accepted: 01/02/2022] [Indexed: 10/19/2022]
Abstract
INTRODUCTION Residency programs have historically used numerical Step 1 scores to screen applicants, making it a career-defining, high-stakes examination. Step 1 scores will be reported as pass/fail starting in January 2022, fundamentally reshaping the residency application review process. This review aimed to identify opinions of physicians and medical students about the new format, identify arguments in support of or against the change, and determine the implications of this change on the residency selection process. METHODS A comprehensive PubMed review was performed in May 2021 to identify articles that discussed the new Step 1 format. Non-English and duplicate articles were excluded. Data collected from each article included publication year, specialty, subjects, and key findings. RESULTS A total of 81 articles were included, 26 of which discussed the impact of the new format within surgical fields (32.1%). Remaining articles discussed the implications within the medical community as a whole (n = 33, 40.7%) and nonsurgical fields (n = 22, 27.2%). Studies suggest Program Directors will rely on Step 2 Clinical Knowledge (CK) scores, medical school reputation, applicant familiarity, Dean's letters, recommendation letters, and research in lieu of numerical Step 1 scores. In addition, concerns have been raised that the new format will disadvantage international, osteopathic, and minority applicants while increasing stress surrounding Step 2 CK. CONCLUSIONS Within the medical community, there are concerns that Step 2 CK will be used to substitute Step 1 and that resident diversity will diminish due to the new Step 1 format. Holistic candidate consideration will be increasingly important.
Collapse
Affiliation(s)
- Layne N Raborn
- Department of Surgery, Louisiana State University Health Sciences Center, New Orleans, Louisiana
| | - Jeffrey E Janis
- Department of Plastic and Reconstructive Surgery, Wexner Medical Center, Ohio State University, Columbus, Ohio.
| |
Collapse
|
11
|
Go C, Sachdev U. Letters of recommendation: Nuanced bias or useful affirmation? J Vasc Surg 2021; 74:29S-32S. [PMID: 34303456 DOI: 10.1016/j.jvs.2021.03.050] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Accepted: 03/13/2021] [Indexed: 10/20/2022]
Abstract
Narrative letters of recommendation (NLORs) have become key elements of the application process for residency and fellowship. The potential to inadvertently admit bias into these subjective narratives has become an area of research focus across many disciplines. In the present review, we have highlighted the current data regarding bias in NLORs. We also believe that one of the most effective methods to eliminate bias from written recommendations is to first understand that it exists. Thus, the objective measures that have been taken to identify bias in NLORs are important steps in the right direction. We have presented and reflected on the accrued data on bias in NLORs pertaining to surgical training.
Collapse
Affiliation(s)
- Catherine Go
- Division of Vascular Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pa
| | - Ulka Sachdev
- Division of Vascular Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pa.
| |
Collapse
|
12
|
Jurich D, Daniel M, Hauer KE, Seibert C, Chandran L, Pock AR, Fazio SB, Fleming A, Santen SA. Does Delaying the United States Medical Licensing Examination Step 1 to After Clerkships Affect Student Performance on Clerkship Subject Examinations? TEACHING AND LEARNING IN MEDICINE 2021; 33:366-381. [PMID: 33356583 DOI: 10.1080/10401334.2020.1860063] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Phenomenon: Schools are considering the optimal timing of Step 1 of the United States Medical Licensing Examination (USMLE). Two primary reasons for moving Step 1 after the core clerkships are to promote deeper, more integrated basic science learning in clinical contexts and to better prepare students for the increasingly clinical focus of Step 1. Positioning Step 1 after the core clerkships leverages a major national assessment to drive learning, encouraging students to deepen their basic science knowledge while in the clinical setting. Previous studies demonstrated small increases in Step 1 scores, reductions in failure rates, and similar Step 2 Clinical Knowledge scores when Step 1 was after the clerkships. Some schools that have moved Step 1 reported declines in clinical subject examination (CSE) performance. This may be due to shortened pre-clerkship curricula, the absence of the Step 1 study period for knowledge consolidation, or exposure to fewer National Board of Medical Examiners type questions prior to taking CSEs. This multi-institutional study aimed to determine whether student performance on CSEs was affected by moving Step 1 after the core clerkships. Approach: CSE scores for students from eight schools that moved Step 1 after core clerkships between 2012 and 2016 were analyzed in a pre-post format. Hierarchical linear modeling was used to quantify the effect of the curriculum on CSE performance. Additional analysis determined if clerkship order impacted clinical subject exam performance and whether the curriculum change resulted in more students scoring in the lowest percentiles (as defined as below the national fifth percentile) before and after the curricular change. Findings: After moving Step 1 to after the clerkships, collectively these eight schools demonstrated statistically significant lower performance on four CSEs (Medicine, Neurology, Pediatrics, and Surgery) but not Obstetrics/Gynecology or Psychiatry. Comparing performance within the three years pre and post Step 1 change, differences across all clerkships ranged from 0.3 to -2.0 points, with an average difference of -1.1. CSE performance in clerkships taken early in the sequence was more affected by the curricular change, and differences gradually disappeared with subsequent examinations. Medicine and Neurology showed the largest average differences between curricular-group when taken early in the clinical year. Finally, there was a slightly higher chance of scoring below the national fifth percentile in four of the clinical subject exams (Medicine, Neurology, Pediatrics, and Psychiatry) for the cohort with Step 1 after the clerkships. Insights: Moving Step 1 after core clerkships had a small impact on CSE scores overall, with decreased scores for exams early in the clerkship sequence and an increased number of students below the fifth percentile. Score differences have minor effects on clerkship grades, but overall the size of the effect is unlikely to be educationally meaningful. Schools can use a variety of mitigation strategies to address CSE performance and Step 1 preparation in the clerkship phase.
Collapse
Affiliation(s)
- Daniel Jurich
- National Board of Medical Examiners, Philadelphia, Pennsylvania, USA
| | - Michelle Daniel
- Department of Emergency Medicine, University of Michigan Medical School, Ann Arbor, Michigan, USA
| | - Karen E Hauer
- Department of Medicine, University of California School of Medicine, San Francisco, California, USA
| | - Christine Seibert
- Department of Medicine, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin, USA
| | - Latha Chandran
- Department of Pediatrics, Renaissance School of Medicine at Stony Brook University, New York, New York, USA
| | - Arnyce R Pock
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland, USA
| | - Sara B Fazio
- Department of Medicine, Harvard Medical School, Boston, Massachusetts, USA
| | - Amy Fleming
- Department of Pediatrics, Vanderbilt University School of Medicine, Nashville, Tennessee, USA
| | - Sally A Santen
- Department of Pediatrics, Virginia Commonwealth University School of Medicine, Richmond, Virginia, USA
| |
Collapse
|
13
|
Swan Sein A, Dathatri S, Bates TA. Twelve tips on guiding preparation for both high-stakes exams and long-term learning. MEDICAL TEACHER 2021; 43:518-523. [PMID: 33032481 DOI: 10.1080/0142159x.2020.1828570] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
High-stakes exams including admissions, licensing, and maintenance of certification examinations are commonplace in health professions education. Although exam scores and performance can often serve gate-keeping purposes, the broader goal of health professions education is to foster deep, self-directed, meaningful, motivated learning. Establishing strong support structures that emphasize deep learning and understanding rather than exam scores can be beneficial to preparing learners who have the knowledge base to be excellent practitioners. This article offers guidance that can be used by academic support centres, medical educators, learning specialists, and faculty advisors, or even test-takers, to help learners to balance score achievement and knowledge development, while simultaneously cultivating more efficient and motivated studying and increasingly self-regulated learning. This series of tips details considerations for building academic success supports, fostering a growth mindset, planning efficient and effective studying efforts, utilizing test-enhanced learning strategies, exam-taking skills practice, and other support structures that can help strengthen learning experiences overall.
Collapse
Affiliation(s)
- Aubrie Swan Sein
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - Shubha Dathatri
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - Todd A Bates
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| |
Collapse
|
14
|
Swan Sein A, Daniel M, Hauer KE, Santen SA. Educational and Practical Implications of Step 1 Timing in the Context of COVID-19. MEDICAL SCIENCE EDUCATOR 2021; 31:911-916. [PMID: 33777488 PMCID: PMC7987737 DOI: 10.1007/s40670-021-01255-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 02/16/2021] [Indexed: 06/12/2023]
Affiliation(s)
- Aubrie Swan Sein
- Columbia Vagelos College of Physicians and Surgeons, Center for Education Research and Evaluation, New York, NY USA
| | - Michelle Daniel
- San Diego School of Medicine, University of California, San Diego, CA USA
- Medical School, University of Michigan, Ann Arbor, MI USA
| | - Karen E. Hauer
- San Francisco School of Medicine, University of California, San Francisco, CA USA
| | - Sally A. Santen
- School of Medicine, Virginia Commonwealth University, Richmond, VA USA
- College of Medicine, University of Cincinnati, Cincinnati, OH USA
| |
Collapse
|
15
|
Swan Sein A, Rashid H, Meka J, Amiel J, Pluta W. Twelve tips for embedding assessment for and as learning practices in a programmatic assessment system. MEDICAL TEACHER 2021; 43:300-306. [PMID: 32658603 DOI: 10.1080/0142159x.2020.1789081] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Programmatic assessment supports the evolution from assessment of learning to fostering assessment for learning and as learning practices. A well-designed programmatic assessment system aligns educational objectives, learning opportunities, and assessments with the goals of supporting student learning, making decisions about student competence and promotion decisions, and supporting curriculum evaluation. We present evidence-based guidance for implementing assessment for and as learning practices in the pre-clinical knowledge assessment system to help students learn, synthesize, master and retain content for the long-term so that they can apply knowledge to patient care. Practical tips are in the domains of culture and motivation of assessment, including how an honour code and competency-based grading system can support an assessment system to develop student self-regulated learning and professional identity, curricular assessment structure, such as how and when to utilize low-stakes and cumulative assessment to drive learning, exam and question structure, including what authentic question and exam types can best facilitate learning, and assessment follow-up and review considerations, such exam retake processes to support learning, and academic success structures. A culture change is likely necessary for administrators, faculty members, and students to embrace assessment as most importantly a learning tool for students and programs.
Collapse
Affiliation(s)
- Aubrie Swan Sein
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - Hanin Rashid
- Rutgers Robert Wood Johnson Medical School, Piscataway, NJ, USA
| | - Jennifer Meka
- Jacobs School of Medicine and Biomedical Sciences, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| | - Jonathan Amiel
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - William Pluta
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
16
|
Harmon KS, Gonzales AD, Fenn NE. Remediation and reassessment methods in pharmacy education: A systematic review. CURRENTS IN PHARMACY TEACHING & LEARNING 2021; 13:81-90. [PMID: 33131623 DOI: 10.1016/j.cptl.2020.07.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/07/2019] [Revised: 06/24/2020] [Accepted: 07/14/2020] [Indexed: 05/26/2023]
Abstract
BACKGROUND Colleges of pharmacy are currently required to implement a remediation program within their curricula, but no specifications are provided on the ideal methodology. While the need for successful remediation strategies continues to grow, literature describing positive or negative outcomes of different approaches is significantly lacking. The objective of this literature review was to describe and evaluate remediation methodologies in pharmacy education. METHODS This literature review was completed following PRISMA criteria. A search of the PubMed, Cochrane Library, Cumulative Index of Nursing and Allied Health, Academic Search Complete, PsycInfo, Scopus, and ProQuest Central databases was conducted in July 2019. Studies were included if they involved pharmacy student education and described either remediation or reassessment. RESULTS The evaluated studies discussed a range of course types being remediated, a large variety of remediation strategies and timeframes, and differing overall outcomes. No studies provided comparison of remediation techniques or provided details on the implementation of their chosen approaches. A consistent finding within the evaluated studies was the inclusion of prevention strategies to attempt to avoid the need for remediation preemptively. Overall outcomes for each remedial program were inconsistent and no clear patterns were evident other than an improvement in student performance following remediation. IMPLICATIONS Remediation strategies included course repetition, summer restudy, reassessment, and individualized plans. Outcomes varied significantly between studies, making methodology comparisons difficult. Future studies that include more detail and consistency in the reported outcomes would be beneficial to students and help clarify remediation for colleges of pharmacy.
Collapse
Affiliation(s)
- Kiersi S Harmon
- The University of Texas at Tyler, 3900 University Blvd, Tyler, TX 75799, United States.
| | - Alessa D Gonzales
- The University of Texas at Tyler, 3900 University Blvd, Tyler, TX 75799, United States.
| | - Norman E Fenn
- The University of Texas at Tyler, 3900 University Blvd, Tyler, TX 75799, United States.
| |
Collapse
|