1
|
Cottrell SA, Hedrick JS, Lama A, Sofka S, Ferrari ND. The Urgent Need for Reporting Accurate and Fair Student Comparisons in the Medical Student Performance Evaluation. J Grad Med Educ 2024; 16:257-260. [PMID: 38882437 PMCID: PMC11173022 DOI: 10.4300/jgme-d-23-00862.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 06/18/2024] Open
Affiliation(s)
- Scott A Cottrell
- is Professor, Department of Medical Education, West Virginia University School of Medicine, Morgantown, West Virginia, USA
| | - Jason S Hedrick
- is Assistant Professor, Department of Medical Education, West Virginia University School of Medicine, Morgantown, West Virginia, USA
| | - Anna Lama
- is Assistant Professor, Department of Medical Education, West Virginia University School of Medicine, Morgantown, West Virginia, USA
| | - Sarah Sofka
- is Professor and Vice Chair of Education, Department of Internal Medicine, West Virginia University School of Medicine, Morgantown, West Virginia, USA; and
| | - Norman D Ferrari
- is Professor and Chair, Department of Medical Education, West Virginia University School of Medicine, Morgantown, West Virginia, USA
| |
Collapse
|
2
|
Engel-Rebitzer E, Kogan JR, Heath JK. Gender-Based Differences in Language Used by Students to Describe Their Noteworthy Characteristics in Medical Student Performance Evaluations. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:844-850. [PMID: 36606764 DOI: 10.1097/acm.0000000000005141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE The noteworthy characteristic (NC) section of the medical student performance evaluation (MSPE) was introduced to facilitate holistic review of residency applications and mitigate biases. The student-written aspect of the characteristics, however, may introduce biases resulting from gender differences in self-promotion behaviors. The authors conducted an exploratory analysis of potential gender-based differences in language used in NCs. METHOD The authors performed a single-center cohort analysis of all student-written NCs at the Perelman School of Medicine (2018-2022). NCs were converted into single words and characterized into word categories: ability (e.g., "talent"), standout ("best"), grindstone ("meticulous"), communal ("caring"), or agentic ("ambitious"). The authors qualitatively analyzed NC topic characteristics (i.e., focused on scholarship, community service). Logistic regression was used to identify gender differences in word categories and topics used in NCs. RESULTS The cohort included 2,084 characteristics from 783 MSPEs (47.5%, n = 371 written by women). After adjusting for underrepresented in medicine status, honor society membership, and intended specialty, men were more likely to use standout (OR = 2.00; 95% confidence interval [CI] = 1.35, 2.96; P = .001) and communal (OR = 1.40; 95% CI = 1.03, 1.90; P = .03) words in their NCs compared with women but less likely to use grindstone words (OR = 0.72; 95% CI = 0.53, 0.98; P = .04). Men were more likely than women to discuss scholarship (OR = 2.03; 95% CI = 1.27, 3.23; P = .003), hobbies (OR = 1.45; 95% CI = 1.07, 1.96; P = .02), and/or awards (OR = 1.59; 95% CI = 1.16, 2.16; P = .004) and less likely to highlight community service (OR = 0.66; 95% CI = 0.48, 0.92; P = .02). CONCLUSIONS The self-written nature of NCs permits language differences that may contribute to gender bias in residency application.
Collapse
Affiliation(s)
- Eden Engel-Rebitzer
- E. Engel-Rebitzer is an internal medicine resident, Brigham and Women's Hospital, Boston, Massachusetts, and at the time the study was conducted, a medical student, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0002-5013-554X
| | - Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| | - Janae K Heath
- J.K. Heath is assistant professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0002-0533-3088
| |
Collapse
|
3
|
Maxfield CM, Cao JY, Martin JG, Grimm LJ. An Elite Privilege: Top-Ranked Medical Schools Provide Fewer Comparative Performance Data on Their Students. J Am Coll Radiol 2023; 20:446-451. [PMID: 36682646 DOI: 10.1016/j.jacr.2022.12.011] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Revised: 12/07/2022] [Accepted: 12/17/2022] [Indexed: 01/22/2023]
Abstract
PURPOSE The objective of this study was to determine differences in the reporting of performance data on medical student performance evaluations (MSPEs) by medical school ranking. METHODS MSPEs from all US allopathic and osteopathic medical schools received by a single diagnostic radiology residency program during the 2021-2022 application cycle were retrospectively reviewed. Preclinical class and core clerkship grades were categorized as pass/fail or multitiered. Comparative summative assessments provided in the MSPEs were recorded. Medical schools were grouped by their US News & World Report rankings, and the proportion of reported performance metrics for each group was compared. RESULTS Information from 95% of US allopathic medical schools (148 of 155) and 73% of osteopathic medical schools (27 of 37) was collected, on the basis of 1,046 applications received. For preclinical classes, multitiered grading was reported by no schools ranked in the top 10, 17% of schools ranked 11th to 50th, 52% of schools ranked 51st to 100th, and 59% of unranked schools (P < .001). For core clinical clerkships, multitiered grades were reported by 70% of the top 10 ranked schools, 90% of schools ranked 11th to 50th, 94% of those ranked 51st to 100th, and 94% of unranked schools (P = .0463). Comparative summative assessments were reported by none of the top 10 ranked schools, 56% of schools ranked 11th to 50th, 80% of those ranked 51th to 100th, and 81% of unranked schools (P < .001). CONCLUSIONS Higher ranked medical schools are less likely to provide comparative assessment data on their MSPEs, which may disadvantage students from lower ranked medical schools.
Collapse
Affiliation(s)
- Charles M Maxfield
- Vice Chair of Education, Department of Radiology, Duke University Medical Center, Durham, North Carolina.
| | - Joseph Y Cao
- Department of Radiology, Duke University Medical Center, Durham, North Carolina
| | - Jonathan G Martin
- Codirector of Undergraduate Medical Education, Department of Radiology, Duke University Medical Center, Durham, North Carolina. https://twitter.com/JonMartinMD
| | - Lars J Grimm
- Department of Radiology, Duke University Medical Center, Durham, North Carolina. https://twitter.com/Dr_Lars_Grimm
| |
Collapse
|
4
|
A Retrospective Analysis of Medical Student Performance Evaluations, 2014-2020: Recommend with Reservations. J Gen Intern Med 2022; 37:2217-2223. [PMID: 35710660 PMCID: PMC9296706 DOI: 10.1007/s11606-022-07502-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/28/2021] [Accepted: 03/23/2022] [Indexed: 10/18/2022]
Abstract
BACKGROUND The Medical Student Performance Evaluations (MSPE) is a cornerstone of residency applications. Little is known regarding adherence to Association of American Medical Colleges (AAMC) MSPE recommendations and longitudinal changes in MSPE content. OBJECTIVES Evaluate current MSPE quality and longitudinal changes in MSPE and grading practices. DESIGN Retrospective analysis. PARTICIPANTS Students from all Liaison Committee on Medical Education (LCME)-accredited medical schools from which the Stanford University Internal Medicine residency program received applications between 2014-2015 and 2019-2020. MAIN MEASURES Inclusion of key words to describe applicant performance and metrics thereof, including distribution among students and key word assignment explanation; inclusion of clerkship grades, grade distributions, and grade composition; and evidence of grade inflation over time. KEY RESULTS MSPE comprehensiveness varied substantially among the 149 schools analyzed. In total, 25% of schools provided complete information consistent with AAMC recommendations regarding key word/categorization of medical students and clerkship grades in 2019-2020. Seventy-seven distinct key word terms appeared across the 139 schools examined in 2019-2020. Grading practices markedly varied, with 2-83% of students receiving the top internal medicine clerkship grade depending on the year and school. Individual schools frequently changed key word and grading practices, with 33% and 18% of schools starting and/or stopping use of key words and grades, respectively. Significant grade inflation occurred over the 6-year study period, with an average 14% relative increase in the proportion of students receiving top clerkship grades. CONCLUSIONS A minority of schools complies with AAMC MSPE guidelines, and MSPEs are inconsistent across time and schools. These practices may impair evaluation of students within and between schools.
Collapse
|
5
|
Ryan MS, Holmboe ES, Chandra S. Competency-Based Medical Education: Considering Its Past, Present, and a Post-COVID-19 Era. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:S90-S97. [PMID: 34817404 PMCID: PMC8855766 DOI: 10.1097/acm.0000000000004535] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/07/2023]
Abstract
Advancement toward competency-based medical education (CBME) has been hindered by inertia and a myriad of implementation challenges, including those associated with assessment of competency, accreditation/regulation, and logistical considerations. The COVID-19 pandemic disrupted medical education at every level. Time-in-training sometimes was shortened or significantly altered and there were reductions in the number and variety of clinical exposures. These and other unanticipated changes to existing models highlighted the need to advance the core principles of CBME. This manuscript describes the impact of COVID-19 on the ongoing transition to CBME, including the effects on training, curricular, and assessment processes for medical school and graduate medical education programs. The authors outline consequences of the COVID-19 disruption on learner training and assessment of competency, such as conversion to virtual learning modalities in medical school, redeployment of residents within health systems, and early graduation of trainees based on achievement of competency. Finally, the authors reflect on what the COVID-19 pandemic taught them about realization of CBME as the medical education community looks forward to a postpandemic future.
Collapse
Affiliation(s)
- Michael S. Ryan
- M.S. Ryan is professor and vice chair of education, Department of Pediatrics, Children’s Hospital of Richmond at Virginia Commonwealth University, Richmond, Virginia
| | - Eric S. Holmboe
- E.S. Holmboe is chief research, milestone development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Subani Chandra
- S. Chandra is associate professor and residency program director, Department of Internal Medicine, Columbia University Vagelos College of Physicians and Surgeons, New York, New York
| |
Collapse
|
6
|
Compliance with CDIM-APDIM Guidelines for Department of Medicine Letters: an Opportunity to Improve Communication Across the Continuum. J Gen Intern Med 2022; 37:125-129. [PMID: 33791934 PMCID: PMC8739400 DOI: 10.1007/s11606-021-06744-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/21/2020] [Revised: 02/04/2021] [Accepted: 03/17/2021] [Indexed: 01/03/2023]
Abstract
BACKGROUND With rising applications to internal medicine programs and pending changes in United States Medical Licensing Examination Step 1 score reporting, program directors desire transparent data for comparing applicants. The Department of Medicine Letters of Recommendation (DOM LORs) are frequently used to assess applicants and have the potential to provide clearly defined data on performance including stratification of a medical school class. Despite published guidelines on the expected content of the DOM LOR, these LORs do not always meet that need. OBJECTIVES To better understand the degree to which DOM LORs comply with published guidelines. METHODS We reviewed DOM LORs from 146 of 155 LCME-accredited medical schools in the 2019 Match cycle, assessing for compliance with published guidelines. RESULTS Adherence to the recommendation for DOM LORs to provide a final characterization of performance relative to peers was low (68/146, 47%). Of those that provided a final characterization, 19/68 (28%) provided a quantitative measure, and 49/68 (72%) provided a qualitative descriptor. Only 17/49 (35%) with qualitative terms described those terms, and thirteen distinct qualitative scales were identified. Ranking systems varied, with seven different titles given to highest performers. Explanations about determination of ranking groups were provided in 12% of cases. CONCLUSIONS Adherence to published guidelines for DOM LORs varies but is generally low. For program directors desiring transparent data to use in application review, clearly defined data on student performance, stratification groupings, and common language across schools could improve the utility of DOM LORs.
Collapse
|
7
|
Brenner JM, Bird JB, Brenner J, Orner D, Friedman K. Current State of the Medical Student Performance Evaluation: A Tool for Reflection for Residency Programs. J Grad Med Educ 2021; 13:576-580. [PMID: 34434519 PMCID: PMC8370358 DOI: 10.4300/jgme-d-20-01373.1] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/06/2020] [Revised: 02/11/2021] [Accepted: 04/18/2021] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Medical Student Performance Evaluation (MSPE) provides important information to residency programs. Despite recent recommendations for standardization, it is not clear how much variation exists in MSPE content among schools. OBJECTIVES We describe the current section content of the MSPE in US allopathic medical schools, with a particular focus on variations in the presentation of student performance. METHODS A representative MSPE was obtained from 95.3% (143 of 150) of allopathic US medical schools through residency applications to the Zucker School of Medicine at Hofstra/Northwell in select programs for the 2019-2020 academic year. A manual data abstraction tool was piloted in 2018-2019. After training, it was used to code all portions of the MSPE in this study. The results were analyzed, and descriptive statistics were reported. RESULTS In preclinical years, 30.8% of MSPEs reported data regarding performance of students beyond achieving "passes" in a pass/fail curriculum. Only half referenced performance in the fourth year including electives, acting internships, or both. About two-thirds of schools included an overall descriptor of comparative performance in the final paragraph. Among these schools, a majority provided adjectives such as "outstanding/excellent/very good/good," while one-quarter reported numerical data categories. Regarding clerkship grades, there were numerous nomenclature systems used. CONCLUSIONS This analysis demonstrates the existence of extreme variability in the content of MSPEs submitted by US allopathic medical schools in the 2019-2020 cycle, including the components and nomenclature of grades and descriptors of comparative performance, display of data, and inclusion of data across all years of the medical education program.
Collapse
Affiliation(s)
- Judith M. Brenner
- Judith M. Brenner, MD, is Associate Dean for Curricular Integration and Assessment, and Associate Professor of Science Education and Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Jeffrey B. Bird
- Jeffrey B. Bird, MA, is Educational Research & Strategic Assessment Analyst, and Assistant Professor of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Jason Brenner
- Jason Brenner, BS, is a Volunteer Research Assistant, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, and Student, University of Michigan
| | - David Orner
- David Orner, MPH, is a Research Assistant, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| | - Karen Friedman
- Karen Friedman, MS, MD, is Vice Chair for Education, Department of Medicine, Northwell Health, and Professor of Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell
| |
Collapse
|
8
|
Beck Dallaghan GL, Alexandraki I, Christner J, Keeley M, Khandelwal S, Steiner B, Hemmer PA. Medical School to Residency: How Can We Trust the Process? Cureus 2021; 13:e14485. [PMID: 34007741 PMCID: PMC8121123 DOI: 10.7759/cureus.14485] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
Background To say that the transition from undergraduate medical education (UME) to graduate medical education (GME) is under scrutiny would be an understatement. Findings from a panel discussion at the 2018 Association of American Medical Colleges Annual meeting entitled, “Pass-Fail in Medical School and the Residency Application Process and Graduate Medical Education Transition” addressed what and when information should be shared with residency programs, and how and when that information should be shared. Materials and Methods Over 250 participants representing UME and GME (e.g. leadership, faculty, medical students) completed worksheets addressing these questions. During report-back times, verbal comments were transcribed in real time, and written comments on worksheets were later transcribed. All comments were anonymous. Thematic analysis was conducted manually by the research team to analyze the worksheet responses and report back comments. Results Themes based on suggestions of what information should be shared included the following: 1) developmental/assessment benchmarks such as demonstrating the ability/competencies to do clinical work; 2) performance on examinations; 3) grades and class ranking; 4) 360 evaluations; 5) narrative evaluations; 6) failures/remediation/gaps in training; 7) professionalism lapses; 8) characteristics of students such as resiliency/reliability; and 9) service/leadership/participation. In terms of how this information should be shared, the participants suggested enhancements to the current process of transmitting documents rather than alternative methods (e.g., video, telephonic, face-to-face discussions) and information sharing at both the time of the match and again near/at graduation to include information about post-match rotations. Discussion Considerations to address concerns with the transition from medical school to residency include further enhancements to the Medical Student Performance Evaluation, viewing departmental letters as ones of evaluation and not recommendation, a more meaningful educational handoff, and limits on the number of residency applications allowed for each student. The current medical education environment is ready for meaningful change in the UME to GME transition.
Collapse
Affiliation(s)
- Gary L Beck Dallaghan
- Office of Medical Education, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, USA
| | - Irene Alexandraki
- Medicine, Florida State University College of Medicine, Tallahassee, USA
| | | | - Meg Keeley
- Pediatrics, University of Virginia School of Medicine, Charlottesville, USA
| | - Sorabh Khandelwal
- Emergency Medicine, The Ohio State University College of Medicine, Columbus, USA
| | - Beat Steiner
- Family Medicine, University of North Carolina School of Medicine, Chapel Hill, USA
| | - Paul A Hemmer
- Medicine, Uniformed Services University of the Health Sciences, Bethesda, USA
| |
Collapse
|
9
|
Ryan MS, Brooks EM, Safdar K, Santen SA. Clerkship Grading and the U.S. Economy: What Medical Education Can Learn From America's Economic History. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:186-192. [PMID: 33492834 PMCID: PMC8325378 DOI: 10.1097/acm.0000000000003566] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Clerkship grades (like money) are a social construct that function as the currency through which value exchanges in medical education are negotiated between the system's various stakeholders. They provide a widely recognizable and efficient medium through which learner development can be assessed, tracked, compared, and demonstrated and are commonly used to make decisions regarding progression, distinction, and selection for residency. However, substantial literature has demonstrated how grades imprecisely and unreliably reflect the value of learners. In this article, the authors suggest that challenges with clerkship grades are fundamentally tied to their role as currency in the medical education system. Associations are drawn between clerkship grades and the history of the U.S. economy; 2 major concepts are highlighted: regulation and stock prices. The authors describe the history of these economic concepts and how they relate to challenges in clerkship grading. Using lessons learned from the history of the U.S. economy, the authors then propose a 2-step solution to improve upon grading for future generations of medical students: (1) transition from grades to a federally regulated competency-based assessment model and (2) development of a departmental competency letter that incorporates competency-based assessments rather than letter grades and meets the needs of program directors.
Collapse
Affiliation(s)
- Michael S Ryan
- M.S. Ryan is associate professor and assistant dean for clinical medical education, Department of Pediatrics, Virginia Commonwealth University, Richmond, Virginia; ORCID: https://orcid.org/0000-0003-3266-9289
| | - E Marshall Brooks
- E.M. Brooks is assistant professor, Department of Family Medicine and Population Health, Virginia Commonwealth University, Richmond, Virginia
| | - Komal Safdar
- K. Safdar is a fourth-year medical student, Virginia Commonwealth University, Richmond, Virginia; ORCID: https://orcid.org/0000-0003-1024-2153
| | - Sally A Santen
- S.A. Santen is professor and senior associate dean, assessment, evaluation and scholarship, Department of Emergency Medicine, Virginia Commonwealth University, Richmond, Virginia; ORCID: http://orcid.org/0000-0002-8327-8002
| |
Collapse
|
10
|
Hauer KE, Giang D, Kapp ME, Sterling R. Standardization in the MSPE: Key Tensions for Learners, Schools, and Residency Programs. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:44-49. [PMID: 32167965 DOI: 10.1097/acm.0000000000003290] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
The Medical Student Performance Evaluation (MSPE), which summarizes a medical student's academic and professional undergraduate medical education performance and provides salient information during the residency selection process, faces persistent criticisms regarding heterogeneity and obscurity. Specifically, MSPEs do not always provide the same type or amount of information about students, especially from diverse schools, and important information is not always easy to find or interpret. To address these concerns, a key guiding principle from the Recommendations for Revising the MSPE Task Force of the Association of American Medical Colleges (AAMC) was to achieve "a level of standardization and transparency that facilitates the residency selection process." Benefits of standardizing the MSPE format include clarification of performance benchmarks or metrics, consistency across schools to enhance readability, and improved quality. In medical education, standardization may be an important mechanism to ensure accountability of the system for all learners, including those with varied backgrounds and socioeconomic resources. In this article, members of the aforementioned AAMC MSPE task force explore 5 tensions inherent in the pursuit of standardizing the MSPE: (1) presenting each student's individual characteristics and strengths in a way that is relevant, while also working with a standard format and providing standard content; (2) showcasing school-specific curricular strengths while also demonstrating standard evidence of readiness for internship; (3) defining and achieving the right amount of standardization so that the MSPE provides useful information, adds value to the residency selection process, and is efficient to read and understand; (4) balancing reporting with advocacy; and (5) maintaining standardization over time, especially given the tendency for the MSPE format and content to drift. Ongoing efforts to promote collaboration and trust across the undergraduate to graduate medical education continuum offer promise to reconcile these tensions and promote successful educational outcomes.
Collapse
Affiliation(s)
- Karen E Hauer
- K.E. Hauer is associate dean, Assessment, and professor, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California; ORCID: https://orcid.org/0000-0002-8812-4045
| | - Daniel Giang
- D. Giang is associate dean, Graduate Medical Education, and professor, Department of Neurology, Loma Linda University, Loma Linda, California
| | - Meghan E Kapp
- M.E. Kapp is assistant professor, Department of Pathology, Microbiology and Immunology, Vanderbilt University Medical Center, Nashville, Tennessee; ORCID: https://orcid.org/0000-0002-0252-3919
| | - Robert Sterling
- R. Sterling is associate professor, Department of Orthopaedic Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland; ORCID: https://orcid.org/0000-0003-2963-3162
| |
Collapse
|
11
|
Rajesh A, Asaad M, Sridhar M. Binary Reporting of USMLE Step 1 Scores: Resident Perspectives. JOURNAL OF SURGICAL EDUCATION 2021; 78:304-307. [PMID: 32600888 DOI: 10.1016/j.jsurg.2020.06.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2020] [Revised: 05/23/2020] [Accepted: 06/13/2020] [Indexed: 06/11/2023]
Abstract
The recent consensus from the Invitational Conference on USMLE Scoring has recommended a transition to a binary pass/fail reporting on the USMLE Step 1 exam to be implemented from January 22, 2022. While this change was instituted in an effort to decrease medical student stress and re-iterate the importance of the Step 1 as merely a licensing or qualifying exam, this decision has profound implications for medical graduates of both United States and foreign medical schools. In addition to compounding the difficulties of resident selection by residency programs, the new system could exert significant mental and financial burden on medical students, and potentially affect the diversity of graduate medical education in the United States. This article draws attention to the downstream effects of a pass/fail system on the future of medical and surgical education.
Collapse
Affiliation(s)
- Aashish Rajesh
- Department of Surgery, University of Texas Health Science Center, San Antonio, Texas.
| | - Malke Asaad
- Department of Plastic Surgery, MD Anderson Cancer Center, Houston, Texas
| | - Monica Sridhar
- Department of Surgery, University of Texas Health Science Center, San Antonio, Texas
| |
Collapse
|
12
|
Prystowsky MB, Cadoff E, Lo Y, Hebert TM, Steinberg JJ. Prioritizing the Interview in Selecting Resident Applicants: Behavioral Interviews to Determine Goodness of Fit. Acad Pathol 2021; 8:23742895211052885. [PMID: 34722866 PMCID: PMC8552388 DOI: 10.1177/23742895211052885] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 08/02/2021] [Accepted: 08/07/2021] [Indexed: 12/30/2022] Open
Abstract
From our initial screening of applications, we assess that the 10% to 15% of applicants whom we will interview are all academically qualified to complete our residency training program. This initial screening to select applicants to interview includes a personality assessment provided by the personal statement, Dean's letter, and letters of recommendation that, taken together, begin our evaluation of the applicant's cultural fit for our program. While the numerical scoring ranks applicants preinterview, the final ranking into best fit categories is determined solely on the interview day at a consensus conference by faculty and residents. We analyzed data of 819 applicants from 2005 to 2017. Most candidates were US medical graduates (62.5%) with 23.7% international medical graduates, 11.7% Doctors of Osteopathic Medicine (DO), and 2.1% Caribbean medical graduates. Given that personality assessment began with application review, there was excellent correlation between the preinterview composite score and the final categorical ranking in all 4 categories. For most comparisons, higher scores and categorical rankings were associated with applicants subsequently working in academia versus private practice. We found no problem in using our 3-step process employing virtual interviews during the COVID pandemic.
Collapse
Affiliation(s)
| | - Evan Cadoff
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Yungtai Lo
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Tiffany M. Hebert
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| | - Jacob J. Steinberg
- Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, NY, USA
| |
Collapse
|
13
|
Joshi ART. The Discouraging Inadequacy of Clerkship Grades to Evaluate Medical Students-Are We Ready for Solutions? J Grad Med Educ 2020; 12:150-152. [PMID: 32322346 PMCID: PMC7161321 DOI: 10.4300/jgme-d-20-00166.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
|
14
|
Giang D. Medical Student Performance Evaluation (MSPE) 2017 Task Force Recommendations as Reflected in the Format of 2018 MSPE. J Grad Med Educ 2019; 11:385-388. [PMID: 31440331 PMCID: PMC6699527 DOI: 10.4300/jgme-d-19-00479.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
|
15
|
Pinto-Powell R, Lahey T. Just a Game: the Dangers of Quantifying Medical Student Professionalism. J Gen Intern Med 2019; 34:1641-1644. [PMID: 31147979 PMCID: PMC6667566 DOI: 10.1007/s11606-019-05063-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Revised: 01/16/2019] [Accepted: 04/11/2019] [Indexed: 10/26/2022]
Abstract
A medical student on her internal medicine clerkship says her numerical medical professionalism grade was "just a game." Building on this anecdote, we suggest there is good reason to believe that numerical summative assessments of medical student professionalism can, paradoxically, undermine medical student professionalism by sapping internal motivation and converting conversations about core professional values into just another hurdle to residency. We suggest better ways of supporting medical student professional development, including a portfolio comprised of written personal reflection and periodic 360° formative assessment in the context of longitudinal faculty coaching.
Collapse
Affiliation(s)
- Roshini Pinto-Powell
- Department of Medicine, Geisel School of Medicine at Dartmouth, Hanover, NH, USA. .,Dartmouth-Hitchcock Medical Center, Lebanon, NH, USA. .,Department of Medical Education, Geisel School of Medicine at Dartmouth, Hanover, NH, USA.
| | - Timothy Lahey
- Larner College of Medicine, University of Vermont, Burlington, VT, USA.,The University of Vermont Medical Center, Burlington, VT, USA
| |
Collapse
|
16
|
Brenner JM. The Revised Medical School Performance Evaluation: Does It Meet the Needs of Its Readers? J Grad Med Educ 2019; 11:475-478. [PMID: 31440345 PMCID: PMC6699531 DOI: 10.4300/jgme-d-19-00089.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Revised: 04/23/2019] [Accepted: 04/24/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Medical School Performance Evaluation (MSPE) is an important factor for application to residency programs. Many medical schools are incorporating recent recommendations from the Association of American Medical Colleges MSPE Task Force into their letters. To date, there has been no feedback from the graduate medical education community on the impact of this effort. OBJECTIVE We surveyed individuals involved in residency candidate selection for internal medicine programs to understand their perceptions on the new MSPE format. METHODS A survey was distributed in March and April 2018 using the Association of Program Directors in Internal Medicine listserv, which comprises 4220 individuals from 439 residency programs. Responses were analyzed, and themes were extracted from open-ended questions. RESULTS A total of 140 individuals, predominantly program directors and associate program directors, from across the United States completed the survey. Most were aware of the existence of the MSPE Task Force. Respondents read a median of 200 to 299 letters each recruitment season. The majority reported observing evidence of adoption of the new format in more than one quarter of all medical schools. Among respondents, nearly half reported the new format made the MSPE more important in decision-making about a candidate. Within the MSPE, respondents recognized the following areas as most influential: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank). CONCLUSIONS The internal medicine graduate medical education community finds value in many components of the new MSPE format, while recognizing there are further opportunities for improvement.
Collapse
|