1
|
Mattes MD, Ferrari III ND. Impact of Medical Student Disciplinary Actions on the United States National Resident Match. Cureus 2022; 14:e24583. [PMID: 35651435 PMCID: PMC9138807 DOI: 10.7759/cureus.24583] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/28/2022] [Indexed: 11/06/2022] Open
Abstract
Introduction: Each year, the United States National Resident Matching Program describes the relative importance of a number of factors in the residency match for each speciality. However, the impact of disciplinary actions taken by a school when a student fails to meet certain expectations is not specifically evaluated but may have a major impact on a physician’s future performance. Methods: This study used electronic surveys sent to deans of medical education and residency program directors (PDs) to assess the way disciplinary actions are used at US allopathic medical schools, and the perceived implications of those actions on the residency match. Results: Thirty-three deans and 158 PDs participated (response rates of 26% and 22%, respectively). The median percentage of students put on probation each year as a function of class size was 3.3% (interquartile range [IQR] 2% to 6%). Three institutions reported putting greater than 10% of their students on probation each year and one institution reported putting 22% of their students on probation each year. A student's risk of failing to match was thought to be very or extremely likely (to deans and PDs, respectively) if there was a history of failed coursework (18.8% and 41.2%, p = 0.017), academic probation (34.4% and 67.1%, p = 0.009), or professionalism probation (78.1% and 83.9%, p = 0.016). The differences between each of the above types of disciplinary action’s impact on the likelihood of interviewing (p < 0.001) and risk of failure to match (p < 0.001) were also significant among both groups. Conclusion: Significant variability exists in the use and reporting of disciplinary actions at US medical schools. A history of these adverse actions, even if successfully remediated, was thought to negatively impact a student’s likelihood to interview and match. Greater standardization in the use and reporting of disciplinary actions would be appropriate to ensure equitable treatment of students nationwide.
Collapse
|
2
|
Warm EJ, Englander R, Pereira A, Barach P. Improving Learner Handovers in Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:927-931. [PMID: 27805952 DOI: 10.1097/acm.0000000000001457] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Multiple studies have demonstrated that the information included in the Medical Student Performance Evaluation fails to reliably predict medical students' future performance. This faulty transfer of information can lead to harm when poorly prepared students fail out of residency or, worse, are shuttled through the medical education system without an honest accounting of their performance. Such poor learner handovers likely arise from two root causes: (1) the absence of agreed-on outcomes of training and/or accepted assessments of those outcomes, and (2) the lack of standardized ways to communicate the results of those assessments. To improve the current learner handover situation, an authentic, shared mental model of competency is needed; high-quality tools to assess that competency must be developed and tested; and transparent, reliable, and safe ways to communicate this information must be created.To achieve these goals, the authors propose using a learner handover process modeled after a patient handover process. The CLASS model includes a description of the learner's Competency attainment, a summary of the Learner's performance, an Action list and statement of Situational awareness, and Synthesis by the receiving program. This model also includes coaching oriented towards improvement along the continuum of education and care. Just as studies have evaluated patient handover models using metrics that matter most to patients, studies must evaluate this learner handover model using metrics that matter most to providers, patients, and learners.
Collapse
Affiliation(s)
- Eric J Warm
- E.J. Warm is the Sue P. and Richard W. Vilter Professor of Medicine and categorical medicine residency program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. R. Englander is associate dean for undergraduate medical education, University of Minnesota Medical School, Minneapolis, Minnesota. A. Pereira is associate professor and assistant dean for clinical education, University of Minnesota Medical School, Minneapolis, Minnesota. P. Barach is clinical professor, Department of Pediatrics, Wayne State University School of Medicine, Detroit, Michigan
| | | | | | | |
Collapse
|
3
|
Andolsek KM. Improving the Medical Student Performance Evaluation to Facilitate Resident Selection. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2016; 91:1475-1479. [PMID: 27603040 DOI: 10.1097/acm.0000000000001386] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/13/2023]
Abstract
The Medical Student Performance Evaluation (MSPE) was introduced as a refinement of the prior "dean's letter" to provide residency program directors with a standardized comprehensive assessment of a medical student's performance throughout medical school. The author argues that, although the MSPE was created with good intentions, many have questioned its efficacy in predicting performance during residency. The author asserts that, despite decades of use and some acknowledged improvement, the MSPE remains a suboptimal tool for informing program directors' decisions about which applicants to interview and rank. In the current approach to MSPEs, there may even be some inherent conflicts of interest that cannot be overcome. In January 2015, an MSPE Task Force was created to review the MSPE over three years and recommend changes to its next iteration. The author believes, however, that expanding this collaborative effort between undergraduate and graduate medical education and other stakeholders could optimize the MSPE's standardization and transparency. The author offers six recommendations for achieving this goal: developing a truly standardized MSPE template; improving faculty accountability in student assessment; enhancing transparency in the MSPE; reconsidering the authorship responsibility of the MSPE; including assessment of compliance with administrative tasks and peer assessments in student evaluations; and embracing milestones for evaluation of medical student performance.
Collapse
Affiliation(s)
- Kathryn M Andolsek
- K.M. Andolsek is professor of community and family medicine and assistant dean for premedical education, Duke University School of Medicine, Durham, North Carolina
| |
Collapse
|
4
|
A Program Director's Guide to the Medical Student Performance Evaluation (Former Dean's Letter) With a Database. J Am Coll Radiol 2014; 11:611-5. [DOI: 10.1016/j.jacr.2013.11.012] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2013] [Accepted: 11/20/2013] [Indexed: 11/23/2022]
|
5
|
Lang VJ, Aboff BM, Bordley DR, Call S, Dezee KJ, Fazio SB, Fitz M, Hemmer PA, Logio LS, Wayne DB. Guidelines for writing department of medicine summary letters. Am J Med 2013; 126:458-63. [PMID: 23582937 DOI: 10.1016/j.amjmed.2013.01.018] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/29/2013] [Accepted: 01/29/2013] [Indexed: 10/26/2022]
Affiliation(s)
- Valerie J Lang
- University of Rochester School of Medicine & Dentistry, Rochester, NY 14642, USA.
| | | | | | | | | | | | | | | | | | | |
Collapse
|
6
|
Axelson RD, Solow CM, Ferguson KJ, Cohen MB. Assessing implicit gender bias in Medical Student Performance Evaluations. Eval Health Prof 2011; 33:365-85. [PMID: 20801977 DOI: 10.1177/0163278710375097] [Citation(s) in RCA: 54] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
For medical schools, the increasing presence of women makes it especially important that potential sources of gender bias be identified and removed from student evaluation methods. Our study looked for patterns of gender bias in adjective data used to inform our Medical Student Performance Evaluations (MSPEs). Multigroup Confirmatory Factor Analysis (CFA) was used to model the latent structure of the adjectives attributed to students (n = 657) and to test for systematic scoring errors by gender. Gender bias was evident in two areas: (a) women were more likely than comparable men to be described as ''compassionate,'' ''sensitive,'' and ''enthusiastic'' and (b) men were more likely than comparable women to be seen as ''quick learners.'' The gender gap in ''quick learner'' attribution grows with increasing student proficiency; men's rate of increase is over twice that of women's. Technical and nontechnical approaches for ameliorating the impact of gender bias on student recommendations are suggested.
Collapse
Affiliation(s)
- Rick D Axelson
- Office of Consultation and Research in Medical Education, University of Iowa, Iowa City, IA, USA.
| | | | | | | |
Collapse
|
7
|
Chen HC, Teherani A, O'Sullivan P. How does a comprehensive clinical performance examination relate to ratings on the medical school student performance evaluation? TEACHING AND LEARNING IN MEDICINE 2011; 23:12-14. [PMID: 21240776 DOI: 10.1080/10401334.2011.536752] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
BACKGROUND U.S. medical schools have long used the Medical Student Performance Evaluation (MSPE) to represent overall student performance while comprehensive clinical performance exams (CPX) are beginning to emerge as a new standard for determining student competence. PURPOSE This study describes the association between the MSPE and CPX in their independent measures of student competence. METHODS We examined the relationship between CPX scores and student MSPE rating at our institution, which was completed independently of the CPX. RESULTS Students with higher CPX scores had better MSPE rating, but the associations are small ranging from rs=.13 for history-taking skills to rs=.31 for interpersonal skills. CONCLUSIONS CPX results are not strongly related to MSPE rating and, thus, they may provide information on clinical competencies that should be included in the MSPE.
Collapse
Affiliation(s)
- Huiju Carrie Chen
- Department of Pediatrics, University of California, San Francisco, California 94143, USA.
| | | | | |
Collapse
|
8
|
Swide C, Lasater K, Dillman D. Perceived predictive value of the Medical Student Performance Evaluation (MSPE) in anesthesiology resident selection. J Clin Anesth 2009; 21:38-43. [DOI: 10.1016/j.jclinane.2008.06.019] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2007] [Revised: 06/02/2008] [Accepted: 06/19/2008] [Indexed: 10/21/2022]
|
9
|
Shea JA, O'Grady E, Wagner BR, Morris JB, Morrison G. Professionalism in clerkships: an analysis of MSPE commentary. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2008; 83:S1-S4. [PMID: 18820484 DOI: 10.1097/acm.0b013e318183e547] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
BACKGROUND Professionalism in medical school predicts future behaviors. The authors assessed prevalence of references to professionalism behaviors in the clerkship commentary portion of Medical School Performance Evaluations (MSPEs). METHOD Content analyses of 293 MSPEs submitted for 2005 graduates. RESULTS Overall, 70% of MSPEs specifically mentioned professionalism; 96% included information about at least 1 of 16 professional behaviors. Internal Medicine referenced significantly more behaviors than other clerkships. Commentary about behaviors such as interactions (94%) and motivation (91%) was common; behaviors such as truthfulness (8%) and confidentiality (6%) were rarely mentioned. Fewer than 1% of comments could be considered negative. CONCLUSIONS Most professionalism comments in MSPEs are generic and somewhat bland, tending to be about students' collegial interactions and hard work. More detail and breadth may be facilitated by wider use of behavior-centered evaluation in clerkships.
Collapse
Affiliation(s)
- Judy A Shea
- University of Pennsylvania, 1223 Blockley Hall, 423 Guardian Drive, Philadelphia, PA 19104-6021, USA.
| | | | | | | | | |
Collapse
|
10
|
Shea JA, O'Grady E, Morrison G, Wagner BR, Morris JB. Medical Student Performance Evaluations in 2005: an improvement over the former dean's letter? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2008; 83:284-291. [PMID: 18316879 DOI: 10.1097/acm.0b013e3181637bdd] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
PURPOSE To collect information regarding preparation, content, and format of Medical Student Performance Evaluations (MSPEs) and evaluate a sample of 2005 MSPEs to assess compliance with the 2002 Association of American Medical Colleges-issued MSPE guidelines. METHOD Cross-sectional survey with all 126 U.S. allopathic medical schools. Associate deans of students affairs were sent an eight-item questionnaire in June 2006 and asked to submit a sample of redacted MSPEs for 2005 graduates, choosing one from each tertile of the class. Survey data are summarized. MSPEs were abstracted, and results are presented descriptively. RESULTS The survey response rate was 84%. Most associate deans (71%) reported having primary responsibility for composing MSPEs; 78% adhered to the format and content guidelines three fourths of the time. The abstraction of 293 MSPEs (78%) showed that more than 80% adhered to format recommendations. However, only 70% to 80% stated grades clearly, avoided the word recommendation, and stated whether the student had completed remediation. Fewer than 70% indicated whether the student had had any adverse actions or provided adequate comparative data. Strikingly, only 17% provide comparative data in the summary paragraph. Overall, 75% of the MSPEs were judged to be "adequate." CONCLUSIONS MSPEs are somewhat variable in terms of which specific items are included. There has been steady quality improvement since prior surveys, primarily in formatting and labeling. However, a sizable minority of writers are still using the MSPE as a recommendation, and too few are providing helpful comparative data.
Collapse
Affiliation(s)
- Judy A Shea
- Department of Medicine, University of Pennsylvania, 1223 Blockley Hall, 423 Guardian Drive, Philadelphia, PA 19104-6021, USA.
| | | | | | | | | |
Collapse
|
11
|
Naidich JB, Lee JY, Hansen EC, Smith LG. The meaning of excellence. Acad Radiol 2007; 14:1121-6. [PMID: 17707321 DOI: 10.1016/j.acra.2007.05.022] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2007] [Revised: 05/25/2007] [Accepted: 05/31/2007] [Indexed: 11/19/2022]
Abstract
RATIONALE AND OBJECTIVES Program directors would like to interview the very best students applying to their programs. The summary paragraph of the dean's letter should provide useful information regarding a student's performance in medical school. One frequently found descriptor is excellent. However, its very frequency suggests the word may be loosely used. The purpose of this investigation is to determine the meaning of excellence. MATERIALS AND METHODS The descriptor excellent was searched for in the summary paragraph. An effort was made to determine how many medical schools used excellent, how precisely the medical school defined this word, whether numbers were used to define the upper and lower boundaries of excellent, and what other buzzwords were used in the summary paragraphs for students not defined as excellent. RESULTS Excellent was the most common descriptor, used by 75% of the medical schools. Defined numeric boundaries were used by 47% of schools. Tabulated results showed that within a school the range of excellence varied from as tight as 20 percentile points to so broad that 65% of the students were classified as excellent. The boundaries of excellent, among different schools, varied from as low as the third to as high as the ninety-second percentile. In half the schools, students described as excellent might be in the bottom half of their class. A total of 28% of the schools used excellent, but without any numeric definition. No school used excellent to describe its best students. CONCLUSIONS Medical student deans often exaggerate the quality of their graduates by using the word excellent at variance with the dictionary definition of exceptionally good. Inaccurate descriptions by deans of their graduating medical students diminish the value of MSPE.
Collapse
Affiliation(s)
- James B Naidich
- Department of Radiology, North Shore University Hospital, Manhasset, NY 11030, USA.
| | | | | | | |
Collapse
|
12
|
Lurie SJ, Lambert DR, Grady-Weliky TA. Relationship between dean's letter rankings and later evaluations by residency program directors. TEACHING AND LEARNING IN MEDICINE 2007; 19:251-6. [PMID: 17594220 DOI: 10.1080/10401330701366523] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
BACKGROUND It is not known how well dean's letter rankings predict later performance in residency. PURPOSE To assess the accuracy of dean's letter rankings to predict clinical performance in internship. METHOD Participants were medical students who graduated from the University of Rochester School of Medicine and Dentistry in the classes of 2003 and 2004. In their Dean's Letter, each student was ranked as either "Outstanding" (upper quartile), "Excellent" (second quartile), "Very good" (lower 2 quartiles), or "Good" (lowest few percentile). We compared these dean's letter rankings against results of questionnaires sent to program directors 9 months after graduation. RESULTS Response rate to the questionnaire was 58.9% (109 of 185 eligible graduates). There were no differences in response rate across the four dean's letter ranking categories. Program directors rated students in the top two categories of dean's letter rankings significantly higher than those in the very good group. Students in all three groups were rated significantly higher than those in the good group, F (3, 105) = 13.37, p < .001. Students in the very good group were most variable in their ratings by program directors, with many receiving similarly high ratings as students in the upper 2 groups. There were no differences by gender or specialty. CONCLUSION Dean's letter rankings are a significant predictor of later performance in internship among graduates of our medical school. Students in the bottom half of the class are most likely either to underperform or overperform in internship.
Collapse
Affiliation(s)
- Stephen J Lurie
- Department of Family Medicine, University of Rochester School of Medicine and Dentistry, Rochester, New York 14642, USA.
| | | | | |
Collapse
|
13
|
Lurie SJ, Lambert DR, Nofziger AC, Epstein RM, Grady-Weliky TA. Relationship between peer assessment during medical school, dean's letter rankings, and ratings by internship directors. J Gen Intern Med 2007; 22:13-6. [PMID: 17351836 PMCID: PMC1824780 DOI: 10.1007/s11606-007-0117-4] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
Abstract
BACKGROUND It is not known to what extent the dean's letter (medical student performance evaluation [MSPE]) reflects peer-assessed work habits (WH) skills and/or interpersonal attributes (IA) of students. OBJECTIVE To compare peer ratings of WH and IA of second- and third-year medical students with later MSPE rankings and ratings by internship program directors. DESIGN AND PARTICIPANTS Participants were 281 medical students from the classes of 2004, 2005, and 2006 at a private medical school in the northeastern United States, who had participated in peer assessment exercises in the second and third years of medical school. For students from the class of 2004, we also compared peer assessment data against later evaluations obtained from internship program directors. RESULTS Peer-assessed WH were predictive of later MSPE groups in both the second (F = 44.90, P < .001) and third years (F = 29.54, P < .001) of medical school. Interpersonal attributes were not related to MSPE rankings in either year. MSPE rankings for a majority of students were predictable from peer-assessed WH scores. Internship directors' ratings were significantly related to second- and third-year peer-assessed WH scores (r = .32 [P = .15] and r = .43 [P = .004]), respectively, but not to peer-assessed IA. CONCLUSIONS Peer assessment of WH, as early as the second year of medical school, can predict later MSPE rankings and internship performance. Although peer-assessed IA can be measured reliably, they are unrelated to either outcome.
Collapse
Affiliation(s)
- Stephen J Lurie
- Office of Educational Evaluation and Research, University of Rochester School of Medicine and Dentistry, Rochester, NY 14624, USA.
| | | | | | | | | |
Collapse
|